Hi,
I want to query Flink's REST API in my IDE during runtime in order to get the jobID of the job that is currently running. Is there any way to do this? I found the RestClient class, but can't seem to figure out how to exactly make this work. Any help much appreciated. Best, Annemarie -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ |
Hi Annemarie, You need to use http client to connect to the job managaer. //Creating a HttpClient object CloseableHttpClient httpclient = HttpClients.createDefault(); //Creating a HttpGet object HttpGet httpget = new HttpGet("https://${jobmanager:port}/jobs "); //Executing the Get request HttpResponse httpresponse = httpclient.execute(httpget); from httpresponse you will get all the running job details. On Fri, May 22, 2020 at 9:22 PM Annemarie Burger <[hidden email]> wrote: Hi, |
Hi,
Thanks for your response! I can't seem to get past a "java.net.ConnectException: Connection refused" though. Below is the relevant code and exception, any idea what I'm doing wrong? Configuration config = new Configuration(); config.setString(JobManagerOptions.ADDRESS, "localhost"); config.setInteger(RestOptions.RETRY_MAX_ATTEMPTS, 10); config.setLong(RestOptions.RETRY_DELAY, 1000); config.setString(RestOptions.ADDRESS,"localhost"); // Unsure if I need this ExecutorService ex = WebMonitorEndpoint.createExecutorService(config.getInteger(RestOptions.SERVER_NUM_THREADS), config.getInteger(RestOptions.SERVER_THREAD_PRIORITY),"name"); StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(config); CloseableHttpClient httpClient = HttpClients.createDefault(); // Same problem when trying with port 6123 HttpGet httpget = new HttpGet("http://localhost:8081/jobs"); HttpResponse httpresponse = httpClient.execute(httpget); Exception in thread "main" org.apache.flink.hadoop.shaded.org.apache.http.conn.HttpHostConnectException: Connect to localhost:8081 [localhost/127.0.0.1, localhost/0:0:0:0:0:0:0:1] failed: Connection refused: connect at org.apache.flink.hadoop.shaded.org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:159) at org.apache.flink.hadoop.shaded.org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:359) at org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:381) at org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237) at org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185) at org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89) at org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:111) at org.apache.flink.hadoop.shaded.org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185) at org.apache.flink.hadoop.shaded.org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83) at org.apache.flink.hadoop.shaded.org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108) at gellyStreaming.gradoop.model.GraphState.<init>(GraphState.java:113) at gellyStreaming.gradoop.model.SimpleTemporalEdgeStream.buildState(SimpleTemporalEdgeStream.java:473) at gellyStreaming.gradoop.model.Tests.incrementalState(Tests.java:261) at gellyStreaming.gradoop.model.Tests.main(Tests.java:417) Caused by: java.net.ConnectException: Connection refused: connect at java.base/java.net.PlainSocketImpl.connect0(Native Method) at java.base/java.net.PlainSocketImpl.socketConnect(PlainSocketImpl.java:101) at java.base/java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:399) at java.base/java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:242) at java.base/java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:224) at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:403) at java.base/java.net.Socket.connect(Socket.java:609) at org.apache.flink.hadoop.shaded.org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:75) at org.apache.flink.hadoop.shaded.org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142) ... 13 more -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ |
StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(config) does
not actually create any resources yet, this only happens when you run a job. Upon execute() the Flink cluster is started, the job is run, and once the job finishes (and execute() returns) the cluster shuts down. So, you can only query the REST API if the cluster is running; you will have to call execute() and in a separate thread query the REST API. FYI: You should never query 6123 via HTTP, since this is what Akka runs on. On 25/05/2020 12:46, Annemarie Burger wrote: > Hi, > > Thanks for your response! > I can't seem to get past a "java.net.ConnectException: Connection refused" > though. Below is the relevant code and exception, any idea what I'm doing > wrong? > > > > Configuration config = new Configuration(); > config.setString(JobManagerOptions.ADDRESS, "localhost"); > config.setInteger(RestOptions.RETRY_MAX_ATTEMPTS, 10); > config.setLong(RestOptions.RETRY_DELAY, 1000); > config.setString(RestOptions.ADDRESS,"localhost"); > > // Unsure if I need this > ExecutorService ex = > WebMonitorEndpoint.createExecutorService(config.getInteger(RestOptions.SERVER_NUM_THREADS), > > config.getInteger(RestOptions.SERVER_THREAD_PRIORITY),"name"); > > StreamExecutionEnvironment env = > > StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(config); > > CloseableHttpClient httpClient = HttpClients.createDefault(); > > // Same problem when trying with port 6123 > HttpGet httpget = new HttpGet("http://localhost:8081/jobs"); > HttpResponse httpresponse = httpClient.execute(httpget); > > > Exception in thread "main" > org.apache.flink.hadoop.shaded.org.apache.http.conn.HttpHostConnectException: > Connect to localhost:8081 [localhost/127.0.0.1, localhost/0:0:0:0:0:0:0:1] > failed: Connection refused: connect > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:159) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:359) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:381) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:111) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108) > at gellyStreaming.gradoop.model.GraphState.<init>(GraphState.java:113) > at > gellyStreaming.gradoop.model.SimpleTemporalEdgeStream.buildState(SimpleTemporalEdgeStream.java:473) > at gellyStreaming.gradoop.model.Tests.incrementalState(Tests.java:261) > at gellyStreaming.gradoop.model.Tests.main(Tests.java:417) > Caused by: java.net.ConnectException: Connection refused: connect > at java.base/java.net.PlainSocketImpl.connect0(Native Method) > at > java.base/java.net.PlainSocketImpl.socketConnect(PlainSocketImpl.java:101) > at > java.base/java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:399) > at > java.base/java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:242) > at > java.base/java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:224) > at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:403) > at java.base/java.net.Socket.connect(Socket.java:609) > at > org.apache.flink.hadoop.shaded.org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:75) > at > org.apache.flink.hadoop.shaded.org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142) > ... 13 more > > > > > -- > Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ > |
Hi,
Thanks for your reply and explanation! Do you know of any way to have a job retrieve its own jobID while it's still running? Best, Annemarie -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ |
If you set DeploymentOptions.ATTACHED to false then execute() does not
block until the job finishes, and returns a DetachedJobExecutionResult from which you can retrieve the Job ID. If you need to know when the job finishes you will have to continuously query the REST API. This is the only way to do so using the StreamExecutionEnvironment that I'm aware of. On 25/05/2020 14:34, Annemarie Burger wrote: > Hi, > > Thanks for your reply and explanation! > Do you know of any way to have a job retrieve its own jobID while it's still > running? > > Best, > Annemarie > > > > -- > Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ > |
Free forum by Nabble | Edit this page |