Submit Flink 1.11 job from java

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Submit Flink 1.11 job from java

Flavio Pompermaier
Hi to all,
in my current job server I submit jobs to the cluster setting up an SSH session with the JobManager host and running the bin/flink run command remotely (since the jar is put in the flink-web-upload directory). Unfortunately, this approach makes very difficult to caputre all exceptions that a job submission could arise
Is there a better way to invoke the execution of a main class contained in a jar file uploaded on the Job Manager? Ideally I could invoke the Flink REST API but the problem is that I need to call some code after env.execute() and that's not possible if I use them..every java code after env.execute() is discarded, while this does not happen if I use the CLI client.

I know that there was some client refactoring in Flink 1.11 but I didn't find a solution to this problem yet.

Thanks in advance for any help,
Flavio
Reply | Threaded
Open this post in threaded view
|

Re: Submit Flink 1.11 job from java

godfrey he
hi Flavio,
Maybe you can try env.executeAsync method, 
which just submits the job and returns a JobClient.

Best,
Godfrey

Flavio Pompermaier <[hidden email]> 于2020年8月6日周四 下午9:45写道:
Hi to all,
in my current job server I submit jobs to the cluster setting up an SSH session with the JobManager host and running the bin/flink run command remotely (since the jar is put in the flink-web-upload directory). Unfortunately, this approach makes very difficult to caputre all exceptions that a job submission could arise
Is there a better way to invoke the execution of a main class contained in a jar file uploaded on the Job Manager? Ideally I could invoke the Flink REST API but the problem is that I need to call some code after env.execute() and that's not possible if I use them..every java code after env.execute() is discarded, while this does not happen if I use the CLI client.

I know that there was some client refactoring in Flink 1.11 but I didn't find a solution to this problem yet.

Thanks in advance for any help,
Flavio
Reply | Threaded
Open this post in threaded view
|

Re: Submit Flink 1.11 job from java

Flavio Pompermaier
The problem with env.executeAsync is that I need to load the job classes on the client side and this is something I'd like to avoid because it's a source of problems.
I'd like to tell Flink to run a jar that is available somewhere (on the flink instances or on the blob server or on a network filesystem like DFS or HDFS).
Probably, what  I'm looking for it to run the CLI client from a remote host but I think this is not possible right now..

Best,
Flavio

On Fri, Aug 7, 2020 at 3:55 AM godfrey he <[hidden email]> wrote:
hi Flavio,
Maybe you can try env.executeAsync method, 
which just submits the job and returns a JobClient.

Best,
Godfrey

Flavio Pompermaier <[hidden email]> 于2020年8月6日周四 下午9:45写道:
Hi to all,
in my current job server I submit jobs to the cluster setting up an SSH session with the JobManager host and running the bin/flink run command remotely (since the jar is put in the flink-web-upload directory). Unfortunately, this approach makes very difficult to caputre all exceptions that a job submission could arise
Is there a better way to invoke the execution of a main class contained in a jar file uploaded on the Job Manager? Ideally I could invoke the Flink REST API but the problem is that I need to call some code after env.execute() and that's not possible if I use them..every java code after env.execute() is discarded, while this does not happen if I use the CLI client.

I know that there was some client refactoring in Flink 1.11 but I didn't find a solution to this problem yet.

Thanks in advance for any help,
Flavio
Reply | Threaded
Open this post in threaded view
|

Re: Submit Flink 1.11 job from java

David Anderson-3
Flavio,

Have you looked at application mode [1] [2] [3], added in 1.11? It offers at least some of what you are looking for -- the application jar and its dependencies can be pre-uploaded to HDFS, and the main() method runs on the job manager, so none of the classes have to be loaded in the client.


Best,
David



On Fri, Aug 7, 2020 at 9:31 AM Flavio Pompermaier <[hidden email]> wrote:
The problem with env.executeAsync is that I need to load the job classes on the client side and this is something I'd like to avoid because it's a source of problems.
I'd like to tell Flink to run a jar that is available somewhere (on the flink instances or on the blob server or on a network filesystem like DFS or HDFS).
Probably, what  I'm looking for it to run the CLI client from a remote host but I think this is not possible right now..

Best,
Flavio

On Fri, Aug 7, 2020 at 3:55 AM godfrey he <[hidden email]> wrote:
hi Flavio,
Maybe you can try env.executeAsync method, 
which just submits the job and returns a JobClient.

Best,
Godfrey

Flavio Pompermaier <[hidden email]> 于2020年8月6日周四 下午9:45写道:
Hi to all,
in my current job server I submit jobs to the cluster setting up an SSH session with the JobManager host and running the bin/flink run command remotely (since the jar is put in the flink-web-upload directory). Unfortunately, this approach makes very difficult to caputre all exceptions that a job submission could arise
Is there a better way to invoke the execution of a main class contained in a jar file uploaded on the Job Manager? Ideally I could invoke the Flink REST API but the problem is that I need to call some code after env.execute() and that's not possible if I use them..every java code after env.execute() is discarded, while this does not happen if I use the CLI client.

I know that there was some client refactoring in Flink 1.11 but I didn't find a solution to this problem yet.

Thanks in advance for any help,
Flavio