Re: Re: Re: How to deploy dynamically generated flink jobs?
Posted by
Yun Gao on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/How-to-deploy-dynamically-generated-flink-jobs-tp39040p39070.html
Hi Alexander,
Sorry I might not fully understand the issue, do you means the "flink" jar is the same jar with the spring app fat jar, or they are not the same jar? As a whole, I think the parameter value we need for jarFiles is the absolute path of the jar file. We might need some logic to decide the path to the jar files. For example, if the "flink" jar containing the UDF is the same to the spring app fat jar containing the execute call, we might use method like [1] to find the containing jar, otherwise we might need some mappings from the job name to its flink jar.
Best,
Yun
[1] https://github.com/apache/hadoop/blob/8ee6bc2518bfdf7ad257cc1cf3c73f4208c49fc0/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/ClassUtil.java#L38
------------------Original Mail ------------------
Send Date:Fri Oct 30 04:49:59 2020
Subject:Re: Re: How to deploy dynamically generated flink jobs?
Thanks, Yun. Makes sense. How would you reference a jar file from inside of another jar for such invocation?
In my case I would have an interactive application - spring boot web app - where the job would be configured and StreamExecutionEnvironment.execute(jobName) is called.
Spring app is a runnable fat jar with my "flink" jar packaged along with other jars. How would I specify location to the jar so that StreamExecutionEnvironment can find it?
Thanks,
Alex