I want to manage the execution of Flink Jobs programmatically through Flink Monitoring API.
I.e. I want to run/delete jobs ONLY with the
Now, it seems that the Session Mode may fits my needs: “Session Mode: one JobManager instance manages multiple jobs sharing the same cluster of TaskManagers” (https://ci.apache.org/projects/flink/flink-docs-release-1.12/deployment/) However, I couldn’t find a way to start the API server (i.e. a JobManager) that didn’t already include submitting a JAR file for a job execution. Any suggestions? |
Hi Barak, Before starting the JobManager I don't think there is any API running at all. If you want to be able to submit/stop multiple jobs to the same cluster session mode is indeed the way to go. But first you need to to start the cluster ( start-cluster.sh ) [1] Piotrek wt., 25 maj 2021 o 14:10 Barak Ben Nathan <[hidden email]> napisał(a):
|
Glad to hear it! Thanks for confirming that it works. Piotrek śr., 26 maj 2021 o 12:59 Barak Ben Nathan <[hidden email]> napisał(a):
|
Free forum by Nabble | Edit this page |