Re: How to submit two Flink jobs to the same Flink cluster?

Posted by Fabian Hueske-2 on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/How-to-submit-two-Flink-jobs-to-the-same-Flink-cluster-tp20675p20683.html

Hi Angelica,

The Flink cluster needs to provide a sufficient number of slots to process the tasks of all submitted jobs.
Besides that there is no limit. However, if you run super many jobs, you might need to tune a few configuration parameters.

Best, Fabian

2018-06-12 8:46 GMT+02:00 Sampath Bhat <[hidden email]>:
Hi Angelica

You can run any number of flink jobs in flink cluster. There is no restriction as such until and unless there are issues with flink jobs resource sharing(Ex : two jobs accessing same port).

On Tue, Jun 12, 2018 at 5:03 AM, Angelica <[hidden email]> wrote:
I have a Flink Standalone Cluster based on Flink 1.4.2 (1 job manager, 4 task
slots) and want to submit two different Flink programs.
Not sure if this is possible at all as some flink archives say that a job
manager can only run one job. If this is true, any ideas how can I get
around this issue? There is only one machine available for the Flink cluster
and we don't want to use any resource manager such as Mesos or Yarn.

Any hints?






--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/