Re: Job manager URI rpc address:port

Posted by Som Lima on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Job-manager-URI-rpc-address-port-tp34437p34506.html

This is the code I was looking for,  which will allow me programmatically to connect to remote jobmanager same as  spark remote master .
The spark master which shares the compute load with slaves , in the case of flink jobmanager with taskmanagers.


Configuration conf = new Configuration();
conf.setString("mykey","myvalue");
final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
env.getConfig().setGlobalJobParameters(conf);

I found it at the bottom of this page .

https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/batch/index.html




On Sun, 19 Apr 2020, 11:02 tison, <[hidden email]> wrote:
You can change flink-conf.yaml "jobmanager.address" or "jobmanager.port" options before run the program or take a look at RemoteStreamEnvironment which enables configuring host and port.

Best,
tison.


Som Lima <[hidden email]> 于2020年4月19日周日 下午5:58写道:
Hi,

After running 
$ ./bin/start-cluster.sh
The following line of code defaults jobmanager  to localhost:6123 

final  ExecutionEnvironment env = Environment.getExecutionEnvironment();

which is same on spark.

val spark = SparkSession.builder.master(local[*]).appname("anapp").getOrCreate

However if I wish to run the servers on a different physical computer.
Then in Spark I can do it this way using the spark URI in my IDE.

Conf =  SparkConf().setMaster("spark://<hostip>:<port>").setAppName("anapp")

Can you please tell me the equivalent change to make so I can run my servers and my IDE from different physical computers.