Job manager URI rpc address:port

Posted by Som Lima on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Job-manager-URI-rpc-address-port-tp34437.html

Hi,

After running 
$ ./bin/start-cluster.sh
The following line of code defaults jobmanager  to localhost:6123 

final  ExecutionEnvironment env = Environment.getExecutionEnvironment();

which is same on spark.

val spark = SparkSession.builder.master(local[*]).appname("anapp").getOrCreate

However if I wish to run the servers on a different physical computer.
Then in Spark I can do it this way using the spark URI in my IDE.

Conf =  SparkConf().setMaster("spark://<hostip>:<port>").setAppName("anapp")

Can you please tell me the equivalent change to make so I can run my servers and my IDE from different physical computers.