Re: Job manager URI rpc address:port
Posted by
Jeff Zhang on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Job-manager-URI-rpc-address-port-tp34437p34447.html
Som, Let us know when you have any problems
Thanks for the info and links.
I had a lot of problems I am not sure what I was doing wrong.
May be conflicts with setup from apache spark. I think I may need to setup users for each development.
Anyway I kept doing fresh installs about four altogether I think.
Everything works fine now
Including remote access of zeppelin on machines across the local area network.
Next step setup remote clusters
Wish me luck !
Hi Som,
You can take a look at flink on zeppelin, in zeppelin you can connect to a remote flink cluster via a few configuration, and you don't need to worry about the jars. Flink interpreter will ship necessary jars for you. Here's a list of tutorials.
Hi Tison,
I think I may have found what I want in example 22.
I need to create Configuration object first as shown .
Also I think flink-conf.yaml file may contain configuration for client rather than server. So before starting is irrelevant.
I am going to play around and see but if the Configuration class allows me to set configuration programmatically and overrides the yaml file then that would be great.
Thanks.
flink-conf.yaml does allow me to do what I need to do without making any changes to client source code.
But
RemoteStreamEnvironment constructor expects a jar file as the third parameter also.
RemoteStreamEnvironment(String host, int port, String... jarFiles)
Creates a new RemoteStreamEnvironment that points to the master (JobManager) described by the given host name and port.
You can change flink-conf.yaml "jobmanager.address" or "jobmanager.port" options before run the program or take a look at RemoteStreamEnvironment which enables configuring host and port.
Hi,
After running
The following line of code defaults jobmanager to localhost:6123
final ExecutionEnvironment env = Environment.getExecutionEnvironment();
which is same on spark.
val spark = SparkSession.builder.master(local[*]).appname("anapp").getOrCreate
However if I wish to run the servers on a different physical computer.
Then in Spark I can do it this way using the spark URI in my IDE.
Conf = SparkConf().setMaster("spark://<hostip>:<port>").setAppName("anapp")
Can you please tell me the equivalent change to make so I can run my servers and my IDE from different physical computers.
--
Best Regards
Jeff Zhang
--
Best Regards
Jeff Zhang