Hi,
I am running Flink on YARN mode. My app consumes data from Kafka and then does some internal processing and then indexes the data to Solr cloud. I have a big Solr cloud (one collection for each month of a language and the collection name is decided from the date of the message read from kafka) and would need a configuration file to create the solr writer instances.
My question is how does one pass a config file to an app in Flink. I am using the parameter tool to pass simple parameters like kafka topic but the Solr config file is quite big and I would like to read the config as a file in the app code. How would one achieve this? Should the file be kept in HDFS and read or is there a way to pass a config file contents through the Parameter tool? Any example code or references is much appreciated
Thanks,
Dinesh