Organize env using files

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Organize env using files

andy
I have 3 different files for env: test, staging and production. Each of those has different parameters like: kafka host, endpoint urls, redis connection host…
I read about `https://ci.apache.org/projects/flink/flink-docs-stable/dev/best_practices.html#register-the-parameters-globally` but its has a few downside: If I have a model class (for transformation) and want to use a env variable, I have to pass it in the constructor (because the global paramters can only accessed inside context)

I wonder if are there any better solution for accessing env anywhere in our jobs? I can be fine with having a class storing all the env values and if I can somehow spec the env as “prod” I can init the ProdEnv Class… I’m looking for any ideas here :)

Thank a bunch,

Andy,
Reply | Threaded
Open this post in threaded view
|

Re: Organize env using files

Rafi Aroch

On Wed, Apr 17, 2019 at 7:07 AM Andy Hoang <[hidden email]> wrote:
I have 3 different files for env: test, staging and production. Each of those has different parameters like: kafka host, endpoint urls, redis connection host…
I read about `https://ci.apache.org/projects/flink/flink-docs-stable/dev/best_practices.html#register-the-parameters-globally` but its has a few downside: If I have a model class (for transformation) and want to use a env variable, I have to pass it in the constructor (because the global paramters can only accessed inside context)

I wonder if are there any better solution for accessing env anywhere in our jobs? I can be fine with having a class storing all the env values and if I can somehow spec the env as “prod” I can init the ProdEnv Class… I’m looking for any ideas here :)

Thank a bunch,

Andy,