Application-specific loggers configuration

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Application-specific loggers configuration

Gwenhael Pasquiers

Hi,

 

We’re developing the first of (we hope) many flink streaming app.

 

We’d like to package the logging configuration (log4j) together with the jar. Meaning, different application will probably have different logging configuration (ex: to different logstash ports) …

 

Is there a way to “override” the many log4j properties files that are in flink/conf./*.properties ?

 

In our environment, the flink binaries would be on the PATH, and our apps would be :

-          Jar file

-          App configuration files

-          Log configuration files

-          Startup script

 

B.R.

 

Gwenhaël PASQUIERS

Reply | Threaded
Open this post in threaded view
|

Re: Application-specific loggers configuration

Aljoscha Krettek
Hi Gwenhaël,
are you using the one-yarn-cluster-per-job mode of Flink? I.e., you are starting your Flink job with (from the doc):

flink run -m yarn-cluster -yn 4 -yjm 1024 -ytm 4096 ./examples/flink-java-examples-0.10-SNAPSHOT-WordCount.jar

If you are, then this is almost possible on the current version of Flink. What you have to do is copy the conf directory of Flink to a separate directory that is specific to your job. There you make your modifications to the log configuration etc. Then, when you start your job you do this instead:

export FLINK_CONF_DIR=/path/to/my/conf
flink run -m yarn-cluster -yn 4 -yjm 1024 -ytm 4096 ./examples/flink-java-examples-0.10-SNAPSHOT-WordCount.jar

You can easily put this into your startup script, of course.

I said almost possible because this requires a small fix in bin/flink. Around line 130 this line:
FLINK_CONF_DIR=$FLINK_ROOT_DIR_MANGLED/conf
needs to be replaced by this line:
if [ -z "$FLINK_CONF_DIR" ]; then FLINK_CONF_DIR=$FLINK_ROOT_DIR_MANGLED/conf; fi

(We will fix this in the upcoming version and the 0.9.1 bugfix release.)

Does this help? Let us know if you are not using the one-yarn-cluster-per-job mode, then we'll have to try to find another solution.

Best,
Aljoscha



On Tue, 25 Aug 2015 at 16:22 Gwenhael Pasquiers <[hidden email]> wrote:

Hi,

 

We’re developing the first of (we hope) many flink streaming app.

 

We’d like to package the logging configuration (log4j) together with the jar. Meaning, different application will probably have different logging configuration (ex: to different logstash ports) …

 

Is there a way to “override” the many log4j properties files that are in flink/conf./*.properties ?

 

In our environment, the flink binaries would be on the PATH, and our apps would be :

-          Jar file

-          App configuration files

-          Log configuration files

-          Startup script

 

B.R.

 

Gwenhaël PASQUIERS

Reply | Threaded
Open this post in threaded view
|

RE: Application-specific loggers configuration

Gwenhael Pasquiers

Hi !

 

Yes, we’re starting our job with  “flink run --jobmanager yarn-cluster”

 

So it’s perfect, we’ll use your fix and, when it’s out, we’ll switch to flink 0.9.1.

 

B.R.

 

From: Aljoscha Krettek [mailto:[hidden email]]
Sent: mardi 25 août 2015 19:25
To: [hidden email]
Subject: Re: Application-specific loggers configuration

 

Hi Gwenhaël,

are you using the one-yarn-cluster-per-job mode of Flink? I.e., you are starting your Flink job with (from the doc):

 

flink run -m yarn-cluster -yn 4 -yjm 1024 -ytm 4096 ./examples/flink-java-examples-0.10-SNAPSHOT-WordCount.jar

 

If you are, then this is almost possible on the current version of Flink. What you have to do is copy the conf directory of Flink to a separate directory that is specific to your job. There you make your modifications to the log configuration etc. Then, when you start your job you do this instead:

 

export FLINK_CONF_DIR=/path/to/my/conf

flink run -m yarn-cluster -yn 4 -yjm 1024 -ytm 4096 ./examples/flink-java-examples-0.10-SNAPSHOT-WordCount.jar

 

You can easily put this into your startup script, of course.

 

I said almost possible because this requires a small fix in bin/flink. Around line 130 this line:

FLINK_CONF_DIR=$FLINK_ROOT_DIR_MANGLED/conf

needs to be replaced by this line:

if [ -z "$FLINK_CONF_DIR" ]; then FLINK_CONF_DIR=$FLINK_ROOT_DIR_MANGLED/conf; fi

 

(We will fix this in the upcoming version and the 0.9.1 bugfix release.)

 

Does this help? Let us know if you are not using the one-yarn-cluster-per-job mode, then we'll have to try to find another solution.

 

Best,

Aljoscha

 

 

 

On Tue, 25 Aug 2015 at 16:22 Gwenhael Pasquiers <[hidden email]> wrote:

Hi,

 

We’re developing the first of (we hope) many flink streaming app.

 

We’d like to package the logging configuration (log4j) together with the jar. Meaning, different application will probably have different logging configuration (ex: to different logstash ports) …

 

Is there a way to “override” the many log4j properties files that are in flink/conf./*.properties ?

 

In our environment, the flink binaries would be on the PATH, and our apps would be :

-          Jar file

-          App configuration files

-          Log configuration files

-          Startup script

 

B.R.

 

Gwenhaël PASQUIERS