Facing issues with Logback

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Facing issues with Logback

Teena K
I have a single node Flink instance which has the required jars for logback in the lib folder (logback-classic.jar, logback-core.jar, log4j-over-slf4j.jar). I have removed the jars for log4j from the lib folder (log4j-1.2.17.jar, slf4j-log4j12-1.7.7.jar). 'logback.xml' is also correctly updated in 'conf' folder. I have also included 'logback.xml' in the classpath, although this does not seem to be considered while the job is run. Flink refers to logback.xml inside the conf folder only. I have updated pom.xml as per Flink's documentation in order to exclude log4j. I have some log entries set inside a few map and flatmap functions and some log entries outside those functions (eg: "program execution started").

When I run the job, Flink writes only those logs that are coded outside the transformations. Those logs that are coded inside the transformations (map, flatmap etc) are not getting written to the log file. If this was happening always, I could have assumed that the Task Manager is not writing the logs. But Flink displays a strange behavior regarding this. Whenever I update the logback jars inside the the lib folder(due to version changes), during the next job run, all logs (even those inside map and flatmap) are written correctly into the log file. But the logs don't get written in any of the runs after that. This means that my 'logback.xml' file is correct and the settings are also correct. But I don't understand why the same settings don't work while the same job is run again.



Reply | Threaded
Open this post in threaded view
|

Re: Facing issues with Logback

Fabian Hueske-2
Hi Teena,

thanks for reaching out to the mailing list for this issue. This sound indeed like a bug in Flink and should be investigated.
We are currently working on a new release 1.4 and the testing phase will start soon. So it would make sense to include this problem in the testing and hopefully include a bugfix for the next release.

I've created a JIRA issue to track the problem [1].

I left out the "affects version" field because you didn't mention your Flink version.
Can you update the JIRA issue or reply with your version?

Thank you,
Fabian

[1] https://issues.apache.org/jira/browse/FLINK-7990

2017-10-31 8:37 GMT+01:00 Teena K <[hidden email]>:
I have a single node Flink instance which has the required jars for logback in the lib folder (logback-classic.jar, logback-core.jar, log4j-over-slf4j.jar). I have removed the jars for log4j from the lib folder (log4j-1.2.17.jar, slf4j-log4j12-1.7.7.jar). 'logback.xml' is also correctly updated in 'conf' folder. I have also included 'logback.xml' in the classpath, although this does not seem to be considered while the job is run. Flink refers to logback.xml inside the conf folder only. I have updated pom.xml as per Flink's documentation in order to exclude log4j. I have some log entries set inside a few map and flatmap functions and some log entries outside those functions (eg: "program execution started").

When I run the job, Flink writes only those logs that are coded outside the transformations. Those logs that are coded inside the transformations (map, flatmap etc) are not getting written to the log file. If this was happening always, I could have assumed that the Task Manager is not writing the logs. But Flink displays a strange behavior regarding this. Whenever I update the logback jars inside the the lib folder(due to version changes), during the next job run, all logs (even those inside map and flatmap) are written correctly into the log file. But the logs don't get written in any of the runs after that. This means that my 'logback.xml' file is correct and the settings are also correct. But I don't understand why the same settings don't work while the same job is run again.