Ability to partition logs per pipeline

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Ability to partition logs per pipeline

Chawla,Sumit

Hi All

Does flink provide any ability to streamline logs being generated from a  pipeline.  How can we keep the logs from two pipelines separate so that its easy to debug the pipeline execution (something dynamic to automatically partition the logs per pipeline) 

Regards
Sumit Chawla

Reply | Threaded
Open this post in threaded view
|

Re: Ability to partition logs per pipeline

rmetzger0
Hi Sumit,

What exactly do you mean by pipeline?
Are you talking about cases were multiple jobs are running concurrently on the same TaskManager, or are you referring to parallel instances of a Flink job?

On Wed, Jul 13, 2016 at 9:49 PM, Chawla,Sumit <[hidden email]> wrote:

Hi All

Does flink provide any ability to streamline logs being generated from a  pipeline.  How can we keep the logs from two pipelines separate so that its easy to debug the pipeline execution (something dynamic to automatically partition the logs per pipeline) 

Regards
Sumit Chawla


Reply | Threaded
Open this post in threaded view
|

Re: Ability to partition logs per pipeline

Chawla,Sumit
Hi Robert

I actually mean both.  Scenarios where multiple jobs are running on cluster, and same job could  be running on multiple task managers.  How can we make sure that each job logs to a different file so that Logs are not mixed, and its easy to debug a particular job.  Something like Hadoop Yarn, where each attempt of a task produces a different log file.

Regards
Sumit Chawla


On Thu, Jul 14, 2016 at 6:11 AM, Robert Metzger <[hidden email]> wrote:
Hi Sumit,

What exactly do you mean by pipeline?
Are you talking about cases were multiple jobs are running concurrently on the same TaskManager, or are you referring to parallel instances of a Flink job?

On Wed, Jul 13, 2016 at 9:49 PM, Chawla,Sumit <[hidden email]> wrote:

Hi All

Does flink provide any ability to streamline logs being generated from a  pipeline.  How can we keep the logs from two pipelines separate so that its easy to debug the pipeline execution (something dynamic to automatically partition the logs per pipeline) 

Regards
Sumit Chawla



Reply | Threaded
Open this post in threaded view
|

Re: Ability to partition logs per pipeline

Aljoscha Krettek
Hi,
I'm afraid that's not possible right now. The preferred way of running would be to have a Yarn cluster per job, that way you can isolate the logs.

Cheers,
Aljoscha

On Thu, 14 Jul 2016 at 09:49 Chawla,Sumit <[hidden email]> wrote:
Hi Robert

I actually mean both.  Scenarios where multiple jobs are running on cluster, and same job could  be running on multiple task managers.  How can we make sure that each job logs to a different file so that Logs are not mixed, and its easy to debug a particular job.  Something like Hadoop Yarn, where each attempt of a task produces a different log file.

Regards

Sumit Chawla


On Thu, Jul 14, 2016 at 6:11 AM, Robert Metzger <[hidden email]> wrote:
Hi Sumit,

What exactly do you mean by pipeline?
Are you talking about cases were multiple jobs are running concurrently on the same TaskManager, or are you referring to parallel instances of a Flink job?

On Wed, Jul 13, 2016 at 9:49 PM, Chawla,Sumit <[hidden email]> wrote:

Hi All

Does flink provide any ability to streamline logs being generated from a  pipeline.  How can we keep the logs from two pipelines separate so that its easy to debug the pipeline execution (something dynamic to automatically partition the logs per pipeline) 

Regards
Sumit Chawla