flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

Luis Mariano Guerra
any hint of what may I be doing wrong for this to fail like this?
Reply | Threaded
Open this post in threaded view
|

Re: flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

Luis Mariano Guerra
context: I have two other similar jobs in the same project that run without problem.

On Mon, Sep 19, 2016 at 4:28 PM, Luis Mariano Guerra <[hidden email]> wrote:
any hint of what may I be doing wrong for this to fail like this?

Reply | Threaded
Open this post in threaded view
|

Re: flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

Fabian Hueske-2
Hi Luis,

this looks like a bug.
Can you open a JIRA [1] issue and provide a more detailed description of what you do (Environment, DataStream / DataSet, how do you submit the program, maybe add a small program that reproduce the problem on your setup)?

Thanks, Fabian

2016-09-19 17:30 GMT+02:00 Luis Mariano Guerra <[hidden email]>:
context: I have two other similar jobs in the same project that run without problem.

On Mon, Sep 19, 2016 at 4:28 PM, Luis Mariano Guerra <[hidden email]> wrote:
any hint of what may I be doing wrong for this to fail like this?


Reply | Threaded
Open this post in threaded view
|

Re: flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

Luis Mariano Guerra
On Mon, Sep 19, 2016 at 8:02 PM, Fabian Hueske <[hidden email]> wrote:
Hi Luis,

this looks like a bug.
Can you open a JIRA [1] issue and provide a more detailed description of what you do (Environment, DataStream / DataSet, how do you submit the program, maybe add a small program that reproduce the problem on your setup)?

The problem was that I was catching an exception during setup and logging the error, but for some reason logging doesn't log at that point, is there a way to avoid the "log and print" problem during setup? or should I just print?
 

Thanks, Fabian

2016-09-19 17:30 GMT+02:00 Luis Mariano Guerra <[hidden email]>:
context: I have two other similar jobs in the same project that run without problem.

On Mon, Sep 19, 2016 at 4:28 PM, Luis Mariano Guerra <[hidden email]> wrote:
any hint of what may I be doing wrong for this to fail like this?



Reply | Threaded
Open this post in threaded view
|

Re: flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

Maximilian Michels
Hi Luis,

That looks like a bug but looking at the code I don't yet see how it may occur. We definitely need more information to reproduce it. Do you have an example job? Are you using master or a Flink release? Are your Flink cluster and your job compiled with the exact same version of Flink?

Cheers,
Max

On Tue, Sep 20, 2016 at 12:06 PM, Luis Mariano Guerra <[hidden email]> wrote:
On Mon, Sep 19, 2016 at 8:02 PM, Fabian Hueske <[hidden email]> wrote:
Hi Luis,

this looks like a bug.
Can you open a JIRA [1] issue and provide a more detailed description of what you do (Environment, DataStream / DataSet, how do you submit the program, maybe add a small program that reproduce the problem on your setup)?

The problem was that I was catching an exception during setup and logging the error, but for some reason logging doesn't log at that point, is there a way to avoid the "log and print" problem during setup? or should I just print?
 

Thanks, Fabian

2016-09-19 17:30 GMT+02:00 Luis Mariano Guerra <[hidden email]>:
context: I have two other similar jobs in the same project that run without problem.

On Mon, Sep 19, 2016 at 4:28 PM, Luis Mariano Guerra <[hidden email]> wrote:
any hint of what may I be doing wrong for this to fail like this?




Reply | Threaded
Open this post in threaded view
|

Re: flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

Luis Mariano Guerra
On Tue, Sep 20, 2016 at 12:49 PM, Maximilian Michels <[hidden email]> wrote:
Hi Luis,

That looks like a bug but looking at the code I don't yet see how it may occur. We definitely need more information to reproduce it. Do you have an example job? Are you using master or a Flink release? Are your Flink cluster and your job compiled with the exact same version of Flink?

I had a job that mapped from DataStream<String> (JSON) to DataStream<SpecificRecordBase> (Avro), during setup I had a try { setup... } catch (Exception ex) { logger.error("error ... ", ex); } in there, setup threw an exception but since I was logging and not using System.out.println I didn't see the error. BTW, this is the error in case it's useful for you:

java.lang.IllegalStateException: Expecting type to be a PojoTypeInfo
        at org.apache.flink.api.java.typeutils.AvroTypeInfo.generateFieldsFromAvroSchema(AvroTypeInfo.java:58)
        at org.apache.flink.api.java.typeutils.AvroTypeInfo.<init>(AvroTypeInfo.java:48)
        at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1585)
        at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1493)
        at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:752)
        at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:580)
        at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:381)
        at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:310)
        at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:125)
        at org.apache.flink.streaming.api.datastream.DataStream.map(DataStream.java:506)

followed by:

 The program finished with the following exception:

java.lang.NullPointerException
        at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:781)
        at org.apache.flink.client.CliFrontend.run(CliFrontend.java:250)
        at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1002)
        at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1045)

which was the one that I was seeing.

I solved it by replacing SpecificRecordBase with Object.


Cheers,
Max

On Tue, Sep 20, 2016 at 12:06 PM, Luis Mariano Guerra <[hidden email]> wrote:
On Mon, Sep 19, 2016 at 8:02 PM, Fabian Hueske <[hidden email]> wrote:
Hi Luis,

this looks like a bug.
Can you open a JIRA [1] issue and provide a more detailed description of what you do (Environment, DataStream / DataSet, how do you submit the program, maybe add a small program that reproduce the problem on your setup)?

The problem was that I was catching an exception during setup and logging the error, but for some reason logging doesn't log at that point, is there a way to avoid the "log and print" problem during setup? or should I just print?
 

Thanks, Fabian

2016-09-19 17:30 GMT+02:00 Luis Mariano Guerra <[hidden email]>:
context: I have two other similar jobs in the same project that run without problem.

On Mon, Sep 19, 2016 at 4:28 PM, Luis Mariano Guerra <[hidden email]> wrote:
any hint of what may I be doing wrong for this to fail like this?





Reply | Threaded
Open this post in threaded view
|

Re: flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

Maximilian Michels
Hi Luis,

With your feedback I was able to find the problem. I have created an
issue and a fix is available which will be in Flink 1.1.3 and Flink
1.2.0.


Thanks,
Max

[1] https://issues.apache.org/jira/browse/FLINK-4677

On Tue, Sep 20, 2016 at 2:00 PM, Luis Mariano Guerra
<[hidden email]> wrote:

> On Tue, Sep 20, 2016 at 12:49 PM, Maximilian Michels <[hidden email]> wrote:
>>
>> Hi Luis,
>>
>> That looks like a bug but looking at the code I don't yet see how it may
>> occur. We definitely need more information to reproduce it. Do you have an
>> example job? Are you using master or a Flink release? Are your Flink cluster
>> and your job compiled with the exact same version of Flink?
>
>
> I had a job that mapped from DataStream<String> (JSON) to
> DataStream<SpecificRecordBase> (Avro), during setup I had a try { setup... }
> catch (Exception ex) { logger.error("error ... ", ex); } in there, setup
> threw an exception but since I was logging and not using System.out.println
> I didn't see the error. BTW, this is the error in case it's useful for you:
>
> java.lang.IllegalStateException: Expecting type to be a PojoTypeInfo
>         at
> org.apache.flink.api.java.typeutils.AvroTypeInfo.generateFieldsFromAvroSchema(AvroTypeInfo.java:58)
>         at
> org.apache.flink.api.java.typeutils.AvroTypeInfo.<init>(AvroTypeInfo.java:48)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1585)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1493)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:752)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:580)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:381)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:310)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:125)
>         at
> org.apache.flink.streaming.api.datastream.DataStream.map(DataStream.java:506)
>
> followed by:
>
>  The program finished with the following exception:
>
> java.lang.NullPointerException
>         at
> org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:781)
>         at org.apache.flink.client.CliFrontend.run(CliFrontend.java:250)
>         at
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1002)
>         at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1045)
>
> which was the one that I was seeing.
>
> I solved it by replacing SpecificRecordBase with Object.
>
>>
>> Cheers,
>> Max
>>
>> On Tue, Sep 20, 2016 at 12:06 PM, Luis Mariano Guerra
>> <[hidden email]> wrote:
>>>
>>> On Mon, Sep 19, 2016 at 8:02 PM, Fabian Hueske <[hidden email]> wrote:
>>>>
>>>> Hi Luis,
>>>>
>>>> this looks like a bug.
>>>> Can you open a JIRA [1] issue and provide a more detailed description of
>>>> what you do (Environment, DataStream / DataSet, how do you submit the
>>>> program, maybe add a small program that reproduce the problem on your
>>>> setup)?
>>>
>>>
>>> The problem was that I was catching an exception during setup and logging
>>> the error, but for some reason logging doesn't log at that point, is there a
>>> way to avoid the "log and print" problem during setup? or should I just
>>> print?
>>>
>>>>
>>>>
>>>> Thanks, Fabian
>>>>
>>>> 2016-09-19 17:30 GMT+02:00 Luis Mariano Guerra
>>>> <[hidden email]>:
>>>>>
>>>>> context: I have two other similar jobs in the same project that run
>>>>> without problem.
>>>>>
>>>>> On Mon, Sep 19, 2016 at 4:28 PM, Luis Mariano Guerra
>>>>> <[hidden email]> wrote:
>>>>>>
>>>>>> hi
>>>>>>
>>>>>> submitting a job I get a NPE here:
>>>>>>
>>>>>> https://github.com/apache/flink/blob/master/flink-clients/src/main/java/org/apache/flink/client/CliFrontend.java#L781
>>>>>>
>>>>>> building from source and adding some prints I got that
>>>>>> this.lastJobExecutionResult here seems to be null:
>>>>>> https://github.com/apache/flink/blob/master/flink-clients/src/main/java/org/apache/flink/client/program/ClusterClient.java#L329
>>>>>>
>>>>>> any hint of what may I be doing wrong for this to fail like this?
>>>>>
>>>>>
>>>>
>>>
>>
>
Reply | Threaded
Open this post in threaded view
|

Re: flink run throws NPE, JobSubmissionResult is null when interactive and not isDetached()

Luis Mariano Guerra
On Mon, Sep 26, 2016 at 2:07 PM, Maximilian Michels <[hidden email]> wrote:
Hi Luis,

With your feedback I was able to find the problem. I have created an
issue and a fix is available which will be in Flink 1.1.3 and Flink
1.2.0.


Thanks,

Thank you!
 
Max

[1] https://issues.apache.org/jira/browse/FLINK-4677

On Tue, Sep 20, 2016 at 2:00 PM, Luis Mariano Guerra
<[hidden email]> wrote:
> On Tue, Sep 20, 2016 at 12:49 PM, Maximilian Michels <[hidden email]> wrote:
>>
>> Hi Luis,
>>
>> That looks like a bug but looking at the code I don't yet see how it may
>> occur. We definitely need more information to reproduce it. Do you have an
>> example job? Are you using master or a Flink release? Are your Flink cluster
>> and your job compiled with the exact same version of Flink?
>
>
> I had a job that mapped from DataStream<String> (JSON) to
> DataStream<SpecificRecordBase> (Avro), during setup I had a try { setup... }
> catch (Exception ex) { logger.error("error ... ", ex); } in there, setup
> threw an exception but since I was logging and not using System.out.println
> I didn't see the error. BTW, this is the error in case it's useful for you:
>
> java.lang.IllegalStateException: Expecting type to be a PojoTypeInfo
>         at
> org.apache.flink.api.java.typeutils.AvroTypeInfo.generateFieldsFromAvroSchema(AvroTypeInfo.java:58)
>         at
> org.apache.flink.api.java.typeutils.AvroTypeInfo.<init>(AvroTypeInfo.java:48)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1585)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1493)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:752)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:580)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:381)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:310)
>         at
> org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:125)
>         at
> org.apache.flink.streaming.api.datastream.DataStream.map(DataStream.java:506)
>
> followed by:
>
>  The program finished with the following exception:
>
> java.lang.NullPointerException
>         at
> org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:781)
>         at org.apache.flink.client.CliFrontend.run(CliFrontend.java:250)
>         at
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1002)
>         at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1045)
>
> which was the one that I was seeing.
>
> I solved it by replacing SpecificRecordBase with Object.
>
>>
>> Cheers,
>> Max
>>
>> On Tue, Sep 20, 2016 at 12:06 PM, Luis Mariano Guerra
>> <[hidden email]> wrote:
>>>
>>> On Mon, Sep 19, 2016 at 8:02 PM, Fabian Hueske <[hidden email]> wrote:
>>>>
>>>> Hi Luis,
>>>>
>>>> this looks like a bug.
>>>> Can you open a JIRA [1] issue and provide a more detailed description of
>>>> what you do (Environment, DataStream / DataSet, how do you submit the
>>>> program, maybe add a small program that reproduce the problem on your
>>>> setup)?
>>>
>>>
>>> The problem was that I was catching an exception during setup and logging
>>> the error, but for some reason logging doesn't log at that point, is there a
>>> way to avoid the "log and print" problem during setup? or should I just
>>> print?
>>>
>>>>
>>>>
>>>> Thanks, Fabian
>>>>
>>>> 2016-09-19 17:30 GMT+02:00 Luis Mariano Guerra
>>>> <[hidden email]>:
>>>>>
>>>>> context: I have two other similar jobs in the same project that run
>>>>> without problem.
>>>>>
>>>>> On Mon, Sep 19, 2016 at 4:28 PM, Luis Mariano Guerra
>>>>> <[hidden email]> wrote:
>>>>>>
>>>>>> hi
>>>>>>
>>>>>> submitting a job I get a NPE here:
>>>>>>
>>>>>> https://github.com/apache/flink/blob/master/flink-clients/src/main/java/org/apache/flink/client/CliFrontend.java#L781
>>>>>>
>>>>>> building from source and adding some prints I got that
>>>>>> this.lastJobExecutionResult here seems to be null:
>>>>>> https://github.com/apache/flink/blob/master/flink-clients/src/main/java/org/apache/flink/client/program/ClusterClient.java#L329
>>>>>>
>>>>>> any hint of what may I be doing wrong for this to fail like this?
>>>>>
>>>>>
>>>>
>>>
>>
>