Flink cluster and Java 8

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink cluster and Java 8

Flavio Pompermaier
Hi to all,

I was trying to make my Java 8 application to run on a Flink 0.10.1 cluster.
I've compiled both Flink sources and my app with the same Java version (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every flink-conf.yml of the cluster.

I always get the following Exception:

java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor version 52.0

Is there any other setting I forgot to check? Do I have to change also the source and target to 1.8 in the maven compiler settings of the main pom?

Best,
Flavio
Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Flavio Pompermaier
I've checked the compiled classes with javap -verbose and indeed they had a major.verion=51 (java 7).
So I've changed the source and target to 1.8 in the main pom.xm and now the generated .class have major.verion=52.
Unfortunately now I get this error:

[ERROR] /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63] incompatible types: void cannot be converted to java.lang.Object

How can I fix it? I also tried to upgrade the maven compiler to 3.5 but it didn't help :(

Best,
Flavio

On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <[hidden email]> wrote:
Hi to all,

I was trying to make my Java 8 application to run on a Flink 0.10.1 cluster.
I've compiled both Flink sources and my app with the same Java version (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every flink-conf.yml of the cluster.

I always get the following Exception:

java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor version 52.0

Is there any other setting I forgot to check? Do I have to change also the source and target to 1.8 in the maven compiler settings of the main pom?

Best,
Flavio


Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Flavio Pompermaier
I've fixed it changing the copy method in the TupleSerializer as follow:

@Override
public T copy(T from, T reuse) {
for (int i = 0; i < arity; i++) {
Object copy = fieldSerializers[i].copy(from.getField(i));
reuse.setField(copy, i);
}
return reuse;
}

And commenting line 50 in CollectionExecutionAccumulatorsTest:

assertEquals(NUM_ELEMENTS, result.getAccumulatorResult(ACCUMULATOR_NAME)); 

I hope it helps..

On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier <[hidden email]> wrote:
I've checked the compiled classes with javap -verbose and indeed they had a major.verion=51 (java 7).
So I've changed the source and target to 1.8 in the main pom.xm and now the generated .class have major.verion=52.
Unfortunately now I get this error:

[ERROR] /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63] incompatible types: void cannot be converted to java.lang.Object

How can I fix it? I also tried to upgrade the maven compiler to 3.5 but it didn't help :(

Best,
Flavio

On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <[hidden email]> wrote:
Hi to all,

I was trying to make my Java 8 application to run on a Flink 0.10.1 cluster.
I've compiled both Flink sources and my app with the same Java version (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every flink-conf.yml of the cluster.

I always get the following Exception:

java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor version 52.0

Is there any other setting I forgot to check? Do I have to change also the source and target to 1.8 in the maven compiler settings of the main pom?

Best,
Flavio



Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Flavio Pompermaier
Anyone looking into this? Java 7 reached its end of life at april 2015 with its last public update (numer 80) and the ability to run Java 8 jobs would be more and more important in the future. IMHO, the default target of the maven compiler plugin should be set to 1.8 in the 1.0 release. In most of the cases this would be backward compatible and if it's not you can always recompile it with 1.7 (but as an exception this time). 
Obviously this is not urgent, I just wanted to point this out and hopefully help someone else facing the same problem

Best,
Flavio

On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <[hidden email]> wrote:
I've fixed it changing the copy method in the TupleSerializer as follow:

@Override
public T copy(T from, T reuse) {
for (int i = 0; i < arity; i++) {
Object copy = fieldSerializers[i].copy(from.getField(i));
reuse.setField(copy, i);
}
return reuse;
}

And commenting line 50 in CollectionExecutionAccumulatorsTest:

assertEquals(NUM_ELEMENTS, result.getAccumulatorResult(ACCUMULATOR_NAME)); 

I hope it helps..

On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier <[hidden email]> wrote:
I've checked the compiled classes with javap -verbose and indeed they had a major.verion=51 (java 7).
So I've changed the source and target to 1.8 in the main pom.xm and now the generated .class have major.verion=52.
Unfortunately now I get this error:

[ERROR] /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63] incompatible types: void cannot be converted to java.lang.Object

How can I fix it? I also tried to upgrade the maven compiler to 3.5 but it didn't help :(

Best,
Flavio

On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <[hidden email]> wrote:
Hi to all,

I was trying to make my Java 8 application to run on a Flink 0.10.1 cluster.
I've compiled both Flink sources and my app with the same Java version (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every flink-conf.yml of the cluster.

I always get the following Exception:

java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor version 52.0

Is there any other setting I forgot to check? Do I have to change also the source and target to 1.8 in the maven compiler settings of the main pom?

Best,
Flavio



Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Stephan Ewen
Hi!

I am running Java 8 for a year without an issue. The code is compiled for target Java 7, but can be run with Java 8.
User code that is targeted for Java 8 can be run if Flink is run with Java 8.

The initial error you got was because you probably compiled with Java 8 as the target, and ran it with Java 7.

I would just leave the target to be 1.7 and run it in a Java 8 JVM. User code can also be Java 8, that mixes seamlessly.

Stephan


On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier <[hidden email]> wrote:
Anyone looking into this? Java 7 reached its end of life at april 2015 with its last public update (numer 80) and the ability to run Java 8 jobs would be more and more important in the future. IMHO, the default target of the maven compiler plugin should be set to 1.8 in the 1.0 release. In most of the cases this would be backward compatible and if it's not you can always recompile it with 1.7 (but as an exception this time). 
Obviously this is not urgent, I just wanted to point this out and hopefully help someone else facing the same problem

Best,
Flavio


On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <[hidden email]> wrote:
I've fixed it changing the copy method in the TupleSerializer as follow:

@Override
public T copy(T from, T reuse) {
for (int i = 0; i < arity; i++) {
Object copy = fieldSerializers[i].copy(from.getField(i));
reuse.setField(copy, i);
}
return reuse;
}

And commenting line 50 in CollectionExecutionAccumulatorsTest:

assertEquals(NUM_ELEMENTS, result.getAccumulatorResult(ACCUMULATOR_NAME)); 

I hope it helps..

On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier <[hidden email]> wrote:
I've checked the compiled classes with javap -verbose and indeed they had a major.verion=51 (java 7).
So I've changed the source and target to 1.8 in the main pom.xm and now the generated .class have major.verion=52.
Unfortunately now I get this error:

[ERROR] /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63] incompatible types: void cannot be converted to java.lang.Object

How can I fix it? I also tried to upgrade the maven compiler to 3.5 but it didn't help :(

Best,
Flavio

On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <[hidden email]> wrote:
Hi to all,

I was trying to make my Java 8 application to run on a Flink 0.10.1 cluster.
I've compiled both Flink sources and my app with the same Java version (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every flink-conf.yml of the cluster.

I always get the following Exception:

java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor version 52.0

Is there any other setting I forgot to check? Do I have to change also the source and target to 1.8 in the maven compiler settings of the main pom?

Best,
Flavio




Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Flavio Pompermaier
I've tested several configurations (also changing my compilation to 1.7 but then sesame 4 was causing the error [1]):
  1. Flink compiled with java 1.7 (default), runned within Eclipse with Java 8: OK
  2. Flink compiled with java 1.7 (default), runned the cluster with java 8: not able to run my job compiled with java 1.8 and causing the reported exception (unsupported major.minor version)
  3. Flink compiled with java 1.8: not able to compile without the reported modifications, but then the job was running fine
I don't know if you ever tested all those configurations but I'm sure it wasn't working when deployed in the cluster.


On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <[hidden email]> wrote:
Hi!

I am running Java 8 for a year without an issue. The code is compiled for target Java 7, but can be run with Java 8.
User code that is targeted for Java 8 can be run if Flink is run with Java 8.

The initial error you got was because you probably compiled with Java 8 as the target, and ran it with Java 7.

I would just leave the target to be 1.7 and run it in a Java 8 JVM. User code can also be Java 8, that mixes seamlessly.

Stephan


On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier <[hidden email]> wrote:
Anyone looking into this? Java 7 reached its end of life at april 2015 with its last public update (numer 80) and the ability to run Java 8 jobs would be more and more important in the future. IMHO, the default target of the maven compiler plugin should be set to 1.8 in the 1.0 release. In most of the cases this would be backward compatible and if it's not you can always recompile it with 1.7 (but as an exception this time). 
Obviously this is not urgent, I just wanted to point this out and hopefully help someone else facing the same problem

Best,
Flavio


On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <[hidden email]> wrote:
I've fixed it changing the copy method in the TupleSerializer as follow:

@Override
public T copy(T from, T reuse) {
for (int i = 0; i < arity; i++) {
Object copy = fieldSerializers[i].copy(from.getField(i));
reuse.setField(copy, i);
}
return reuse;
}

And commenting line 50 in CollectionExecutionAccumulatorsTest:

assertEquals(NUM_ELEMENTS, result.getAccumulatorResult(ACCUMULATOR_NAME)); 

I hope it helps..

On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier <[hidden email]> wrote:
I've checked the compiled classes with javap -verbose and indeed they had a major.verion=51 (java 7).
So I've changed the source and target to 1.8 in the main pom.xm and now the generated .class have major.verion=52.
Unfortunately now I get this error:

[ERROR] /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63] incompatible types: void cannot be converted to java.lang.Object

How can I fix it? I also tried to upgrade the maven compiler to 3.5 but it didn't help :(

Best,
Flavio

On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier <[hidden email]> wrote:
Hi to all,

I was trying to make my Java 8 application to run on a Flink 0.10.1 cluster.
I've compiled both Flink sources and my app with the same Java version (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every flink-conf.yml of the cluster.

I always get the following Exception:

java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor version 52.0

Is there any other setting I forgot to check? Do I have to change also the source and target to 1.8 in the maven compiler settings of the main pom?

Best,
Flavio






Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Maximilian Michels
Hi Flavio,

To address your points:

1) It runs. That's fine.
2) It doesn't work to run a Java 8 compiled Flink job with Java 7
Flink cluster if you use Java 8 non-backwards-compatible features in
your job.
3) I compile Flink daily with Java 8. Also, we have Travis CI tests
which uses OpenJDK and OracaleJDK 7/8 to compile.

I think there is something wrong with the configuration of your build setup.

Cheers,
Max

On Thu, Feb 4, 2016 at 11:55 AM, Flavio Pompermaier
<[hidden email]> wrote:

> I've tested several configurations (also changing my compilation to 1.7 but
> then sesame 4 was causing the error [1]):
>
> Flink compiled with java 1.7 (default), runned within Eclipse with Java 8:
> OK
> Flink compiled with java 1.7 (default), runned the cluster with java 8: not
> able to run my job compiled with java 1.8 and causing the reported exception
> (unsupported major.minor version)
> Flink compiled with java 1.8: not able to compile without the reported
> modifications, but then the job was running fine
>
> I don't know if you ever tested all those configurations but I'm sure it
> wasn't working when deployed in the cluster.
>
> [1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view
>
> On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <[hidden email]> wrote:
>>
>> Hi!
>>
>> I am running Java 8 for a year without an issue. The code is compiled for
>> target Java 7, but can be run with Java 8.
>> User code that is targeted for Java 8 can be run if Flink is run with Java
>> 8.
>>
>> The initial error you got was because you probably compiled with Java 8 as
>> the target, and ran it with Java 7.
>>
>> I would just leave the target to be 1.7 and run it in a Java 8 JVM. User
>> code can also be Java 8, that mixes seamlessly.
>>
>> Stephan
>>
>>
>> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier <[hidden email]>
>> wrote:
>>>
>>> Anyone looking into this? Java 7 reached its end of life at april 2015
>>> with its last public update (numer 80) and the ability to run Java 8 jobs
>>> would be more and more important in the future. IMHO, the default target of
>>> the maven compiler plugin should be set to 1.8 in the 1.0 release. In most
>>> of the cases this would be backward compatible and if it's not you can
>>> always recompile it with 1.7 (but as an exception this time).
>>> Obviously this is not urgent, I just wanted to point this out and
>>> hopefully help someone else facing the same problem
>>>
>>> Best,
>>> Flavio
>>>
>>>
>>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <[hidden email]>
>>> wrote:
>>>>
>>>> I've fixed it changing the copy method in the TupleSerializer as follow:
>>>>
>>>> @Override
>>>> public T copy(T from, T reuse) {
>>>> for (int i = 0; i < arity; i++) {
>>>> Object copy = fieldSerializers[i].copy(from.getField(i));
>>>> reuse.setField(copy, i);
>>>> }
>>>> return reuse;
>>>> }
>>>>
>>>> And commenting line 50 in CollectionExecutionAccumulatorsTest:
>>>>
>>>> assertEquals(NUM_ELEMENTS,
>>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
>>>>
>>>> I hope it helps..
>>>>
>>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier
>>>> <[hidden email]> wrote:
>>>>>
>>>>> I've checked the compiled classes with javap -verbose and indeed they
>>>>> had a major.verion=51 (java 7).
>>>>> So I've changed the source and target to 1.8 in the main pom.xm and now
>>>>> the generated .class have major.verion=52.
>>>>> Unfortunately now I get this error:
>>>>>
>>>>> [ERROR]
>>>>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>>>>> incompatible types: void cannot be converted to java.lang.Object
>>>>>
>>>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5 but
>>>>> it didn't help :(
>>>>>
>>>>> Best,
>>>>> Flavio
>>>>>
>>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier
>>>>> <[hidden email]> wrote:
>>>>>>
>>>>>> Hi to all,
>>>>>>
>>>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>>>>>> cluster.
>>>>>> I've compiled both Flink sources and my app with the same Java version
>>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
>>>>>> flink-conf.yml of the cluster.
>>>>>>
>>>>>> I always get the following Exception:
>>>>>>
>>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
>>>>>> version 52.0
>>>>>>
>>>>>> Is there any other setting I forgot to check? Do I have to change also
>>>>>> the source and target to 1.8 in the maven compiler settings of the main pom?
>>>>>>
>>>>>> Best,
>>>>>> Flavio
>>>>>
>>>>>
>>>>>
>>>>
>>
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Flavio Pompermaier
Flink compiles correctly using java 8 as long as you leave java 1.7 source and target in the maven java compiler.
If you change them to 1.8 flink-core doesn't compile anymore.

On Thu, Feb 4, 2016 at 4:23 PM, Maximilian Michels <[hidden email]> wrote:
Hi Flavio,

To address your points:

1) It runs. That's fine.
2) It doesn't work to run a Java 8 compiled Flink job with Java 7
Flink cluster if you use Java 8 non-backwards-compatible features in
your job.
3) I compile Flink daily with Java 8. Also, we have Travis CI tests
which uses OpenJDK and OracaleJDK 7/8 to compile.

I think there is something wrong with the configuration of your build setup.

Cheers,
Max

On Thu, Feb 4, 2016 at 11:55 AM, Flavio Pompermaier
<[hidden email]> wrote:
> I've tested several configurations (also changing my compilation to 1.7 but
> then sesame 4 was causing the error [1]):
>
> Flink compiled with java 1.7 (default), runned within Eclipse with Java 8:
> OK
> Flink compiled with java 1.7 (default), runned the cluster with java 8: not
> able to run my job compiled with java 1.8 and causing the reported exception
> (unsupported major.minor version)
> Flink compiled with java 1.8: not able to compile without the reported
> modifications, but then the job was running fine
>
> I don't know if you ever tested all those configurations but I'm sure it
> wasn't working when deployed in the cluster.
>
> [1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view
>
> On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <[hidden email]> wrote:
>>
>> Hi!
>>
>> I am running Java 8 for a year without an issue. The code is compiled for
>> target Java 7, but can be run with Java 8.
>> User code that is targeted for Java 8 can be run if Flink is run with Java
>> 8.
>>
>> The initial error you got was because you probably compiled with Java 8 as
>> the target, and ran it with Java 7.
>>
>> I would just leave the target to be 1.7 and run it in a Java 8 JVM. User
>> code can also be Java 8, that mixes seamlessly.
>>
>> Stephan
>>
>>
>> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier <[hidden email]>
>> wrote:
>>>
>>> Anyone looking into this? Java 7 reached its end of life at april 2015
>>> with its last public update (numer 80) and the ability to run Java 8 jobs
>>> would be more and more important in the future. IMHO, the default target of
>>> the maven compiler plugin should be set to 1.8 in the 1.0 release. In most
>>> of the cases this would be backward compatible and if it's not you can
>>> always recompile it with 1.7 (but as an exception this time).
>>> Obviously this is not urgent, I just wanted to point this out and
>>> hopefully help someone else facing the same problem
>>>
>>> Best,
>>> Flavio
>>>
>>>
>>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier <[hidden email]>
>>> wrote:
>>>>
>>>> I've fixed it changing the copy method in the TupleSerializer as follow:
>>>>
>>>> @Override
>>>> public T copy(T from, T reuse) {
>>>> for (int i = 0; i < arity; i++) {
>>>> Object copy = fieldSerializers[i].copy(from.getField(i));
>>>> reuse.setField(copy, i);
>>>> }
>>>> return reuse;
>>>> }
>>>>
>>>> And commenting line 50 in CollectionExecutionAccumulatorsTest:
>>>>
>>>> assertEquals(NUM_ELEMENTS,
>>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
>>>>
>>>> I hope it helps..
>>>>
>>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier
>>>> <[hidden email]> wrote:
>>>>>
>>>>> I've checked the compiled classes with javap -verbose and indeed they
>>>>> had a major.verion=51 (java 7).
>>>>> So I've changed the source and target to 1.8 in the main pom.xm and now
>>>>> the generated .class have major.verion=52.
>>>>> Unfortunately now I get this error:
>>>>>
>>>>> [ERROR]
>>>>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>>>>> incompatible types: void cannot be converted to java.lang.Object
>>>>>
>>>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5 but
>>>>> it didn't help :(
>>>>>
>>>>> Best,
>>>>> Flavio
>>>>>
>>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier
>>>>> <[hidden email]> wrote:
>>>>>>
>>>>>> Hi to all,
>>>>>>
>>>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>>>>>> cluster.
>>>>>> I've compiled both Flink sources and my app with the same Java version
>>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM in every
>>>>>> flink-conf.yml of the cluster.
>>>>>>
>>>>>> I always get the following Exception:
>>>>>>
>>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported major.minor
>>>>>> version 52.0
>>>>>>
>>>>>> Is there any other setting I forgot to check? Do I have to change also
>>>>>> the source and target to 1.8 in the maven compiler settings of the main pom?
>>>>>>
>>>>>> Best,
>>>>>> Flavio
>>>>>
>>>>>
>>>>>
>>>>
>>
>
>

Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Maximilian Michels
I see. Did you perform a full "mvn clean package -DskipTests" after
you changed the source level to 1.8?

On Thu, Feb 4, 2016 at 4:33 PM, Flavio Pompermaier <[hidden email]> wrote:

> Flink compiles correctly using java 8 as long as you leave java 1.7 source
> and target in the maven java compiler.
> If you change them to 1.8 flink-core doesn't compile anymore.
>
> On Thu, Feb 4, 2016 at 4:23 PM, Maximilian Michels <[hidden email]> wrote:
>>
>> Hi Flavio,
>>
>> To address your points:
>>
>> 1) It runs. That's fine.
>> 2) It doesn't work to run a Java 8 compiled Flink job with Java 7
>> Flink cluster if you use Java 8 non-backwards-compatible features in
>> your job.
>> 3) I compile Flink daily with Java 8. Also, we have Travis CI tests
>> which uses OpenJDK and OracaleJDK 7/8 to compile.
>>
>> I think there is something wrong with the configuration of your build
>> setup.
>>
>> Cheers,
>> Max
>>
>> On Thu, Feb 4, 2016 at 11:55 AM, Flavio Pompermaier
>> <[hidden email]> wrote:
>> > I've tested several configurations (also changing my compilation to 1.7
>> > but
>> > then sesame 4 was causing the error [1]):
>> >
>> > Flink compiled with java 1.7 (default), runned within Eclipse with Java
>> > 8:
>> > OK
>> > Flink compiled with java 1.7 (default), runned the cluster with java 8:
>> > not
>> > able to run my job compiled with java 1.8 and causing the reported
>> > exception
>> > (unsupported major.minor version)
>> > Flink compiled with java 1.8: not able to compile without the reported
>> > modifications, but then the job was running fine
>> >
>> > I don't know if you ever tested all those configurations but I'm sure it
>> > wasn't working when deployed in the cluster.
>> >
>> > [1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view
>> >
>> > On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <[hidden email]> wrote:
>> >>
>> >> Hi!
>> >>
>> >> I am running Java 8 for a year without an issue. The code is compiled
>> >> for
>> >> target Java 7, but can be run with Java 8.
>> >> User code that is targeted for Java 8 can be run if Flink is run with
>> >> Java
>> >> 8.
>> >>
>> >> The initial error you got was because you probably compiled with Java 8
>> >> as
>> >> the target, and ran it with Java 7.
>> >>
>> >> I would just leave the target to be 1.7 and run it in a Java 8 JVM.
>> >> User
>> >> code can also be Java 8, that mixes seamlessly.
>> >>
>> >> Stephan
>> >>
>> >>
>> >> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier
>> >> <[hidden email]>
>> >> wrote:
>> >>>
>> >>> Anyone looking into this? Java 7 reached its end of life at april 2015
>> >>> with its last public update (numer 80) and the ability to run Java 8
>> >>> jobs
>> >>> would be more and more important in the future. IMHO, the default
>> >>> target of
>> >>> the maven compiler plugin should be set to 1.8 in the 1.0 release. In
>> >>> most
>> >>> of the cases this would be backward compatible and if it's not you can
>> >>> always recompile it with 1.7 (but as an exception this time).
>> >>> Obviously this is not urgent, I just wanted to point this out and
>> >>> hopefully help someone else facing the same problem
>> >>>
>> >>> Best,
>> >>> Flavio
>> >>>
>> >>>
>> >>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier
>> >>> <[hidden email]>
>> >>> wrote:
>> >>>>
>> >>>> I've fixed it changing the copy method in the TupleSerializer as
>> >>>> follow:
>> >>>>
>> >>>> @Override
>> >>>> public T copy(T from, T reuse) {
>> >>>> for (int i = 0; i < arity; i++) {
>> >>>> Object copy = fieldSerializers[i].copy(from.getField(i));
>> >>>> reuse.setField(copy, i);
>> >>>> }
>> >>>> return reuse;
>> >>>> }
>> >>>>
>> >>>> And commenting line 50 in CollectionExecutionAccumulatorsTest:
>> >>>>
>> >>>> assertEquals(NUM_ELEMENTS,
>> >>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
>> >>>>
>> >>>> I hope it helps..
>> >>>>
>> >>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier
>> >>>> <[hidden email]> wrote:
>> >>>>>
>> >>>>> I've checked the compiled classes with javap -verbose and indeed
>> >>>>> they
>> >>>>> had a major.verion=51 (java 7).
>> >>>>> So I've changed the source and target to 1.8 in the main pom.xm and
>> >>>>> now
>> >>>>> the generated .class have major.verion=52.
>> >>>>> Unfortunately now I get this error:
>> >>>>>
>> >>>>> [ERROR]
>> >>>>>
>> >>>>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>> >>>>> incompatible types: void cannot be converted to java.lang.Object
>> >>>>>
>> >>>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5
>> >>>>> but
>> >>>>> it didn't help :(
>> >>>>>
>> >>>>> Best,
>> >>>>> Flavio
>> >>>>>
>> >>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier
>> >>>>> <[hidden email]> wrote:
>> >>>>>>
>> >>>>>> Hi to all,
>> >>>>>>
>> >>>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>> >>>>>> cluster.
>> >>>>>> I've compiled both Flink sources and my app with the same Java
>> >>>>>> version
>> >>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM
>> >>>>>> in every
>> >>>>>> flink-conf.yml of the cluster.
>> >>>>>>
>> >>>>>> I always get the following Exception:
>> >>>>>>
>> >>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported
>> >>>>>> major.minor
>> >>>>>> version 52.0
>> >>>>>>
>> >>>>>> Is there any other setting I forgot to check? Do I have to change
>> >>>>>> also
>> >>>>>> the source and target to 1.8 in the maven compiler settings of the
>> >>>>>> main pom?
>> >>>>>>
>> >>>>>> Best,
>> >>>>>> Flavio
>> >>>>>
>> >>>>>
>> >>>>>
>> >>>>
>> >>
>> >
>> >
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Flink cluster and Java 8

Flavio Pompermaier
yes I did

On Thu, Feb 4, 2016 at 5:44 PM, Maximilian Michels <[hidden email]> wrote:
I see. Did you perform a full "mvn clean package -DskipTests" after
you changed the source level to 1.8?

On Thu, Feb 4, 2016 at 4:33 PM, Flavio Pompermaier <[hidden email]> wrote:
> Flink compiles correctly using java 8 as long as you leave java 1.7 source
> and target in the maven java compiler.
> If you change them to 1.8 flink-core doesn't compile anymore.
>
> On Thu, Feb 4, 2016 at 4:23 PM, Maximilian Michels <[hidden email]> wrote:
>>
>> Hi Flavio,
>>
>> To address your points:
>>
>> 1) It runs. That's fine.
>> 2) It doesn't work to run a Java 8 compiled Flink job with Java 7
>> Flink cluster if you use Java 8 non-backwards-compatible features in
>> your job.
>> 3) I compile Flink daily with Java 8. Also, we have Travis CI tests
>> which uses OpenJDK and OracaleJDK 7/8 to compile.
>>
>> I think there is something wrong with the configuration of your build
>> setup.
>>
>> Cheers,
>> Max
>>
>> On Thu, Feb 4, 2016 at 11:55 AM, Flavio Pompermaier
>> <[hidden email]> wrote:
>> > I've tested several configurations (also changing my compilation to 1.7
>> > but
>> > then sesame 4 was causing the error [1]):
>> >
>> > Flink compiled with java 1.7 (default), runned within Eclipse with Java
>> > 8:
>> > OK
>> > Flink compiled with java 1.7 (default), runned the cluster with java 8:
>> > not
>> > able to run my job compiled with java 1.8 and causing the reported
>> > exception
>> > (unsupported major.minor version)
>> > Flink compiled with java 1.8: not able to compile without the reported
>> > modifications, but then the job was running fine
>> >
>> > I don't know if you ever tested all those configurations but I'm sure it
>> > wasn't working when deployed in the cluster.
>> >
>> > [1] http://rdf4j.org/doc/4/release-notes/4.0.0.docbook?view
>> >
>> > On Thu, Feb 4, 2016 at 11:40 AM, Stephan Ewen <[hidden email]> wrote:
>> >>
>> >> Hi!
>> >>
>> >> I am running Java 8 for a year without an issue. The code is compiled
>> >> for
>> >> target Java 7, but can be run with Java 8.
>> >> User code that is targeted for Java 8 can be run if Flink is run with
>> >> Java
>> >> 8.
>> >>
>> >> The initial error you got was because you probably compiled with Java 8
>> >> as
>> >> the target, and ran it with Java 7.
>> >>
>> >> I would just leave the target to be 1.7 and run it in a Java 8 JVM.
>> >> User
>> >> code can also be Java 8, that mixes seamlessly.
>> >>
>> >> Stephan
>> >>
>> >>
>> >> On Thu, Feb 4, 2016 at 11:34 AM, Flavio Pompermaier
>> >> <[hidden email]>
>> >> wrote:
>> >>>
>> >>> Anyone looking into this? Java 7 reached its end of life at april 2015
>> >>> with its last public update (numer 80) and the ability to run Java 8
>> >>> jobs
>> >>> would be more and more important in the future. IMHO, the default
>> >>> target of
>> >>> the maven compiler plugin should be set to 1.8 in the 1.0 release. In
>> >>> most
>> >>> of the cases this would be backward compatible and if it's not you can
>> >>> always recompile it with 1.7 (but as an exception this time).
>> >>> Obviously this is not urgent, I just wanted to point this out and
>> >>> hopefully help someone else facing the same problem
>> >>>
>> >>> Best,
>> >>> Flavio
>> >>>
>> >>>
>> >>> On Wed, Feb 3, 2016 at 3:40 PM, Flavio Pompermaier
>> >>> <[hidden email]>
>> >>> wrote:
>> >>>>
>> >>>> I've fixed it changing the copy method in the TupleSerializer as
>> >>>> follow:
>> >>>>
>> >>>> @Override
>> >>>> public T copy(T from, T reuse) {
>> >>>> for (int i = 0; i < arity; i++) {
>> >>>> Object copy = fieldSerializers[i].copy(from.getField(i));
>> >>>> reuse.setField(copy, i);
>> >>>> }
>> >>>> return reuse;
>> >>>> }
>> >>>>
>> >>>> And commenting line 50 in CollectionExecutionAccumulatorsTest:
>> >>>>
>> >>>> assertEquals(NUM_ELEMENTS,
>> >>>> result.getAccumulatorResult(ACCUMULATOR_NAME));
>> >>>>
>> >>>> I hope it helps..
>> >>>>
>> >>>> On Wed, Feb 3, 2016 at 3:12 PM, Flavio Pompermaier
>> >>>> <[hidden email]> wrote:
>> >>>>>
>> >>>>> I've checked the compiled classes with javap -verbose and indeed
>> >>>>> they
>> >>>>> had a major.verion=51 (java 7).
>> >>>>> So I've changed the source and target to 1.8 in the main pom.xm and
>> >>>>> now
>> >>>>> the generated .class have major.verion=52.
>> >>>>> Unfortunately now I get this error:
>> >>>>>
>> >>>>> [ERROR]
>> >>>>>
>> >>>>> /opt/flink-src/flink-java/src/main/java/org/apache/flink/api/java/typeutils/runtime/TupleSerializer.java:[104,63]
>> >>>>> incompatible types: void cannot be converted to java.lang.Object
>> >>>>>
>> >>>>> How can I fix it? I also tried to upgrade the maven compiler to 3.5
>> >>>>> but
>> >>>>> it didn't help :(
>> >>>>>
>> >>>>> Best,
>> >>>>> Flavio
>> >>>>>
>> >>>>> On Wed, Feb 3, 2016 at 2:38 PM, Flavio Pompermaier
>> >>>>> <[hidden email]> wrote:
>> >>>>>>
>> >>>>>> Hi to all,
>> >>>>>>
>> >>>>>> I was trying to make my Java 8 application to run on a Flink 0.10.1
>> >>>>>> cluster.
>> >>>>>> I've compiled both Flink sources and my app with the same Java
>> >>>>>> version
>> >>>>>> (1.8.72) and I've set the env.java.home to point to my java 8 JVM
>> >>>>>> in every
>> >>>>>> flink-conf.yml of the cluster.
>> >>>>>>
>> >>>>>> I always get the following Exception:
>> >>>>>>
>> >>>>>> java.lang.UnsupportedClassVersionError: XXX: Unsupported
>> >>>>>> major.minor
>> >>>>>> version 52.0
>> >>>>>>
>> >>>>>> Is there any other setting I forgot to check? Do I have to change
>> >>>>>> also
>> >>>>>> the source and target to 1.8 in the maven compiler settings of the
>> >>>>>> main pom?
>> >>>>>>
>> >>>>>> Best,
>> >>>>>> Flavio
>> >>>>>
>> >>>>>
>> >>>>>
>> >>>>
>> >>
>> >
>> >
>
>