Flink upgrade compatibility table

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink upgrade compatibility table

Colin Williams
I recently tried to launch our application 1.3 jars against a 1.4 cluster. I got a java.lang.NoClassDefFoundError: org/apache/flink/streaming/api/checkpoint/CheckpointedRestoring when I tried to run our 1.3 flink application against 1.4 .

Then I googled around and didn't see a mention of 1.4 in the compatibility table: https://ci.apache.org/projects/flink/flink-docs-release-1.4/ops/upgrading.html#compatibility-table

Does 1.4 break compatibility? Maybe the 1.4 docs should be updated to reflect that?

Thanks,

Colin Williams
Reply | Threaded
Open this post in threaded view
|

Re: Flink upgrade compatibility table

Colin Williams
Hi Kien,

Thanks for the feedback. I wasn't certain regarding compatibility between jars. I did version bump the flink libraries and the application did start. Just curious if the previous jar still worked without upgrading.

Regarding the savepoint table. Someone should probably add 1.4 information for consistency.


Thanks,

Colin Williams


On Dec 20, 2017 8:16 PM, "Kien Truong" <[hidden email]> wrote:
Hi Colin,

Did you try to rebuild the application with Flink 1.4 ? You cannot just take a jar build with 1.3 and run it on 1.4 cluster. Afaik, Flink doesn't make any guarantee about binary compatibility between major releases, so you always have to recompile your application code when you upgrade the cluster.

Also, the compatibility table you mentioned is only applicable to save point, not application code.

Best regards,
Kien

Sent from TypeApp
On Dec 21, 2017, at 09:06, Colin Williams <[hidden email]> wrote:
I recently tried to launch our application 1.3 jars against a 1.4 cluster. I got a java.lang.NoClassDefFoundError: org/apache/flink/streaming/api/checkpoint/CheckpointedRestoring when I tried to run our 1.3 flink application against 1.4 .

Then I googled around and didn't see a mention of 1.4 in the compatibility table: https://ci.apache.org/projects/flink/flink-docs-release-1.4/ops/upgrading.html#compatibility-table

Does 1.4 break compatibility? Maybe the 1.4 docs should be updated to reflect that?

Thanks,

Colin Williams
Reply | Threaded
Open this post in threaded view
|

Re: Flink upgrade compatibility table

Fabian Hueske-2
Hi Colin,

thanks for pointing out this gap in the docs!
I created FLINK-8303 [1] to extend the table and updated the release process documentation [2] to update the page for new releases.

Thank you,
Fabian

2017-12-21 6:03 GMT+01:00 Colin Williams <[hidden email]>:
Hi Kien,

Thanks for the feedback. I wasn't certain regarding compatibility between jars. I did version bump the flink libraries and the application did start. Just curious if the previous jar still worked without upgrading.

Regarding the savepoint table. Someone should probably add 1.4 information for consistency.


Thanks,

Colin Williams


On Dec 20, 2017 8:16 PM, "Kien Truong" <[hidden email]> wrote:
Hi Colin,

Did you try to rebuild the application with Flink 1.4 ? You cannot just take a jar build with 1.3 and run it on 1.4 cluster. Afaik, Flink doesn't make any guarantee about binary compatibility between major releases, so you always have to recompile your application code when you upgrade the cluster.

Also, the compatibility table you mentioned is only applicable to save point, not application code.

Best regards,
Kien

Sent from TypeApp
On Dec 21, 2017, at 09:06, Colin Williams <[hidden email]> wrote:
I recently tried to launch our application 1.3 jars against a 1.4 cluster. I got a java.lang.NoClassDefFoundError: org/apache/flink/streaming/api/checkpoint/CheckpointedRestoring when I tried to run our 1.3 flink application against 1.4 .

Then I googled around and didn't see a mention of 1.4 in the compatibility table: https://ci.apache.org/projects/flink/flink-docs-release-1.4/ops/upgrading.html#compatibility-table

Does 1.4 break compatibility? Maybe the 1.4 docs should be updated to reflect that?

Thanks,

Colin Williams