mixing java libraries between 1.12.x and 1.11.x

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

mixing java libraries between 1.12.x and 1.11.x

Jin Yi
(also updated https://issues.apache.org/jira/browse/FLINK-19955 w/ this question)

i'm in the situation where i want to use ParquetProtoWriters found in flink-parquet 1.12.x.  the rest of our codebase, anticipating possibly running on the fully-managed aws flink solution for production, is depending on 1.11.1.

i'm expecting this to work since i am thinking that 1.11 vs 1.12 is only a minor version bump, and flink-parquet is more of a "terminal" package rather than a core, internal flink library.  is this assumption correct?

locally, things appear to be working as expected.  but this is just for somewhat trivial unit tests.  i haven't pushed this out to meaningful environments where issues around check/savepoints and such may creep up.  so, i'm looking for any gotchas that folks may be aware of when mixing library versions like this.

thanks!
Reply | Threaded
Open this post in threaded view
|

Re: mixing java libraries between 1.12.x and 1.11.x

Arvid Heise-4
Hi Jin,

as Till already answered on the ticket: in general, there is no guarantee that stuff works in between different versions. Everything that builds on public APIs is guaranteed to be forward compatible. However, in this case, you want things to be backward-compatible, which is impossible to be guaranteed as something that is implemented at time X might use API that was added in X-1 and now you want to use it in X-2.

Nevertheless, you can usually assume that it works if it's running like in your test. If the format depends on new methods in 1.12, you would have noticed quickly. It's still possible that for certain edge cases (error handling), it's depending on new things and fails in the failure case. That's a risk that you have to live with.

One way to reduce the probability further (after your initial test runs), is to checkout Flink at the given version and backport the commit yourself. If everything compiles - you are not using any new API. If you manage to backport the tests, then you would even see some edge cases covered (assuming that the class is covered sufficiently). Then it just might happen that intermediate bugfixes relevant to ParquetProtoWriters are not in 1.11, but that chance is slim as we usually port all bugfixes back.


On Wed, Mar 10, 2021 at 10:07 PM Jin Yi <[hidden email]> wrote:
(also updated https://issues.apache.org/jira/browse/FLINK-19955 w/ this question)

i'm in the situation where i want to use ParquetProtoWriters found in flink-parquet 1.12.x.  the rest of our codebase, anticipating possibly running on the fully-managed aws flink solution for production, is depending on 1.11.1.

i'm expecting this to work since i am thinking that 1.11 vs 1.12 is only a minor version bump, and flink-parquet is more of a "terminal" package rather than a core, internal flink library.  is this assumption correct?

locally, things appear to be working as expected.  but this is just for somewhat trivial unit tests.  i haven't pushed this out to meaningful environments where issues around check/savepoints and such may creep up.  so, i'm looking for any gotchas that folks may be aware of when mixing library versions like this.

thanks!