Hi Everyone, Think of this as a pre-flip, but what does everyone think about dropping Scala 2.11 support from Flink. The last patch release was in 2017 and in that time the scala community has released 2.13 and is working towards a 3.0 release. Apache Kafka and Spark have both dropped 2.11 support in recent versions. In fact, Flink's universal Kafka connector is stuck on 2.4 because that is the last version with scala 2.11 support. What are people's thoughts on dropping Scala 2.11? How many are still using it in production? Seth |
Yes! I would be in favour of this since it's blocking us from upgrading
certain dependencies. I would also be in favour of dropping Scala completely but that's a different story. Aljoscha On 10.09.20 16:51, Seth Wiesman wrote: > Hi Everyone, > > Think of this as a pre-flip, but what does everyone think about dropping > Scala 2.11 support from Flink. > > The last patch release was in 2017 and in that time the scala community has > released 2.13 and is working towards a 3.0 release. Apache Kafka and Spark > have both dropped 2.11 support in recent versions. In fact, Flink's > universal Kafka connector is stuck on 2.4 because that is the last version > with scala 2.11 support. > > What are people's thoughts on dropping Scala 2.11? How many are still using > it in production? > > Seth > |
@glen Yes, we would absolutely migrate statefun. StateFun can be compiled with Scala 2.12 today, I'm not sure why it's not cross released. @aljoscha :) @mathieu Its on the roadmap but it's non-trivial and I'm not aware of anyone actively working on it. On Thu, Sep 10, 2020 at 10:09 AM Matthieu Bonneviot <[hidden email]> wrote: That makes sense. |
@Galen FYI: the upcoming StateFun release would use Scala2.12 On Thu, Sep 10, 2020 at 5:14 PM Seth Wiesman <[hidden email]> wrote:
|
We use a Cloudera 6.3 cluster in prod. I'd guess that it's still widely used in prod as those cloudera upgrades for major versions are planned long time ahead and take a significant amount of resources in big data lakes. On that 6.3. cluster, if I open spark-shell, I still see scala 2.11 in use: > Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181) Just to don't mess with any dependencies and as we have both, Flink and Spark jobs, we declare all dependencies with scala_2.11. But we never actually tried what happens if we would change that. We just seek maximum compatibility while at the same time, maintain and upgrade Flink ourselves to keep up with the development pace of the fast community and being able to use the latest major version quickly. So my suggestion: As long as it isn't much pain, I'd like to keep the scala 2.11 support for a while. (or even better for me as a Java, but never scala developer: Drop it completely :) ) Speaking for my project: I think our ops team plans to upgrade to Cloudera 7 somewhere around Q2 2021, so personally I'm fine with dropping it then. Best regards Theo Von: "Igal Shilman" <[hidden email]> An: "Seth Wiesman" <[hidden email]> CC: "dev" <[hidden email]>, "user" <[hidden email]> Gesendet: Freitag, 11. September 2020 13:15:38 Betreff: Re: [DISCUSS] Drop Scala 2.11 @Galen FYI: the upcoming StateFun release would use Scala2.12 On Thu, Sep 10, 2020 at 5:14 PM Seth Wiesman <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |