Flink 1.11.2 test cases fail with Scala 2.12.12

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink 1.11.2 test cases fail with Scala 2.12.12

soumoks
Hi,

We have several flink applications written with Flink 1.9.1 and Scala 2.11.12 and we are in the process of upgrading to Flink 1.11.2 and Scala 2.12.12. We are using maven to manage our application dependencies.

After updating the pom.xml file to use the upgraded versions of Scala and Flink as mentioned above, all the unit tests written with Scalatest 3.0.5(We are using flatspec style) fail with the following exception.

 org.apache.flink.shaded.guava18.com.google.common.util.concurrent.ExecutionError: java.lang.NoClassDefFoundError: scala/math/Ordering$$anon$9
  at org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201)
  at org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
  at org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
  at org.apache.flink.api.scala.typeutils.TraversableSerializer$.compileCbf(TraversableSerializer.scala:184)
  at org.apache.flink.api.scala.typeutils.TraversableSerializer.compileCbf(TraversableSerializer.scala:51)
  at org.apache.flink.api.scala.typeutils.TraversableSerializer.<init>(TraversableSerializer.scala:41)

  Cause: java.lang.NoClassDefFoundError: scala/math/Ordering$$anon$9
  at scala.tools.nsc.transform.LambdaLift$LambdaLifter.<init>(LambdaLift.scala:67)
  at scala.tools.nsc.transform.LambdaLift.newTransformer(LambdaLift.scala:49)
  at scala.tools.nsc.transform.Transform$Phase.apply(Transform.scala:30)
  at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:441)
  at scala.tools.nsc.Global$GlobalPhase.run(Global.scala:392)
  at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1467)
  at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1451)
  at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.wrapInPackageAndCompile(ToolBoxFactory.scala:201)
  at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.compile(ToolBoxFactory.scala:256)
  at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl.$anonfun$compile$13(ToolBoxFactory.scala:433)
  ...
  Cause: java.lang.ClassNotFoundException: scala.math.Ordering$$anon$9
  at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
  at scala.tools.nsc.transform.LambdaLift$LambdaLifter.<init>(LambdaLift.scala:67)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService - Stop job leader service.
  at scala.tools.nsc.transform.LambdaLift.newTransformer(LambdaLift.scala:49)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
  at scala.tools.nsc.transform.Transform$Phase.apply(Transform.scala:30)
  at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:441)
  at scala.tools.nsc.Global$GlobalPhase.run(Global.scala:392)
  at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1467)


The app itself compiles without issues and this can be verified by running mvn clean package `mvn clean package -DskipTests`

In order to troubleshoot/narrow down the issue, I upgraded the flink package from 1.9.1 to to 1.11.2 while keeping the scala version the same i.e 2.11.12 instead of 2.12.12 and this seems to have resolved the issue.
The app compiles and the test cases pass as well.

Is this a known compatibility issue between Flink 1.11.2 and Scala 2.12.12?

Thanks,
Sourabh



Reply | Threaded
Open this post in threaded view
|

Re: Flink 1.11.2 test cases fail with Scala 2.12.12

Chesnay Schepler
Scala 2.12.8 broke binary compatibility with 2.12.7 which Flink currently is compiled against.
As a result you must either stay at 2.12.7, or recompile Flink yourself against 2.12.12 as shown here.

On 1/28/2021 2:34 AM, Sourabh Mokhasi wrote:
Hi,

We have several flink applications written with Flink 1.9.1 and Scala 2.11.12 and we are in the process of upgrading to Flink 1.11.2 and Scala 2.12.12. We are using maven to manage our application dependencies.

After updating the pom.xml file to use the upgraded versions of Scala and Flink as mentioned above, all the unit tests written with Scalatest 3.0.5(We are using flatspec style) fail with the following exception.

 org.apache.flink.shaded.guava18.com.google.common.util.concurrent.ExecutionError: java.lang.NoClassDefFoundError: scala/math/Ordering$$anon$9
  at org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201)
  at org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
  at org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
  at org.apache.flink.api.scala.typeutils.TraversableSerializer$.compileCbf(TraversableSerializer.scala:184)
  at org.apache.flink.api.scala.typeutils.TraversableSerializer.compileCbf(TraversableSerializer.scala:51)
  at org.apache.flink.api.scala.typeutils.TraversableSerializer.<init>(TraversableSerializer.scala:41)

  Cause: java.lang.NoClassDefFoundError: scala/math/Ordering$$anon$9
  at scala.tools.nsc.transform.LambdaLift$LambdaLifter.<init>(LambdaLift.scala:67)
  at scala.tools.nsc.transform.LambdaLift.newTransformer(LambdaLift.scala:49)
  at scala.tools.nsc.transform.Transform$Phase.apply(Transform.scala:30)
  at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:441)
  at scala.tools.nsc.Global$GlobalPhase.run(Global.scala:392)
  at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1467)
  at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1451)
  at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.wrapInPackageAndCompile(ToolBoxFactory.scala:201)
  at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.compile(ToolBoxFactory.scala:256)
  at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl.$anonfun$compile$13(ToolBoxFactory.scala:433)
  ...
  Cause: java.lang.ClassNotFoundException: scala.math.Ordering$$anon$9
  at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
  at scala.tools.nsc.transform.LambdaLift$LambdaLifter.<init>(LambdaLift.scala:67)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService - Stop job leader service.
  at scala.tools.nsc.transform.LambdaLift.newTransformer(LambdaLift.scala:49)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
  at scala.tools.nsc.transform.Transform$Phase.apply(Transform.scala:30)
  at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:441)
  at scala.tools.nsc.Global$GlobalPhase.run(Global.scala:392)
  at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1467)


The app itself compiles without issues and this can be verified by running mvn clean package `mvn clean package -DskipTests`

In order to troubleshoot/narrow down the issue, I upgraded the flink package from 1.9.1 to to 1.11.2 while keeping the scala version the same i.e 2.11.12 instead of 2.12.12 and this seems to have resolved the issue.
The app compiles and the test cases pass as well.

Is this a known compatibility issue between Flink 1.11.2 and Scala 2.12.12?

Thanks,
Sourabh




Reply | Threaded
Open this post in threaded view
|

Re: Flink 1.11.2 test cases fail with Scala 2.12.12

soumoks
Thank you for the response but this error continues to happen with Scala
2.12.7.
The app itself continues to compile without errors but the test cases fail
with the same error.

Seems to be related to
https://issues.apache.org/jira/browse/FLINK-12461

I have set the Scala version in pom.xml file and I have used this property
value for all dependencies present.

    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <encoding>UTF-8</encoding>
        <scala.version>2.12.7</scala.version>
        <scala.compat.version>2.12</scala.compat.version>
        <aws.sdk.version>1.11.461</aws.sdk.version>
        <spec2.version>4.2.0</spec2.version>
    </properties>

 


However, I ran into the same error

org.apache.flink.shaded.guava18.com.google.common.util.concurrent.ExecutionError:
java.lang.NoClassDefFoundError: scala/math/Ordering$$anon$9
  at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201)
  at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
  at
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
  at
org.apache.flink.api.scala.typeutils.TraversableSerializer$.compileCbf(TraversableSerializer.scala:184)
  at
org.apache.flink.api.scala.typeutils.TraversableSerializer.compileCbf(TraversableSerializer.scala:51)
  at
org.apache.flink.api.scala.typeutils.TraversableSerializer.<init>(TraversableSerializer.scala:41)




and it occurs if I use a Scala mutable/immutable map data-structure in my
code.

Sample test code:

val data: scala.collection.mutable.Map[String, String] =
      scala.collection.mutable.Map("key1" -> "v1", "key2" -> "v2")

    val kafkaMsg: String = write(
      SampleCaseClass(14L, "23FC", 10L, data)
    )

    val stream: DataStream[SampleCaseClass] =
      env.addSource(new MockKafkaSource(kafkaMsg)).
flatMap(new SampleCaseClassMapper)


I am including part of pom file related to test packages for reference.

        <dependency>
            <groupId>org.scalatest</groupId>
            <artifactId>scalatest_${scala.compat.version}</artifactId>
            <version>3.2.3</version>
            <scope>test</scope>
        </dependency>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
        <plugins>
            <plugin>
               
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>4.4.0</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                            <args>
                                <arg>-dependencyfile</arg>
                               
<arg>${project.build.directory}/.scala_dependencies</arg>
                            </args>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.22.2</version>
                <configuration>
                   
                    <skipTests>true</skipTests>
                </configuration>
            </plugin>

            <plugin>
                <groupId>org.scalatest</groupId>
                <artifactId>scalatest-maven-plugin</artifactId>
                <version>2.0.2</version>
                <configuration>
                   
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
                    <junitxml>.</junitxml>
                    <filereports>TestSuiteReport.txt</filereports>
                   
                   
                </configuration>
                <executions>
                    <execution>
                        <id>test</id>
                        <goals>
                            <goal>test</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: Flink 1.11.2 test cases fail with Scala 2.12.12

Chesnay Schepler
Coud you check your dependency tree for the version of scala-library?

On 2/24/2021 7:28 AM, soumoks wrote:

> Thank you for the response but this error continues to happen with Scala
> 2.12.7.
> The app itself continues to compile without errors but the test cases fail
> with the same error.
>
> Seems to be related to
> https://issues.apache.org/jira/browse/FLINK-12461
>
> I have set the Scala version in pom.xml file and I have used this property
> value for all dependencies present.
>
>      <properties>
>          <maven.compiler.source>1.8</maven.compiler.source>
>          <maven.compiler.target>1.8</maven.compiler.target>
>          <encoding>UTF-8</encoding>
>          <scala.version>2.12.7</scala.version>
>          <scala.compat.version>2.12</scala.compat.version>
>          <aws.sdk.version>1.11.461</aws.sdk.version>
>          <spec2.version>4.2.0</spec2.version>
>      </properties>
>
>  
>
>
> However, I ran into the same error
>
> org.apache.flink.shaded.guava18.com.google.common.util.concurrent.ExecutionError:
> java.lang.NoClassDefFoundError: scala/math/Ordering$$anon$9
>    at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201)
>    at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
>    at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
>    at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$.compileCbf(TraversableSerializer.scala:184)
>    at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.compileCbf(TraversableSerializer.scala:51)
>    at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.<init>(TraversableSerializer.scala:41)
>
>
>
>
> and it occurs if I use a Scala mutable/immutable map data-structure in my
> code.
>
> Sample test code:
>
> val data: scala.collection.mutable.Map[String, String] =
>        scala.collection.mutable.Map("key1" -> "v1", "key2" -> "v2")
>
>      val kafkaMsg: String = write(
>        SampleCaseClass(14L, "23FC", 10L, data)
>      )
>
>      val stream: DataStream[SampleCaseClass] =
>        env.addSource(new MockKafkaSource(kafkaMsg)).
> flatMap(new SampleCaseClassMapper)
>
>
> I am including part of pom file related to test packages for reference.
>
>          <dependency>
>              <groupId>org.scalatest</groupId>
>              <artifactId>scalatest_${scala.compat.version}</artifactId>
>              <version>3.2.3</version>
>              <scope>test</scope>
>          </dependency>
>
>      <build>
>          <sourceDirectory>src/main/scala</sourceDirectory>
>          <testSourceDirectory>src/test/scala</testSourceDirectory>
>          <plugins>
>              <plugin>
>                  
>                  <groupId>net.alchim31.maven</groupId>
>                  <artifactId>scala-maven-plugin</artifactId>
>                  <version>4.4.0</version>
>                  <executions>
>                      <execution>
>                          <goals>
>                              <goal>compile</goal>
>                              <goal>testCompile</goal>
>                          </goals>
>                          <configuration>
>                              <args>
>                                  <arg>-dependencyfile</arg>
>                                
> <arg>${project.build.directory}/.scala_dependencies</arg>
>                              </args>
>                          </configuration>
>                      </execution>
>                  </executions>
>              </plugin>
>              <plugin>
>                  <groupId>org.apache.maven.plugins</groupId>
>                  <artifactId>maven-surefire-plugin</artifactId>
>                  <version>2.22.2</version>
>                  <configuration>
>                      
>                      <skipTests>true</skipTests>
>                  </configuration>
>              </plugin>
>
>              <plugin>
>                  <groupId>org.scalatest</groupId>
>                  <artifactId>scalatest-maven-plugin</artifactId>
>                  <version>2.0.2</version>
>                  <configuration>
>                    
> <reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
>                      <junitxml>.</junitxml>
>                      <filereports>TestSuiteReport.txt</filereports>
>                      
>                      
>                  </configuration>
>                  <executions>
>                      <execution>
>                          <id>test</id>
>                          <goals>
>                              <goal>test</goal>
>                          </goals>
>                      </execution>
>                  </executions>
>              </plugin>
>          </plugins>
>      </build>
>
>
>
> --
> Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Reply | Threaded
Open this post in threaded view
|

Re: Flink 1.11.2 test cases fail with Scala 2.12.12

soumoks
Thank you! I had scala-library 2.12.8 in my dependency tree (Probably a
remnant from when I was testing with Scala 2.12.8).

I did the following to fix this issue.

Removed  scala-library 2.12.8 from my dependency tree and added the below
dependency.


<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.12.7</version>
</dependency>






--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/