Is it possible to use Scala 2.11 and Java 8?
I'm able to get our project to compile correctly, however there are runtime errors with the Reflectasm library (I'm guessing due to Kyro). I looked into the error and it seems Spark had the same issue (https://issues.apache.org/jira/browse/SPARK-6152, https://github.com/EsotericSoftware/reflectasm/issues/35) because of an outdated version of Kyro. I'm also unsure if maybe we have to build Flink with Scala 2.11 (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html) in order to run Flink correctly with Java 8. Cheers, Cory |
Hi Cory,
Thanks for reporting the issue. Scala should run independently of the Java version. We are already using ASM version 5.0.4. However, some code uses the ASM4 op codes which don't seem to be work with Java 8. This needs to be fixed. I'm filing a JIRA. Cheers, Max On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty <[hidden email]> wrote: > Is it possible to use Scala 2.11 and Java 8? > > I'm able to get our project to compile correctly, however there are runtime > errors with the Reflectasm library (I'm guessing due to Kyro). I looked into > the error and it seems Spark had the same issue > (https://issues.apache.org/jira/browse/SPARK-6152, > https://github.com/EsotericSoftware/reflectasm/issues/35) because of an > outdated version of Kyro. > > I'm also unsure if maybe we have to build Flink with Scala 2.11 > (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html) > in order to run Flink correctly with Java 8. > > Cheers, > > Cory |
For completeness, could you provide a stack trace of the error message?
On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <[hidden email]> wrote: > Hi Cory, > > Thanks for reporting the issue. Scala should run independently of the > Java version. We are already using ASM version 5.0.4. However, some > code uses the ASM4 op codes which don't seem to be work with Java 8. > This needs to be fixed. I'm filing a JIRA. > > Cheers, > Max > > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty <[hidden email]> wrote: >> Is it possible to use Scala 2.11 and Java 8? >> >> I'm able to get our project to compile correctly, however there are runtime >> errors with the Reflectasm library (I'm guessing due to Kyro). I looked into >> the error and it seems Spark had the same issue >> (https://issues.apache.org/jira/browse/SPARK-6152, >> https://github.com/EsotericSoftware/reflectasm/issues/35) because of an >> outdated version of Kyro. >> >> I'm also unsure if maybe we have to build Flink with Scala 2.11 >> (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html) >> in order to run Flink correctly with Java 8. >> >> Cheers, >> >> Cory |
Thanks, Max. java.lang.IllegalArgumentException: at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source) at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source) at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source) at org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47) at org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90) at org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113) at org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555) at org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764) at org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473) On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <[hidden email]> wrote: For completeness, could you provide a stack trace of the error message? |
Flink's own asm is 5.0, but the Kryo version used in Flink bundles reflectasm with a dedicated asm version 4 (no lambdas supported). Might be as simple as bumping the kryo version... On Mon, Dec 7, 2015 at 7:59 PM, Cory Monty <[hidden email]> wrote:
|
Sorry, correcting myself: The ClosureCleaner uses Kryo's bundled ASM 4 without any reason - simply adjusting the imports to use the common ASM (which is 5.0) should do it ;-) On Mon, Dec 7, 2015 at 8:18 PM, Stephan Ewen <[hidden email]> wrote:
|
Thanks for the stack trace, Cory. Looks like you were on the right
path with the Spark issue. We will file an issue and correct it soon. Thanks, Max On Mon, Dec 7, 2015 at 8:20 PM, Stephan Ewen <[hidden email]> wrote: > Sorry, correcting myself: > > The ClosureCleaner uses Kryo's bundled ASM 4 without any reason - simply > adjusting the imports to use the common ASM (which is 5.0) should do it ;-) > > On Mon, Dec 7, 2015 at 8:18 PM, Stephan Ewen <[hidden email]> wrote: >> >> Flink's own asm is 5.0, but the Kryo version used in Flink bundles >> reflectasm with a dedicated asm version 4 (no lambdas supported). >> >> Might be as simple as bumping the kryo version... >> >> >> >> On Mon, Dec 7, 2015 at 7:59 PM, Cory Monty <[hidden email]> >> wrote: >>> >>> Thanks, Max. >>> >>> Here is the stack trace I receive: >>> >>> java.lang.IllegalArgumentException: >>> at >>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown >>> Source) >>> at >>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown >>> Source) >>> at >>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown >>> Source) >>> at >>> org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47) >>> at >>> org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90) >>> at >>> org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113) >>> at >>> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555) >>> at >>> org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764) >>> at >>> org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473) >>> >>> On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <[hidden email]> >>> wrote: >>>> >>>> For completeness, could you provide a stack trace of the error message? >>>> >>>> On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <[hidden email]> >>>> wrote: >>>> > Hi Cory, >>>> > >>>> > Thanks for reporting the issue. Scala should run independently of the >>>> > Java version. We are already using ASM version 5.0.4. However, some >>>> > code uses the ASM4 op codes which don't seem to be work with Java 8. >>>> > This needs to be fixed. I'm filing a JIRA. >>>> > >>>> > Cheers, >>>> > Max >>>> > >>>> > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty >>>> > <[hidden email]> wrote: >>>> >> Is it possible to use Scala 2.11 and Java 8? >>>> >> >>>> >> I'm able to get our project to compile correctly, however there are >>>> >> runtime >>>> >> errors with the Reflectasm library (I'm guessing due to Kyro). I >>>> >> looked into >>>> >> the error and it seems Spark had the same issue >>>> >> (https://issues.apache.org/jira/browse/SPARK-6152, >>>> >> https://github.com/EsotericSoftware/reflectasm/issues/35) because of >>>> >> an >>>> >> outdated version of Kyro. >>>> >> >>>> >> I'm also unsure if maybe we have to build Flink with Scala 2.11 >>>> >> >>>> >> (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html) >>>> >> in order to run Flink correctly with Java 8. >>>> >> >>>> >> Cheers, >>>> >> >>>> >> Cory >>> >>> >> > |
Hi Cory,
The issue has been fixed in the master and the latest Maven snapshot. https://issues.apache.org/jira/browse/FLINK-3143 Cheers, Max On Tue, Dec 8, 2015 at 12:35 PM, Maximilian Michels <[hidden email]> wrote: > Thanks for the stack trace, Cory. Looks like you were on the right > path with the Spark issue. We will file an issue and correct it soon. > > Thanks, > Max > > On Mon, Dec 7, 2015 at 8:20 PM, Stephan Ewen <[hidden email]> wrote: >> Sorry, correcting myself: >> >> The ClosureCleaner uses Kryo's bundled ASM 4 without any reason - simply >> adjusting the imports to use the common ASM (which is 5.0) should do it ;-) >> >> On Mon, Dec 7, 2015 at 8:18 PM, Stephan Ewen <[hidden email]> wrote: >>> >>> Flink's own asm is 5.0, but the Kryo version used in Flink bundles >>> reflectasm with a dedicated asm version 4 (no lambdas supported). >>> >>> Might be as simple as bumping the kryo version... >>> >>> >>> >>> On Mon, Dec 7, 2015 at 7:59 PM, Cory Monty <[hidden email]> >>> wrote: >>>> >>>> Thanks, Max. >>>> >>>> Here is the stack trace I receive: >>>> >>>> java.lang.IllegalArgumentException: >>>> at >>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown >>>> Source) >>>> at >>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown >>>> Source) >>>> at >>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown >>>> Source) >>>> at >>>> org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:47) >>>> at >>>> org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90) >>>> at >>>> org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113) >>>> at >>>> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:555) >>>> at >>>> org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:764) >>>> at >>>> org.apache.flink.streaming.api.scala.DataStream.flatMap(DataStream.scala:473) >>>> >>>> On Mon, Dec 7, 2015 at 11:58 AM, Maximilian Michels <[hidden email]> >>>> wrote: >>>>> >>>>> For completeness, could you provide a stack trace of the error message? >>>>> >>>>> On Mon, Dec 7, 2015 at 6:56 PM, Maximilian Michels <[hidden email]> >>>>> wrote: >>>>> > Hi Cory, >>>>> > >>>>> > Thanks for reporting the issue. Scala should run independently of the >>>>> > Java version. We are already using ASM version 5.0.4. However, some >>>>> > code uses the ASM4 op codes which don't seem to be work with Java 8. >>>>> > This needs to be fixed. I'm filing a JIRA. >>>>> > >>>>> > Cheers, >>>>> > Max >>>>> > >>>>> > On Mon, Dec 7, 2015 at 4:15 PM, Cory Monty >>>>> > <[hidden email]> wrote: >>>>> >> Is it possible to use Scala 2.11 and Java 8? >>>>> >> >>>>> >> I'm able to get our project to compile correctly, however there are >>>>> >> runtime >>>>> >> errors with the Reflectasm library (I'm guessing due to Kyro). I >>>>> >> looked into >>>>> >> the error and it seems Spark had the same issue >>>>> >> (https://issues.apache.org/jira/browse/SPARK-6152, >>>>> >> https://github.com/EsotericSoftware/reflectasm/issues/35) because of >>>>> >> an >>>>> >> outdated version of Kyro. >>>>> >> >>>>> >> I'm also unsure if maybe we have to build Flink with Scala 2.11 >>>>> >> >>>>> >> (https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/building.html) >>>>> >> in order to run Flink correctly with Java 8. >>>>> >> >>>>> >> Cheers, >>>>> >> >>>>> >> Cory >>>> >>>> >>> >> |
Thanks! On Thu, Dec 10, 2015 at 12:32 PM, Maximilian Michels <[hidden email]> wrote: Hi Cory, |
Free forum by Nabble | Edit this page |