ClassCastException after upgrading Flink application to 1.11.2

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

ClassCastException after upgrading Flink application to 1.11.2

soumoks
Hi,

We have upgraded an application originally written for Flink 1.9.1 with
Scala 2.11 to Flink 1.11.2 with Scala 2.12.7 and we are seeing the following
error at runtime.


2021-03-16 20:37:08
java.lang.RuntimeException
  at
org.apache.flink.streaming.runtime.io.RecordWriterOutput.pushToRecordWriter(RecordWriterOutput.java:110)
  at
org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:89)
  at
org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:45)
  at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
  at
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
  at
org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollectWithTimestamp(StreamSourceContexts.java:310)
  at
org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collectWithTimestamp(StreamSourceContexts.java:409)
  at
org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordsWithTimestamps(AbstractFetcher.java:352)
  at
org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:185)
  at
org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:141)
  at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:755)
  at
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
  at
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
  at
org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
Caused by: java.lang.ClassCastException



The class in question was using Scala Long and Scala BigDecimal types which
have been changed to Java Long and Java BigDecimal types as a means to
resolve this error but to no avail.

This application is running on AWS EMR running emr-6.2.0 if that helps.


Thanks,
Sourabh




--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: ClassCastException after upgrading Flink application to 1.11.2

Dawid Wysakowicz-2
Could you share a full stacktrace with us? Could you check the stack
trace also in the task managers logs?

As a side note, make sure you are using the same version of all Flink
dependencies.

Best,

Dawid

On 17/03/2021 06:26, soumoks wrote:

> Hi,
>
> We have upgraded an application originally written for Flink 1.9.1 with
> Scala 2.11 to Flink 1.11.2 with Scala 2.12.7 and we are seeing the following
> error at runtime.
>
>
> 2021-03-16 20:37:08
> java.lang.RuntimeException
>   at
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.pushToRecordWriter(RecordWriterOutput.java:110)
>   at
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:89)
>   at
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:45)
>   at
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
>   at
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
>   at
> org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollectWithTimestamp(StreamSourceContexts.java:310)
>   at
> org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collectWithTimestamp(StreamSourceContexts.java:409)
>   at
> org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordsWithTimestamps(AbstractFetcher.java:352)
>   at
> org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:185)
>   at
> org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:141)
>   at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:755)
>   at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
>   at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
>   at
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
> Caused by: java.lang.ClassCastException
>
>
>
> The class in question was using Scala Long and Scala BigDecimal types which
> have been changed to Java Long and Java BigDecimal types as a means to
> resolve this error but to no avail.
>
> This application is running on AWS EMR running emr-6.2.0 if that helps.
>
>
> Thanks,
> Sourabh
>
>
>
>
> --
> Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


OpenPGP_signature (855 bytes) Download Attachment