Kafka issue

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|

Kafka issue

Gyula Fóra
Hey,

For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)

Any insights?

Cheers,
Gyula
Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Till Rohrmann
Hi Gyula,

could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.

Cheers,
Till

On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
Hey,

For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)

Any insights?

Cheers,
Gyula

Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Gyula Fóra
I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
Hi Gyula,

could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.

Cheers,
Till

On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
Hey,

For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)

Any insights?

Cheers,
Gyula

Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

rmetzger0
Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?

We had issues in the past that jars in the snapshot repo were incorrect

On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
Hi Gyula,

could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.

Cheers,
Till

On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
Hey,

For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)

Any insights?

Cheers,
Gyula


Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Gyula Fóra
I was using the snapshot repo in this case, let me try building my own version...

Maybe this is interesting:
mvn dependency:tree | grep 2.11
[INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
[INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
[INFO] |     +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile


Robert Metzger <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:56):
Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?

We had issues in the past that jars in the snapshot repo were incorrect

On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
Hi Gyula,

could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.

Cheers,
Till

On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
Hey,

For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)

Any insights?

Cheers,
Gyula


Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Gyula Fóra
That actually seemed to be the issue, not that I compiled my own version it doesnt have these wrond jars in the dependency tree...

Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:01):
I was using the snapshot repo in this case, let me try building my own version...

Maybe this is interesting:
mvn dependency:tree | grep 2.11
[INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
[INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
[INFO] |     +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile


Robert Metzger <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:56):
Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?

We had issues in the past that jars in the snapshot repo were incorrect

On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
Hi Gyula,

could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.

Cheers,
Till

On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
Hey,

For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)

Any insights?

Cheers,
Gyula


Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Gyula Fóra
Thanks Robert, so apparently the snapshot version was screwed up somehow and included the 2.11 dependencies.

Now it works.

Cheers,
Gyula

Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:09):
That actually seemed to be the issue, not that I compiled my own version it doesnt have these wrond jars in the dependency tree...

Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:01):
I was using the snapshot repo in this case, let me try building my own version...

Maybe this is interesting:
mvn dependency:tree | grep 2.11
[INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
[INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
[INFO] |     +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile


Robert Metzger <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:56):
Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?

We had issues in the past that jars in the snapshot repo were incorrect

On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
Hi Gyula,

could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.

Cheers,
Till

On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
Hey,

For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)

Any insights?

Cheers,
Gyula


Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Gyula Fóra
Hey, 

Do we have any idea why this is happening in the snapshot repo? We have run into the same issue again...

Cheers,
Gyula

Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:17):
Thanks Robert, so apparently the snapshot version was screwed up somehow and included the 2.11 dependencies.

Now it works.

Cheers,
Gyula

Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:09):
That actually seemed to be the issue, not that I compiled my own version it doesnt have these wrond jars in the dependency tree...

Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:01):
I was using the snapshot repo in this case, let me try building my own version...

Maybe this is interesting:
mvn dependency:tree | grep 2.11
[INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
[INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
[INFO] |     +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile


Robert Metzger <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:56):
Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?

We had issues in the past that jars in the snapshot repo were incorrect

On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
Hi Gyula,

could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.

Cheers,
Till

On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
Hey,

For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)

Any insights?

Cheers,
Gyula


Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Till Rohrmann

Hi Gyula,

we discovered yesterday that our build process for Scala 2.11 is broken for the Kafka connector. The reason is that a property value is not properly resolved and thus pulls in the 2.10 Kafka dependencies. Max already opened a PR to fix this problem. I hope this will also solve your problem.

Cheers,
Till

On Mar 3, 2016 9:36 AM, "Gyula Fóra" <[hidden email]> wrote:
>
> Hey, 
>
> Do we have any idea why this is happening in the snapshot repo? We have run into the same issue again...
>
> Cheers,
> Gyula
>
> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:17):
>>
>> Thanks Robert, so apparently the snapshot version was screwed up somehow and included the 2.11 dependencies.
>>
>> Now it works.
>>
>> Cheers,
>> Gyula
>>
>> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:09):
>>>
>>> That actually seemed to be the issue, not that I compiled my own version it doesnt have these wrond jars in the dependency tree...
>>>
>>> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:01):
>>>>
>>>> I was using the snapshot repo in this case, let me try building my own version...
>>>>
>>>> Maybe this is interesting:
>>>> mvn dependency:tree | grep 2.11
>>>> [INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile
>>>>
>>>>
>>>> Robert Metzger <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:56):
>>>>>
>>>>> Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?
>>>>>
>>>>> We had issues in the past that jars in the snapshot repo were incorrect
>>>>>
>>>>> On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
>>>>>>
>>>>>> I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.
>>>>>>
>>>>>> Gyula
>>>>>>
>>>>>> Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
>>>>>>>
>>>>>>> Hi Gyula,
>>>>>>>
>>>>>>> could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.
>>>>>>>
>>>>>>> Cheers,
>>>>>>> Till
>>>>>>>
>>>>>>> On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
>>>>>>>>
>>>>>>>> Hey,
>>>>>>>>
>>>>>>>> For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.
>>>>>>>>
>>>>>>>> java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>> at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
>>>>>>>> at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
>>>>>>>> at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
>>>>>>>> at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
>>>>>>>> at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>>> at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
>>>>>>>> at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
>>>>>>>> at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
>>>>>>>> at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
>>>>>>>> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
>>>>>>>> at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
>>>>>>>> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)
>>>>>>>>
>>>>>>>> Any insights?
>>>>>>>>
>>>>>>>> Cheers,
>>>>>>>> Gyula
>>>>>>>
>>>>>>>
>>>>>

Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Gyula Fóra
This problem is kind of the other way around, as our 2.10 build has 2.11 dependencies pulled in by kafka. But let's see what happens. :)

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. márc. 3., Cs, 9:46):

Hi Gyula,

we discovered yesterday that our build process for Scala 2.11 is broken for the Kafka connector. The reason is that a property value is not properly resolved and thus pulls in the 2.10 Kafka dependencies. Max already opened a PR to fix this problem. I hope this will also solve your problem.

Cheers,
Till

On Mar 3, 2016 9:36 AM, "Gyula Fóra" <[hidden email]> wrote:
>
> Hey, 
>
> Do we have any idea why this is happening in the snapshot repo? We have run into the same issue again...
>
> Cheers,
> Gyula
>
> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:17):
>>
>> Thanks Robert, so apparently the snapshot version was screwed up somehow and included the 2.11 dependencies.
>>
>> Now it works.
>>
>> Cheers,
>> Gyula
>>
>> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:09):
>>>
>>> That actually seemed to be the issue, not that I compiled my own version it doesnt have these wrond jars in the dependency tree...
>>>
>>> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:01):
>>>>
>>>> I was using the snapshot repo in this case, let me try building my own version...
>>>>
>>>> Maybe this is interesting:
>>>> mvn dependency:tree | grep 2.11
>>>> [INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile
>>>>
>>>>
>>>> Robert Metzger <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:56):
>>>>>
>>>>> Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?
>>>>>
>>>>> We had issues in the past that jars in the snapshot repo were incorrect
>>>>>
>>>>> On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
>>>>>>
>>>>>> I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.
>>>>>>
>>>>>> Gyula
>>>>>>
>>>>>> Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
>>>>>>>
>>>>>>> Hi Gyula,
>>>>>>>
>>>>>>> could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.
>>>>>>>
>>>>>>> Cheers,
>>>>>>> Till
>>>>>>>
>>>>>>> On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
>>>>>>>>
>>>>>>>> Hey,
>>>>>>>>
>>>>>>>> For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.
>>>>>>>>
>>>>>>>> java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>> at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
>>>>>>>> at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
>>>>>>>> at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
>>>>>>>> at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
>>>>>>>> at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>>> at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
>>>>>>>> at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
>>>>>>>> at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
>>>>>>>> at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
>>>>>>>> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
>>>>>>>> at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
>>>>>>>> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)
>>>>>>>>
>>>>>>>> Any insights?
>>>>>>>>
>>>>>>>> Cheers,
>>>>>>>> Gyula
>>>>>>>
>>>>>>>
>>>>>

Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Márton Balassi
Hey guys,

I have run into the same issue when developing against the master. Now after Max's commit supposedly fixing the issue reimporting the project gives me all the dependencies for 2.10, except for scala-compiler and scala-reflect, which come in version 2.11. It seems very weird.

Do you have any suggestions?

Marton

On Thu, Mar 3, 2016 at 9:48 AM, Gyula Fóra <[hidden email]> wrote:
This problem is kind of the other way around, as our 2.10 build has 2.11 dependencies pulled in by kafka. But let's see what happens. :)

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. márc. 3., Cs, 9:46):

Hi Gyula,

we discovered yesterday that our build process for Scala 2.11 is broken for the Kafka connector. The reason is that a property value is not properly resolved and thus pulls in the 2.10 Kafka dependencies. Max already opened a PR to fix this problem. I hope this will also solve your problem.

Cheers,
Till

On Mar 3, 2016 9:36 AM, "Gyula Fóra" <[hidden email]> wrote:
>
> Hey, 
>
> Do we have any idea why this is happening in the snapshot repo? We have run into the same issue again...
>
> Cheers,
> Gyula
>
> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:17):
>>
>> Thanks Robert, so apparently the snapshot version was screwed up somehow and included the 2.11 dependencies.
>>
>> Now it works.
>>
>> Cheers,
>> Gyula
>>
>> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:09):
>>>
>>> That actually seemed to be the issue, not that I compiled my own version it doesnt have these wrond jars in the dependency tree...
>>>
>>> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:01):
>>>>
>>>> I was using the snapshot repo in this case, let me try building my own version...
>>>>
>>>> Maybe this is interesting:
>>>> mvn dependency:tree | grep 2.11
>>>> [INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile
>>>>
>>>>
>>>> Robert Metzger <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:56):
>>>>>
>>>>> Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?
>>>>>
>>>>> We had issues in the past that jars in the snapshot repo were incorrect
>>>>>
>>>>> On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
>>>>>>
>>>>>> I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.
>>>>>>
>>>>>> Gyula
>>>>>>
>>>>>> Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
>>>>>>>
>>>>>>> Hi Gyula,
>>>>>>>
>>>>>>> could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.
>>>>>>>
>>>>>>> Cheers,
>>>>>>> Till
>>>>>>>
>>>>>>> On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
>>>>>>>>
>>>>>>>> Hey,
>>>>>>>>
>>>>>>>> For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.
>>>>>>>>
>>>>>>>> java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>> at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
>>>>>>>> at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
>>>>>>>> at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
>>>>>>>> at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
>>>>>>>> at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>>> at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
>>>>>>>> at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
>>>>>>>> at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
>>>>>>>> at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
>>>>>>>> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
>>>>>>>> at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
>>>>>>>> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)
>>>>>>>>
>>>>>>>> Any insights?
>>>>>>>>
>>>>>>>> Cheers,
>>>>>>>> Gyula
>>>>>>>
>>>>>>>
>>>>>


Reply | Threaded
Open this post in threaded view
|

Re: Kafka issue

Márton Balassi
I have inspected mvn dependency:tree in the meantime, the maven build fortunately looks healthy fortunately, it seems my IntelliJ is very keen on the freshly acquired dependencies it has gathered recently for scala 2.11.

On Thu, Mar 3, 2016 at 1:04 PM, Márton Balassi <[hidden email]> wrote:
Hey guys,

I have run into the same issue when developing against the master. Now after Max's commit supposedly fixing the issue reimporting the project gives me all the dependencies for 2.10, except for scala-compiler and scala-reflect, which come in version 2.11. It seems very weird.

Do you have any suggestions?

Marton

On Thu, Mar 3, 2016 at 9:48 AM, Gyula Fóra <[hidden email]> wrote:
This problem is kind of the other way around, as our 2.10 build has 2.11 dependencies pulled in by kafka. But let's see what happens. :)

Gyula

Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. márc. 3., Cs, 9:46):

Hi Gyula,

we discovered yesterday that our build process for Scala 2.11 is broken for the Kafka connector. The reason is that a property value is not properly resolved and thus pulls in the 2.10 Kafka dependencies. Max already opened a PR to fix this problem. I hope this will also solve your problem.

Cheers,
Till

On Mar 3, 2016 9:36 AM, "Gyula Fóra" <[hidden email]> wrote:
>
> Hey, 
>
> Do we have any idea why this is happening in the snapshot repo? We have run into the same issue again...
>
> Cheers,
> Gyula
>
> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:17):
>>
>> Thanks Robert, so apparently the snapshot version was screwed up somehow and included the 2.11 dependencies.
>>
>> Now it works.
>>
>> Cheers,
>> Gyula
>>
>> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:09):
>>>
>>> That actually seemed to be the issue, not that I compiled my own version it doesnt have these wrond jars in the dependency tree...
>>>
>>> Gyula Fóra <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 11:01):
>>>>
>>>> I was using the snapshot repo in this case, let me try building my own version...
>>>>
>>>> Maybe this is interesting:
>>>> mvn dependency:tree | grep 2.11
>>>> [INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile
>>>>
>>>>
>>>> Robert Metzger <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:56):
>>>>>
>>>>> Are you building 1.0-SNAPSHOT yourself or are you relying on the snapshot repository?
>>>>>
>>>>> We had issues in the past that jars in the snapshot repo were incorrect
>>>>>
>>>>> On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <[hidden email]> wrote:
>>>>>>
>>>>>> I am not sure what is happening. I tried running against a Flink cluster that is definitely running the correct Scala version (2.10) and I still got the error. So it might be something with the pom.xml but we just don't see how it is different from the correct one.
>>>>>>
>>>>>> Gyula
>>>>>>
>>>>>> Till Rohrmann <[hidden email]> ezt írta (időpont: 2016. febr. 26., P, 10:42):
>>>>>>>
>>>>>>> Hi Gyula,
>>>>>>>
>>>>>>> could it be that you compiled against a different Scala version than the one you're using for running the job? This usually happens when you compile against 2.10 and let it run with version 2.11.
>>>>>>>
>>>>>>> Cheers,
>>>>>>> Till
>>>>>>>
>>>>>>> On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <[hidden email]> wrote:
>>>>>>>>
>>>>>>>> Hey,
>>>>>>>>
>>>>>>>> For one of our jobs we ran into this issue. It's probably some dependency issue but we cant figure it out as a very similar setup works without issues for a different program.
>>>>>>>>
>>>>>>>> java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>> at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
>>>>>>>> at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
>>>>>>>> at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
>>>>>>>> at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
>>>>>>>> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
>>>>>>>> at com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
>>>>>>>> at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>>> at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
>>>>>>>> at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
>>>>>>>> at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
>>>>>>>> at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
>>>>>>>> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
>>>>>>>> at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
>>>>>>>> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)
>>>>>>>>
>>>>>>>> Any insights?
>>>>>>>>
>>>>>>>> Cheers,
>>>>>>>> Gyula
>>>>>>>
>>>>>>>
>>>>>