kafka.javaapi.consumer.SimpleConsumer class not found

classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

kafka.javaapi.consumer.SimpleConsumer class not found

Balaji Rajagopalan
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 

Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

rmetzger0
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 


Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

Balaji Rajagopalan
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 



Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

rmetzger0
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 




Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

Balaji Rajagopalan
Robert,
   I have  moved on to latest version of flink of 1.0.0 hoping that will solve my problem with kafka connector . Here is my pom.xml but now I cannot get the code compiled. 

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first) on project flink-streaming-demo: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

I read about the above errors in most cases people where able to overcome is by deleting the .m2 directory, and that did not fix the issue for me. 

What I noticied was that, if I remove the dependency on 

Here is my pom.xml 

<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.dataArtisans</groupId>
<artifactId>flink-streaming-demo</artifactId>
<version>0.1</version>
<packaging>jar</packaging>

<name>Flink Streaming Demo</name>
<url>http://www.data-artisans.com</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<slf4j.version>1.7.12</slf4j.version>
<flink.version>1.0.0</flink.version>
<scala.version>2.10</scala.version>
</properties>

<dependencies>



<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>1.7.3</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.7</version>
</dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_${scala.version}</artifactId>
<version>3.3.0</version>
</dependency>


</dependencies>

<build>
<plugins>

<!-- Scala Compiler -->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
<executions>
<!-- Run scala compiler in the process-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) compile phase -->
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>

<!-- Run scala compiler in the process-test-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) test-compile phase -->
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<jvmArgs>
<jvmArg>-Xms128m</jvmArg>
<jvmArg>-Xmx512m</jvmArg>
</jvmArgs>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.9</version>
<executions>
<execution>
<id>unpack</id>
<!-- executed just before the package phase -->
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<!-- For Flink connector classes -->
<artifactItem>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>1.0.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>org/apache/flink/**</includes>
</artifactItem>
<!-- For Kafka API classes -->
<artifactItem>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>kafka/**</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>

<!--plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>

<execution>
<id>MBoxParser</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>

<configuration>
<classifier>MBoxParser</classifier>

<archive>
<manifestEntries>
<main-class>com.dataartisans.flinkTraining.dataSetPreparation.MBoxParser</main-class>
</manifestEntries>
</archive>

<includes>
<include>**/MBoxParser.class</include>
<include>**/MBoxParser$*.class</include>
</includes>
</configuration>
</execution>

</executions>
</plugin-->

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" -->
<target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" -->
</configuration>
</plugin>

<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
<version>0.10</version><!--$NO-MVN-MAN-VER$-->
<inherited>false</inherited>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<excludeSubProjects>false</excludeSubProjects>
<numUnapprovedLicenses>0</numUnapprovedLicenses>
<licenses>
<!-- Enforce this license:
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<license implementation="org.apache.rat.analysis.license.SimplePatternBasedLicense">
<licenseFamilyCategory>AL2 </licenseFamilyCategory>
<licenseFamilyName>Apache License 2.0</licenseFamilyName>
<notes />
<patterns>
<pattern>Copyright 2015 data Artisans GmbH</pattern>
<pattern>Licensed under the Apache License, Version 2.0 (the "License");</pattern>
</patterns>
</license>
</licenses>
<licenseFamilies>
<licenseFamily implementation="org.apache.rat.license.SimpleLicenseFamily">
<familyName>Apache License 2.0</familyName>
</licenseFamily>
</licenseFamilies>
<excludes>
<!-- Additional files like .gitignore etc.-->
<exclude>**/.*</exclude>
<exclude>**/*.prefs</exclude>
<exclude>**/*.properties</exclude>
<exclude>**/*.log</exclude>
<exclude>*.txt/**</exclude>
<!-- Administrative files in the main trunk. -->
<exclude>**/README.md</exclude>
<exclude>CHANGELOG</exclude>
<!-- Build files -->
<exclude>**/*.iml</exclude>
<!-- Generated content -->
<exclude>**/target/**</exclude>
<exclude>**/build/**</exclude>
</excludes>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>2.12.1</version>
<executions>
<execution>
<id>validate</id>
<phase>validate</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<configLocation>/tools/maven/checkstyle.xml</configLocation>
<logViolationsToConsole>true</logViolationsToConsole>
</configuration>
</plugin>

</plugins>

</build>
</project>

On Mon, Mar 14, 2016 at 3:15 PM, Robert Metzger <[hidden email]> wrote:
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 





Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

Balaji Rajagopalan

What I noticied was that, if I remove the dependency on flink-connector-kafka so it is clearly to do something with that dependency. 



On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
   I have  moved on to latest version of flink of 1.0.0 hoping that will solve my problem with kafka connector . Here is my pom.xml but now I cannot get the code compiled. 

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first) on project flink-streaming-demo: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

I read about the above errors in most cases people where able to overcome is by deleting the .m2 directory, and that did not fix the issue for me. 

What I noticied was that, if I remove the dependency on 

Here is my pom.xml 

<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.dataArtisans</groupId>
<artifactId>flink-streaming-demo</artifactId>
<version>0.1</version>
<packaging>jar</packaging>

<name>Flink Streaming Demo</name>
<url>http://www.data-artisans.com</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<slf4j.version>1.7.12</slf4j.version>
<flink.version>1.0.0</flink.version>
<scala.version>2.10</scala.version>
</properties>

<dependencies>



<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>1.7.3</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.7</version>
</dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_${scala.version}</artifactId>
<version>3.3.0</version>
</dependency>


</dependencies>

<build>
<plugins>

<!-- Scala Compiler -->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
<executions>
<!-- Run scala compiler in the process-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) compile phase -->
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>

<!-- Run scala compiler in the process-test-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) test-compile phase -->
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<jvmArgs>
<jvmArg>-Xms128m</jvmArg>
<jvmArg>-Xmx512m</jvmArg>
</jvmArgs>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.9</version>
<executions>
<execution>
<id>unpack</id>
<!-- executed just before the package phase -->
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<!-- For Flink connector classes -->
<artifactItem>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>1.0.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>org/apache/flink/**</includes>
</artifactItem>
<!-- For Kafka API classes -->
<artifactItem>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>kafka/**</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>

<!--plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>

<execution>
<id>MBoxParser</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>

<configuration>
<classifier>MBoxParser</classifier>

<archive>
<manifestEntries>
<main-class>com.dataartisans.flinkTraining.dataSetPreparation.MBoxParser</main-class>
</manifestEntries>
</archive>

<includes>
<include>**/MBoxParser.class</include>
<include>**/MBoxParser$*.class</include>
</includes>
</configuration>
</execution>

</executions>
</plugin-->

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" -->
<target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" -->
</configuration>
</plugin>

<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
<version>0.10</version><!--$NO-MVN-MAN-VER$-->
<inherited>false</inherited>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<excludeSubProjects>false</excludeSubProjects>
<numUnapprovedLicenses>0</numUnapprovedLicenses>
<licenses>
<!-- Enforce this license:
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<license implementation="org.apache.rat.analysis.license.SimplePatternBasedLicense">
<licenseFamilyCategory>AL2 </licenseFamilyCategory>
<licenseFamilyName>Apache License 2.0</licenseFamilyName>
<notes />
<patterns>
<pattern>Copyright 2015 data Artisans GmbH</pattern>
<pattern>Licensed under the Apache License, Version 2.0 (the "License");</pattern>
</patterns>
</license>
</licenses>
<licenseFamilies>
<licenseFamily implementation="org.apache.rat.license.SimpleLicenseFamily">
<familyName>Apache License 2.0</familyName>
</licenseFamily>
</licenseFamilies>
<excludes>
<!-- Additional files like .gitignore etc.-->
<exclude>**/.*</exclude>
<exclude>**/*.prefs</exclude>
<exclude>**/*.properties</exclude>
<exclude>**/*.log</exclude>
<exclude>*.txt/**</exclude>
<!-- Administrative files in the main trunk. -->
<exclude>**/README.md</exclude>
<exclude>CHANGELOG</exclude>
<!-- Build files -->
<exclude>**/*.iml</exclude>
<!-- Generated content -->
<exclude>**/target/**</exclude>
<exclude>**/build/**</exclude>
</excludes>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>2.12.1</version>
<executions>
<execution>
<id>validate</id>
<phase>validate</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<configLocation>/tools/maven/checkstyle.xml</configLocation>
<logViolationsToConsole>true</logViolationsToConsole>
</configuration>
</plugin>

</plugins>

</build>
</project>

On Mon, Mar 14, 2016 at 3:15 PM, Robert Metzger <[hidden email]> wrote:
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 






Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

rmetzger0
Hi,

flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_


On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <[hidden email]> wrote:

What I noticied was that, if I remove the dependency on flink-connector-kafka so it is clearly to do something with that dependency. 



On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
   I have  moved on to latest version of flink of 1.0.0 hoping that will solve my problem with kafka connector . Here is my pom.xml but now I cannot get the code compiled. 

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first) on project flink-streaming-demo: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

I read about the above errors in most cases people where able to overcome is by deleting the .m2 directory, and that did not fix the issue for me. 

What I noticied was that, if I remove the dependency on 

Here is my pom.xml 

<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.dataArtisans</groupId>
<artifactId>flink-streaming-demo</artifactId>
<version>0.1</version>
<packaging>jar</packaging>

<name>Flink Streaming Demo</name>
<url>http://www.data-artisans.com</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<slf4j.version>1.7.12</slf4j.version>
<flink.version>1.0.0</flink.version>
<scala.version>2.10</scala.version>
</properties>

<dependencies>



<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>1.7.3</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.7</version>
</dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_${scala.version}</artifactId>
<version>3.3.0</version>
</dependency>


</dependencies>

<build>
<plugins>

<!-- Scala Compiler -->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
<executions>
<!-- Run scala compiler in the process-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) compile phase -->
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>

<!-- Run scala compiler in the process-test-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) test-compile phase -->
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<jvmArgs>
<jvmArg>-Xms128m</jvmArg>
<jvmArg>-Xmx512m</jvmArg>
</jvmArgs>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.9</version>
<executions>
<execution>
<id>unpack</id>
<!-- executed just before the package phase -->
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<!-- For Flink connector classes -->
<artifactItem>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>1.0.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>org/apache/flink/**</includes>
</artifactItem>
<!-- For Kafka API classes -->
<artifactItem>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>kafka/**</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>

<!--plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>

<execution>
<id>MBoxParser</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>

<configuration>
<classifier>MBoxParser</classifier>

<archive>
<manifestEntries>
<main-class>com.dataartisans.flinkTraining.dataSetPreparation.MBoxParser</main-class>
</manifestEntries>
</archive>

<includes>
<include>**/MBoxParser.class</include>
<include>**/MBoxParser$*.class</include>
</includes>
</configuration>
</execution>

</executions>
</plugin-->

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" -->
<target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" -->
</configuration>
</plugin>

<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
<version>0.10</version><!--$NO-MVN-MAN-VER$-->
<inherited>false</inherited>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<excludeSubProjects>false</excludeSubProjects>
<numUnapprovedLicenses>0</numUnapprovedLicenses>
<licenses>
<!-- Enforce this license:
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<license implementation="org.apache.rat.analysis.license.SimplePatternBasedLicense">
<licenseFamilyCategory>AL2 </licenseFamilyCategory>
<licenseFamilyName>Apache License 2.0</licenseFamilyName>
<notes />
<patterns>
<pattern>Copyright 2015 data Artisans GmbH</pattern>
<pattern>Licensed under the Apache License, Version 2.0 (the "License");</pattern>
</patterns>
</license>
</licenses>
<licenseFamilies>
<licenseFamily implementation="org.apache.rat.license.SimpleLicenseFamily">
<familyName>Apache License 2.0</familyName>
</licenseFamily>
</licenseFamilies>
<excludes>
<!-- Additional files like .gitignore etc.-->
<exclude>**/.*</exclude>
<exclude>**/*.prefs</exclude>
<exclude>**/*.properties</exclude>
<exclude>**/*.log</exclude>
<exclude>*.txt/**</exclude>
<!-- Administrative files in the main trunk. -->
<exclude>**/README.md</exclude>
<exclude>CHANGELOG</exclude>
<!-- Build files -->
<exclude>**/*.iml</exclude>
<!-- Generated content -->
<exclude>**/target/**</exclude>
<exclude>**/build/**</exclude>
</excludes>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>2.12.1</version>
<executions>
<execution>
<id>validate</id>
<phase>validate</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<configLocation>/tools/maven/checkstyle.xml</configLocation>
<logViolationsToConsole>true</logViolationsToConsole>
</configuration>
</plugin>

</plugins>

</build>
</project>

On Mon, Mar 14, 2016 at 3:15 PM, Robert Metzger <[hidden email]> wrote:
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 







Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

Balaji Rajagopalan
Yes figured that out, thanks for point that, my bad. I have put back 0.10.2 as flink version, will try to reproduce the problem again, this time I have explicitly called out the scala version as 2.11. 


On Mon, Mar 14, 2016 at 4:14 PM, Robert Metzger <[hidden email]> wrote:
Hi,

flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_


On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <[hidden email]> wrote:

What I noticied was that, if I remove the dependency on flink-connector-kafka so it is clearly to do something with that dependency. 



On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
   I have  moved on to latest version of flink of 1.0.0 hoping that will solve my problem with kafka connector . Here is my pom.xml but now I cannot get the code compiled. 

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first) on project flink-streaming-demo: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

I read about the above errors in most cases people where able to overcome is by deleting the .m2 directory, and that did not fix the issue for me. 

What I noticied was that, if I remove the dependency on 

Here is my pom.xml 

<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.dataArtisans</groupId>
<artifactId>flink-streaming-demo</artifactId>
<version>0.1</version>
<packaging>jar</packaging>

<name>Flink Streaming Demo</name>
<url>http://www.data-artisans.com</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<slf4j.version>1.7.12</slf4j.version>
<flink.version>1.0.0</flink.version>
<scala.version>2.10</scala.version>
</properties>

<dependencies>



<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>1.7.3</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.7</version>
</dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_${scala.version}</artifactId>
<version>3.3.0</version>
</dependency>


</dependencies>

<build>
<plugins>

<!-- Scala Compiler -->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
<executions>
<!-- Run scala compiler in the process-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) compile phase -->
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>

<!-- Run scala compiler in the process-test-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) test-compile phase -->
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<jvmArgs>
<jvmArg>-Xms128m</jvmArg>
<jvmArg>-Xmx512m</jvmArg>
</jvmArgs>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.9</version>
<executions>
<execution>
<id>unpack</id>
<!-- executed just before the package phase -->
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<!-- For Flink connector classes -->
<artifactItem>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>1.0.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>org/apache/flink/**</includes>
</artifactItem>
<!-- For Kafka API classes -->
<artifactItem>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>kafka/**</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>

<!--plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>

<execution>
<id>MBoxParser</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>

<configuration>
<classifier>MBoxParser</classifier>

<archive>
<manifestEntries>
<main-class>com.dataartisans.flinkTraining.dataSetPreparation.MBoxParser</main-class>
</manifestEntries>
</archive>

<includes>
<include>**/MBoxParser.class</include>
<include>**/MBoxParser$*.class</include>
</includes>
</configuration>
</execution>

</executions>
</plugin-->

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" -->
<target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" -->
</configuration>
</plugin>

<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
<version>0.10</version><!--$NO-MVN-MAN-VER$-->
<inherited>false</inherited>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<excludeSubProjects>false</excludeSubProjects>
<numUnapprovedLicenses>0</numUnapprovedLicenses>
<licenses>
<!-- Enforce this license:
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<license implementation="org.apache.rat.analysis.license.SimplePatternBasedLicense">
<licenseFamilyCategory>AL2 </licenseFamilyCategory>
<licenseFamilyName>Apache License 2.0</licenseFamilyName>
<notes />
<patterns>
<pattern>Copyright 2015 data Artisans GmbH</pattern>
<pattern>Licensed under the Apache License, Version 2.0 (the "License");</pattern>
</patterns>
</license>
</licenses>
<licenseFamilies>
<licenseFamily implementation="org.apache.rat.license.SimpleLicenseFamily">
<familyName>Apache License 2.0</familyName>
</licenseFamily>
</licenseFamilies>
<excludes>
<!-- Additional files like .gitignore etc.-->
<exclude>**/.*</exclude>
<exclude>**/*.prefs</exclude>
<exclude>**/*.properties</exclude>
<exclude>**/*.log</exclude>
<exclude>*.txt/**</exclude>
<!-- Administrative files in the main trunk. -->
<exclude>**/README.md</exclude>
<exclude>CHANGELOG</exclude>
<!-- Build files -->
<exclude>**/*.iml</exclude>
<!-- Generated content -->
<exclude>**/target/**</exclude>
<exclude>**/build/**</exclude>
</excludes>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>2.12.1</version>
<executions>
<execution>
<id>validate</id>
<phase>validate</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<configLocation>/tools/maven/checkstyle.xml</configLocation>
<logViolationsToConsole>true</logViolationsToConsole>
</configuration>
</plugin>

</plugins>

</build>
</project>

On Mon, Mar 14, 2016 at 3:15 PM, Robert Metzger <[hidden email]> wrote:
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 








Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

Balaji Rajagopalan
Yep the same issue as before(class not found)  with flink 0.10.2 with scala version 2.11. I was not able to use scala 2.10 since connector for flink_connector_kafka for 0.10.2 is not available. 

balaji 

On Mon, Mar 14, 2016 at 4:20 PM, Balaji Rajagopalan <[hidden email]> wrote:
Yes figured that out, thanks for point that, my bad. I have put back 0.10.2 as flink version, will try to reproduce the problem again, this time I have explicitly called out the scala version as 2.11. 


On Mon, Mar 14, 2016 at 4:14 PM, Robert Metzger <[hidden email]> wrote:
Hi,

flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_


On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <[hidden email]> wrote:

What I noticied was that, if I remove the dependency on flink-connector-kafka so it is clearly to do something with that dependency. 



On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
   I have  moved on to latest version of flink of 1.0.0 hoping that will solve my problem with kafka connector . Here is my pom.xml but now I cannot get the code compiled. 

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first) on project flink-streaming-demo: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

I read about the above errors in most cases people where able to overcome is by deleting the .m2 directory, and that did not fix the issue for me. 

What I noticied was that, if I remove the dependency on 

Here is my pom.xml 

<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.dataArtisans</groupId>
<artifactId>flink-streaming-demo</artifactId>
<version>0.1</version>
<packaging>jar</packaging>

<name>Flink Streaming Demo</name>
<url>http://www.data-artisans.com</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<slf4j.version>1.7.12</slf4j.version>
<flink.version>1.0.0</flink.version>
<scala.version>2.10</scala.version>
</properties>

<dependencies>



<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>1.7.3</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.7</version>
</dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_${scala.version}</artifactId>
<version>3.3.0</version>
</dependency>


</dependencies>

<build>
<plugins>

<!-- Scala Compiler -->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
<executions>
<!-- Run scala compiler in the process-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) compile phase -->
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>

<!-- Run scala compiler in the process-test-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) test-compile phase -->
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<jvmArgs>
<jvmArg>-Xms128m</jvmArg>
<jvmArg>-Xmx512m</jvmArg>
</jvmArgs>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.9</version>
<executions>
<execution>
<id>unpack</id>
<!-- executed just before the package phase -->
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<!-- For Flink connector classes -->
<artifactItem>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>1.0.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>org/apache/flink/**</includes>
</artifactItem>
<!-- For Kafka API classes -->
<artifactItem>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>kafka/**</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>

<!--plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>

<execution>
<id>MBoxParser</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>

<configuration>
<classifier>MBoxParser</classifier>

<archive>
<manifestEntries>
<main-class>com.dataartisans.flinkTraining.dataSetPreparation.MBoxParser</main-class>
</manifestEntries>
</archive>

<includes>
<include>**/MBoxParser.class</include>
<include>**/MBoxParser$*.class</include>
</includes>
</configuration>
</execution>

</executions>
</plugin-->

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" -->
<target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" -->
</configuration>
</plugin>

<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
<version>0.10</version><!--$NO-MVN-MAN-VER$-->
<inherited>false</inherited>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<excludeSubProjects>false</excludeSubProjects>
<numUnapprovedLicenses>0</numUnapprovedLicenses>
<licenses>
<!-- Enforce this license:
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<license implementation="org.apache.rat.analysis.license.SimplePatternBasedLicense">
<licenseFamilyCategory>AL2 </licenseFamilyCategory>
<licenseFamilyName>Apache License 2.0</licenseFamilyName>
<notes />
<patterns>
<pattern>Copyright 2015 data Artisans GmbH</pattern>
<pattern>Licensed under the Apache License, Version 2.0 (the "License");</pattern>
</patterns>
</license>
</licenses>
<licenseFamilies>
<licenseFamily implementation="org.apache.rat.license.SimpleLicenseFamily">
<familyName>Apache License 2.0</familyName>
</licenseFamily>
</licenseFamilies>
<excludes>
<!-- Additional files like .gitignore etc.-->
<exclude>**/.*</exclude>
<exclude>**/*.prefs</exclude>
<exclude>**/*.properties</exclude>
<exclude>**/*.log</exclude>
<exclude>*.txt/**</exclude>
<!-- Administrative files in the main trunk. -->
<exclude>**/README.md</exclude>
<exclude>CHANGELOG</exclude>
<!-- Build files -->
<exclude>**/*.iml</exclude>
<!-- Generated content -->
<exclude>**/target/**</exclude>
<exclude>**/build/**</exclude>
</excludes>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>2.12.1</version>
<executions>
<execution>
<id>validate</id>
<phase>validate</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<configLocation>/tools/maven/checkstyle.xml</configLocation>
<logViolationsToConsole>true</logViolationsToConsole>
</configuration>
</plugin>

</plugins>

</build>
</project>

On Mon, Mar 14, 2016 at 3:15 PM, Robert Metzger <[hidden email]> wrote:
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 









Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

Balaji Rajagopalan
Robert,
  I got it working for 1.0.0. 

balaji 

On Mon, Mar 14, 2016 at 4:41 PM, Balaji Rajagopalan <[hidden email]> wrote:
Yep the same issue as before(class not found)  with flink 0.10.2 with scala version 2.11. I was not able to use scala 2.10 since connector for flink_connector_kafka for 0.10.2 is not available. 

balaji 

On Mon, Mar 14, 2016 at 4:20 PM, Balaji Rajagopalan <[hidden email]> wrote:
Yes figured that out, thanks for point that, my bad. I have put back 0.10.2 as flink version, will try to reproduce the problem again, this time I have explicitly called out the scala version as 2.11. 


On Mon, Mar 14, 2016 at 4:14 PM, Robert Metzger <[hidden email]> wrote:
Hi,

flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_


On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <[hidden email]> wrote:

What I noticied was that, if I remove the dependency on flink-connector-kafka so it is clearly to do something with that dependency. 



On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
   I have  moved on to latest version of flink of 1.0.0 hoping that will solve my problem with kafka connector . Here is my pom.xml but now I cannot get the code compiled. 

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first) on project flink-streaming-demo: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

I read about the above errors in most cases people where able to overcome is by deleting the .m2 directory, and that did not fix the issue for me. 

What I noticied was that, if I remove the dependency on 

Here is my pom.xml 

<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.dataArtisans</groupId>
<artifactId>flink-streaming-demo</artifactId>
<version>0.1</version>
<packaging>jar</packaging>

<name>Flink Streaming Demo</name>
<url>http://www.data-artisans.com</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<slf4j.version>1.7.12</slf4j.version>
<flink.version>1.0.0</flink.version>
<scala.version>2.10</scala.version>
</properties>

<dependencies>



<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>1.7.3</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.7</version>
</dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_${scala.version}</artifactId>
<version>3.3.0</version>
</dependency>


</dependencies>

<build>
<plugins>

<!-- Scala Compiler -->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
<executions>
<!-- Run scala compiler in the process-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) compile phase -->
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>

<!-- Run scala compiler in the process-test-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) test-compile phase -->
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<jvmArgs>
<jvmArg>-Xms128m</jvmArg>
<jvmArg>-Xmx512m</jvmArg>
</jvmArgs>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.9</version>
<executions>
<execution>
<id>unpack</id>
<!-- executed just before the package phase -->
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<!-- For Flink connector classes -->
<artifactItem>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>1.0.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>org/apache/flink/**</includes>
</artifactItem>
<!-- For Kafka API classes -->
<artifactItem>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>kafka/**</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>

<!--plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>

<execution>
<id>MBoxParser</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>

<configuration>
<classifier>MBoxParser</classifier>

<archive>
<manifestEntries>
<main-class>com.dataartisans.flinkTraining.dataSetPreparation.MBoxParser</main-class>
</manifestEntries>
</archive>

<includes>
<include>**/MBoxParser.class</include>
<include>**/MBoxParser$*.class</include>
</includes>
</configuration>
</execution>

</executions>
</plugin-->

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" -->
<target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" -->
</configuration>
</plugin>

<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
<version>0.10</version><!--$NO-MVN-MAN-VER$-->
<inherited>false</inherited>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<excludeSubProjects>false</excludeSubProjects>
<numUnapprovedLicenses>0</numUnapprovedLicenses>
<licenses>
<!-- Enforce this license:
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<license implementation="org.apache.rat.analysis.license.SimplePatternBasedLicense">
<licenseFamilyCategory>AL2 </licenseFamilyCategory>
<licenseFamilyName>Apache License 2.0</licenseFamilyName>
<notes />
<patterns>
<pattern>Copyright 2015 data Artisans GmbH</pattern>
<pattern>Licensed under the Apache License, Version 2.0 (the "License");</pattern>
</patterns>
</license>
</licenses>
<licenseFamilies>
<licenseFamily implementation="org.apache.rat.license.SimpleLicenseFamily">
<familyName>Apache License 2.0</familyName>
</licenseFamily>
</licenseFamilies>
<excludes>
<!-- Additional files like .gitignore etc.-->
<exclude>**/.*</exclude>
<exclude>**/*.prefs</exclude>
<exclude>**/*.properties</exclude>
<exclude>**/*.log</exclude>
<exclude>*.txt/**</exclude>
<!-- Administrative files in the main trunk. -->
<exclude>**/README.md</exclude>
<exclude>CHANGELOG</exclude>
<!-- Build files -->
<exclude>**/*.iml</exclude>
<!-- Generated content -->
<exclude>**/target/**</exclude>
<exclude>**/build/**</exclude>
</excludes>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>2.12.1</version>
<executions>
<execution>
<id>validate</id>
<phase>validate</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<configLocation>/tools/maven/checkstyle.xml</configLocation>
<logViolationsToConsole>true</logViolationsToConsole>
</configuration>
</plugin>

</plugins>

</build>
</project>

On Mon, Mar 14, 2016 at 3:15 PM, Robert Metzger <[hidden email]> wrote:
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji 










Reply | Threaded
Open this post in threaded view
|

Re: kafka.javaapi.consumer.SimpleConsumer class not found

rmetzger0
Great to hear!

On Tue, Mar 15, 2016 at 1:14 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  I got it working for 1.0.0. 

balaji 

On Mon, Mar 14, 2016 at 4:41 PM, Balaji Rajagopalan <[hidden email]> wrote:
Yep the same issue as before(class not found)  with flink 0.10.2 with scala version 2.11. I was not able to use scala 2.10 since connector for flink_connector_kafka for 0.10.2 is not available. 

balaji 

On Mon, Mar 14, 2016 at 4:20 PM, Balaji Rajagopalan <[hidden email]> wrote:
Yes figured that out, thanks for point that, my bad. I have put back 0.10.2 as flink version, will try to reproduce the problem again, this time I have explicitly called out the scala version as 2.11. 


On Mon, Mar 14, 2016 at 4:14 PM, Robert Metzger <[hidden email]> wrote:
Hi,

flink-connector-kafka_ doesn't exist for 1.0.0. You have to use either flink-connector-kafka-0.8_ or flink-connector-kafka-0.9_


On Mon, Mar 14, 2016 at 11:17 AM, Balaji Rajagopalan <[hidden email]> wrote:

What I noticied was that, if I remove the dependency on flink-connector-kafka so it is clearly to do something with that dependency. 



On Mon, Mar 14, 2016 at 3:46 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
   I have  moved on to latest version of flink of 1.0.0 hoping that will solve my problem with kafka connector . Here is my pom.xml but now I cannot get the code compiled. 

[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first) on project flink-streaming-demo: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed: For artifact {null:null:null:jar}: The groupId cannot be empty. -> [Help 1]

I read about the above errors in most cases people where able to overcome is by deleting the .m2 directory, and that did not fix the issue for me. 

What I noticied was that, if I remove the dependency on 

Here is my pom.xml 

<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.dataArtisans</groupId>
<artifactId>flink-streaming-demo</artifactId>
<version>0.1</version>
<packaging>jar</packaging>

<name>Flink Streaming Demo</name>
<url>http://www.data-artisans.com</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<slf4j.version>1.7.12</slf4j.version>
<flink.version>1.0.0</flink.version>
<scala.version>2.10</scala.version>
</properties>

<dependencies>



<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>1.7.3</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.7</version>
</dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_${scala.version}</artifactId>
<version>3.3.0</version>
</dependency>


</dependencies>

<build>
<plugins>

<!-- Scala Compiler -->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
<executions>
<!-- Run scala compiler in the process-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) compile phase -->
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>

<!-- Run scala compiler in the process-test-resources phase, so that dependencies on
scala classes can be resolved later in the (Java) test-compile phase -->
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<jvmArgs>
<jvmArg>-Xms128m</jvmArg>
<jvmArg>-Xmx512m</jvmArg>
</jvmArgs>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.9</version>
<executions>
<execution>
<id>unpack</id>
<!-- executed just before the package phase -->
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<!-- For Flink connector classes -->
<artifactItem>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>1.0.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>org/apache/flink/**</includes>
</artifactItem>
<!-- For Kafka API classes -->
<artifactItem>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${scala.version}</artifactId>
<version>0.8.2.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>kafka/**</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>

<!--plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>

<execution>
<id>MBoxParser</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>

<configuration>
<classifier>MBoxParser</classifier>

<archive>
<manifestEntries>
<main-class>com.dataartisans.flinkTraining.dataSetPreparation.MBoxParser</main-class>
</manifestEntries>
</archive>

<includes>
<include>**/MBoxParser.class</include>
<include>**/MBoxParser$*.class</include>
</includes>
</configuration>
</execution>

</executions>
</plugin-->

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" -->
<target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" -->
</configuration>
</plugin>

<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
<version>0.10</version><!--$NO-MVN-MAN-VER$-->
<inherited>false</inherited>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<excludeSubProjects>false</excludeSubProjects>
<numUnapprovedLicenses>0</numUnapprovedLicenses>
<licenses>
<!-- Enforce this license:
Copyright 2015 data Artisans GmbH

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<license implementation="org.apache.rat.analysis.license.SimplePatternBasedLicense">
<licenseFamilyCategory>AL2 </licenseFamilyCategory>
<licenseFamilyName>Apache License 2.0</licenseFamilyName>
<notes />
<patterns>
<pattern>Copyright 2015 data Artisans GmbH</pattern>
<pattern>Licensed under the Apache License, Version 2.0 (the "License");</pattern>
</patterns>
</license>
</licenses>
<licenseFamilies>
<licenseFamily implementation="org.apache.rat.license.SimpleLicenseFamily">
<familyName>Apache License 2.0</familyName>
</licenseFamily>
</licenseFamilies>
<excludes>
<!-- Additional files like .gitignore etc.-->
<exclude>**/.*</exclude>
<exclude>**/*.prefs</exclude>
<exclude>**/*.properties</exclude>
<exclude>**/*.log</exclude>
<exclude>*.txt/**</exclude>
<!-- Administrative files in the main trunk. -->
<exclude>**/README.md</exclude>
<exclude>CHANGELOG</exclude>
<!-- Build files -->
<exclude>**/*.iml</exclude>
<!-- Generated content -->
<exclude>**/target/**</exclude>
<exclude>**/build/**</exclude>
</excludes>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>2.12.1</version>
<executions>
<execution>
<id>validate</id>
<phase>validate</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<configLocation>/tools/maven/checkstyle.xml</configLocation>
<logViolationsToConsole>true</logViolationsToConsole>
</configuration>
</plugin>

</plugins>

</build>
</project>

On Mon, Mar 14, 2016 at 3:15 PM, Robert Metzger <[hidden email]> wrote:
Can you send me the full build file to further investigate the issue?

On Fri, Mar 11, 2016 at 4:56 PM, Balaji Rajagopalan <[hidden email]> wrote:
Robert,
  That did not fix it ( using flink and connector same version) . Tried with scala version 2.11, so will try to see scala 2.10 makes any difference. 

balaji 

On Fri, Mar 11, 2016 at 8:06 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to use the same version for all dependencies from the "org.apache.flink" group.

You said these are the versions you are using:

flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

For the connector, you also need to use 0.10.2.



On Fri, Mar 11, 2016 at 9:56 AM, Balaji Rajagopalan <[hidden email]> wrote:
I am tyring to use the flink kafka connector, for this I have specified the kafka connector dependency and created a fat jar since default flink installation does not contain kafka connector jars. I have made sure that flink-streaming-demo-0.1.jar has the kafka.javaapi.consumer.SimpleConsumer.class but still I see the class not found exception. 

The code for kafka connector in flink. 
val env = StreamExecutionEnvironment.getExecutionEnvironment
val prop:Properties = new Properties()
prop.setProperty("zookeeper.connect","somezookeer:2181")
prop.setProperty("group.id","some-group")
prop.setProperty("bootstrap.servers","somebroker:9092")

val stream = env
.addSource(new FlinkKafkaConsumer082[String]("location", new SimpleStringSchema, prop))

jar tvf flink-streaming-demo-0.1.jar | grep kafka.javaapi.consumer.SimpleConsumer

  5111 Fri Mar 11 14:18:36 UTC 2016 kafka/javaapi/consumer/SimpleConsumer.class


flink.version = 0.10.2
kafka.verison = 0.8.2
flink.kafka.connection.verion=0.9.1

The command that I use to run the flink program in yarn cluster is below, 

HADOOP_CONF_DIR=/etc/hadoop/conf /usr/share/flink/bin/flink run -c com.dataartisans.flink_demo.examples.DriverEventConsumer  -m yarn-cluster -yn 2 /home/balajirajagopalan/flink-streaming-demo-0.1.jar

java.lang.NoClassDefFoundError: kafka/javaapi/consumer/SimpleConsumer

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:691)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)

at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)

at com.dataartisans.flink_demo.examples.DriverEventConsumer$.main(DriverEventConsumer.scala:53)

at com.dataartisans.flink_demo.examples.DriverEventConsumer.main(DriverEventConsumer.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)

at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)

at org.apache.flink.client.program.Client.runBlocking(Client.java:252)

at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:676)

at org.apache.flink.client.CliFrontend.run(CliFrontend.java:326)

at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:978)

at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1028)

Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.SimpleConsumer

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 16 more


Any help appreciated. 


balaji