Hi,
I am following the basic steps to implement a consumer and a producer with Kafka for Flink. My Flink version is 1.2.0, the Kafka's one is 0.10.2.0, so in my pom.xml I will add the : <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-kafka-0.10_2.10</artifactId> <version>1.2.0</version> </dependency> The problem is that if I run the program with maven or in my IDE it works. When I upload the jar on flink I get : java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer010 I googled a bit and I found out that usually these problems are caused by a version problem but I cannot understand where the error is. Best, Paolo |
Hi Paolo, Have you followed the instructions in this documentation [1]? The connectors are not part of the binary distributions, so you would need to bundle the dependencies with your code by building an uber jar. Cheers, Gordon [1] https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/linking.html On 6 July 2017 at 12:04:47 AM, Paolo Cristofanelli ([hidden email]) wrote:
|
Since you’re placing jars in the lib/ folder yourself instead of packaging an uber jar, you also need to provide the Kafka dependency jars. Using the maven-shade-plugin, you can build an uber jar. For example, add the following to your project Maven POM: <build> Gordon
On 6 July 2017 at 1:02:40 AM, Paolo Cristofanelli ([hidden email]) wrote:
|
Free forum by Nabble | Edit this page |