Hi, I would like to upgrade to the new stable version 1.2 - but i get an ClassNotFound exception when i start the application.Caused by: java.lang.NoClassDefFoundError: com/codahale/metrics/Metric at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1367) at com.datastax.driver.core.Cluster.init(Cluster.java:162) at com.datastax.driver.core.Cluster.connectAsync(Cluster.java:333) at com.datastax.driver.core.Cluster.connectAsync(Cluster.java:308) at com.datastax.driver.core.Cluster.connect(Cluster.java:250) at org.apache.flink.streaming.connectors.cassandra.CassandraSinkBase.open(CassandraSinkBase.java:67) at org.apache.flink.streaming.connectors.cassandra.CassandraTupleSink.open(CassandraTupleSink.java:42) at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36) at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:112) at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:386) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:262) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655) at java.lang.Thread.run(Thread.java:745) So I think the cassandra connector is the reason for it. Moreover, i don't see a version 1.2 in the maven repository for the connector as mentioned in the doc. <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-cassandra_2.10</artifactId> <version>1.2.0</version> </dependency> |
Hi Nico, The cassandra connector should be available on Maven central: http://search.maven.org/#artifactdetails%7Corg.apache.flink%7Cflink-connector-cassandra_2.10%7C1.2.0%7Cjar Potentially, the issue you've mentioned is due to some shading issue. Is the "com/codahale/metrics/Metric" class in your user code jar? On Thu, Feb 9, 2017 at 2:56 PM, Nico <[hidden email]> wrote:
|
Hi Robert & Nico, I am facing the same problem (java.lang.---------------------------------------------
--------------------------------------------- On Sun, Feb 12, 2017 at 1:56 AM, Robert Metzger <[hidden email]> wrote:
|
Hello,
i believe the cassandra connector is not shading it's dependencies properly. This didn't cause issues in the past since flink used to have a dependency on codahale metrics as well. Please open a JIRA for this issue. Regards, Chesnay On 06.03.2017 11:32, Tarandeep Singh wrote:
|
Hi @all, I came back to this issue today... "com/codahale/metrics/Metric" class was not available in the user code jar Even after adding the metric class into the build-jar profile of the pom file, more "class not found" errors occur. So the only solution was to add the whole dependency: <dependency> This worked for me. Best, Nico 2017-03-06 11:46 GMT+01:00 Chesnay Schepler <[hidden email]>:
|
Can we improve the Flink experience here by adding this dependency directly to the cassandra connector pom.xml (so that user jars always pull it in via transitivity)?
On Wed, Mar 15, 2017 at 4:09 PM, Nico <[hidden email]> wrote:
|
Yep, this is definitively a bug / misconfiguration in the build system. The cassandra client defines metrics-core as a dependency, but the shading is dropping the dependency when building the dependency reduced pom. To resolve the issue, we need to add the following line into the shading config of the cassandra module: This makes the metrics dependency appear again in the dep red pom. I've filed a JIRA: https://issues.apache.org/jira/browse/FLINK-6084 and will open a PR. On Thu, Mar 16, 2017 at 1:08 PM, Stephan Ewen <[hidden email]> wrote:
|
I've created a pull request for the fix: https://github.com/apache/flink/pull/3556 It would be nice if one of the issue reporters could validate that the cassandra connector works after the fix. If it is a valid fix, I would like to include it into the upcoming 1.2.1 release. On Thu, Mar 16, 2017 at 6:08 PM, Robert Metzger <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |