Re: flink no class found error

Posted by Janardhan Reddy on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/flink-no-class-found-error-tp8432p8438.html

I have cross checked that all our yarn nodes have 1.8 java installed but still we are getting the error : Unsupported major.minor version 52.0

On Thu, Aug 11, 2016 at 1:35 AM, Janardhan Reddy <[hidden email]> wrote:
can you please explain a bit more about last option. We are using yarn so guava might be in some classpath.

On Thu, Aug 11, 2016 at 1:29 AM, Robert Metzger <[hidden email]> wrote:
Can you check if the jar you are submitting to the cluster contains a different Guava than you use at compile time?

Also, it might happen that Guava is in your classpath, for example one some YARN setups.

The last resort to resolve these issues is to use the maven-shade-plugin and relocated the guava version you need into your own namespace.

On Wed, Aug 10, 2016 at 9:56 PM, Janardhan Reddy <[hidden email]> wrote:
#1 is thrown from user code.

We use hadoop 2.7 which uses gauva 11.2 but our application uses 18.0. I think the hadoop's gauva is getting picked up instead of ours

On Thu, Aug 11, 2016 at 1:24 AM, Robert Metzger <[hidden email]> wrote:
Hi Janardhan,

#1 Is the exception thrown from your user code, or from Flink?

You compiled the code with Java8, but you try to run it with an older JVM.

On Wed, Aug 10, 2016 at 9:46 PM, Janardhan Reddy <[hidden email]> wrote:
Hi,

We are getting the following error on submitting the flink jobs to the cluster.

1. Caused by: java.lang.NoSuchMethodError: com.google.common.io.Resources.asCharSource

2. This is for entirely different job
Caused by: java.lang.UnsupportedClassVersionError: com/olacabs/fabric/common/Metadata : Unsupported major.minor version 52.0

But when we are running the flink locally, there is no error in both the jobs.