#1 is thrown from user code.We use hadoop 2.7 which uses gauva 11.2 but our application uses 18.0. I think the hadoop's gauva is getting picked up instead of oursOn Thu, Aug 11, 2016 at 1:24 AM, Robert Metzger <[hidden email]> wrote:Hi Janardhan,#1 Is the exception thrown from your user code, or from Flink?#2 is most likely caused due to a compiler / runtime version mismatch: http://stackoverflow.com/questions/10382929/how- to-fix-java-lang-unsupportedcl assversionerror-unsupported- major-minor-versi You compiled the code with Java8, but you try to run it with an older JVM.On Wed, Aug 10, 2016 at 9:46 PM, Janardhan Reddy <[hidden email]> wrote:Hi,We are getting the following error on submitting the flink jobs to the cluster.1. Caused by: java.lang.NoSuchMethodError: com.google.common.io.Resources.asCharSource 2. This is for entirely different jobCaused by: java.lang.UnsupportedClassVersionError: com/olacabs/fabric/common/Meta data : Unsupported major.minor version 52.0 But when we are running the flink locally, there is no error in both the jobs.
Free forum by Nabble | Edit this page |