I am running Apache Flink 1.1.3 – Hadoop version 1.2.1 with the NiFi connector. When I run a program with a single NiFi Source, I receive the following Stack trace in the logs: 2016-11-11 19:28:25,661 WARN org.apache.flink.client.CliFrontend - Unable to locate custom CLI class org.apache.flink.yarn.cli.FlinkYarnSessionCli. Flink is not compiled with support for this class. java.lang.ClassNotFoundException: org.apache.flink.yarn.cli.FlinkYarnSessionCli at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:195) at org.apache.flink.client.CliFrontend.loadCustomCommandLine(CliFrontend.java:1136) at org.apache.flink.client.CliFrontend.<clinit>(CliFrontend.java:128) 2016-11-11 19:28:25,855 INFO org.apache.flink.client.CliFrontend 2016-11-11 19:28:25,855 INFO org.apache.flink.client.CliFrontend - Starting Command Line Client (Version: 1.1.3, Rev:8e8d454, Date:10.10.2016 @ 13:26:32 UTC) 2016-11-11 19:28:25,855 INFO org.apache.flink.client.CliFrontend - Current user: xxxxx 2016-11-11 19:28:25,856 INFO org.apache.flink.client.CliFrontend - JVM: Java HotSpot(TM) 64-Bit Server VM - Oracle Corporation - 1.7/24.80-b11 2016-11-11 19:28:25,856 INFO org.apache.flink.client.CliFrontend - Maximum heap size: 3545 MiBytes 2016-11-11 19:28:25,866 INFO org.apache.flink.client.CliFrontend - Hadoop version: 1.2.1 Seems like the Nifi connector requires the yarn enabled version of flink? Is there a dependency I can add to get over this hurdle? Thanks Jim |
Hi, the problem is that Flink's YARN code is not available in the Hadoop 1.2.1 build. How do you try to execute the Flink job to trigger this error message? On Fri, Nov 11, 2016 at 12:23 PM, PACE, JAMES <[hidden email]> wrote:
|
bin/flink run -c com.att.flink.poc.NifiTest jars/flinkpoc-0.0.1-SNAPSHOT.jar I have another entry point in this jar that uses readFileStream and that works fine. From: Robert Metzger [mailto:[hidden email]]
Hi, the problem is that Flink's YARN code is not available in the Hadoop 1.2.1 build. How do you try to execute the Flink job to trigger this error message? On Fri, Nov 11, 2016 at 12:23 PM, PACE, JAMES <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |