Exception while submitting jobs through Yarn

classic Classic list List threaded Threaded
26 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Exception while submitting jobs through Yarn

Garvit Sharma
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <[hidden email]> wrote:
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Chesnay Schepler
My gut feeling is that these classes must be present in jars in the /lib directory. I don't think you can supply these with the submitted jar.
For a simple test, put your jar into the /lib folder before submitting it.

On 14.06.2018 06:56, Garvit Sharma wrote:
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <[hidden email]> wrote:
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
Thanks Chesnay, Now it is connecting to the Resource Manager but I am getting the below exception :

2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-15 09:12:44,825 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.util.Records

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)


Please help.

Thanks,


On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler <[hidden email]> wrote:
My gut feeling is that these classes must be present in jars in the /lib directory. I don't think you can supply these with the submitted jar.
For a simple test, put your jar into the /lib folder before submitting it.

On 14.06.2018 06:56, Garvit Sharma wrote:
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <[hidden email]> wrote:
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.




--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
Does someone has any idea how to get rid if the above parse exception while submitting flink job to Yarn.

Already searched on the internet, could not find any solution to it.

Please help.

On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma <[hidden email]> wrote:
Thanks Chesnay, Now it is connecting to the Resource Manager but I am getting the below exception :

2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-15 09:12:44,825 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.util.Records

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)


Please help.

Thanks,


On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler <[hidden email]> wrote:
My gut feeling is that these classes must be present in jars in the /lib directory. I don't think you can supply these with the submitted jar.
For a simple test, put your jar into the /lib folder before submitting it.

On 14.06.2018 06:56, Garvit Sharma wrote:
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <[hidden email]> wrote:
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.




--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Till Rohrmann
Hi Garvit,

have you exported the HADOOP_CLASSPATH as described in the release notes [1]?


Cheers,
Till

On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma <[hidden email]> wrote:
Does someone has any idea how to get rid if the above parse exception while submitting flink job to Yarn.

Already searched on the internet, could not find any solution to it.

Please help.

On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma <[hidden email]> wrote:
Thanks Chesnay, Now it is connecting to the Resource Manager but I am getting the below exception :

2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-15 09:12:44,825 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.util.Records

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)


Please help.

Thanks,


On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler <[hidden email]> wrote:
My gut feeling is that these classes must be present in jars in the /lib directory. I don't think you can supply these with the submitted jar.
For a simple test, put your jar into the /lib folder before submitting it.

On 14.06.2018 06:56, Garvit Sharma wrote:
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <[hidden email]> wrote:
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.




--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
Yes, I did. 

On Fri, Jun 15, 2018 at 6:17 PM Till Rohrmann <[hidden email]> wrote:
Hi Garvit,

have you exported the HADOOP_CLASSPATH as described in the release notes [1]?


Cheers,
Till

On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma <[hidden email]> wrote:
Does someone has any idea how to get rid if the above parse exception while submitting flink job to Yarn.

Already searched on the internet, could not find any solution to it.

Please help.

On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma <[hidden email]> wrote:
Thanks Chesnay, Now it is connecting to the Resource Manager but I am getting the below exception :

2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-15 09:12:44,825 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.util.Records

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)


Please help.

Thanks,


On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler <[hidden email]> wrote:
My gut feeling is that these classes must be present in jars in the /lib directory. I don't think you can supply these with the submitted jar.
For a simple test, put your jar into the /lib folder before submitting it.

On 14.06.2018 06:56, Garvit Sharma wrote:
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <[hidden email]> wrote:
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.




--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Till Rohrmann
Hmm could you maybe share the client logs with us.

Cheers,
Till

On Fri, Jun 15, 2018 at 4:54 PM Garvit Sharma <[hidden email]> wrote:
Yes, I did. 

On Fri, Jun 15, 2018 at 6:17 PM Till Rohrmann <[hidden email]> wrote:
Hi Garvit,

have you exported the HADOOP_CLASSPATH as described in the release notes [1]?


Cheers,
Till

On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma <[hidden email]> wrote:
Does someone has any idea how to get rid if the above parse exception while submitting flink job to Yarn.

Already searched on the internet, could not find any solution to it.

Please help.

On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma <[hidden email]> wrote:
Thanks Chesnay, Now it is connecting to the Resource Manager but I am getting the below exception :

2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-15 09:12:44,825 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.util.Records

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)


Please help.

Thanks,


On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler <[hidden email]> wrote:
My gut feeling is that these classes must be present in jars in the /lib directory. I don't think you can supply these with the submitted jar.
For a simple test, put your jar into the /lib folder before submitting it.

On 14.06.2018 06:56, Garvit Sharma wrote:
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <[hidden email]> wrote:
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.




--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.RuntimeException: javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2600)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

... 14 more

Caused by: javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

... 19 more

2018-06-16 19:25:10,627 DEBUG org.apache.hadoop.service.AbstractService                     - Service: org.apache.hadoop.yarn.client.api.impl.YarnClientImpl entered state STOPPED

2018-06-16 19:25:10,628 DEBUG org.apache.hadoop.ipc.Client                                  - stopping client from cache: org.apache.hadoop.ipc.Client@32c726ee

2018-06-16 19:25:10,628 DEBUG org.apache.hadoop.ipc.Client                                  - removing client from cache: org.apache.hadoop.ipc.Client@32c726ee

2018-06-16 19:25:10,628 DEBUG org.apache.hadoop.ipc.Client                                  - stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@32c726ee

2018-06-16 19:25:10,628 DEBUG org.apache.hadoop.ipc.Client                                  - Stopping client

2018-06-16 19:25:10,629 DEBUG org.apache.hadoop.service.AbstractService                     - Service: org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl entered state STOPPED

2018-06-16 19:25:10,630 ERROR org.apache.flink.client.cli.CliFrontend                       - Fatal error while running command line interface.

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.util.Records

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:212)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)



On Fri, Jun 15, 2018 at 9:35 PM Till Rohrmann <[hidden email]> wrote:
Hmm could you maybe share the client logs with us.

Cheers,
Till

On Fri, Jun 15, 2018 at 4:54 PM Garvit Sharma <[hidden email]> wrote:
Yes, I did. 

On Fri, Jun 15, 2018 at 6:17 PM Till Rohrmann <[hidden email]> wrote:
Hi Garvit,

have you exported the HADOOP_CLASSPATH as described in the release notes [1]?


Cheers,
Till

On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma <[hidden email]> wrote:
Does someone has any idea how to get rid if the above parse exception while submitting flink job to Yarn.

Already searched on the internet, could not find any solution to it.

Please help.

On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma <[hidden email]> wrote:
Thanks Chesnay, Now it is connecting to the Resource Manager but I am getting the below exception :

2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-15 09:12:44,825 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.util.Records

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)


Please help.

Thanks,


On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler <[hidden email]> wrote:
My gut feeling is that these classes must be present in jars in the /lib directory. I don't think you can supply these with the submitted jar.
For a simple test, put your jar into the /lib folder before submitting it.

On 14.06.2018 06:56, Garvit Sharma wrote:
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <[hidden email]> wrote:
Hi,

I am using flink-1.5.0-bin-hadoop27-scala_2.11 to submit jobs through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError: com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException: com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20 test.jar

The class com/sun/jersey/core/util/FeaturesAndProperties is already present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.




--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Ted Yu
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Till Rohrmann
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Amit Jain
Hi Gravit,

I think Till is interested to know about classpath details present at the start of JM and TM logs e.g. following logs provide classpath details used by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  - --------------------------------------------------------------------------------
2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Starting YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     .
2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Classpath: lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <[hidden email]> wrote:
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.

Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain <[hidden email]> wrote:
Hi Gravit,

I think Till is interested to know about classpath details present at the start of JM and TM logs e.g. following logs provide classpath details used by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  - --------------------------------------------------------------------------------
2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Starting YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     .
2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Classpath: lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <[hidden email]> wrote:
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.

yarn.logs (43K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Till Rohrmann
Which Hadoop version have you installed? It looks as if Flink has been build with Hadoop 2.7 but I see /usr/hdp/2.6.3.0-235 in the class path. If you want to run Flink on Hadoop 2.6, then try to use the Hadoop free Flink binaries or the one built for Hadoop 2.6.

Cheers,
Till

On Mon, Jun 18, 2018 at 10:48 AM Garvit Sharma <[hidden email]> wrote:
Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain <[hidden email]> wrote:
Hi Gravit,

I think Till is interested to know about classpath details present at the start of JM and TM logs e.g. following logs provide classpath details used by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  - --------------------------------------------------------------------------------
2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Starting YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     .
2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Classpath: lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <[hidden email]> wrote:
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
Hi,

Sorry for the confusion, but the yarn is running on Hadoop version 2.7 only and hence I am using Flink 1.5 Hadoop 2.7 binary.

Below are the details provided by Yarn version command :

Hadoop 2.7.3.2.6.3.0-235
Subversion [hidden email]:hortonworks/hadoop.git -r 45bfd33bba8acadfa0e6024c80981c023b28d454
Compiled by jenkins on 2017-10-30T02:31Z
Compiled with protoc 2.5.0
From source with checksum cd1a4a466ef450f547c279989f3aa3
This command was run using /usr/hdp/2.6.3.0-235/hadoop/hadoop-common-2.7.3.2.6.3.0-235.jar

Please let me know if you have found the resolution to my issue :)

Thanks,


On Mon, Jun 18, 2018 at 4:50 PM Till Rohrmann <[hidden email]> wrote:
Which Hadoop version have you installed? It looks as if Flink has been build with Hadoop 2.7 but I see /usr/hdp/2.6.3.0-235 in the class path. If you want to run Flink on Hadoop 2.6, then try to use the Hadoop free Flink binaries or the one built for Hadoop 2.6.

Cheers,
Till

On Mon, Jun 18, 2018 at 10:48 AM Garvit Sharma <[hidden email]> wrote:
Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain <[hidden email]> wrote:
Hi Gravit,

I think Till is interested to know about classpath details present at the start of JM and TM logs e.g. following logs provide classpath details used by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  - --------------------------------------------------------------------------------
2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Starting YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     .
2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Classpath: lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <[hidden email]> wrote:
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Till Rohrmann
Could you check which xerces version you have on your classpath? Apparently, it cannot read core-default.xml as Ted pointed out. This might be the root cause for the failure.

Cheers,
Till

On Mon, Jun 18, 2018 at 1:31 PM Garvit Sharma <[hidden email]> wrote:
Hi,

Sorry for the confusion, but the yarn is running on Hadoop version 2.7 only and hence I am using Flink 1.5 Hadoop 2.7 binary.

Below are the details provided by Yarn version command :

Hadoop 2.7.3.2.6.3.0-235
Subversion [hidden email]:hortonworks/hadoop.git -r 45bfd33bba8acadfa0e6024c80981c023b28d454
Compiled by jenkins on 2017-10-30T02:31Z
Compiled with protoc 2.5.0
From source with checksum cd1a4a466ef450f547c279989f3aa3
This command was run using /usr/hdp/2.6.3.0-235/hadoop/hadoop-common-2.7.3.2.6.3.0-235.jar

Please let me know if you have found the resolution to my issue :)

Thanks,


On Mon, Jun 18, 2018 at 4:50 PM Till Rohrmann <[hidden email]> wrote:
Which Hadoop version have you installed? It looks as if Flink has been build with Hadoop 2.7 but I see /usr/hdp/2.6.3.0-235 in the class path. If you want to run Flink on Hadoop 2.6, then try to use the Hadoop free Flink binaries or the one built for Hadoop 2.6.

Cheers,
Till

On Mon, Jun 18, 2018 at 10:48 AM Garvit Sharma <[hidden email]> wrote:
Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain <[hidden email]> wrote:
Hi Gravit,

I think Till is interested to know about classpath details present at the start of JM and TM logs e.g. following logs provide classpath details used by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  - --------------------------------------------------------------------------------
2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Starting YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     .
2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Classpath: lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <[hidden email]> wrote:
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
Hi,

After putting the following log in my code, I can see that the Xerces version is - Xerces version : Xerces-J 2.9.1

log.info("Xerces version : {}", org.apache.xerces.impl.Version.getVersion());
Also, following is the response of $ locate xerces command on the server -


/usr/hdp/2.6.1.0-129/falcon/client/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl.jar

/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hbase/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/livy/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/livy2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/oozie/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/oozie/libserver/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/oozie/libtools/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/slider/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/spark2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/zookeeper/lib/xercesMinimal-1.9.6.2.jar

/usr/hdp/2.6.3.0-235/falcon/client/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar

/usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hbase/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/livy/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/livy2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/oozie/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/oozie/libserver/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/oozie/libtools/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/ranger-admin/ews/webapp/WEB-INF/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/slider/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/spark2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/zookeeper/lib/xercesMinimal-1.9.6.2.jar

/usr/hdp/share/hst/hst-common/lib/xercesImpl-2.9.1.jar


Now, I can say that the version of xerces are same.


So, what is causing this issue if Xerces version is in sync?


I am very excited to discover the issue :)


Thanks,


On Mon, Jun 18, 2018 at 6:27 PM Till Rohrmann <[hidden email]> wrote:
Could you check which xerces version you have on your classpath? Apparently, it cannot read core-default.xml as Ted pointed out. This might be the root cause for the failure.

Cheers,
Till

On Mon, Jun 18, 2018 at 1:31 PM Garvit Sharma <[hidden email]> wrote:
Hi,

Sorry for the confusion, but the yarn is running on Hadoop version 2.7 only and hence I am using Flink 1.5 Hadoop 2.7 binary.

Below are the details provided by Yarn version command :

Hadoop 2.7.3.2.6.3.0-235
Subversion [hidden email]:hortonworks/hadoop.git -r 45bfd33bba8acadfa0e6024c80981c023b28d454
Compiled by jenkins on 2017-10-30T02:31Z
Compiled with protoc 2.5.0
From source with checksum cd1a4a466ef450f547c279989f3aa3
This command was run using /usr/hdp/2.6.3.0-235/hadoop/hadoop-common-2.7.3.2.6.3.0-235.jar

Please let me know if you have found the resolution to my issue :)

Thanks,


On Mon, Jun 18, 2018 at 4:50 PM Till Rohrmann <[hidden email]> wrote:
Which Hadoop version have you installed? It looks as if Flink has been build with Hadoop 2.7 but I see /usr/hdp/2.6.3.0-235 in the class path. If you want to run Flink on Hadoop 2.6, then try to use the Hadoop free Flink binaries or the one built for Hadoop 2.6.

Cheers,
Till

On Mon, Jun 18, 2018 at 10:48 AM Garvit Sharma <[hidden email]> wrote:
Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain <[hidden email]> wrote:
Hi Gravit,

I think Till is interested to know about classpath details present at the start of JM and TM logs e.g. following logs provide classpath details used by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  - --------------------------------------------------------------------------------
2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Starting YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     .
2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Classpath: lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <[hidden email]> wrote:
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Till Rohrmann
Hmm, could you check whether core-default.xml contains any suspicious entries? Apparently xerces:2.9.1 cannot read it.

On Mon, Jun 18, 2018 at 3:40 PM Garvit Sharma <[hidden email]> wrote:
Hi,

After putting the following log in my code, I can see that the Xerces version is - Xerces version : Xerces-J 2.9.1

log.info("Xerces version : {}", org.apache.xerces.impl.Version.getVersion());
Also, following is the response of $ locate xerces command on the server -


/usr/hdp/2.6.1.0-129/falcon/client/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl.jar

/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hbase/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/livy/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/livy2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/oozie/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/oozie/libserver/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/oozie/libtools/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/slider/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/spark2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/zookeeper/lib/xercesMinimal-1.9.6.2.jar

/usr/hdp/2.6.3.0-235/falcon/client/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar

/usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hbase/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/livy/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/livy2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/oozie/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/oozie/libserver/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/oozie/libtools/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/ranger-admin/ews/webapp/WEB-INF/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/slider/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/spark2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/zookeeper/lib/xercesMinimal-1.9.6.2.jar

/usr/hdp/share/hst/hst-common/lib/xercesImpl-2.9.1.jar


Now, I can say that the version of xerces are same.


So, what is causing this issue if Xerces version is in sync?


I am very excited to discover the issue :)


Thanks,


On Mon, Jun 18, 2018 at 6:27 PM Till Rohrmann <[hidden email]> wrote:
Could you check which xerces version you have on your classpath? Apparently, it cannot read core-default.xml as Ted pointed out. This might be the root cause for the failure.

Cheers,
Till

On Mon, Jun 18, 2018 at 1:31 PM Garvit Sharma <[hidden email]> wrote:
Hi,

Sorry for the confusion, but the yarn is running on Hadoop version 2.7 only and hence I am using Flink 1.5 Hadoop 2.7 binary.

Below are the details provided by Yarn version command :

Hadoop 2.7.3.2.6.3.0-235
Subversion [hidden email]:hortonworks/hadoop.git -r 45bfd33bba8acadfa0e6024c80981c023b28d454
Compiled by jenkins on 2017-10-30T02:31Z
Compiled with protoc 2.5.0
From source with checksum cd1a4a466ef450f547c279989f3aa3
This command was run using /usr/hdp/2.6.3.0-235/hadoop/hadoop-common-2.7.3.2.6.3.0-235.jar

Please let me know if you have found the resolution to my issue :)

Thanks,


On Mon, Jun 18, 2018 at 4:50 PM Till Rohrmann <[hidden email]> wrote:
Which Hadoop version have you installed? It looks as if Flink has been build with Hadoop 2.7 but I see /usr/hdp/2.6.3.0-235 in the class path. If you want to run Flink on Hadoop 2.6, then try to use the Hadoop free Flink binaries or the one built for Hadoop 2.6.

Cheers,
Till

On Mon, Jun 18, 2018 at 10:48 AM Garvit Sharma <[hidden email]> wrote:
Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain <[hidden email]> wrote:
Hi Gravit,

I think Till is interested to know about classpath details present at the start of JM and TM logs e.g. following logs provide classpath details used by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  - --------------------------------------------------------------------------------
2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Starting YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     .
2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Classpath: lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <[hidden email]> wrote:
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
Reply | Threaded
Open this post in threaded view
|

Re: Exception while submitting jobs through Yarn

Garvit Sharma
I don't think I can access core-default as it comes with Hadoop jar

On Mon, 18 Jun 2018 at 7:30 PM, Till Rohrmann <[hidden email]> wrote:
Hmm, could you check whether core-default.xml contains any suspicious entries? Apparently xerces:2.9.1 cannot read it.

On Mon, Jun 18, 2018 at 3:40 PM Garvit Sharma <[hidden email]> wrote:
Hi,

After putting the following log in my code, I can see that the Xerces version is - Xerces version : Xerces-J 2.9.1

log.info("Xerces version : {}", org.apache.xerces.impl.Version.getVersion());
Also, following is the response of $ locate xerces command on the server -


/usr/hdp/2.6.1.0-129/falcon/client/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl.jar

/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hbase/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/livy/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/livy2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/oozie/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/oozie/libserver/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/oozie/libtools/xercesImpl-2.10.0.jar

/usr/hdp/2.6.1.0-129/slider/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/spark2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar

/usr/hdp/2.6.1.0-129/zookeeper/lib/xercesMinimal-1.9.6.2.jar

/usr/hdp/2.6.3.0-235/falcon/client/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar

/usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hbase/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/livy/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/livy2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/oozie/lib/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/oozie/libserver/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/oozie/libtools/xercesImpl-2.10.0.jar

/usr/hdp/2.6.3.0-235/ranger-admin/ews/webapp/WEB-INF/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/slider/lib/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/spark2/jars/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar

/usr/hdp/2.6.3.0-235/zookeeper/lib/xercesMinimal-1.9.6.2.jar

/usr/hdp/share/hst/hst-common/lib/xercesImpl-2.9.1.jar


Now, I can say that the version of xerces are same.


So, what is causing this issue if Xerces version is in sync?


I am very excited to discover the issue :)


Thanks,


On Mon, Jun 18, 2018 at 6:27 PM Till Rohrmann <[hidden email]> wrote:
Could you check which xerces version you have on your classpath? Apparently, it cannot read core-default.xml as Ted pointed out. This might be the root cause for the failure.

Cheers,
Till

On Mon, Jun 18, 2018 at 1:31 PM Garvit Sharma <[hidden email]> wrote:
Hi,

Sorry for the confusion, but the yarn is running on Hadoop version 2.7 only and hence I am using Flink 1.5 Hadoop 2.7 binary.

Below are the details provided by Yarn version command :

Hadoop 2.7.3.2.6.3.0-235
Subversion [hidden email]:hortonworks/hadoop.git -r 45bfd33bba8acadfa0e6024c80981c023b28d454
Compiled by jenkins on 2017-10-30T02:31Z
Compiled with protoc 2.5.0
From source with checksum cd1a4a466ef450f547c279989f3aa3
This command was run using /usr/hdp/2.6.3.0-235/hadoop/hadoop-common-2.7.3.2.6.3.0-235.jar

Please let me know if you have found the resolution to my issue :)

Thanks,


On Mon, Jun 18, 2018 at 4:50 PM Till Rohrmann <[hidden email]> wrote:
Which Hadoop version have you installed? It looks as if Flink has been build with Hadoop 2.7 but I see /usr/hdp/2.6.3.0-235 in the class path. If you want to run Flink on Hadoop 2.6, then try to use the Hadoop free Flink binaries or the one built for Hadoop 2.6.

Cheers,
Till

On Mon, Jun 18, 2018 at 10:48 AM Garvit Sharma <[hidden email]> wrote:
Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain <[hidden email]> wrote:
Hi Gravit,

I think Till is interested to know about classpath details present at the start of JM and TM logs e.g. following logs provide classpath details used by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  - --------------------------------------------------------------------------------
2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Starting YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -     .
2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner                  -  Classpath: lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <[hidden email]> wrote:
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <[hidden email]> wrote:
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <[hidden email]> wrote:
The error for core-default.xml is interesting. 

Flink doesn't have this file. Probably it came with Yarn. Please check the hadoop version Flink was built with versus the hadoop version in your cluster.

Thanks

-------- Original message --------
From: Garvit Sharma <[hidden email]>
Date: 6/16/18 7:23 AM (GMT-08:00)
Cc: Chesnay Schepler <[hidden email]>, [hidden email]
Subject: Re: Exception while submitting jobs through Yarn

I am not able to figure out, got stuck badly in this since last 1 week. Any little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration                          - error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized.

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details

java.lang.ExceptionInInitializerError

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.



--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.


--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
--

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination that makes him master.
12