Posted by
sohimankotia on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Flink-Yarn-Deployment-Issue-1-7-0-tp25015.html
Hi ,
I have installed flink-1.7.0 Hadoop 2.7 scala 2.11 . We are using
hortonworks hadoop distribution.(hdp/2.6.1.0-129/)
Flink lib folder looks like :-rw-r--r-- 1 hdfs hadoop 93184216 Nov 29 02:15 flink-dist_2.11-1.7.0.jar
-rw-r--r-- 1 hdfs hadoop 79219 Nov 29 03:33
flink-hadoop-compatibility_2.11-1.7.0.jar
-rw-r--r-- 1 hdfs hadoop 141881 Nov 29 02:13 flink-python_2.11-1.7.0.jar
-rw-r--r-- 1 hdfs hadoop 489884 Nov 28 23:01 log4j-1.2.17.jar
-rw-r--r-- 1 hdfs hadoop 9931 Nov 28 23:01 slf4j-log4j12-1.7.15.j
*My code :* ExecutionEnvironment env =
ExecutionEnvironment.getExecutionEnvironment();
String p = args[0];
Job job = Job.getInstance();
SequenceFileInputFormat<Text, BytesWritable> inputFormat = new
SequenceFileInputFormat<>();
job.getConfiguration().setBoolean(FileInputFormat.INPUT_DIR_RECURSIVE,
true);
final HadoopInputFormat<Text, BytesWritable> hInputEvents =
HadoopInputs.readHadoopFile(inputFormat, Text.class, BytesWritable.class, p,
job);
org.apache.flink.configuration.Configuration fileReadConfig = new
org.apache.flink.configuration.Configuration();
env.createInput(hInputEvents)
.output(new PrintingOutputFormat<>());
pom.xmlflink.version = 1.7.0
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-compatibility_2.11</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-shaded-hadoop2</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
in script :export HADOOP_CONF_DIR=/etc/hadoop/conf
export HADOOP_CLASSPATH="/usr/hdp/2.6.1.0-129/hadoop/hadoop-*":`hadoop
classpath`
echo ${HADOOP_CLASSPATH}
PARALLELISM=1
JAR_PATH="jar"
CLASS_NAME="CLASS_NAME"
NODES=1
SLOTS=1
MEMORY_PER_NODE=2048
QUEUE="default"
NAME="sample"
IN="input-file-path"
/home/hdfs/flink-1.7.0/bin/flink run -m yarn-cluster -yn ${NODES} -yqu
${QUEUE} -ys ${SLOTS} -ytm ${MEMORY_PER_NODE} --parallelism ${PARALLELISM}
-ynm ${NAME} -c ${CLASS_NAME} ${JAR_PATH} ${IN}
where classpath is printing:/usr/hdp/2.6.1.0-129/hadoop/hadoop-*:/usr/hdp/2.6.1.0-129/hadoop/conf:/usr/hdp/2.6.1.0-129/hadoop/lib/*:/usr/hdp/2.6.1.0-129/hadoop/.//*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/./:/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/.//*:/usr/hdp/2.6.1.0-129/hadoop-yarn/lib/*:/usr/hdp/2.6.1.0-129/hadoop-yarn/.//*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/lib/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/.//*:/usr/hdp/2.6.1.0-129/hadoop/conf:/usr/hdp/2.6.1.0-129/hadoop/lib/*:/usr/hdp/2.6.1.0-129/hadoop/.//*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/./:/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/.//*:/usr/hdp/2.6.1.0-129/hadoop-yarn/lib/*:/usr/hdp/2.6.1.0-129/hadoop-yarn/.//*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/lib/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/.//*::mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.6.1.0-129/tez/*:/usr/hdp/2.6.1.0-129/tez/lib/*:/usr/hdp/2.6.1.0-129/tez/conf:mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.6.1.0-129/tez/*:/usr/hdp/2.6.1.0-129/tez/lib/*:/usr/hdp/2.6.1.0-129/tez/conf
But I am getting class not found error for hadoop related jar . Error is
attached .
error.txt
Another Problem :If i added hadoop shaded jar in lib folder
-rw-r--r-- 1 hdfs hadoop 93184216 Nov 29 02:15 flink-dist_2.11-1.7.0.jar
-rw-r--r-- 1 hdfs hadoop 79219 Nov 29 03:33
flink-hadoop-compatibility_2.11-1.7.0.jar
-rw-r--r-- 1 hdfs hadoop 141881 Nov 29 02:13 flink-python_2.11-1.7.0.jar
*-rw-r--r-- 1 hdfs hadoop 41130742 Dec 8 22:38
flink-shaded-hadoop2-uber-1.7.0.jar*
-rw-r--r-- 1 hdfs hadoop 489884 Nov 28 23:01 log4j-1.2.17.jar
-rw-r--r-- 1 hdfs hadoop 9931 Nov 28 23:01 slf4j-log4j12-1.7.15.jar
I am getting following error. And this is happening for all version greater
than 1.4.2 .
java.lang.IllegalAccessError: tried to access method
org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider.getProxyInternal()Ljava/lang/Object;
from class
org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider
at
org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider.init(RequestHedgingRMFailoverProxyProvider.java:75)
at
org.apache.hadoop.yarn.client.RMProxy.createRMFailoverProxyProvider(RMProxy.java:163)
at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:94)
at
org.apache.hadoop.yarn.client.ClientRMProxy.createRMProxy(ClientRMProxy.java:72)
at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:187)
at
org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
at
org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:985)
at
org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:273)
at
org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:451)
at
org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:96)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:224)
Thanks in advance .
--
Sent from:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/