i start the yarn-session.sh with sudo and than the flink run command with sudo,: the FlinkMain.java: 70 is:
2015-06-04 17:17 GMT+02:00 Pa Rö <[hidden email]>:
|
sorry, i see my yarn end before i can run my app, i must set the write access for yarn, maybe this solve my problem. 2015-06-04 17:33 GMT+02:00 Pa Rö <[hidden email]>:
|
i have change the permissions from the cloudera user and try the following command. i get the same exception again, maybe the problem have an other reason? and the files exist on hdfs ;) i set the files in my properties file like "flink.output=/user/cloudera/outputs/output_flink" [cloudera@quickstart bin]$ sudo su yarn bash-4.1$ hadoop fs -chmod 777 /user/cloudera/inputs bash-4.1$ hadoop fs -chmod 777 /user/cloudera/outputs bash-4.1$ exit exit [cloudera@quickstart bin]$ sudo ./flink run /home/cloudera/Desktop/ma-flink.jar log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Found YARN properties file /home/cloudera/Desktop/flink-0.9-SNAPSHOT/bin/../conf/.yarn-properties Using JobManager address from YARN properties quickstart.cloudera/127.0.0.1:52601 org.apache.flink.client.program.ProgramInvocationException: The program execution failed: Failed to submit job fe78e9ee50cf76ac8b487919e1c951fa (KMeans Flink) at org.apache.flink.client.program.Client.run(Client.java:412) at org.apache.flink.client.program.Client.run(Client.java:355) at org.apache.flink.client.program.Client.run(Client.java:348) at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:63) at mgm.tp.bigdata.ma_flink.FlinkMain.main(FlinkMain.java:70) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:437) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:353) at org.apache.flink.client.program.Client.run(Client.java:315) at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:584) at org.apache.flink.client.CliFrontend.run(CliFrontend.java:290) at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:880) at org.apache.flink.client.CliFrontend.main(CliFrontend.java:922) Caused by: org.apache.flink.runtime.client.JobExecutionException: Failed to submit job fe78e9ee50cf76ac8b487919e1c951fa (KMeans Flink) at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:595) at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$receiveWithLogMessages$1.applyOrElse(JobManager.scala:192) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.flink.yarn.ApplicationMasterActor$$anonfun$receiveYarnMessages$1.applyOrElse(ApplicationMasterActor.scala:99) at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162) at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:36) at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:29) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.flink.runtime.ActorLogMessages$$anon$1.applyOrElse(ActorLogMessages.scala:29) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:94) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254) at akka.dispatch.Mailbox.run(Mailbox.scala:221) at akka.dispatch.Mailbox.exec(Mailbox.scala:231) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Caused by: org.apache.flink.runtime.JobException: Creating the input splits caused an error: File /user/cloudera/inputs does not exist or the user running Flink ('yarn') has insufficient permissions to access it. at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:162) at org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:471) at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:535) ... 21 more Caused by: java.io.FileNotFoundException: File /user/cloudera/inputs does not exist or the user running Flink ('yarn') has insufficient permissions to access it. at org.apache.flink.core.fs.local.LocalFileSystem.getFileStatus(LocalFileSystem.java:106) at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:390) at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:51) at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:146) ... 23 more 2015-06-04 17:38 GMT+02:00 Pa Rö <[hidden email]>:
|
here my main class: maybe i can't use the following for the hdfs?
2015-06-04 17:53 GMT+02:00 Pa Rö <[hidden email]>:
|
No, the permissions are still not correct, otherwise, Flink would not complan. The error message of Flink is actually pretty precise: "Caused by: java.io.FileNotFoundException: File /user/cloudera/inputs does not exist or the user running Flink ('yarn') has insufficient permissions to access it." Does the file exist and does the user "yarn" has permission to access it? On Thu, Jun 4, 2015 at 5:57 PM, Pa Rö <[hidden email]> wrote:
|
Hi Robert, i have see that you write me on stackoverflow, thanks. now the path is right and i get the old exception:org.apache.flink.runtime.JobException: Creating the input splits caused an error: File file:/127.0.0.1:8020/home/user/cloudera/outputs/seed-1 does not exist or the user running Flink ('yarn') has insufficient permissions to access it. [cloudera@quickstart bin]$ hdfs dfs -ls Found 9 items drwxrwxrwt - cloudera cloudera 0 2015-06-03 04:24 .Trash drwxrwxrwt - cloudera cloudera 0 2015-06-08 01:17 .flink drwxrwxrwt - cloudera cloudera 0 2015-06-04 06:51 .staging drwxrwxrwt - cloudera cloudera 0 2015-02-17 08:33 gdelt drwxrwxrwt - cloudera cloudera 0 2015-06-02 06:42 inputs -rwxrwxrwt 1 cloudera cloudera 31223141 2015-06-03 03:53 ma-mahout.jar -rwxrwxrwt 1 cloudera cloudera 30037418 2015-06-03 03:53 ma-mapreduce.jar drwxrwxrwt - cloudera cloudera 0 2015-06-04 07:38 oozie-oozi drwxrwxrwt - cloudera cloudera 0 2015-06-03 03:59 outputs [cloudera@quickstart bin]$ sudo hdfs dfs -chown -R yarn:hadoop inputs chown: `inputs': No such file or directory [cloudera@quickstart bin]$ sudo hdfs dfs -chown -R yarn:hadoop outputs chown: `outputs': No such file or directory helpful: https://hadoop.apache.org/docs/r2.4.1/hadoop-project-dist/hadoop-common/FileSystemShell.html something i do wrong, maybe you have a idea? |
I assume that the path On Mon, Jun 8, 2015 at 11:23 AM Pa Rö <[hidden email]> wrote:
|
it's works, now i have set the permissiions to the yarn user, but my flink app not find the path. i try following path and get the same exception:file:///127.0.0.1:8020/user/cloudera/inputs/ how i must set the path to hdfs?? 2015-06-08 11:38 GMT+02:00 Till Rohrmann <[hidden email]>:
|
On Mon, Jun 8, 2015 at 12:41 PM Pa Rö <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |