Hi,Could you please list what exactly is in your submitted jar file, for example using "jar tf my-jar-file.jar"? And also what files exactly are in your Flink lib directory.Best,AljoschaOn 19. Dec 2017, at 20:10, shashank agarwal <[hidden email]> wrote:<img width="0" height="0" class="m_-2758157150223952626mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display:flex">Hi Timo,I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt :"org.slf4j" % "slf4j-log4j12" % "1.7.21","org.apache.flink" %% "flink-scala" % flinkVersion % "provided","org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided","org.apache.flink" %% "flink-cep-scala" % flinkVersion,"org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion,"org.apache.flink" %% "flink-connector-filesystem" % flinkVersion,"org.apache.flink" %% "flink-statebackend-rocksdb" % flinkVersion,"org.apache.flink" %% "flink-connector-cassandra" % "1.3.2","org.apache.flink" % "flink-shaded-hadoop2" % flinkVersion,when i start flink yarn session it's working fine even it's creating flink checkpointing directory and copying libs into hdfs.But when I submit the application to this yarn session it prints following logs :Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf
:/usr/hdp/2.6.0.3-8/hadoop/ lib/*:/usr/hdp/2.6.0.3-8/ hadoop/.//*:/usr/hdp/2.6.0.3- 8/hadoop-hdfs/./:/usr/hdp/2.6. 0.3-8/hadoop-hdfs/lib/*:/usr/ hdp/2.6.0.3-8/hadoop-hdfs/.//* :/usr/hdp/2.6.0.3-8/hadoop- yarn/lib/*:/usr/hdp/2.6.0.3-8/ hadoop-yarn/.//*:/usr/hdp/2.6. 0.3-8/hadoop-mapreduce/lib/*:/ usr/hdp/2.6.0.3-8/hadoop- mapreduce/.//* But application fails contuniously with logs which i have sent earlier.
I have tried to add flink- hadoop-compability*.jar as suggested by Jorn but it's not working. On Tue, Dec 19, 2017 at 5:08 PM, shashank agarwal <[hidden email]> wrote:yes, it's working fine. now not getting compile time error.But when i trying to run this on cluster or yarn, getting following runtime error :org.apache.flink.core.fs.UnsupportedFileSystemSchemeExceptio n: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded. at org.apache.flink.core.fs.FileS ystem.getUnguardedFileSystem(F ileSystem.java:405) at org.apache.flink.core.fs.FileS ystem.get(FileSystem.java:320) at org.apache.flink.core.fs.Path. getFileSystem(Path.java:293) at org.apache.flink.runtime.state .filesystem.FsCheckpointStream Factory.<init>(FsCheckpointStr eamFactory.java:99) at org.apache.flink.runtime.state .filesystem.FsStateBackend.cre ateStreamFactory(FsStateBacken d.java:277) at org.apache.flink.contrib.strea ming.state.RocksDBStateBackend .createStreamFactory(RocksDBSt ateBackend.java:273) at org.apache.flink.streaming.run time.tasks.StreamTask.createCh eckpointStreamFactory(StreamTa sk.java:787) at org.apache.flink.streaming.api .operators.AbstractStreamOpera tor.initializeState(AbstractSt reamOperator.java:247) at org.apache.flink.streaming.run time.tasks.StreamTask.initiali zeOperators(StreamTask.java: 694) at org.apache.flink.streaming.run time.tasks.StreamTask.initiali zeState(StreamTask.java:682) at org.apache.flink.streaming.run time.tasks.StreamTask.invoke( StreamTask.java:253) at org.apache.flink.runtime.taskm anager.Task.run(Task.java:718) at java.lang.Thread.run(Thread.ja va:745) Caused by: org.apache.flink.core.fs.Unsup portedFileSystemSchemeExceptio n: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath. at org.apache.flink.runtime.fs.hd fs.HadoopFsFactory.create(Hado opFsFactory.java:102) at org.apache.flink.core.fs.FileS ystem.getUnguardedFileSystem(F ileSystem.java:401) ... 12 more Caused by: java.io.IOException: No FileSystem for scheme: hdfs at org.apache.hadoop.fs.FileSyste m.getFileSystemClass(FileSyste m.java:2786) at org.apache.flink.runtime.fs.hd fs.HadoopFsFactory.create(Hado opFsFactory.java:99) ... 13 more while submitting job it's printing following logs so i think it's including hdoop libs :Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf
:/usr/hdp/2.6.0.3-8/hadoop/ lib/*:/usr/hdp/2.6.0.3-8/ hadoop/.//*:/usr/hdp/2.6.0.3- 8/hadoop-hdfs/./:/usr/hdp/2.6. 0.3-8/hadoop-hdfs/lib/*:/usr/ hdp/2.6.0.3-8/hadoop-hdfs/.//* :/usr/hdp/2.6.0.3-8/hadoop- yarn/lib/*:/usr/hdp/2.6.0.3-8/ hadoop-yarn/.//*:/usr/hdp/2.6. 0.3-8/hadoop-mapreduce/lib/*:/ usr/hdp/2.6.0.3-8/hadoop- mapreduce/.//* On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:Sure i’ll Try that. ThanksOn Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]>wrote: I see, thanks for letting us know!On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:<img width="0" height="0" class="m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">I had to include two dependencies.hadoop-hdfs (this for HDFS configuration)hadoop-common (this for Path)
On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]>wrote: I think hadoop-hdfs might be sufficient.On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:<img width="0" height="0" class="m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Can you specifically guide which dependencies should I add to extend this :On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:<img width="0" height="0" class="m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.
--On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]>wrote: Hi,Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.Best,AljoschaOn 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:<img width="0" height="0" class="m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink. I am extendingorg.apache.flink.streaming.connectors.fs.bucketing.Bucketer in the class, i have to use org.apache.hadoop.fs.Path but as hadoop libraries removed it's giving error"object hadoop is not a member of package org.apache"Should i have to include Hadoop client libs in build.sbt dependencies.
ThanksShashank
Thanks Regards
SHASHANK AGARWAL--- Trying to mobilize the things....--Thanks Regards
SHASHANK AGARWAL--- Trying to mobilize the things....--Thanks Regards
SHASHANK AGARWAL--- Trying to mobilize the things....--Sent from iPhone 5--Thanks Regards
SHASHANK AGARWAL--- Trying to mobilize the things....--Thanks Regards
SHASHANK AGARWAL--- Trying to mobilize the things....
Free forum by Nabble | Edit this page |