Can not resolve org.apache.hadoop.fs.Path in 1.4.0

classic Classic list List threaded Threaded
21 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank
Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Aljoscha Krettek
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha

On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Aljoscha Krettek
I think hadoop-hdfs might be sufficient.

On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_1305801829708615289mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-4024719543416093672mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display:flex">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Aljoscha Krettek
I see, thanks for letting us know!

On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-4024719543416093672mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Stephan Ewen
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


--
Sent from iPhone 5
Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


--
Sent from iPhone 5



--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Timo Walther
Hi Shashank,

it seems that HDFS is still not in classpath. Could you quickly explain how I can reproduce the error?

Regards,
Timo



Am 12/19/17 um 12:38 PM schrieb shashank agarwal:
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more

          

          

          

          
while submitting job it's printing following logs so i think it's including hdoop libs :

          
Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


--
Sent from iPhone 5



--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Jörn Franke
In reply to this post by shashank734
You need to put flink-hadoop-compability*.jar in the lib folder of your flink distribution or in the class path of your Custer nodes

On 19. Dec 2017, at 12:38, shashank agarwal <[hidden email]> wrote:

yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


--
Sent from iPhone 5



--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
I have tried to add this in Both lib folder of flink and assembly jar as dependency too. But getting the same error.




On Tue, Dec 19, 2017 at 11:28 PM, Jörn Franke <[hidden email]> wrote:
You need to put flink-hadoop-compability*.jar in the lib folder of your flink distribution or in the class path of your Custer nodes

On 19. Dec 2017, at 12:38, shashank agarwal <[hidden email]> wrote:

yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-8460032463601947187m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display:flex">I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-8460032463601947187m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-8460032463601947187m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-8460032463601947187m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


--
Sent from iPhone 5



--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
In reply to this post by shashank734
Hi Timo,

I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt :

"org.slf4j" % "slf4j-log4j12" % "1.7.21",
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-cep-scala" % flinkVersion,
  "org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion,
  "org.apache.flink" %% "flink-connector-filesystem" % flinkVersion,
  "org.apache.flink" %% "flink-statebackend-rocksdb" % flinkVersion,
  "org.apache.flink" %% "flink-connector-cassandra" % "1.3.2",
  "org.apache.flink" % "flink-shaded-hadoop2" % flinkVersion,

when i start flink yarn session  it's working fine even it's creating flink checkpointing directory and copying libs into hdfs.

But when I submit the application to this yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
But application fails contuniously with logs which i have sent earlier.


‌I have tried to add flink- hadoop-compability*.jar as suggested by Jorn but it's not working.



On Tue, Dec 19, 2017 at 5:08 PM, shashank agarwal <[hidden email]> wrote:
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display:flex">I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


--
Sent from iPhone 5



--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Aljoscha Krettek
Hi,

Could you please list what exactly is in your submitted jar file, for example using "jar tf my-jar-file.jar"? And also what files exactly are in your Flink lib directory.

Best,
Aljoscha

On 19. Dec 2017, at 20:10, shashank agarwal <[hidden email]> wrote:

Hi Timo,

I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt :

"org.slf4j" % "slf4j-log4j12" % "1.7.21",
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-cep-scala" % flinkVersion,
  "org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion,
  "org.apache.flink" %% "flink-connector-filesystem" % flinkVersion,
  "org.apache.flink" %% "flink-statebackend-rocksdb" % flinkVersion,
  "org.apache.flink" %% "flink-connector-cassandra" % "1.3.2",
  "org.apache.flink" % "flink-shaded-hadoop2" % flinkVersion,

when i start flink yarn session  it's working fine even it's creating flink checkpointing directory and copying libs into hdfs.

But when I submit the application to this yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
But application fails contuniously with logs which i have sent earlier.


‌I have tried to add flink- hadoop-compability*.jar as suggested by Jorn but it's not working.



On Tue, Dec 19, 2017 at 5:08 PM, shashank agarwal <[hidden email]> wrote:
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


-- 
Sent from iPhone 5



-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
Hi,

Please find attached list of jar file contents and flink/lib/ contents. I have removed my class files list from jar list and I have added flink-hadoop-compatibility_2.11-1.4.0.jar later in flink/lib/ but no success. 

I have tried by removing flink-shaded-hadoop2 from my project but still no success.


Thanks
Shashank


On Wed, Dec 20, 2017 at 2:14 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Could you please list what exactly is in your submitted jar file, for example using "jar tf my-jar-file.jar"? And also what files exactly are in your Flink lib directory.

Best,
Aljoscha


On 19. Dec 2017, at 20:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-2758157150223952626mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display:flex">Hi Timo,

I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt :

"org.slf4j" % "slf4j-log4j12" % "1.7.21",
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-cep-scala" % flinkVersion,
  "org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion,
  "org.apache.flink" %% "flink-connector-filesystem" % flinkVersion,
  "org.apache.flink" %% "flink-statebackend-rocksdb" % flinkVersion,
  "org.apache.flink" %% "flink-connector-cassandra" % "1.3.2",
  "org.apache.flink" % "flink-shaded-hadoop2" % flinkVersion,

when i start flink yarn session  it's working fine even it's creating flink checkpointing directory and copying libs into hdfs.

But when I submit the application to this yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
But application fails contuniously with logs which i have sent earlier.


‌I have tried to add flink- hadoop-compability*.jar as suggested by Jorn but it's not working.



On Tue, Dec 19, 2017 at 5:08 PM, shashank agarwal <[hidden email]> wrote:
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


-- 
Sent from iPhone 5



-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


flinklib_content.txt (238 bytes) Download Attachment
jar_content.txt (3M) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
One more thing when i submit the job ir start yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flink/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]


So i think it's adding Hadoop libs in classpath too cause it's able to create the checkpointing directories from flink-conf file to HDFS.







On Wed, Dec 20, 2017 at 2:31 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-6936750922697133559mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hi,

Please find attached list of jar file contents and flink/lib/ contents. I have removed my class files list from jar list and I have added flink-hadoop-compatibility_2.11-1.4.0.jar later in flink/lib/ but no success. 

I have tried by removing flink-shaded-hadoop2 from my project but still no success.


Thanks
Shashank


On Wed, Dec 20, 2017 at 2:14 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Could you please list what exactly is in your submitted jar file, for example using "jar tf my-jar-file.jar"? And also what files exactly are in your Flink lib directory.

Best,
Aljoscha


On 19. Dec 2017, at 20:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hi Timo,

I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt :

"org.slf4j" % "slf4j-log4j12" % "1.7.21",
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-cep-scala" % flinkVersion,
  "org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion,
  "org.apache.flink" %% "flink-connector-filesystem" % flinkVersion,
  "org.apache.flink" %% "flink-statebackend-rocksdb" % flinkVersion,
  "org.apache.flink" %% "flink-connector-cassandra" % "1.3.2",
  "org.apache.flink" % "flink-shaded-hadoop2" % flinkVersion,

when i start flink yarn session  it's working fine even it's creating flink checkpointing directory and copying libs into hdfs.

But when I submit the application to this yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
But application fails contuniously with logs which i have sent earlier.


‌I have tried to add flink- hadoop-compability*.jar as suggested by Jorn but it's not working.



On Tue, Dec 19, 2017 at 5:08 PM, shashank agarwal <[hidden email]> wrote:
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


-- 
Sent from iPhone 5



-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Aljoscha Krettek
Hi,

That jar file looks like it has too much stuff in there that shouldn't be there. This can explain the errors you seeing because of classloading conflicts.

Could you try not building a fat-jar and have only your code in your jar?

Best,
Aljoscha

On 20. Dec 2017, at 10:15, shashank agarwal <[hidden email]> wrote:

One more thing when i submit the job ir start yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flink/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]


So i think it's adding Hadoop libs in classpath too cause it's able to create the checkpointing directories from flink-conf file to HDFS.







On Wed, Dec 20, 2017 at 2:31 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-6936750922697133559mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Hi,

Please find attached list of jar file contents and flink/lib/ contents. I have removed my class files list from jar list and I have added flink-hadoop-compatibility_2.11-1.4.0.jar later in flink/lib/ but no success. 

I have tried by removing flink-shaded-hadoop2 from my project but still no success.


Thanks
Shashank


On Wed, Dec 20, 2017 at 2:14 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Could you please list what exactly is in your submitted jar file, for example using "jar tf my-jar-file.jar"? And also what files exactly are in your Flink lib directory.

Best,
Aljoscha


On 19. Dec 2017, at 20:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Hi Timo,

I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt :

"org.slf4j" % "slf4j-log4j12" % "1.7.21",
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-cep-scala" % flinkVersion,
  "org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion,
  "org.apache.flink" %% "flink-connector-filesystem" % flinkVersion,
  "org.apache.flink" %% "flink-statebackend-rocksdb" % flinkVersion,
  "org.apache.flink" %% "flink-connector-cassandra" % "1.3.2",
  "org.apache.flink" % "flink-shaded-hadoop2" % flinkVersion,

when i start flink yarn session  it's working fine even it's creating flink checkpointing directory and copying libs into hdfs.

But when I submit the application to this yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
But application fails contuniously with logs which i have sent earlier.


‌I have tried to add flink- hadoop-compability*.jar as suggested by Jorn but it's not working.



On Tue, Dec 19, 2017 at 5:08 PM, shashank agarwal <[hidden email]> wrote:
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display: flex;">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


-- 
Sent from iPhone 5



-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

shashank734
Hi,

In that case, it won't find the dependencies. Cause I have other dependencies also and what about CEP etc. cause that is not part of flink-dist. 

Best
Shashank




On Wed, Dec 20, 2017 at 3:16 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

That jar file looks like it has too much stuff in there that shouldn't be there. This can explain the errors you seeing because of classloading conflicts.

Could you try not building a fat-jar and have only your code in your jar?

Best,
Aljoscha


On 20. Dec 2017, at 10:15, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-3512480450428035689mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display:flex">One more thing when i submit the job ir start yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flink/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]


So i think it's adding Hadoop libs in classpath too cause it's able to create the checkpointing directories from flink-conf file to HDFS.







On Wed, Dec 20, 2017 at 2:31 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-3512480450428035689m_-6936750922697133559mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hi,

Please find attached list of jar file contents and flink/lib/ contents. I have removed my class files list from jar list and I have added flink-hadoop-compatibility_2.11-1.4.0.jar later in flink/lib/ but no success. 

I have tried by removing flink-shaded-hadoop2 from my project but still no success.


Thanks
Shashank


On Wed, Dec 20, 2017 at 2:14 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Could you please list what exactly is in your submitted jar file, for example using "jar tf my-jar-file.jar"? And also what files exactly are in your Flink lib directory.

Best,
Aljoscha


On 19. Dec 2017, at 20:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hi Timo,

I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt :

"org.slf4j" % "slf4j-log4j12" % "1.7.21",
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-cep-scala" % flinkVersion,
  "org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion,
  "org.apache.flink" %% "flink-connector-filesystem" % flinkVersion,
  "org.apache.flink" %% "flink-statebackend-rocksdb" % flinkVersion,
  "org.apache.flink" %% "flink-connector-cassandra" % "1.3.2",
  "org.apache.flink" % "flink-shaded-hadoop2" % flinkVersion,

when i start flink yarn session  it's working fine even it's creating flink checkpointing directory and copying libs into hdfs.

But when I submit the application to this yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
But application fails contuniously with logs which i have sent earlier.


‌I have tried to add flink- hadoop-compability*.jar as suggested by Jorn but it's not working.



On Tue, Dec 19, 2017 at 5:08 PM, shashank agarwal <[hidden email]> wrote:
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more




while submitting job it's printing following logs so i think it's including hdoop libs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img width="0" height="0" class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img width="0" height="0" class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


-- 
Sent from iPhone 5



-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....

Reply | Threaded
Open this post in threaded view
|

Re: Can not resolve org.apache.hadoop.fs.Path in 1.4.0

Timo Walther
Libraries such as CEP or Table API should have the "compile" scope and should be in the both the fat and non-fat jar.

The non-fat jar should contain everything that is not in flink-dist or your lib directory.

Regards,
Timo


Am 12/20/17 um 3:07 PM schrieb shashank agarwal:
Hi,

In that case, it won't find the dependencies. Cause I have other dependencies also and what about CEP etc. cause that is not part of flink-dist. 

Best
Shashank




On Wed, Dec 20, 2017 at 3:16 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

That jar file looks like it has too much stuff in there that shouldn't be there. This can explain the errors you seeing because of classloading conflicts.

Could you try not building a fat-jar and have only your code in your jar?

Best,
Aljoscha


On 20. Dec 2017, at 10:15, shashank agarwal <[hidden email]> wrote:

<img class="m_-3512480450428035689mailtrack-img" alt="" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" style="display:flex" moz-do-not-send="true" width="0" height="0">One more thing when i submit the job ir start yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flink/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]


So i think it's adding Hadoop libs in classpath too cause it's able to create the checkpointing directories from flink-conf file to HDFS.







On Wed, Dec 20, 2017 at 2:31 PM, shashank agarwal <[hidden email]> wrote:
<img class="m_-3512480450428035689m_-6936750922697133559mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">Hi,

Please find attached list of jar file contents and flink/lib/ contents. I have removed my class files list from jar list and I have added flink-hadoop-compatibility_2.11-1.4.0.jar later in flink/lib/ but no success. 

I have tried by removing flink-shaded-hadoop2 from my project but still no success.


Thanks
Shashank


On Wed, Dec 20, 2017 at 2:14 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Could you please list what exactly is in your submitted jar file, for example using "jar tf my-jar-file.jar"? And also what files exactly are in your Flink lib directory.

Best,
Aljoscha


On 19. Dec 2017, at 20:10, shashank agarwal <[hidden email]> wrote:

<img class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">Hi Timo,

I am using Rocksdbstatebackend with hdfs path. I have following flink dependencies in my sbt :

"org.slf4j" % "slf4j-log4j12" % "1.7.21",
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-cep-scala" % flinkVersion,
  "org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion,
  "org.apache.flink" %% "flink-connector-filesystem" % flinkVersion,
  "org.apache.flink" %% "flink-statebackend-rocksdb" % flinkVersion,
  "org.apache.flink" %% "flink-connector-cassandra" % "1.3.2",
  "org.apache.flink" % "flink-shaded-hadoop2" % flinkVersion,

when i start flink yarn session  it's working fine even it's creating flink checkpointing directory and copying libs into hdfs.

But when I submit the application to this yarn session it prints following logs :

Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*
But application fails contuniously with logs which i have sent earlier.


‌I have tried to add flink- hadoop-compability*.jar as suggested by Jorn but it's not working.



On Tue, Dec 19, 2017 at 5:08 PM, shashank agarwal <[hidden email]> wrote:
yes, it's working fine. now not getting compile time error.

But when i trying to run this on cluster or yarn, getting following runtime error :

org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:405)
	at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:320)
	at org.apache.flink.core.fs.Path.getFileSystem(Path.java:293)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStreamFactory.<init>(FsCheckpointStreamFactory.java:99)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createStreamFactory(FsStateBackend.java:277)
	at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createStreamFactory(RocksDBStateBackend.java:273)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.createCheckpointStreamFactory(StreamTask.java:787)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:247)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop File System abstraction does not support scheme 'hdfs'. Either no file system implementation exists for that scheme, or the relevant classes are missing from the classpath.
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:102)
	at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
	... 12 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
	at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
	... 13 more

                                                          

                                                          

                                                          

                                                          
while submitting job it's printing following logs so i think it's including hdoop libs :

                                                          
Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*

On Fri, Dec 8, 2017 at 9:24 PM, shashank agarwal <[hidden email]> wrote:
Sure i’ll Try that. Thanks

On Fri, 8 Dec 2017 at 9:18 PM, Stephan Ewen <[hidden email]> wrote:
I would recommend to add "flink-shaded-hadoop2". That is a bundle of all Hadoop dependencies used by Flink.


On Fri, Dec 8, 2017 at 3:44 PM, Aljoscha Krettek <[hidden email]> wrote:
I see, thanks for letting us know!


On 8. Dec 2017, at 15:42, shashank agarwal <[hidden email]> wrote:

<img class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">I had to include two dependencies.

hadoop-hdfs (this for HDFS configuration) 
hadoop-common (this for Path)



On Fri, Dec 8, 2017 at 7:38 PM, Aljoscha Krettek <[hidden email]> wrote:
I think hadoop-hdfs might be sufficient.


On 8. Dec 2017, at 14:48, shashank agarwal <[hidden email]> wrote:

<img class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">Can you specifically guide which dependencies should I add to extend this :


On Fri, Dec 8, 2017 at 6:58 PM, shashank agarwal <[hidden email]> wrote:
<img class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">It's a compilation error. I think I have to include the Hadoop dependencies.




On Fri, Dec 8, 2017 at 6:54 PM, Aljoscha Krettek <[hidden email]> wrote:
Hi,

Is this a compilation error or at runtime. But in general, yes you have to include the Hadoop dependencies if they're not there.

Best,
Aljoscha


On 8. Dec 2017, at 14:10, shashank agarwal <[hidden email]> wrote:

<img class="m_-3512480450428035689m_-6936750922697133559m_-2758157150223952626m_1553599686143405755m_-8226718503733193951m_5954409089651021616m_8277047368309584877m_-4024719543416093672m_1305801829708615289m_1570241767079666426mailtrack-img" alt="" style="display:flex" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" moz-do-not-send="true" width="0" height="0">Hello,

I am trying to test 1.4.0-RC3, Hadoop libraries removed in this version. Actually, i have created custom Bucketer for the bucketing sink.  I am extending 


org.apache.flink.streaming.connectors.fs.bucketing.Bucketer

in the class, i have to use org.apache.hadoop.fs.Path  but as hadoop libraries removed it's giving error 

"object hadoop is not a member of package org.apache"

Should i have to include Hadoop client libs in build.sbt dependencies.


Thanks
Shashank




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


-- 
Sent from iPhone 5



-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




-- 
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....


12