Facing Issues while migrating to S3A

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Facing Issues while migrating to S3A

Vinay Patil
Hi All,

I am facing issues when I changed from S3n to S3A, I am using hadoop-aws-2.7.2 and aws-java-sdk-1.7.4 as dependencies

Getting this error : java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access
Tried to change the fs.s3.buffer.dir path but getting same error.

According to the documentation when I try to provide values in these properties, it does not work and gives the credential exception:

<property>
  <name>fs.s3.awsAccessKeyId</name>
  <value></value>
</property>

<property>
  <name>fs.s3.awsSecretAccessKey</name>
  <value></value>
</property>

When I change these properties to fs.s3a.access.key and fs.s3a.secret.key it accepts the credentials but gives the above exception.

Can you please guide me the correct way of migrating from S3n to S3A

P.S I have followed all the steps in the documentation.
Reply | Threaded
Open this post in threaded view
|

Re: Facing Issues while migrating to S3A

Stefan Richter
Hi,

from the exception, it seems like some native library is missing in your classpath. The code of the native method should be contained in something like a HADOOP_HOME\bin\hadoop.dll.

Best,
Stefan

> Am 05.08.2016 um 11:32 schrieb vinay patil <[hidden email]>:
>
> Hi All,
>
> I am facing issues when I changed from S3n to S3A, I am using
> hadoop-aws-2.7.2 and aws-java-sdk-1.7.4 as dependencies
>
> Getting this error : java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access
> Tried to change the fs.s3.buffer.dir path but getting same error.
>
> According to the documentation when I try to provide values in these
> properties, it does not work and gives the credential exception:
>
> <property>
>  <name>fs.s3.awsAccessKeyId</name>
>  <value></value>
> </property>
>
> <property>
>  <name>fs.s3.awsSecretAccessKey</name>
>  <value></value>
> </property>
>
> When I change these properties to fs.s3a.access.key and fs.s3a.secret.key it
> accepts the credentials but gives the above exception.
>
> Can you please guide me the correct way of migrating from S3n to S3A
>
> P.S I have followed all the steps in the documentation.
>
>
>
> --
> View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Facing-Issues-while-migrating-to-S3A-tp8337.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: Facing Issues while migrating to S3A

Vinay Patil
Hi Stefan,

The migration from S3n to S3A is working fine when I try it on cluster (linux box), the issue is only with the local windows machine.

I am not clear on this ,Can you please elaborate:
"The code of the native method should be contained in something like a HADOOP_HOME\bin\hadoop.dll"
 
I want to test it locally so that I don't have to run every-time on cluster

Regards,
Vinay Patil

On Fri, Aug 5, 2016 at 3:18 PM, Stefan Richter [via Apache Flink User Mailing List archive.] <[hidden email]> wrote:
Hi,

from the exception, it seems like some native library is missing in your classpath. The code of the native method should be contained in something like a HADOOP_HOME\bin\hadoop.dll.

Best,
Stefan

> Am 05.08.2016 um 11:32 schrieb vinay patil <[hidden email]>:
>
> Hi All,
>
> I am facing issues when I changed from S3n to S3A, I am using
> hadoop-aws-2.7.2 and aws-java-sdk-1.7.4 as dependencies
>
> Getting this error : java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access
> Tried to change the fs.s3.buffer.dir path but getting same error.
>
> According to the documentation when I try to provide values in these
> properties, it does not work and gives the credential exception:
>
> <property>
>  <name>fs.s3.awsAccessKeyId</name>
>  <value></value>
> </property>
>
> <property>
>  <name>fs.s3.awsSecretAccessKey</name>
>  <value></value>
> </property>
>
> When I change these properties to fs.s3a.access.key and fs.s3a.secret.key it
> accepts the credentials but gives the above exception.
>
> Can you please guide me the correct way of migrating from S3n to S3A
>
> P.S I have followed all the steps in the documentation.
>
>
>
> --
> View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Facing-Issues-while-migrating-to-S3A-tp8337.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.




If you reply to this email, your message will be added to the discussion below:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Facing-Issues-while-migrating-to-S3A-tp8337p8338.html
To start a new topic under Apache Flink User Mailing List archive., email [hidden email]
To unsubscribe from Apache Flink User Mailing List archive., click here.
NAML

Reply | Threaded
Open this post in threaded view
|

Re: Facing Issues while migrating to S3A

Stefan Richter
Hi,

yes, it is only an issue on Windows because the system is using native functions and it seems your classpath lacks the Windows specific binary. Those functions are called from Java through JNI and their implementation for Windows is in some dll file that you need to put on your classpath. I think that file is $HADOOP_HOME\bin\hadoop.dll, where $HADOOP_HOME is the location of your Hadoop installation on Windows. Locate this file on your machine and include it to your classpath. This should help.

Best,
Stefan 

Am 06.08.2016 um 08:34 schrieb vinay patil <[hidden email]>:

Hi Stefan,

The migration from S3n to S3A is working fine when I try it on cluster (linux box), the issue is only with the local windows machine.

I am not clear on this ,Can you please elaborate:
"The code of the native method should be contained in something like a HADOOP_HOME\bin\hadoop.dll"
 
I want to test it locally so that I don't have to run every-time on cluster

Regards,
Vinay Patil

On Fri, Aug 5, 2016 at 3:18 PM, Stefan Richter [via Apache Flink User Mailing List archive.] <<a href="x-msg://2/user/SendEmail.jtp?type=node&amp;node=8349&amp;i=0" target="_top" rel="nofollow" link="external" class="">[hidden email]> wrote:
Hi, 

from the exception, it seems like some native library is missing in your classpath. The code of the native method should be contained in something like a HADOOP_HOME\bin\hadoop.dll. 

Best, 
Stefan 

> Am 05.08.2016 um 11:32 schrieb vinay patil <[hidden email]>: 
> 
> Hi All, 
> 
> I am facing issues when I changed from S3n to S3A, I am using 
> hadoop-aws-2.7.2 and aws-java-sdk-1.7.4 as dependencies 
> 
> Getting this error : java.lang.UnsatisfiedLinkError: 
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access 
> Tried to change the fs.s3.buffer.dir path but getting same error. 
> 
> According to the documentation when I try to provide values in these 
> properties, it does not work and gives the credential exception: 
> 
> <property> 
>  <name>fs.s3.awsAccessKeyId</name> 
>  <value></value> 
> </property> 
> 
> <property> 
>  <name>fs.s3.awsSecretAccessKey</name> 
>  <value></value> 
> </property> 
> 
> When I change these properties to fs.s3a.access.key and fs.s3a.secret.key it 
> accepts the credentials but gives the above exception. 
> 
> Can you please guide me the correct way of migrating from S3n to S3A 
> 
> P.S I have followed all the steps in the documentation. 
> 
> 
> 
> -- 
> View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Facing-Issues-while-migrating-to-S3A-tp8337.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.



To start a new topic under Apache Flink User Mailing List archive., email <a href="x-msg://2/user/SendEmail.jtp?type=node&amp;node=8349&amp;i=1" target="_top" rel="nofollow" link="external" class="">[hidden email] 
To unsubscribe from Apache Flink User Mailing List archive., click here.
NAML



View this message in context: Re: Facing Issues while migrating to S3A
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: Facing Issues while migrating to S3A

yinhua.dai
Hi,

I am still have the same problem, googled many ways but still failed.
I have downloaded and added hadoop.dll and winutils.exe to class path.

To verify that is working, I called "System.loadLibrary("haddop")" at the
beginning of my java program and it succeed.

BTW: I run my program in windows 7 64bit.



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/