Hi All,
I am facing issues when I changed from S3n to S3A, I am using hadoop-aws-2.7.2 and aws-java-sdk-1.7.4 as dependencies Getting this error : java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access Tried to change the fs.s3.buffer.dir path but getting same error. According to the documentation when I try to provide values in these properties, it does not work and gives the credential exception: <property> <name>fs.s3.awsAccessKeyId</name> <value></value> </property> <property> <name>fs.s3.awsSecretAccessKey</name> <value></value> </property> When I change these properties to fs.s3a.access.key and fs.s3a.secret.key it accepts the credentials but gives the above exception. Can you please guide me the correct way of migrating from S3n to S3A P.S I have followed all the steps in the documentation. |
Hi,
from the exception, it seems like some native library is missing in your classpath. The code of the native method should be contained in something like a HADOOP_HOME\bin\hadoop.dll. Best, Stefan > Am 05.08.2016 um 11:32 schrieb vinay patil <[hidden email]>: > > Hi All, > > I am facing issues when I changed from S3n to S3A, I am using > hadoop-aws-2.7.2 and aws-java-sdk-1.7.4 as dependencies > > Getting this error : java.lang.UnsatisfiedLinkError: > org.apache.hadoop.io.nativeio.NativeIO$Windows.access > Tried to change the fs.s3.buffer.dir path but getting same error. > > According to the documentation when I try to provide values in these > properties, it does not work and gives the credential exception: > > <property> > <name>fs.s3.awsAccessKeyId</name> > <value></value> > </property> > > <property> > <name>fs.s3.awsSecretAccessKey</name> > <value></value> > </property> > > When I change these properties to fs.s3a.access.key and fs.s3a.secret.key it > accepts the credentials but gives the above exception. > > Can you please guide me the correct way of migrating from S3n to S3A > > P.S I have followed all the steps in the documentation. > > > > -- > View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Facing-Issues-while-migrating-to-S3A-tp8337.html > Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com. |
Hi Stefan, The migration from S3n to S3A is working fine when I try it on cluster (linux box), the issue is only with the local windows machine. I am not clear on this ,Can you please elaborate: "The code of the native method should be contained in something like a HADOOP_HOME\bin\hadoop.dll" I want to test it locally so that I don't have to run every-time on cluster Regards, Vinay Patil On Fri, Aug 5, 2016 at 3:18 PM, Stefan Richter [via Apache Flink User Mailing List archive.] <[hidden email]> wrote: Hi, |
Hi,
yes, it is only an issue on Windows because the system is using native functions and it seems your classpath lacks the Windows specific binary. Those functions are called from Java through JNI and their implementation for Windows is in some dll file that you need to put on your classpath. I think that file is $HADOOP_HOME\bin\hadoop.dll, where $HADOOP_HOME is the location of your Hadoop installation on Windows. Locate this file on your machine and include it to your classpath. This should help. Best, Stefan
|
Hi,
I am still have the same problem, googled many ways but still failed. I have downloaded and added hadoop.dll and winutils.exe to class path. To verify that is working, I called "System.loadLibrary("haddop")" at the beginning of my java program and it succeed. BTW: I run my program in windows 7 64bit. -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ |
Free forum by Nabble | Edit this page |