Forgot to cc ;)
Hi,
Can someone please help on this issue. We have even tried to set fs.s3a.impl in core-site.xml, still its not working.
Regards,Vinay Patil
On Fri, Jan 11, 2019 at 5:03 PM Taher Koitawala [via Apache Flink User Mailing List archive.] <[hidden email]> wrote:
Hi All,We have implemented S3 sink in the following way:
StreamingFileSink sink= StreamingFileSink.forBulkFormat(new Path("s3a://mybucket/myfolder/output/"), ParquetAvroWriters.forGenericRecord(schema)).withBucketCheckInterval(50l).withBucketAssigner(new CustomBucketAssigner()).build();
The problem we are facing is that StreamingFileSink is initializing S3AFileSystem class to write to s3 and is not able to find the s3 credentials to write data, However other flink application on the same cluster use "s3://" paths are able to write data to the same s3 bucket and folders, we are only facing this issue with StreamingFileSink.
Regards,
Taher KoitawalaGS Lab Pune+91 8407979163
If you reply to this email, your message will be added to the discussion below:http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/StreamingFileSink-cannot-get-AWS-S3-credentials-tp25464.htmlTo start a new topic under Apache Flink User Mailing List archive., email [hidden email]
To unsubscribe from Apache Flink User Mailing List archive., click here.
NAML
Free forum by Nabble | Edit this page |