AWS Client Builder with default credentials

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

AWS Client Builder with default credentials

David Magalhães
I'm using org.apache.flink.fs.s3base.shaded.com.amazonaws.client.builder.AwsClientBuilder to create a S3 client to copy objects and delete object inside a TwoPhaseCommitSinkFunction.

Shouldn't be another way to set up configurations without put them hardcoded ? Something like core-site.xml or flink-conf.yaml ?

Right now I need to have them hardcoded like this.

AmazonS3ClientBuilder.standard
      .withPathStyleAccessEnabled(true)
      .withEndpointConfiguration(
        new EndpointConfiguration("http://minio:9000", "us-east-1")
      )
      .withCredentials(
        new AWSStaticCredentialsProvider(new BasicAWSCredentials("minio", "minio123"))
      )
      .build


Thanks
Reply | Threaded
Open this post in threaded view
|

Re: AWS Client Builder with default credentials

Chesnay Schepler
First things first, we do not intend for users to use anything in the S3 filesystem modules except the filesystems itself,
meaning that you're somewhat treading on unsupported ground here.

Nevertheless, the S3 modules contain a large variety of AWS-provided CerentialsProvider implementations,
that can derive credentials from environment variables, system properties, files on the classpath and many more.

Ultimately though, you're kind of asking us how to use AWS APIs, for which I would direct you to the AWS documentation.

On 20/02/2020 13:16, David Magalhães wrote:
I'm using org.apache.flink.fs.s3base.shaded.com.amazonaws.client.builder.AwsClientBuilder to create a S3 client to copy objects and delete object inside a TwoPhaseCommitSinkFunction.

Shouldn't be another way to set up configurations without put them hardcoded ? Something like core-site.xml or flink-conf.yaml ?

Right now I need to have them hardcoded like this.

AmazonS3ClientBuilder.standard
      .withPathStyleAccessEnabled(true)
      .withEndpointConfiguration(
        new EndpointConfiguration("http://minio:9000", "us-east-1")
      )
      .withCredentials(
        new AWSStaticCredentialsProvider(new BasicAWSCredentials("minio", "minio123"))
      )
      .build


Thanks


Reply | Threaded
Open this post in threaded view
|

Re: AWS Client Builder with default credentials

rmetzger0
There are multiple ways of passing configuration parameters to your user defined code in Flink

a)  use getRuntimeContext().getUserCodeClassLoader().getResource() to load a config file from your user code jar or the classpath.
b)  use getRuntimeContext().getExecutionConfig().getGlobalJobParameters() to access a configuration object serialized from the main method.
you can pass a custom object to the job parameters, or use Flink's "Configuration" object in your main method:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

Configuration config = new Configuration();
config.setString("foo", "bar");
env.getConfig().setGlobalJobParameters(config);
c) Load the flink-conf.yaml:
Configuration conf = GlobalConfiguration.loadConfiguration();
I'm not 100% sure if this approach works, as it is not intended to be used in user code (I believe).


Let me know if this helps!

Best,
Robert

On Thu, Feb 20, 2020 at 1:50 PM Chesnay Schepler <[hidden email]> wrote:
First things first, we do not intend for users to use anything in the S3 filesystem modules except the filesystems itself,
meaning that you're somewhat treading on unsupported ground here.

Nevertheless, the S3 modules contain a large variety of AWS-provided CerentialsProvider implementations,
that can derive credentials from environment variables, system properties, files on the classpath and many more.

Ultimately though, you're kind of asking us how to use AWS APIs, for which I would direct you to the AWS documentation.

On 20/02/2020 13:16, David Magalhães wrote:
I'm using org.apache.flink.fs.s3base.shaded.com.amazonaws.client.builder.AwsClientBuilder to create a S3 client to copy objects and delete object inside a TwoPhaseCommitSinkFunction.

Shouldn't be another way to set up configurations without put them hardcoded ? Something like core-site.xml or flink-conf.yaml ?

Right now I need to have them hardcoded like this.

AmazonS3ClientBuilder.standard
      .withPathStyleAccessEnabled(true)
      .withEndpointConfiguration(
        new EndpointConfiguration("http://minio:9000", "us-east-1")
      )
      .withCredentials(
        new AWSStaticCredentialsProvider(new BasicAWSCredentials("minio", "minio123"))
      )
      .build


Thanks


Reply | Threaded
Open this post in threaded view
|

Re: AWS Client Builder with default credentials

David Magalhães
Hi Robert, thanks for your reply.

GlobalConfiguration.loadConfiguration was useful to check if a flink-conf.yml file was on resources, for the integration tests that I'm doing. On the cluster I will use the default configurations.

On Fri, Feb 21, 2020 at 10:58 AM Robert Metzger <[hidden email]> wrote:
There are multiple ways of passing configuration parameters to your user defined code in Flink

a)  use getRuntimeContext().getUserCodeClassLoader().getResource() to load a config file from your user code jar or the classpath.
b)  use getRuntimeContext().getExecutionConfig().getGlobalJobParameters() to access a configuration object serialized from the main method.
you can pass a custom object to the job parameters, or use Flink's "Configuration" object in your main method:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

Configuration config = new Configuration();
config.setString("foo", "bar");
env.getConfig().setGlobalJobParameters(config);
c) Load the flink-conf.yaml:
Configuration conf = GlobalConfiguration.loadConfiguration();
I'm not 100% sure if this approach works, as it is not intended to be used in user code (I believe).


Let me know if this helps!

Best,
Robert

On Thu, Feb 20, 2020 at 1:50 PM Chesnay Schepler <[hidden email]> wrote:
First things first, we do not intend for users to use anything in the S3 filesystem modules except the filesystems itself,
meaning that you're somewhat treading on unsupported ground here.

Nevertheless, the S3 modules contain a large variety of AWS-provided CerentialsProvider implementations,
that can derive credentials from environment variables, system properties, files on the classpath and many more.

Ultimately though, you're kind of asking us how to use AWS APIs, for which I would direct you to the AWS documentation.

On 20/02/2020 13:16, David Magalhães wrote:
I'm using org.apache.flink.fs.s3base.shaded.com.amazonaws.client.builder.AwsClientBuilder to create a S3 client to copy objects and delete object inside a TwoPhaseCommitSinkFunction.

Shouldn't be another way to set up configurations without put them hardcoded ? Something like core-site.xml or flink-conf.yaml ?

Right now I need to have them hardcoded like this.

AmazonS3ClientBuilder.standard
      .withPathStyleAccessEnabled(true)
      .withEndpointConfiguration(
        new EndpointConfiguration("http://minio:9000", "us-east-1")
      )
      .withCredentials(
        new AWSStaticCredentialsProvider(new BasicAWSCredentials("minio", "minio123"))
      )
      .build


Thanks


Reply | Threaded
Open this post in threaded view
|

Re: AWS Client Builder with default credentials

sri hari kali charan Tummala

On Mon, Feb 24, 2020 at 9:08 AM David Magalhães <[hidden email]> wrote:
Hi Robert, thanks for your reply.

GlobalConfiguration.loadConfiguration was useful to check if a flink-conf.yml file was on resources, for the integration tests that I'm doing. On the cluster I will use the default configurations.

On Fri, Feb 21, 2020 at 10:58 AM Robert Metzger <[hidden email]> wrote:
There are multiple ways of passing configuration parameters to your user defined code in Flink

a)  use getRuntimeContext().getUserCodeClassLoader().getResource() to load a config file from your user code jar or the classpath.
b)  use getRuntimeContext().getExecutionConfig().getGlobalJobParameters() to access a configuration object serialized from the main method.
you can pass a custom object to the job parameters, or use Flink's "Configuration" object in your main method:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

Configuration config = new Configuration();
config.setString("foo", "bar");
env.getConfig().setGlobalJobParameters(config);
c) Load the flink-conf.yaml:
Configuration conf = GlobalConfiguration.loadConfiguration();
I'm not 100% sure if this approach works, as it is not intended to be used in user code (I believe).


Let me know if this helps!

Best,
Robert

On Thu, Feb 20, 2020 at 1:50 PM Chesnay Schepler <[hidden email]> wrote:
First things first, we do not intend for users to use anything in the S3 filesystem modules except the filesystems itself,
meaning that you're somewhat treading on unsupported ground here.

Nevertheless, the S3 modules contain a large variety of AWS-provided CerentialsProvider implementations,
that can derive credentials from environment variables, system properties, files on the classpath and many more.

Ultimately though, you're kind of asking us how to use AWS APIs, for which I would direct you to the AWS documentation.

On 20/02/2020 13:16, David Magalhães wrote:
I'm using org.apache.flink.fs.s3base.shaded.com.amazonaws.client.builder.AwsClientBuilder to create a S3 client to copy objects and delete object inside a TwoPhaseCommitSinkFunction.

Shouldn't be another way to set up configurations without put them hardcoded ? Something like core-site.xml or flink-conf.yaml ?

Right now I need to have them hardcoded like this.

AmazonS3ClientBuilder.standard
      .withPathStyleAccessEnabled(true)
      .withEndpointConfiguration(
        new EndpointConfiguration("http://minio:9000", "us-east-1")
      )
      .withCredentials(
        new AWSStaticCredentialsProvider(new BasicAWSCredentials("minio", "minio123"))
      )
      .build


Thanks




--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: AWS Client Builder with default credentials

Suneel Marthi
In reply to this post by David Magalhães
Not sure if this helps - this is how I invoke a Sagemaker endpoint model from a flink pipeline.




On Mon, Feb 24, 2020 at 10:08 AM David Magalhães <[hidden email]> wrote:
Hi Robert, thanks for your reply.

GlobalConfiguration.loadConfiguration was useful to check if a flink-conf.yml file was on resources, for the integration tests that I'm doing. On the cluster I will use the default configurations.

On Fri, Feb 21, 2020 at 10:58 AM Robert Metzger <[hidden email]> wrote:
There are multiple ways of passing configuration parameters to your user defined code in Flink

a)  use getRuntimeContext().getUserCodeClassLoader().getResource() to load a config file from your user code jar or the classpath.
b)  use getRuntimeContext().getExecutionConfig().getGlobalJobParameters() to access a configuration object serialized from the main method.
you can pass a custom object to the job parameters, or use Flink's "Configuration" object in your main method:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

Configuration config = new Configuration();
config.setString("foo", "bar");
env.getConfig().setGlobalJobParameters(config);
c) Load the flink-conf.yaml:
Configuration conf = GlobalConfiguration.loadConfiguration();
I'm not 100% sure if this approach works, as it is not intended to be used in user code (I believe).


Let me know if this helps!

Best,
Robert

On Thu, Feb 20, 2020 at 1:50 PM Chesnay Schepler <[hidden email]> wrote:
First things first, we do not intend for users to use anything in the S3 filesystem modules except the filesystems itself,
meaning that you're somewhat treading on unsupported ground here.

Nevertheless, the S3 modules contain a large variety of AWS-provided CerentialsProvider implementations,
that can derive credentials from environment variables, system properties, files on the classpath and many more.

Ultimately though, you're kind of asking us how to use AWS APIs, for which I would direct you to the AWS documentation.

On 20/02/2020 13:16, David Magalhães wrote:
I'm using org.apache.flink.fs.s3base.shaded.com.amazonaws.client.builder.AwsClientBuilder to create a S3 client to copy objects and delete object inside a TwoPhaseCommitSinkFunction.

Shouldn't be another way to set up configurations without put them hardcoded ? Something like core-site.xml or flink-conf.yaml ?

Right now I need to have them hardcoded like this.

AmazonS3ClientBuilder.standard
      .withPathStyleAccessEnabled(true)
      .withEndpointConfiguration(
        new EndpointConfiguration("http://minio:9000", "us-east-1")
      )
      .withCredentials(
        new AWSStaticCredentialsProvider(new BasicAWSCredentials("minio", "minio123"))
      )
      .build


Thanks