Flink Read S3 Intellij IDEA Error

classic Classic list List threaded Threaded
21 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
<property>
<name>fs.s3.impl</name>
<value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
</property>

<property>
<name>fs.s3.buffer.dir</name>
<value>/tmp</value>
</property>

<property>
<name>fs.s3a.server-side-encryption-algorithm</name>
<value>AES256</value>
</property>

<!--<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
</property>-->

<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
</property>
<property>
<name>fs.s3a.access.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.session.token</name>
<value></value>
</property>

<property>
<name>fs.s3a.proxy.host</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.port</name>
<value>8099</value>
</property>
<property>
<name>fs.s3a.proxy.username</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.password</name>
<value></value>
</property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>FlinkStreamAndSql</groupId>
<artifactId>FlinkStreamAndSql</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.1.3</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.13</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>

<!-- "package" command plugin -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.13.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-jdbc_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_2.11</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-json</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kinesis_2.11</artifactId>
<version>1.8.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.11_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>amazon-kinesis-client</artifactId>
<version>1.8.8</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-kinesis</artifactId>
<version>1.11.579</version>
</dependency>

<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.1</version>
</dependency>

<dependency>
<groupId>commons-cli</groupId>
<artifactId>commons-cli</artifactId>
<version>1.4</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.7</version>
</dependency>

<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.4.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.579</version>
</dependency>


<!-- For Parquet -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-compatibility_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-avro</artifactId>
<version>1.10.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>3.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-twitter_2.10</artifactId>
<version>1.1.4-hadoop1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-jackson_2.11</artifactId>
<version>3.6.7</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-cloudsearch</artifactId>
<version>1.11.500</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-shaded-hadoop2</artifactId>
<version>2.8.3-1.8.3</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-s3-fs-hadoop</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.5</version>
</dependency>


</dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

def main(args: Array[String]): Unit = {

val env: ExecutionEnvironment =
ExecutionEnvironment.getExecutionEnvironment
val tableEnv: BatchTableEnvironment =
TableEnvironment.getTableEnvironment(env)
/* create table from csv */

val tableSrc = CsvTableSource
.builder()
.path("s3a://bucket/csvfolder/avg.txt")
.fieldDelimiter(",")
.field("date", Types.STRING)
.field("month", Types.STRING)
.field("category", Types.STRING)
.field("product", Types.STRING)
.field("profit", Types.INT)
.build()

tableEnv.registerTableSource("CatalogTable", tableSrc)

val catalog: Table = tableEnv.scan("CatalogTable")
/* querying with Table API */

val order20: Table = catalog
.filter(" category === 'Category5'")
.groupBy("month")
.select("month, profit.sum as sum")
.orderBy("sum")

val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

order20Set.writeAsText("src/main/resources/table1/table1")

//tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
env.execute("State")

}

class Row1 {

var month: String = _

var sum: java.lang.Integer = _

override def toString(): String = month + "," + sum

}

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
<property>
<name>fs.s3.impl</name>
<value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
</property>

<property>
<name>fs.s3.buffer.dir</name>
<value>/tmp</value>
</property>

<property>
<name>fs.s3a.server-side-encryption-algorithm</name>
<value>AES256</value>
</property>

<!--<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
</property>-->

<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
</property>
<property>
<name>fs.s3a.access.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.session.token</name>
<value></value>
</property>

<property>
<name>fs.s3a.proxy.host</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.port</name>
<value>8099</value>
</property>
<property>
<name>fs.s3a.proxy.username</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.password</name>
<value></value>
</property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>FlinkStreamAndSql</groupId>
<artifactId>FlinkStreamAndSql</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.1.3</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.13</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>

<!-- "package" command plugin -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.13.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-jdbc_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_2.11</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-json</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kinesis_2.11</artifactId>
<version>1.8.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.11_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>amazon-kinesis-client</artifactId>
<version>1.8.8</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-kinesis</artifactId>
<version>1.11.579</version>
</dependency>

<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.1</version>
</dependency>

<dependency>
<groupId>commons-cli</groupId>
<artifactId>commons-cli</artifactId>
<version>1.4</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.7</version>
</dependency>

<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.4.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.579</version>
</dependency>


<!-- For Parquet -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-compatibility_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-avro</artifactId>
<version>1.10.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>3.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-twitter_2.10</artifactId>
<version>1.1.4-hadoop1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-jackson_2.11</artifactId>
<version>3.6.7</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-cloudsearch</artifactId>
<version>1.11.500</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-shaded-hadoop2</artifactId>
<version>2.8.3-1.8.3</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-s3-fs-hadoop</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.5</version>
</dependency>


</dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

def main(args: Array[String]): Unit = {

val env: ExecutionEnvironment =
ExecutionEnvironment.getExecutionEnvironment
val tableEnv: BatchTableEnvironment =
TableEnvironment.getTableEnvironment(env)
/* create table from csv */

val tableSrc = CsvTableSource
.builder()
.path("s3a://bucket/csvfolder/avg.txt")
.fieldDelimiter(",")
.field("date", Types.STRING)
.field("month", Types.STRING)
.field("category", Types.STRING)
.field("product", Types.STRING)
.field("profit", Types.INT)
.build()

tableEnv.registerTableSource("CatalogTable", tableSrc)

val catalog: Table = tableEnv.scan("CatalogTable")
/* querying with Table API */

val order20: Table = catalog
.filter(" category === 'Category5'")
.groupBy("month")
.select("month, profit.sum as sum")
.orderBy("sum")

val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

order20Set.writeAsText("src/main/resources/table1/table1")

//tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
env.execute("State")

}

class Row1 {

var month: String = _

var sum: java.lang.Integer = _

override def toString(): String = month + "," + sum

}

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
<property>
<name>fs.s3.impl</name>
<value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
</property>

<property>
<name>fs.s3.buffer.dir</name>
<value>/tmp</value>
</property>

<property>
<name>fs.s3a.server-side-encryption-algorithm</name>
<value>AES256</value>
</property>

<!--<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
</property>-->

<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
</property>
<property>
<name>fs.s3a.access.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.session.token</name>
<value></value>
</property>

<property>
<name>fs.s3a.proxy.host</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.port</name>
<value>8099</value>
</property>
<property>
<name>fs.s3a.proxy.username</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.password</name>
<value></value>
</property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>FlinkStreamAndSql</groupId>
<artifactId>FlinkStreamAndSql</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.1.3</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.13</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>

<!-- "package" command plugin -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.13.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-jdbc_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_2.11</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-json</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kinesis_2.11</artifactId>
<version>1.8.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.11_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>amazon-kinesis-client</artifactId>
<version>1.8.8</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-kinesis</artifactId>
<version>1.11.579</version>
</dependency>

<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.1</version>
</dependency>

<dependency>
<groupId>commons-cli</groupId>
<artifactId>commons-cli</artifactId>
<version>1.4</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.7</version>
</dependency>

<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.4.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.579</version>
</dependency>


<!-- For Parquet -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-compatibility_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-avro</artifactId>
<version>1.10.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>3.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-twitter_2.10</artifactId>
<version>1.1.4-hadoop1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-jackson_2.11</artifactId>
<version>3.6.7</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-cloudsearch</artifactId>
<version>1.11.500</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-shaded-hadoop2</artifactId>
<version>2.8.3-1.8.3</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-s3-fs-hadoop</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.5</version>
</dependency>


</dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

def main(args: Array[String]): Unit = {

val env: ExecutionEnvironment =
ExecutionEnvironment.getExecutionEnvironment
val tableEnv: BatchTableEnvironment =
TableEnvironment.getTableEnvironment(env)
/* create table from csv */

val tableSrc = CsvTableSource
.builder()
.path("s3a://bucket/csvfolder/avg.txt")
.fieldDelimiter(",")
.field("date", Types.STRING)
.field("month", Types.STRING)
.field("category", Types.STRING)
.field("product", Types.STRING)
.field("profit", Types.INT)
.build()

tableEnv.registerTableSource("CatalogTable", tableSrc)

val catalog: Table = tableEnv.scan("CatalogTable")
/* querying with Table API */

val order20: Table = catalog
.filter(" category === 'Category5'")
.groupBy("month")
.select("month, profit.sum as sum")
.orderBy("sum")

val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

order20Set.writeAsText("src/main/resources/table1/table1")

//tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
env.execute("State")

}

class Row1 {

var month: String = _

var sum: java.lang.Integer = _

override def toString(): String = month + "," + sum

}

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
<property>
<name>fs.s3.impl</name>
<value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
</property>

<property>
<name>fs.s3.buffer.dir</name>
<value>/tmp</value>
</property>

<property>
<name>fs.s3a.server-side-encryption-algorithm</name>
<value>AES256</value>
</property>

<!--<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
</property>-->

<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
</property>
<property>
<name>fs.s3a.access.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.session.token</name>
<value></value>
</property>

<property>
<name>fs.s3a.proxy.host</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.port</name>
<value>8099</value>
</property>
<property>
<name>fs.s3a.proxy.username</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.password</name>
<value></value>
</property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>FlinkStreamAndSql</groupId>
<artifactId>FlinkStreamAndSql</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.1.3</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.13</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>

<!-- "package" command plugin -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.13.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-jdbc_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_2.11</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-json</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kinesis_2.11</artifactId>
<version>1.8.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.11_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>amazon-kinesis-client</artifactId>
<version>1.8.8</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-kinesis</artifactId>
<version>1.11.579</version>
</dependency>

<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.1</version>
</dependency>

<dependency>
<groupId>commons-cli</groupId>
<artifactId>commons-cli</artifactId>
<version>1.4</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.7</version>
</dependency>

<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.4.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.579</version>
</dependency>


<!-- For Parquet -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-compatibility_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-avro</artifactId>
<version>1.10.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>3.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-twitter_2.10</artifactId>
<version>1.1.4-hadoop1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-jackson_2.11</artifactId>
<version>3.6.7</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-cloudsearch</artifactId>
<version>1.11.500</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-shaded-hadoop2</artifactId>
<version>2.8.3-1.8.3</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-s3-fs-hadoop</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.5</version>
</dependency>


</dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

def main(args: Array[String]): Unit = {

val env: ExecutionEnvironment =
ExecutionEnvironment.getExecutionEnvironment
val tableEnv: BatchTableEnvironment =
TableEnvironment.getTableEnvironment(env)
/* create table from csv */

val tableSrc = CsvTableSource
.builder()
.path("s3a://bucket/csvfolder/avg.txt")
.fieldDelimiter(",")
.field("date", Types.STRING)
.field("month", Types.STRING)
.field("category", Types.STRING)
.field("product", Types.STRING)
.field("profit", Types.INT)
.build()

tableEnv.registerTableSource("CatalogTable", tableSrc)

val catalog: Table = tableEnv.scan("CatalogTable")
/* querying with Table API */

val order20: Table = catalog
.filter(" category === 'Category5'")
.groupBy("month")
.select("month, profit.sum as sum")
.orderBy("sum")

val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

order20Set.writeAsText("src/main/resources/table1/table1")

//tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
env.execute("State")

}

class Row1 {

var month: String = _

var sum: java.lang.Integer = _

override def toString(): String = month + "," + sum

}

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

Lasse Nedergaard-2
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala <[hidden email]>:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
<property>
<name>fs.s3.impl</name>
<value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
</property>

<property>
<name>fs.s3.buffer.dir</name>
<value>/tmp</value>
</property>

<property>
<name>fs.s3a.server-side-encryption-algorithm</name>
<value>AES256</value>
</property>

<!--<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
</property>-->

<property>
<name>fs.s3a.aws.credentials.provider</name>
<value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
</property>
<property>
<name>fs.s3a.access.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value></value>
</property>
<property>
<name>fs.s3a.session.token</name>
<value></value>
</property>

<property>
<name>fs.s3a.proxy.host</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.port</name>
<value>8099</value>
</property>
<property>
<name>fs.s3a.proxy.username</name>
<value></value>
</property>
<property>
<name>fs.s3a.proxy.password</name>
<value></value>
</property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>FlinkStreamAndSql</groupId>
<artifactId>FlinkStreamAndSql</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.1.3</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.13</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>

<!-- "package" command plugin -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.13.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-jdbc_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_2.11</artifactId>
<version>1.8.1</version>
</dependency>


<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-json</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kinesis_2.11</artifactId>
<version>1.8.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.11_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>amazon-kinesis-client</artifactId>
<version>1.8.8</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-kinesis</artifactId>
<version>1.11.579</version>
</dependency>

<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.1</version>
</dependency>

<dependency>
<groupId>commons-cli</groupId>
<artifactId>commons-cli</artifactId>
<version>1.4</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.7</version>
</dependency>

<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.4.1</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>dynamodb-streams-kinesis-adapter</artifactId>
<version>1.4.0</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.579</version>
</dependency>


<!-- For Parquet -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-compatibility_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-avro</artifactId>
<version>1.10.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>3.1.1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-twitter_2.10</artifactId>
<version>1.1.4-hadoop1</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_2.11</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-jackson_2.11</artifactId>
<version>3.6.7</version>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-cloudsearch</artifactId>
<version>1.11.500</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-shaded-hadoop2</artifactId>
<version>2.8.3-1.8.3</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-s3-fs-hadoop</artifactId>
<version>1.8.1</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.5</version>
</dependency>


</dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

def main(args: Array[String]): Unit = {

val env: ExecutionEnvironment =
ExecutionEnvironment.getExecutionEnvironment
val tableEnv: BatchTableEnvironment =
TableEnvironment.getTableEnvironment(env)
/* create table from csv */

val tableSrc = CsvTableSource
.builder()
.path("s3a://bucket/csvfolder/avg.txt")
.fieldDelimiter(",")
.field("date", Types.STRING)
.field("month", Types.STRING)
.field("category", Types.STRING)
.field("product", Types.STRING)
.field("profit", Types.INT)
.build()

tableEnv.registerTableSource("CatalogTable", tableSrc)

val catalog: Table = tableEnv.scan("CatalogTable")
/* querying with Table API */

val order20: Table = catalog
.filter(" category === 'Category5'")
.groupBy("month")
.select("month, profit.sum as sum")
.orderBy("sum")

val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

order20Set.writeAsText("src/main/resources/table1/table1")

//tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
env.execute("State")

}

class Row1 {

var month: String = _

var sum: java.lang.Integer = _

override def toString(): String = month + "," + sum

}

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

Chesnay Schepler
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala


Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala




--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

Chesnay Schepler
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

--
Thanks & Regards
Sri Tummala


Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

--
Thanks & Regards
Sri Tummala


--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

--
Thanks & Regards
Sri Tummala


--
Thanks & Regards
Sri Tummala

--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

Chesnay Schepler
From the exception I would conclude that your core-site.xml file is not being picked up.

AFAIK fs.hdfs.hadoopconf only works for HDFS, not for S3 filesystems, so try setting HADOOP_CONF_DIR to the directory that the file resides in.

On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint


Thanks



The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala




--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

--
Thanks & Regards
Sri Tummala


--
Thanks & Regards
Sri Tummala

--
Thanks & Regards
Sri Tummala


Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
Same error.
 


On Fri, 12 Mar 2021 at 09:01, ChesnaSchepler <[hidden email]> wrote:
From the exception I would conclude that your core-site.xml file is not being picked up.

AFAIK fs.hdfs.hadoopconf only works for HDFS, not for S3 filesystems, so try setting HADOOP_CONF_DIR to the directory that the file resides in.

On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException:
--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

rmetzger0
Since this error is happening in your IDE, I would recommend using the IntelliJ debugger to follow the filesystem initialization process and see where it fails to pick up the credentials.

On Fri, Mar 12, 2021 at 11:11 PM sri hari kali charan Tummala <[hidden email]> wrote:
Same error.
 


On Fri, 12 Mar 2021 at 09:01, ChesnaSchepler <[hidden email]> wrote:
From the exception I would conclude that your core-site.xml file is not being picked up.

AFAIK fs.hdfs.hadoopconf only works for HDFS, not for S3 filesystems, so try setting HADOOP_CONF_DIR to the directory that the file resides in.

On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException:
--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
Below is a complete stack trace running my job in Intellij debug mode.

Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/bin/java -agentlib:jdwp=transport=dt_socket,address=127.0.0.1:52571,suspend=y,server=n -javaagent:/Users/hmf743/Library/Caches/JetBrains/IntelliJIdea2020.3/captureAgent/debugger-agent.jar -Dfile.encoding=UTF-8 -classpath /Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/charsets.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/cldrdata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/dnsns.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jaccess.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jfxrt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/localedata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/nashorn.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunec.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/zipfs.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jce.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfr.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfxswt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jsse.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/management-agent.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/resources.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/rt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/ant-javafx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/dt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/javafx-mx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/jconsole.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/packager.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/sa-jdi.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/tools.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/target/classes:/Users/hmf743/.m2/repository/org/apache/flink/flink-core/1.8.1/flink-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-annotations/1.8.1/flink-annotations-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-metrics-core/1.8.1/flink-metrics-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm/5.0.4-6.0/flink-shaded-asm-5.0.4-6.0.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/kryo/kryo/2.24.0/kryo-2.24.0.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/hmf743/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/hmf743/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-guava/18.0-6.0/flink-shaded-guava-18.0-6.0.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-api/1.7.15/slf4j-api-1.7.15.jar:/Users/hmf743/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/hmf743/.m2/repository/org/apache/flink/force-shading/1.8.1/force-shading-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-clients_2.11/1.8.1/flink-clients_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-runtime_2.11/1.8.1/flink-runtime_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-queryable-state-client-java_2.11/1.8.1/flink-queryable-state-client-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-hadoop-fs/1.8.1/flink-hadoop-fs-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-netty/4.1.32.Final-6.0/flink-shaded-netty-4.1.32.Final-6.0.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-jackson/2.7.9-6.0/flink-shaded-jackson-2.7.9-6.0.jar:/Users/hmf743/.m2/repository/org/javassist/javassist/3.19.0-GA/javassist-3.19.0-GA.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-actor_2.11/2.4.20/akka-actor_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/config/1.3.0/config-1.3.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-java8-compat_2.11/0.7.0/scala-java8-compat_2.11-0.7.0.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-stream_2.11/2.4.20/akka-stream_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/reactivestreams/reactive-streams/1.0.0/reactive-streams-1.0.0.jar:/Users/hmf743/.m2/repository/com/typesafe/ssl-config-core_2.11/0.2.1/ssl-config-core_2.11-0.2.1.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-protobuf_2.11/2.4.20/akka-protobuf_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-slf4j_2.11/2.4.20/akka-slf4j_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/clapper/grizzled-slf4j_2.11/1.3.2/grizzled-slf4j_2.11-1.3.2.jar:/Users/hmf743/.m2/repository/com/github/scopt/scopt_2.11/3.5.0/scopt_2.11-3.5.0.jar:/Users/hmf743/.m2/repository/com/twitter/chill_2.11/0.7.6/chill_2.11-0.7.6.jar:/Users/hmf743/.m2/repository/com/twitter/chill-java/0.7.6/chill-java-0.7.6.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-optimizer_2.11/1.8.1/flink-optimizer_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-java/1.8.1/flink-java-1.8.1.jar:/Users/hmf743/.m2/repository/commons-cli/commons-cli/1.3.1/commons-cli-1.3.1.jar:/Users/hmf743/.m2/repository/org/apache/derby/derby/10.13.1.1/derby-10.13.1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-jdbc_2.11/1.8.1/flink-jdbc_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-scala_2.11/1.8.1/flink-table-api-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-common/1.8.1/flink-table-common-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java/1.8.1/flink-table-api-java-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-table_2.11-1.7.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-planner_2.11/1.8.1/flink-table-planner_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java-bridge_2.11/1.8.1/flink-table-api-java-bridge_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-json/1.8.1/flink-json-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-scala_2.11/1.8.1/flink-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm-6/6.2.1-6.0/flink-shaded-asm-6-6.2.1-6.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-reflect/2.11.12/scala-reflect-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-library/2.11.12/scala-library-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-compiler/2.11.12/scala-compiler-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.5/scala-xml_2.11-1.0.5.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-scala_2.11/1.8.1/flink-streaming-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-java_2.11/1.8.1/flink-streaming-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/hmf743/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk/1.11.579/aws-java-sdk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationinsights/1.11.579/aws-java-sdk-applicationinsights-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/jmespath-java/1.11.579/jmespath-java-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicequotas/1.11.579/aws-java-sdk-servicequotas-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeevents/1.11.579/aws-java-sdk-personalizeevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalize/1.11.579/aws-java-sdk-personalize-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeruntime/1.11.579/aws-java-sdk-personalizeruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ioteventsdata/1.11.579/aws-java-sdk-ioteventsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotevents/1.11.579/aws-java-sdk-iotevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotthingsgraph/1.11.579/aws-java-sdk-iotthingsgraph-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-groundstation/1.11.579/aws-java-sdk-groundstation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackagevod/1.11.579/aws-java-sdk-mediapackagevod-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-managedblockchain/1.11.579/aws-java-sdk-managedblockchain-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-textract/1.11.579/aws-java-sdk-textract-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-worklink/1.11.579/aws-java-sdk-worklink-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-backup/1.11.579/aws-java-sdk-backup-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-docdb/1.11.579/aws-java-sdk-docdb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewayv2/1.11.579/aws-java-sdk-apigatewayv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewaymanagementapi/1.11.579/aws-java-sdk-apigatewaymanagementapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kafka/1.11.579/aws-java-sdk-kafka-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appmesh/1.11.579/aws-java-sdk-appmesh-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-licensemanager/1.11.579/aws-java-sdk-licensemanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-securityhub/1.11.579/aws-java-sdk-securityhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fsx/1.11.579/aws-java-sdk-fsx-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconnect/1.11.579/aws-java-sdk-mediaconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisanalyticsv2/1.11.579/aws-java-sdk-kinesisanalyticsv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehendmedical/1.11.579/aws-java-sdk-comprehendmedical-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-globalaccelerator/1.11.579/aws-java-sdk-globalaccelerator-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transfer/1.11.579/aws-java-sdk-transfer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datasync/1.11.579/aws-java-sdk-datasync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-robomaker/1.11.579/aws-java-sdk-robomaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-amplify/1.11.579/aws-java-sdk-amplify-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-quicksight/1.11.579/aws-java-sdk-quicksight-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rdsdata/1.11.579/aws-java-sdk-rdsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53resolver/1.11.579/aws-java-sdk-route53resolver-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ram/1.11.579/aws-java-sdk-ram-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3control/1.11.579/aws-java-sdk-s3control-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointsmsvoice/1.11.579/aws-java-sdk-pinpointsmsvoice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointemail/1.11.579/aws-java-sdk-pinpointemail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-chime/1.11.579/aws-java-sdk-chime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-signer/1.11.579/aws-java-sdk-signer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dlm/1.11.579/aws-java-sdk-dlm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-macie/1.11.579/aws-java-sdk-macie-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-eks/1.11.579/aws-java-sdk-eks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediatailor/1.11.579/aws-java-sdk-mediatailor-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-neptune/1.11.579/aws-java-sdk-neptune-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pi/1.11.579/aws-java-sdk-pi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickprojects/1.11.579/aws-java-sdk-iot1clickprojects-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickdevices/1.11.579/aws-java-sdk-iot1clickdevices-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotanalytics/1.11.579/aws-java-sdk-iotanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acmpca/1.11.579/aws-java-sdk-acmpca-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-secretsmanager/1.11.579/aws-java-sdk-secretsmanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fms/1.11.579/aws-java-sdk-fms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-connect/1.11.579/aws-java-sdk-connect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transcribe/1.11.579/aws-java-sdk-transcribe-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscalingplans/1.11.579/aws-java-sdk-autoscalingplans-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workmail/1.11.579/aws-java-sdk-workmail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicediscovery/1.11.579/aws-java-sdk-servicediscovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloud9/1.11.579/aws-java-sdk-cloud9-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-serverlessapplicationrepository/1.11.579/aws-java-sdk-serverlessapplicationrepository-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-alexaforbusiness/1.11.579/aws-java-sdk-alexaforbusiness-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroups/1.11.579/aws-java-sdk-resourcegroups-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehend/1.11.579/aws-java-sdk-comprehend-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-translate/1.11.579/aws-java-sdk-translate-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemaker/1.11.579/aws-java-sdk-sagemaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotjobsdataplane/1.11.579/aws-java-sdk-iotjobsdataplane-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemakerruntime/1.11.579/aws-java-sdk-sagemakerruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisvideo/1.11.579/aws-java-sdk-kinesisvideo-1.11.579.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec-http/4.1.17.Final/netty-codec-http-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec/4.1.17.Final/netty-codec-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-handler/4.1.17.Final/netty-handler-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-buffer/4.1.17.Final/netty-buffer-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-common/4.1.17.Final/netty-common-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-transport/4.1.17.Final/netty-transport-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-resolver/4.1.17.Final/netty-resolver-4.1.17.Final.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appsync/1.11.579/aws-java-sdk-appsync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-guardduty/1.11.579/aws-java-sdk-guardduty-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mq/1.11.579/aws-java-sdk-mq-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconvert/1.11.579/aws-java-sdk-mediaconvert-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastore/1.11.579/aws-java-sdk-mediastore-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastoredata/1.11.579/aws-java-sdk-mediastoredata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-medialive/1.11.579/aws-java-sdk-medialive-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackage/1.11.579/aws-java-sdk-mediapackage-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costexplorer/1.11.579/aws-java-sdk-costexplorer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pricing/1.11.579/aws-java-sdk-pricing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mobile/1.11.579/aws-java-sdk-mobile-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsmv2/1.11.579/aws-java-sdk-cloudhsmv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glue/1.11.579/aws-java-sdk-glue-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-migrationhub/1.11.579/aws-java-sdk-migrationhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dax/1.11.579/aws-java-sdk-dax-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-greengrass/1.11.579/aws-java-sdk-greengrass-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-athena/1.11.579/aws-java-sdk-athena-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplaceentitlement/1.11.579/aws-java-sdk-marketplaceentitlement-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codestar/1.11.579/aws-java-sdk-codestar-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lexmodelbuilding/1.11.579/aws-java-sdk-lexmodelbuilding-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroupstaggingapi/1.11.579/aws-java-sdk-resourcegroupstaggingapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpoint/1.11.579/aws-java-sdk-pinpoint-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-xray/1.11.579/aws-java-sdk-xray-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworkscm/1.11.579/aws-java-sdk-opsworkscm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-support/1.11.579/aws-java-sdk-support-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpledb/1.11.579/aws-java-sdk-simpledb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicecatalog/1.11.579/aws-java-sdk-servicecatalog-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servermigration/1.11.579/aws-java-sdk-servermigration-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpleworkflow/1.11.579/aws-java-sdk-simpleworkflow-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-storagegateway/1.11.579/aws-java-sdk-storagegateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53/1.11.579/aws-java-sdk-route53-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3/1.11.579/aws-java-sdk-s3-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-importexport/1.11.579/aws-java-sdk-importexport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sts/1.11.579/aws-java-sdk-sts-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sqs/1.11.579/aws-java-sdk-sqs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rds/1.11.579/aws-java-sdk-rds-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-redshift/1.11.579/aws-java-sdk-redshift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticbeanstalk/1.11.579/aws-java-sdk-elasticbeanstalk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glacier/1.11.579/aws-java-sdk-glacier-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iam/1.11.579/aws-java-sdk-iam-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datapipeline/1.11.579/aws-java-sdk-datapipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancing/1.11.579/aws-java-sdk-elasticloadbalancing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancingv2/1.11.579/aws-java-sdk-elasticloadbalancingv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-emr/1.11.579/aws-java-sdk-emr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticache/1.11.579/aws-java-sdk-elasticache-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elastictranscoder/1.11.579/aws-java-sdk-elastictranscoder-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ec2/1.11.579/aws-java-sdk-ec2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dynamodb/1.11.579/aws-java-sdk-dynamodb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sns/1.11.579/aws-java-sdk-sns-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-budgets/1.11.579/aws-java-sdk-budgets-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudtrail/1.11.579/aws-java-sdk-cloudtrail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatch/1.11.579/aws-java-sdk-cloudwatch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-logs/1.11.579/aws-java-sdk-logs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-events/1.11.579/aws-java-sdk-events-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidentity/1.11.579/aws-java-sdk-cognitoidentity-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitosync/1.11.579/aws-java-sdk-cognitosync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directconnect/1.11.579/aws-java-sdk-directconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudformation/1.11.579/aws-java-sdk-cloudformation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudfront/1.11.579/aws-java-sdk-cloudfront-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-clouddirectory/1.11.579/aws-java-sdk-clouddirectory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesis/1.11.579/aws-java-sdk-kinesis-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworks/1.11.579/aws-java-sdk-opsworks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ses/1.11.579/aws-java-sdk-ses-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscaling/1.11.579/aws-java-sdk-autoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudsearch/1.11.579/aws-java-sdk-cloudsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatchmetrics/1.11.579/aws-java-sdk-cloudwatchmetrics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codedeploy/1.11.579/aws-java-sdk-codedeploy-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codepipeline/1.11.579/aws-java-sdk-codepipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kms/1.11.579/aws-java-sdk-kms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-config/1.11.579/aws-java-sdk-config-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lambda/1.11.579/aws-java-sdk-lambda-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecs/1.11.579/aws-java-sdk-ecs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecr/1.11.579/aws-java-sdk-ecr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsm/1.11.579/aws-java-sdk-cloudhsm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ssm/1.11.579/aws-java-sdk-ssm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workspaces/1.11.579/aws-java-sdk-workspaces-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-machinelearning/1.11.579/aws-java-sdk-machinelearning-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directory/1.11.579/aws-java-sdk-directory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-efs/1.11.579/aws-java-sdk-efs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codecommit/1.11.579/aws-java-sdk-codecommit-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-devicefarm/1.11.579/aws-java-sdk-devicefarm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticsearch/1.11.579/aws-java-sdk-elasticsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-waf/1.11.579/aws-java-sdk-waf-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacecommerceanalytics/1.11.579/aws-java-sdk-marketplacecommerceanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-inspector/1.11.579/aws-java-sdk-inspector-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot/1.11.579/aws-java-sdk-iot-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-api-gateway/1.11.579/aws-java-sdk-api-gateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acm/1.11.579/aws-java-sdk-acm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-gamelift/1.11.579/aws-java-sdk-gamelift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dms/1.11.579/aws-java-sdk-dms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacemeteringservice/1.11.579/aws-java-sdk-marketplacemeteringservice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidp/1.11.579/aws-java-sdk-cognitoidp-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-discovery/1.11.579/aws-java-sdk-discovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationautoscaling/1.11.579/aws-java-sdk-applicationautoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-snowball/1.11.579/aws-java-sdk-snowball-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rekognition/1.11.579/aws-java-sdk-rekognition-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-polly/1.11.579/aws-java-sdk-polly-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lightsail/1.11.579/aws-java-sdk-lightsail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-stepfunctions/1.11.579/aws-java-sdk-stepfunctions-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-health/1.11.579/aws-java-sdk-health-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costandusagereport/1.11.579/aws-java-sdk-costandusagereport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codebuild/1.11.579/aws-java-sdk-codebuild-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appstream/1.11.579/aws-java-sdk-appstream-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-shield/1.11.579/aws-java-sdk-shield-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-batch/1.11.579/aws-java-sdk-batch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lex/1.11.579/aws-java-sdk-lex-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mechanicalturkrequester/1.11.579/aws-java-sdk-mechanicalturkrequester-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-organizations/1.11.579/aws-java-sdk-organizations-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workdocs/1.11.579/aws-java-sdk-workdocs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-core/1.11.579/aws-java-sdk-core-1.11.579.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpclient/4.5.5/httpclient-4.5.5.jar:/Users/hmf743/.m2/repository/software/amazon/ion/ion-java/1.0.2/ion-java-1.0.2.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-cbor/2.6.7/jackson-dataformat-cbor-2.6.7.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-models/1.11.579/aws-java-sdk-models-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-swf-libraries/1.11.22/aws-java-sdk-swf-libraries-1.11.22.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-aws/2.8.5/hadoop-aws-2.8.5.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.2.3/jackson-core-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.2.3/jackson-databind-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.2.3/jackson-annotations-2.2.3.jar:/Users/hmf743/.m2/repository/joda-time/joda-time/2.9.4/joda-time-2.9.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-hadoop2/2.4.1-1.8.1/flink-shaded-hadoop2-2.4.1-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/Users/hmf743/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar:/Users/hmf743/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-math3/3.5/commons-math3-3.5.jar:/Users/hmf743/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/hmf743/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/hmf743/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/hmf743/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/hmf743/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/hmf743/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/hmf743/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/hmf743/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/hmf743/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/hmf743/.m2/repository/commons-configuration/commons-configuration/1.7/commons-configuration-1.7.jar:/Users/hmf743/.m2/repository/commons-digester/commons-digester/1.8.1/commons-digester-1.8.1.jar:/Users/hmf743/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/hmf743/.m2/repository/org/apache/zookeeper/zookeeper/3.4.10/zookeeper-3.4.10.jar:/Users/hmf743/.m2/repository/commons-beanutils/commons-beanutils/1.9.3/commons-beanutils-1.9.3.jar:/Users/hmf743/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/hmf743/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/hmf743/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/hmf743/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-s3-fs-hadoop/1.8.1/flink-s3-fs-hadoop-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-common/2.4.1/hadoop-common-2.4.1.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-annotations/2.4.1/hadoop-annotations-2.4.1.jar:/Users/hmf743/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/Users/hmf743/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/hmf743/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/hmf743/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.3/jackson-jaxrs-1.8.3.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.3/jackson-xc-1.8.3.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/hmf743/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/hmf743/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/Users/hmf743/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/Users/hmf743/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/Users/hmf743/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/hmf743/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpcore/4.1.2/httpcore-4.1.2.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/Users/hmf743/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-auth/2.4.1/hadoop-auth-2.4.1.jar:/Users/hmf743/Library/Application Support/JetBrains/Toolbox/apps/IDEA-U/ch-0/203.5981.155/IntelliJ IDEA.app/Contents/lib/idea_rt.jar examples.s3.FlinkReadS3
Connected to the target VM, address: '127.0.0.1:52571', transport: 'socket'
log4j:WARN No appenders could be found for logger (com.amazonaws.auth.AWSCredentialsProviderChain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.flink.util.FlinkException: Could not close resource.
at org.apache.flink.util.AutoCloseableAsync.close(AutoCloseableAsync.java:42)
at org.apache.flink.client.LocalExecutor.stop(LocalExecutor.java:155)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:227)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at examples.s3.FlinkReadS3$.main(FlinkReadS3.scala:124)
at examples.s3.FlinkReadS3.main(FlinkReadS3.scala)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
at java.util.concurrent.CompletableFuture$AsyncSupply.run$$$capture(CompletableFuture.java:1604)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:76)
at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:351)
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
... 8 more
Caused by: org.apache.flink.runtime.JobException: Creating the input splits caused an error: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:267)
at org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:853)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:232)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:100)
at org.apache.flink.runtime.jobmaster.JobMaster.createExecutionGraph(JobMaster.java:1198)
at org.apache.flink.runtime.jobmaster.JobMaster.createAndRestoreExecutionGraph(JobMaster.java:1178)
at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:287)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:83)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:37)
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
... 11 more
Caused by: java.net.SocketTimeoutException: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateInterruptedException(S3AUtils.java:330)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:171)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:260)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:317)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:256)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:231)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:372)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:308)
at org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.create(AbstractS3FileSystemFactory.java:125)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:395)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:318)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:587)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:62)
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:253)
... 20 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:139)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:5086)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:5060)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4309)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1337)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1277)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$verifyBucketExists$1(S3AFileSystem.java:373)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109)
... 33 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.handleError(EC2CredentialsFetcher.java:183)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:162)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.getCredentials(EC2CredentialsFetcher.java:82)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(InstanceProfileCredentialsProvider.java:151)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:117)
... 50 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1593)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:110)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:79)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider$InstanceMetadataCredentialsEndpointProvider.getCredentialsEndpoint(InstanceProfileCredentialsProvider.java:174)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:122)
... 53 more

On Mon, Mar 15, 2021 at 4:59 AM Robert Metzger <[hidden email]> wrote:
Since this error is happening in your IDE, I would recommend using the IntelliJ debugger to follow the filesystem initialization process and see where it fails to pick up the credentials.

On Fri, Mar 12, 2021 at 11:11 PM sri hari kali charan Tummala <[hidden email]> wrote:
Same error.
 


On Fri, 12 Mar 2021 at 09:01, ChesnaSchepler <[hidden email]> wrote:
From the exception I would conclude that your core-site.xml file is not being picked up.

AFAIK fs.hdfs.hadoopconf only works for HDFS, not for S3 filesystems, so try setting HADOOP_CONF_DIR to the directory that the file resides in.

On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException:
--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

rmetzger0
Mh, this looks like a network issue. Is it possible that you can not access some AWS services from your network? 
On Mon, Mar 15, 2021 at 6:39 PM sri hari kali charan Tummala <[hidden email]> wrote:
Below is a complete stack trace running my job in Intellij debug mode.

Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/bin/java -agentlib:jdwp=transport=dt_socket,address=127.0.0.1:52571,suspend=y,server=n -javaagent:/Users/hmf743/Library/Caches/JetBrains/IntelliJIdea2020.3/captureAgent/debugger-agent.jar -Dfile.encoding=UTF-8 -classpath /Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/charsets.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/cldrdata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/dnsns.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jaccess.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jfxrt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/localedata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/nashorn.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunec.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/zipfs.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jce.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfr.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfxswt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jsse.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/management-agent.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/resources.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/rt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/ant-javafx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/dt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/javafx-mx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/jconsole.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/packager.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/sa-jdi.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/tools.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/target/classes:/Users/hmf743/.m2/repository/org/apache/flink/flink-core/1.8.1/flink-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-annotations/1.8.1/flink-annotations-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-metrics-core/1.8.1/flink-metrics-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm/5.0.4-6.0/flink-shaded-asm-5.0.4-6.0.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/kryo/kryo/2.24.0/kryo-2.24.0.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/hmf743/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/hmf743/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-guava/18.0-6.0/flink-shaded-guava-18.0-6.0.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-api/1.7.15/slf4j-api-1.7.15.jar:/Users/hmf743/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/hmf743/.m2/repository/org/apache/flink/force-shading/1.8.1/force-shading-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-clients_2.11/1.8.1/flink-clients_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-runtime_2.11/1.8.1/flink-runtime_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-queryable-state-client-java_2.11/1.8.1/flink-queryable-state-client-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-hadoop-fs/1.8.1/flink-hadoop-fs-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-netty/4.1.32.Final-6.0/flink-shaded-netty-4.1.32.Final-6.0.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-jackson/2.7.9-6.0/flink-shaded-jackson-2.7.9-6.0.jar:/Users/hmf743/.m2/repository/org/javassist/javassist/3.19.0-GA/javassist-3.19.0-GA.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-actor_2.11/2.4.20/akka-actor_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/config/1.3.0/config-1.3.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-java8-compat_2.11/0.7.0/scala-java8-compat_2.11-0.7.0.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-stream_2.11/2.4.20/akka-stream_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/reactivestreams/reactive-streams/1.0.0/reactive-streams-1.0.0.jar:/Users/hmf743/.m2/repository/com/typesafe/ssl-config-core_2.11/0.2.1/ssl-config-core_2.11-0.2.1.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-protobuf_2.11/2.4.20/akka-protobuf_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-slf4j_2.11/2.4.20/akka-slf4j_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/clapper/grizzled-slf4j_2.11/1.3.2/grizzled-slf4j_2.11-1.3.2.jar:/Users/hmf743/.m2/repository/com/github/scopt/scopt_2.11/3.5.0/scopt_2.11-3.5.0.jar:/Users/hmf743/.m2/repository/com/twitter/chill_2.11/0.7.6/chill_2.11-0.7.6.jar:/Users/hmf743/.m2/repository/com/twitter/chill-java/0.7.6/chill-java-0.7.6.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-optimizer_2.11/1.8.1/flink-optimizer_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-java/1.8.1/flink-java-1.8.1.jar:/Users/hmf743/.m2/repository/commons-cli/commons-cli/1.3.1/commons-cli-1.3.1.jar:/Users/hmf743/.m2/repository/org/apache/derby/derby/10.13.1.1/derby-10.13.1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-jdbc_2.11/1.8.1/flink-jdbc_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-scala_2.11/1.8.1/flink-table-api-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-common/1.8.1/flink-table-common-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java/1.8.1/flink-table-api-java-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-table_2.11-1.7.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-planner_2.11/1.8.1/flink-table-planner_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java-bridge_2.11/1.8.1/flink-table-api-java-bridge_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-json/1.8.1/flink-json-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-scala_2.11/1.8.1/flink-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm-6/6.2.1-6.0/flink-shaded-asm-6-6.2.1-6.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-reflect/2.11.12/scala-reflect-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-library/2.11.12/scala-library-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-compiler/2.11.12/scala-compiler-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.5/scala-xml_2.11-1.0.5.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-scala_2.11/1.8.1/flink-streaming-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-java_2.11/1.8.1/flink-streaming-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/hmf743/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk/1.11.579/aws-java-sdk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationinsights/1.11.579/aws-java-sdk-applicationinsights-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/jmespath-java/1.11.579/jmespath-java-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicequotas/1.11.579/aws-java-sdk-servicequotas-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeevents/1.11.579/aws-java-sdk-personalizeevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalize/1.11.579/aws-java-sdk-personalize-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeruntime/1.11.579/aws-java-sdk-personalizeruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ioteventsdata/1.11.579/aws-java-sdk-ioteventsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotevents/1.11.579/aws-java-sdk-iotevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotthingsgraph/1.11.579/aws-java-sdk-iotthingsgraph-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-groundstation/1.11.579/aws-java-sdk-groundstation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackagevod/1.11.579/aws-java-sdk-mediapackagevod-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-managedblockchain/1.11.579/aws-java-sdk-managedblockchain-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-textract/1.11.579/aws-java-sdk-textract-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-worklink/1.11.579/aws-java-sdk-worklink-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-backup/1.11.579/aws-java-sdk-backup-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-docdb/1.11.579/aws-java-sdk-docdb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewayv2/1.11.579/aws-java-sdk-apigatewayv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewaymanagementapi/1.11.579/aws-java-sdk-apigatewaymanagementapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kafka/1.11.579/aws-java-sdk-kafka-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appmesh/1.11.579/aws-java-sdk-appmesh-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-licensemanager/1.11.579/aws-java-sdk-licensemanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-securityhub/1.11.579/aws-java-sdk-securityhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fsx/1.11.579/aws-java-sdk-fsx-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconnect/1.11.579/aws-java-sdk-mediaconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisanalyticsv2/1.11.579/aws-java-sdk-kinesisanalyticsv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehendmedical/1.11.579/aws-java-sdk-comprehendmedical-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-globalaccelerator/1.11.579/aws-java-sdk-globalaccelerator-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transfer/1.11.579/aws-java-sdk-transfer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datasync/1.11.579/aws-java-sdk-datasync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-robomaker/1.11.579/aws-java-sdk-robomaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-amplify/1.11.579/aws-java-sdk-amplify-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-quicksight/1.11.579/aws-java-sdk-quicksight-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rdsdata/1.11.579/aws-java-sdk-rdsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53resolver/1.11.579/aws-java-sdk-route53resolver-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ram/1.11.579/aws-java-sdk-ram-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3control/1.11.579/aws-java-sdk-s3control-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointsmsvoice/1.11.579/aws-java-sdk-pinpointsmsvoice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointemail/1.11.579/aws-java-sdk-pinpointemail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-chime/1.11.579/aws-java-sdk-chime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-signer/1.11.579/aws-java-sdk-signer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dlm/1.11.579/aws-java-sdk-dlm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-macie/1.11.579/aws-java-sdk-macie-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-eks/1.11.579/aws-java-sdk-eks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediatailor/1.11.579/aws-java-sdk-mediatailor-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-neptune/1.11.579/aws-java-sdk-neptune-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pi/1.11.579/aws-java-sdk-pi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickprojects/1.11.579/aws-java-sdk-iot1clickprojects-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickdevices/1.11.579/aws-java-sdk-iot1clickdevices-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotanalytics/1.11.579/aws-java-sdk-iotanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acmpca/1.11.579/aws-java-sdk-acmpca-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-secretsmanager/1.11.579/aws-java-sdk-secretsmanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fms/1.11.579/aws-java-sdk-fms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-connect/1.11.579/aws-java-sdk-connect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transcribe/1.11.579/aws-java-sdk-transcribe-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscalingplans/1.11.579/aws-java-sdk-autoscalingplans-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workmail/1.11.579/aws-java-sdk-workmail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicediscovery/1.11.579/aws-java-sdk-servicediscovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloud9/1.11.579/aws-java-sdk-cloud9-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-serverlessapplicationrepository/1.11.579/aws-java-sdk-serverlessapplicationrepository-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-alexaforbusiness/1.11.579/aws-java-sdk-alexaforbusiness-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroups/1.11.579/aws-java-sdk-resourcegroups-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehend/1.11.579/aws-java-sdk-comprehend-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-translate/1.11.579/aws-java-sdk-translate-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemaker/1.11.579/aws-java-sdk-sagemaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotjobsdataplane/1.11.579/aws-java-sdk-iotjobsdataplane-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemakerruntime/1.11.579/aws-java-sdk-sagemakerruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisvideo/1.11.579/aws-java-sdk-kinesisvideo-1.11.579.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec-http/4.1.17.Final/netty-codec-http-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec/4.1.17.Final/netty-codec-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-handler/4.1.17.Final/netty-handler-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-buffer/4.1.17.Final/netty-buffer-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-common/4.1.17.Final/netty-common-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-transport/4.1.17.Final/netty-transport-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-resolver/4.1.17.Final/netty-resolver-4.1.17.Final.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appsync/1.11.579/aws-java-sdk-appsync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-guardduty/1.11.579/aws-java-sdk-guardduty-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mq/1.11.579/aws-java-sdk-mq-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconvert/1.11.579/aws-java-sdk-mediaconvert-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastore/1.11.579/aws-java-sdk-mediastore-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastoredata/1.11.579/aws-java-sdk-mediastoredata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-medialive/1.11.579/aws-java-sdk-medialive-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackage/1.11.579/aws-java-sdk-mediapackage-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costexplorer/1.11.579/aws-java-sdk-costexplorer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pricing/1.11.579/aws-java-sdk-pricing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mobile/1.11.579/aws-java-sdk-mobile-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsmv2/1.11.579/aws-java-sdk-cloudhsmv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glue/1.11.579/aws-java-sdk-glue-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-migrationhub/1.11.579/aws-java-sdk-migrationhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dax/1.11.579/aws-java-sdk-dax-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-greengrass/1.11.579/aws-java-sdk-greengrass-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-athena/1.11.579/aws-java-sdk-athena-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplaceentitlement/1.11.579/aws-java-sdk-marketplaceentitlement-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codestar/1.11.579/aws-java-sdk-codestar-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lexmodelbuilding/1.11.579/aws-java-sdk-lexmodelbuilding-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroupstaggingapi/1.11.579/aws-java-sdk-resourcegroupstaggingapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpoint/1.11.579/aws-java-sdk-pinpoint-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-xray/1.11.579/aws-java-sdk-xray-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworkscm/1.11.579/aws-java-sdk-opsworkscm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-support/1.11.579/aws-java-sdk-support-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpledb/1.11.579/aws-java-sdk-simpledb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicecatalog/1.11.579/aws-java-sdk-servicecatalog-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servermigration/1.11.579/aws-java-sdk-servermigration-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpleworkflow/1.11.579/aws-java-sdk-simpleworkflow-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-storagegateway/1.11.579/aws-java-sdk-storagegateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53/1.11.579/aws-java-sdk-route53-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3/1.11.579/aws-java-sdk-s3-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-importexport/1.11.579/aws-java-sdk-importexport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sts/1.11.579/aws-java-sdk-sts-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sqs/1.11.579/aws-java-sdk-sqs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rds/1.11.579/aws-java-sdk-rds-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-redshift/1.11.579/aws-java-sdk-redshift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticbeanstalk/1.11.579/aws-java-sdk-elasticbeanstalk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glacier/1.11.579/aws-java-sdk-glacier-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iam/1.11.579/aws-java-sdk-iam-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datapipeline/1.11.579/aws-java-sdk-datapipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancing/1.11.579/aws-java-sdk-elasticloadbalancing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancingv2/1.11.579/aws-java-sdk-elasticloadbalancingv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-emr/1.11.579/aws-java-sdk-emr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticache/1.11.579/aws-java-sdk-elasticache-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elastictranscoder/1.11.579/aws-java-sdk-elastictranscoder-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ec2/1.11.579/aws-java-sdk-ec2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dynamodb/1.11.579/aws-java-sdk-dynamodb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sns/1.11.579/aws-java-sdk-sns-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-budgets/1.11.579/aws-java-sdk-budgets-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudtrail/1.11.579/aws-java-sdk-cloudtrail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatch/1.11.579/aws-java-sdk-cloudwatch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-logs/1.11.579/aws-java-sdk-logs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-events/1.11.579/aws-java-sdk-events-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidentity/1.11.579/aws-java-sdk-cognitoidentity-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitosync/1.11.579/aws-java-sdk-cognitosync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directconnect/1.11.579/aws-java-sdk-directconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudformation/1.11.579/aws-java-sdk-cloudformation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudfront/1.11.579/aws-java-sdk-cloudfront-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-clouddirectory/1.11.579/aws-java-sdk-clouddirectory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesis/1.11.579/aws-java-sdk-kinesis-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworks/1.11.579/aws-java-sdk-opsworks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ses/1.11.579/aws-java-sdk-ses-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscaling/1.11.579/aws-java-sdk-autoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudsearch/1.11.579/aws-java-sdk-cloudsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatchmetrics/1.11.579/aws-java-sdk-cloudwatchmetrics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codedeploy/1.11.579/aws-java-sdk-codedeploy-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codepipeline/1.11.579/aws-java-sdk-codepipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kms/1.11.579/aws-java-sdk-kms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-config/1.11.579/aws-java-sdk-config-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lambda/1.11.579/aws-java-sdk-lambda-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecs/1.11.579/aws-java-sdk-ecs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecr/1.11.579/aws-java-sdk-ecr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsm/1.11.579/aws-java-sdk-cloudhsm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ssm/1.11.579/aws-java-sdk-ssm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workspaces/1.11.579/aws-java-sdk-workspaces-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-machinelearning/1.11.579/aws-java-sdk-machinelearning-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directory/1.11.579/aws-java-sdk-directory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-efs/1.11.579/aws-java-sdk-efs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codecommit/1.11.579/aws-java-sdk-codecommit-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-devicefarm/1.11.579/aws-java-sdk-devicefarm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticsearch/1.11.579/aws-java-sdk-elasticsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-waf/1.11.579/aws-java-sdk-waf-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacecommerceanalytics/1.11.579/aws-java-sdk-marketplacecommerceanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-inspector/1.11.579/aws-java-sdk-inspector-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot/1.11.579/aws-java-sdk-iot-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-api-gateway/1.11.579/aws-java-sdk-api-gateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acm/1.11.579/aws-java-sdk-acm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-gamelift/1.11.579/aws-java-sdk-gamelift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dms/1.11.579/aws-java-sdk-dms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacemeteringservice/1.11.579/aws-java-sdk-marketplacemeteringservice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidp/1.11.579/aws-java-sdk-cognitoidp-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-discovery/1.11.579/aws-java-sdk-discovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationautoscaling/1.11.579/aws-java-sdk-applicationautoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-snowball/1.11.579/aws-java-sdk-snowball-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rekognition/1.11.579/aws-java-sdk-rekognition-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-polly/1.11.579/aws-java-sdk-polly-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lightsail/1.11.579/aws-java-sdk-lightsail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-stepfunctions/1.11.579/aws-java-sdk-stepfunctions-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-health/1.11.579/aws-java-sdk-health-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costandusagereport/1.11.579/aws-java-sdk-costandusagereport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codebuild/1.11.579/aws-java-sdk-codebuild-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appstream/1.11.579/aws-java-sdk-appstream-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-shield/1.11.579/aws-java-sdk-shield-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-batch/1.11.579/aws-java-sdk-batch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lex/1.11.579/aws-java-sdk-lex-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mechanicalturkrequester/1.11.579/aws-java-sdk-mechanicalturkrequester-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-organizations/1.11.579/aws-java-sdk-organizations-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workdocs/1.11.579/aws-java-sdk-workdocs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-core/1.11.579/aws-java-sdk-core-1.11.579.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpclient/4.5.5/httpclient-4.5.5.jar:/Users/hmf743/.m2/repository/software/amazon/ion/ion-java/1.0.2/ion-java-1.0.2.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-cbor/2.6.7/jackson-dataformat-cbor-2.6.7.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-models/1.11.579/aws-java-sdk-models-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-swf-libraries/1.11.22/aws-java-sdk-swf-libraries-1.11.22.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-aws/2.8.5/hadoop-aws-2.8.5.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.2.3/jackson-core-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.2.3/jackson-databind-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.2.3/jackson-annotations-2.2.3.jar:/Users/hmf743/.m2/repository/joda-time/joda-time/2.9.4/joda-time-2.9.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-hadoop2/2.4.1-1.8.1/flink-shaded-hadoop2-2.4.1-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/Users/hmf743/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar:/Users/hmf743/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-math3/3.5/commons-math3-3.5.jar:/Users/hmf743/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/hmf743/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/hmf743/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/hmf743/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/hmf743/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/hmf743/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/hmf743/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/hmf743/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/hmf743/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/hmf743/.m2/repository/commons-configuration/commons-configuration/1.7/commons-configuration-1.7.jar:/Users/hmf743/.m2/repository/commons-digester/commons-digester/1.8.1/commons-digester-1.8.1.jar:/Users/hmf743/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/hmf743/.m2/repository/org/apache/zookeeper/zookeeper/3.4.10/zookeeper-3.4.10.jar:/Users/hmf743/.m2/repository/commons-beanutils/commons-beanutils/1.9.3/commons-beanutils-1.9.3.jar:/Users/hmf743/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/hmf743/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/hmf743/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/hmf743/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-s3-fs-hadoop/1.8.1/flink-s3-fs-hadoop-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-common/2.4.1/hadoop-common-2.4.1.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-annotations/2.4.1/hadoop-annotations-2.4.1.jar:/Users/hmf743/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/Users/hmf743/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/hmf743/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/hmf743/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.3/jackson-jaxrs-1.8.3.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.3/jackson-xc-1.8.3.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/hmf743/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/hmf743/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/Users/hmf743/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/Users/hmf743/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/Users/hmf743/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/hmf743/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpcore/4.1.2/httpcore-4.1.2.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/Users/hmf743/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-auth/2.4.1/hadoop-auth-2.4.1.jar:/Users/hmf743/Library/Application Support/JetBrains/Toolbox/apps/IDEA-U/ch-0/203.5981.155/IntelliJ IDEA.app/Contents/lib/idea_rt.jar examples.s3.FlinkReadS3
Connected to the target VM, address: '127.0.0.1:52571', transport: 'socket'
log4j:WARN No appenders could be found for logger (com.amazonaws.auth.AWSCredentialsProviderChain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.flink.util.FlinkException: Could not close resource.
at org.apache.flink.util.AutoCloseableAsync.close(AutoCloseableAsync.java:42)
at org.apache.flink.client.LocalExecutor.stop(LocalExecutor.java:155)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:227)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at examples.s3.FlinkReadS3$.main(FlinkReadS3.scala:124)
at examples.s3.FlinkReadS3.main(FlinkReadS3.scala)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
at java.util.concurrent.CompletableFuture$AsyncSupply.run$$$capture(CompletableFuture.java:1604)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:76)
at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:351)
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
... 8 more
Caused by: org.apache.flink.runtime.JobException: Creating the input splits caused an error: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:267)
at org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:853)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:232)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:100)
at org.apache.flink.runtime.jobmaster.JobMaster.createExecutionGraph(JobMaster.java:1198)
at org.apache.flink.runtime.jobmaster.JobMaster.createAndRestoreExecutionGraph(JobMaster.java:1178)
at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:287)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:83)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:37)
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
... 11 more
Caused by: java.net.SocketTimeoutException: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateInterruptedException(S3AUtils.java:330)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:171)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:260)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:317)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:256)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:231)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:372)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:308)
at org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.create(AbstractS3FileSystemFactory.java:125)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:395)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:318)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:587)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:62)
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:253)
... 20 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:139)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:5086)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:5060)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4309)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1337)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1277)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$verifyBucketExists$1(S3AFileSystem.java:373)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109)
... 33 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.handleError(EC2CredentialsFetcher.java:183)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:162)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.getCredentials(EC2CredentialsFetcher.java:82)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(InstanceProfileCredentialsProvider.java:151)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:117)
... 50 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1593)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:110)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:79)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider$InstanceMetadataCredentialsEndpointProvider.getCredentialsEndpoint(InstanceProfileCredentialsProvider.java:174)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:122)
... 53 more

On Mon, Mar 15, 2021 at 4:59 AM Robert Metzger <[hidden email]> wrote:
Since this error is happening in your IDE, I would recommend using the IntelliJ debugger to follow the filesystem initialization process and see where it fails to pick up the credentials.

On Fri, Mar 12, 2021 at 11:11 PM sri hari kali charan Tummala <[hidden email]> wrote:
Same error.
 


On Fri, 12 Mar 2021 at 09:01, ChesnaSchepler <[hidden email]> wrote:
From the exception I would conclude that your core-site.xml file is not being picked up.

AFAIK fs.hdfs.hadoopconf only works for HDFS, not for S3 filesystems, so try setting HADOOP_CONF_DIR to the directory that the file resides in.

On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException:
--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
I can access AWS Kinesis from Flink under same account from Intellij, I am able to  access S3 from spark too.

Thanks
Sri

On Mon, Mar 15, 2021 at 11:23 AM Robert Metzger <[hidden email]> wrote:
Mh, this looks like a network issue. Is it possible that you can not access some AWS services from your network? 
On Mon, Mar 15, 2021 at 6:39 PM sri hari kali charan Tummala <[hidden email]> wrote:
Below is a complete stack trace running my job in Intellij debug mode.

Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/bin/java -agentlib:jdwp=transport=dt_socket,address=127.0.0.1:52571,suspend=y,server=n -javaagent:/Users/hmf743/Library/Caches/JetBrains/IntelliJIdea2020.3/captureAgent/debugger-agent.jar -Dfile.encoding=UTF-8 -classpath /Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/charsets.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/cldrdata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/dnsns.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jaccess.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jfxrt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/localedata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/nashorn.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunec.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/zipfs.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jce.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfr.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfxswt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jsse.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/management-agent.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/resources.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/rt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/ant-javafx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/dt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/javafx-mx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/jconsole.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/packager.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/sa-jdi.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/tools.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/target/classes:/Users/hmf743/.m2/repository/org/apache/flink/flink-core/1.8.1/flink-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-annotations/1.8.1/flink-annotations-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-metrics-core/1.8.1/flink-metrics-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm/5.0.4-6.0/flink-shaded-asm-5.0.4-6.0.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/kryo/kryo/2.24.0/kryo-2.24.0.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/hmf743/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/hmf743/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-guava/18.0-6.0/flink-shaded-guava-18.0-6.0.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-api/1.7.15/slf4j-api-1.7.15.jar:/Users/hmf743/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/hmf743/.m2/repository/org/apache/flink/force-shading/1.8.1/force-shading-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-clients_2.11/1.8.1/flink-clients_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-runtime_2.11/1.8.1/flink-runtime_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-queryable-state-client-java_2.11/1.8.1/flink-queryable-state-client-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-hadoop-fs/1.8.1/flink-hadoop-fs-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-netty/4.1.32.Final-6.0/flink-shaded-netty-4.1.32.Final-6.0.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-jackson/2.7.9-6.0/flink-shaded-jackson-2.7.9-6.0.jar:/Users/hmf743/.m2/repository/org/javassist/javassist/3.19.0-GA/javassist-3.19.0-GA.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-actor_2.11/2.4.20/akka-actor_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/config/1.3.0/config-1.3.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-java8-compat_2.11/0.7.0/scala-java8-compat_2.11-0.7.0.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-stream_2.11/2.4.20/akka-stream_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/reactivestreams/reactive-streams/1.0.0/reactive-streams-1.0.0.jar:/Users/hmf743/.m2/repository/com/typesafe/ssl-config-core_2.11/0.2.1/ssl-config-core_2.11-0.2.1.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-protobuf_2.11/2.4.20/akka-protobuf_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-slf4j_2.11/2.4.20/akka-slf4j_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/clapper/grizzled-slf4j_2.11/1.3.2/grizzled-slf4j_2.11-1.3.2.jar:/Users/hmf743/.m2/repository/com/github/scopt/scopt_2.11/3.5.0/scopt_2.11-3.5.0.jar:/Users/hmf743/.m2/repository/com/twitter/chill_2.11/0.7.6/chill_2.11-0.7.6.jar:/Users/hmf743/.m2/repository/com/twitter/chill-java/0.7.6/chill-java-0.7.6.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-optimizer_2.11/1.8.1/flink-optimizer_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-java/1.8.1/flink-java-1.8.1.jar:/Users/hmf743/.m2/repository/commons-cli/commons-cli/1.3.1/commons-cli-1.3.1.jar:/Users/hmf743/.m2/repository/org/apache/derby/derby/10.13.1.1/derby-10.13.1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-jdbc_2.11/1.8.1/flink-jdbc_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-scala_2.11/1.8.1/flink-table-api-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-common/1.8.1/flink-table-common-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java/1.8.1/flink-table-api-java-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-table_2.11-1.7.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-planner_2.11/1.8.1/flink-table-planner_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java-bridge_2.11/1.8.1/flink-table-api-java-bridge_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-json/1.8.1/flink-json-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-scala_2.11/1.8.1/flink-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm-6/6.2.1-6.0/flink-shaded-asm-6-6.2.1-6.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-reflect/2.11.12/scala-reflect-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-library/2.11.12/scala-library-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-compiler/2.11.12/scala-compiler-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.5/scala-xml_2.11-1.0.5.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-scala_2.11/1.8.1/flink-streaming-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-java_2.11/1.8.1/flink-streaming-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/hmf743/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk/1.11.579/aws-java-sdk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationinsights/1.11.579/aws-java-sdk-applicationinsights-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/jmespath-java/1.11.579/jmespath-java-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicequotas/1.11.579/aws-java-sdk-servicequotas-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeevents/1.11.579/aws-java-sdk-personalizeevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalize/1.11.579/aws-java-sdk-personalize-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeruntime/1.11.579/aws-java-sdk-personalizeruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ioteventsdata/1.11.579/aws-java-sdk-ioteventsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotevents/1.11.579/aws-java-sdk-iotevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotthingsgraph/1.11.579/aws-java-sdk-iotthingsgraph-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-groundstation/1.11.579/aws-java-sdk-groundstation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackagevod/1.11.579/aws-java-sdk-mediapackagevod-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-managedblockchain/1.11.579/aws-java-sdk-managedblockchain-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-textract/1.11.579/aws-java-sdk-textract-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-worklink/1.11.579/aws-java-sdk-worklink-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-backup/1.11.579/aws-java-sdk-backup-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-docdb/1.11.579/aws-java-sdk-docdb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewayv2/1.11.579/aws-java-sdk-apigatewayv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewaymanagementapi/1.11.579/aws-java-sdk-apigatewaymanagementapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kafka/1.11.579/aws-java-sdk-kafka-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appmesh/1.11.579/aws-java-sdk-appmesh-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-licensemanager/1.11.579/aws-java-sdk-licensemanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-securityhub/1.11.579/aws-java-sdk-securityhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fsx/1.11.579/aws-java-sdk-fsx-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconnect/1.11.579/aws-java-sdk-mediaconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisanalyticsv2/1.11.579/aws-java-sdk-kinesisanalyticsv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehendmedical/1.11.579/aws-java-sdk-comprehendmedical-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-globalaccelerator/1.11.579/aws-java-sdk-globalaccelerator-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transfer/1.11.579/aws-java-sdk-transfer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datasync/1.11.579/aws-java-sdk-datasync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-robomaker/1.11.579/aws-java-sdk-robomaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-amplify/1.11.579/aws-java-sdk-amplify-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-quicksight/1.11.579/aws-java-sdk-quicksight-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rdsdata/1.11.579/aws-java-sdk-rdsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53resolver/1.11.579/aws-java-sdk-route53resolver-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ram/1.11.579/aws-java-sdk-ram-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3control/1.11.579/aws-java-sdk-s3control-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointsmsvoice/1.11.579/aws-java-sdk-pinpointsmsvoice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointemail/1.11.579/aws-java-sdk-pinpointemail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-chime/1.11.579/aws-java-sdk-chime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-signer/1.11.579/aws-java-sdk-signer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dlm/1.11.579/aws-java-sdk-dlm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-macie/1.11.579/aws-java-sdk-macie-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-eks/1.11.579/aws-java-sdk-eks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediatailor/1.11.579/aws-java-sdk-mediatailor-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-neptune/1.11.579/aws-java-sdk-neptune-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pi/1.11.579/aws-java-sdk-pi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickprojects/1.11.579/aws-java-sdk-iot1clickprojects-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickdevices/1.11.579/aws-java-sdk-iot1clickdevices-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotanalytics/1.11.579/aws-java-sdk-iotanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acmpca/1.11.579/aws-java-sdk-acmpca-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-secretsmanager/1.11.579/aws-java-sdk-secretsmanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fms/1.11.579/aws-java-sdk-fms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-connect/1.11.579/aws-java-sdk-connect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transcribe/1.11.579/aws-java-sdk-transcribe-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscalingplans/1.11.579/aws-java-sdk-autoscalingplans-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workmail/1.11.579/aws-java-sdk-workmail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicediscovery/1.11.579/aws-java-sdk-servicediscovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloud9/1.11.579/aws-java-sdk-cloud9-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-serverlessapplicationrepository/1.11.579/aws-java-sdk-serverlessapplicationrepository-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-alexaforbusiness/1.11.579/aws-java-sdk-alexaforbusiness-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroups/1.11.579/aws-java-sdk-resourcegroups-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehend/1.11.579/aws-java-sdk-comprehend-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-translate/1.11.579/aws-java-sdk-translate-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemaker/1.11.579/aws-java-sdk-sagemaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotjobsdataplane/1.11.579/aws-java-sdk-iotjobsdataplane-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemakerruntime/1.11.579/aws-java-sdk-sagemakerruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisvideo/1.11.579/aws-java-sdk-kinesisvideo-1.11.579.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec-http/4.1.17.Final/netty-codec-http-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec/4.1.17.Final/netty-codec-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-handler/4.1.17.Final/netty-handler-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-buffer/4.1.17.Final/netty-buffer-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-common/4.1.17.Final/netty-common-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-transport/4.1.17.Final/netty-transport-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-resolver/4.1.17.Final/netty-resolver-4.1.17.Final.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appsync/1.11.579/aws-java-sdk-appsync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-guardduty/1.11.579/aws-java-sdk-guardduty-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mq/1.11.579/aws-java-sdk-mq-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconvert/1.11.579/aws-java-sdk-mediaconvert-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastore/1.11.579/aws-java-sdk-mediastore-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastoredata/1.11.579/aws-java-sdk-mediastoredata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-medialive/1.11.579/aws-java-sdk-medialive-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackage/1.11.579/aws-java-sdk-mediapackage-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costexplorer/1.11.579/aws-java-sdk-costexplorer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pricing/1.11.579/aws-java-sdk-pricing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mobile/1.11.579/aws-java-sdk-mobile-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsmv2/1.11.579/aws-java-sdk-cloudhsmv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glue/1.11.579/aws-java-sdk-glue-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-migrationhub/1.11.579/aws-java-sdk-migrationhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dax/1.11.579/aws-java-sdk-dax-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-greengrass/1.11.579/aws-java-sdk-greengrass-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-athena/1.11.579/aws-java-sdk-athena-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplaceentitlement/1.11.579/aws-java-sdk-marketplaceentitlement-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codestar/1.11.579/aws-java-sdk-codestar-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lexmodelbuilding/1.11.579/aws-java-sdk-lexmodelbuilding-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroupstaggingapi/1.11.579/aws-java-sdk-resourcegroupstaggingapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpoint/1.11.579/aws-java-sdk-pinpoint-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-xray/1.11.579/aws-java-sdk-xray-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworkscm/1.11.579/aws-java-sdk-opsworkscm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-support/1.11.579/aws-java-sdk-support-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpledb/1.11.579/aws-java-sdk-simpledb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicecatalog/1.11.579/aws-java-sdk-servicecatalog-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servermigration/1.11.579/aws-java-sdk-servermigration-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpleworkflow/1.11.579/aws-java-sdk-simpleworkflow-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-storagegateway/1.11.579/aws-java-sdk-storagegateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53/1.11.579/aws-java-sdk-route53-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3/1.11.579/aws-java-sdk-s3-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-importexport/1.11.579/aws-java-sdk-importexport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sts/1.11.579/aws-java-sdk-sts-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sqs/1.11.579/aws-java-sdk-sqs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rds/1.11.579/aws-java-sdk-rds-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-redshift/1.11.579/aws-java-sdk-redshift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticbeanstalk/1.11.579/aws-java-sdk-elasticbeanstalk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glacier/1.11.579/aws-java-sdk-glacier-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iam/1.11.579/aws-java-sdk-iam-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datapipeline/1.11.579/aws-java-sdk-datapipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancing/1.11.579/aws-java-sdk-elasticloadbalancing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancingv2/1.11.579/aws-java-sdk-elasticloadbalancingv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-emr/1.11.579/aws-java-sdk-emr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticache/1.11.579/aws-java-sdk-elasticache-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elastictranscoder/1.11.579/aws-java-sdk-elastictranscoder-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ec2/1.11.579/aws-java-sdk-ec2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dynamodb/1.11.579/aws-java-sdk-dynamodb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sns/1.11.579/aws-java-sdk-sns-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-budgets/1.11.579/aws-java-sdk-budgets-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudtrail/1.11.579/aws-java-sdk-cloudtrail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatch/1.11.579/aws-java-sdk-cloudwatch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-logs/1.11.579/aws-java-sdk-logs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-events/1.11.579/aws-java-sdk-events-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidentity/1.11.579/aws-java-sdk-cognitoidentity-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitosync/1.11.579/aws-java-sdk-cognitosync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directconnect/1.11.579/aws-java-sdk-directconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudformation/1.11.579/aws-java-sdk-cloudformation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudfront/1.11.579/aws-java-sdk-cloudfront-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-clouddirectory/1.11.579/aws-java-sdk-clouddirectory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesis/1.11.579/aws-java-sdk-kinesis-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworks/1.11.579/aws-java-sdk-opsworks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ses/1.11.579/aws-java-sdk-ses-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscaling/1.11.579/aws-java-sdk-autoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudsearch/1.11.579/aws-java-sdk-cloudsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatchmetrics/1.11.579/aws-java-sdk-cloudwatchmetrics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codedeploy/1.11.579/aws-java-sdk-codedeploy-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codepipeline/1.11.579/aws-java-sdk-codepipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kms/1.11.579/aws-java-sdk-kms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-config/1.11.579/aws-java-sdk-config-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lambda/1.11.579/aws-java-sdk-lambda-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecs/1.11.579/aws-java-sdk-ecs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecr/1.11.579/aws-java-sdk-ecr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsm/1.11.579/aws-java-sdk-cloudhsm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ssm/1.11.579/aws-java-sdk-ssm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workspaces/1.11.579/aws-java-sdk-workspaces-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-machinelearning/1.11.579/aws-java-sdk-machinelearning-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directory/1.11.579/aws-java-sdk-directory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-efs/1.11.579/aws-java-sdk-efs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codecommit/1.11.579/aws-java-sdk-codecommit-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-devicefarm/1.11.579/aws-java-sdk-devicefarm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticsearch/1.11.579/aws-java-sdk-elasticsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-waf/1.11.579/aws-java-sdk-waf-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacecommerceanalytics/1.11.579/aws-java-sdk-marketplacecommerceanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-inspector/1.11.579/aws-java-sdk-inspector-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot/1.11.579/aws-java-sdk-iot-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-api-gateway/1.11.579/aws-java-sdk-api-gateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acm/1.11.579/aws-java-sdk-acm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-gamelift/1.11.579/aws-java-sdk-gamelift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dms/1.11.579/aws-java-sdk-dms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacemeteringservice/1.11.579/aws-java-sdk-marketplacemeteringservice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidp/1.11.579/aws-java-sdk-cognitoidp-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-discovery/1.11.579/aws-java-sdk-discovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationautoscaling/1.11.579/aws-java-sdk-applicationautoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-snowball/1.11.579/aws-java-sdk-snowball-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rekognition/1.11.579/aws-java-sdk-rekognition-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-polly/1.11.579/aws-java-sdk-polly-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lightsail/1.11.579/aws-java-sdk-lightsail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-stepfunctions/1.11.579/aws-java-sdk-stepfunctions-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-health/1.11.579/aws-java-sdk-health-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costandusagereport/1.11.579/aws-java-sdk-costandusagereport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codebuild/1.11.579/aws-java-sdk-codebuild-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appstream/1.11.579/aws-java-sdk-appstream-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-shield/1.11.579/aws-java-sdk-shield-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-batch/1.11.579/aws-java-sdk-batch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lex/1.11.579/aws-java-sdk-lex-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mechanicalturkrequester/1.11.579/aws-java-sdk-mechanicalturkrequester-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-organizations/1.11.579/aws-java-sdk-organizations-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workdocs/1.11.579/aws-java-sdk-workdocs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-core/1.11.579/aws-java-sdk-core-1.11.579.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpclient/4.5.5/httpclient-4.5.5.jar:/Users/hmf743/.m2/repository/software/amazon/ion/ion-java/1.0.2/ion-java-1.0.2.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-cbor/2.6.7/jackson-dataformat-cbor-2.6.7.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-models/1.11.579/aws-java-sdk-models-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-swf-libraries/1.11.22/aws-java-sdk-swf-libraries-1.11.22.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-aws/2.8.5/hadoop-aws-2.8.5.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.2.3/jackson-core-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.2.3/jackson-databind-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.2.3/jackson-annotations-2.2.3.jar:/Users/hmf743/.m2/repository/joda-time/joda-time/2.9.4/joda-time-2.9.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-hadoop2/2.4.1-1.8.1/flink-shaded-hadoop2-2.4.1-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/Users/hmf743/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar:/Users/hmf743/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-math3/3.5/commons-math3-3.5.jar:/Users/hmf743/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/hmf743/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/hmf743/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/hmf743/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/hmf743/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/hmf743/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/hmf743/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/hmf743/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/hmf743/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/hmf743/.m2/repository/commons-configuration/commons-configuration/1.7/commons-configuration-1.7.jar:/Users/hmf743/.m2/repository/commons-digester/commons-digester/1.8.1/commons-digester-1.8.1.jar:/Users/hmf743/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/hmf743/.m2/repository/org/apache/zookeeper/zookeeper/3.4.10/zookeeper-3.4.10.jar:/Users/hmf743/.m2/repository/commons-beanutils/commons-beanutils/1.9.3/commons-beanutils-1.9.3.jar:/Users/hmf743/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/hmf743/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/hmf743/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/hmf743/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-s3-fs-hadoop/1.8.1/flink-s3-fs-hadoop-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-common/2.4.1/hadoop-common-2.4.1.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-annotations/2.4.1/hadoop-annotations-2.4.1.jar:/Users/hmf743/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/Users/hmf743/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/hmf743/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/hmf743/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.3/jackson-jaxrs-1.8.3.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.3/jackson-xc-1.8.3.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/hmf743/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/hmf743/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/Users/hmf743/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/Users/hmf743/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/Users/hmf743/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/hmf743/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpcore/4.1.2/httpcore-4.1.2.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/Users/hmf743/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-auth/2.4.1/hadoop-auth-2.4.1.jar:/Users/hmf743/Library/Application Support/JetBrains/Toolbox/apps/IDEA-U/ch-0/203.5981.155/IntelliJ IDEA.app/Contents/lib/idea_rt.jar examples.s3.FlinkReadS3
Connected to the target VM, address: '127.0.0.1:52571', transport: 'socket'
log4j:WARN No appenders could be found for logger (com.amazonaws.auth.AWSCredentialsProviderChain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.flink.util.FlinkException: Could not close resource.
at org.apache.flink.util.AutoCloseableAsync.close(AutoCloseableAsync.java:42)
at org.apache.flink.client.LocalExecutor.stop(LocalExecutor.java:155)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:227)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at examples.s3.FlinkReadS3$.main(FlinkReadS3.scala:124)
at examples.s3.FlinkReadS3.main(FlinkReadS3.scala)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
at java.util.concurrent.CompletableFuture$AsyncSupply.run$$$capture(CompletableFuture.java:1604)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:76)
at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:351)
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
... 8 more
Caused by: org.apache.flink.runtime.JobException: Creating the input splits caused an error: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:267)
at org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:853)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:232)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:100)
at org.apache.flink.runtime.jobmaster.JobMaster.createExecutionGraph(JobMaster.java:1198)
at org.apache.flink.runtime.jobmaster.JobMaster.createAndRestoreExecutionGraph(JobMaster.java:1178)
at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:287)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:83)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:37)
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
... 11 more
Caused by: java.net.SocketTimeoutException: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateInterruptedException(S3AUtils.java:330)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:171)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:260)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:317)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:256)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:231)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:372)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:308)
at org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.create(AbstractS3FileSystemFactory.java:125)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:395)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:318)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:587)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:62)
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:253)
... 20 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:139)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:5086)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:5060)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4309)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1337)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1277)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$verifyBucketExists$1(S3AFileSystem.java:373)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109)
... 33 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.handleError(EC2CredentialsFetcher.java:183)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:162)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.getCredentials(EC2CredentialsFetcher.java:82)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(InstanceProfileCredentialsProvider.java:151)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:117)
... 50 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1593)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:110)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:79)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider$InstanceMetadataCredentialsEndpointProvider.getCredentialsEndpoint(InstanceProfileCredentialsProvider.java:174)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:122)
... 53 more

On Mon, Mar 15, 2021 at 4:59 AM Robert Metzger <[hidden email]> wrote:
Since this error is happening in your IDE, I would recommend using the IntelliJ debugger to follow the filesystem initialization process and see where it fails to pick up the credentials.

On Fri, Mar 12, 2021 at 11:11 PM sri hari kali charan Tummala <[hidden email]> wrote:
Same error.
 


On Fri, 12 Mar 2021 at 09:01, ChesnaSchepler <[hidden email]> wrote:
From the exception I would conclude that your core-site.xml file is not being picked up.

AFAIK fs.hdfs.hadoopconf only works for HDFS, not for S3 filesystems, so try setting HADOOP_CONF_DIR to the directory that the file resides in.

On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException:
--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

Guowei Ma
Hi,

Could you try the test case `PrestoS3FileSystemITCase`(1.8.1) and see what happens?(you need provide the `IT_CASE_S3_BUCKET` & `IT_CASE_S3_ACCESS_KEY` & `IT_CASE_S3_SECRET_KEY`) in your ide.

Best,
Guowei


On Tue, Mar 16, 2021 at 2:31 AM sri hari kali charan Tummala <[hidden email]> wrote:
I can access AWS Kinesis from Flink under same account from Intellij, I am able to  access S3 from spark too.

Thanks
Sri

On Mon, Mar 15, 2021 at 11:23 AM Robert Metzger <[hidden email]> wrote:
Mh, this looks like a network issue. Is it possible that you can not access some AWS services from your network? 
On Mon, Mar 15, 2021 at 6:39 PM sri hari kali charan Tummala <[hidden email]> wrote:
Below is a complete stack trace running my job in Intellij debug mode.

Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/bin/java -agentlib:jdwp=transport=dt_socket,address=127.0.0.1:52571,suspend=y,server=n -javaagent:/Users/hmf743/Library/Caches/JetBrains/IntelliJIdea2020.3/captureAgent/debugger-agent.jar -Dfile.encoding=UTF-8 -classpath /Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/charsets.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/cldrdata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/dnsns.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jaccess.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jfxrt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/localedata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/nashorn.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunec.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/zipfs.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jce.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfr.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfxswt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jsse.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/management-agent.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/resources.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/rt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/ant-javafx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/dt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/javafx-mx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/jconsole.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/packager.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/sa-jdi.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/tools.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/target/classes:/Users/hmf743/.m2/repository/org/apache/flink/flink-core/1.8.1/flink-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-annotations/1.8.1/flink-annotations-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-metrics-core/1.8.1/flink-metrics-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm/5.0.4-6.0/flink-shaded-asm-5.0.4-6.0.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/kryo/kryo/2.24.0/kryo-2.24.0.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/hmf743/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/hmf743/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-guava/18.0-6.0/flink-shaded-guava-18.0-6.0.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-api/1.7.15/slf4j-api-1.7.15.jar:/Users/hmf743/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/hmf743/.m2/repository/org/apache/flink/force-shading/1.8.1/force-shading-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-clients_2.11/1.8.1/flink-clients_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-runtime_2.11/1.8.1/flink-runtime_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-queryable-state-client-java_2.11/1.8.1/flink-queryable-state-client-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-hadoop-fs/1.8.1/flink-hadoop-fs-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-netty/4.1.32.Final-6.0/flink-shaded-netty-4.1.32.Final-6.0.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-jackson/2.7.9-6.0/flink-shaded-jackson-2.7.9-6.0.jar:/Users/hmf743/.m2/repository/org/javassist/javassist/3.19.0-GA/javassist-3.19.0-GA.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-actor_2.11/2.4.20/akka-actor_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/config/1.3.0/config-1.3.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-java8-compat_2.11/0.7.0/scala-java8-compat_2.11-0.7.0.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-stream_2.11/2.4.20/akka-stream_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/reactivestreams/reactive-streams/1.0.0/reactive-streams-1.0.0.jar:/Users/hmf743/.m2/repository/com/typesafe/ssl-config-core_2.11/0.2.1/ssl-config-core_2.11-0.2.1.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-protobuf_2.11/2.4.20/akka-protobuf_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-slf4j_2.11/2.4.20/akka-slf4j_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/clapper/grizzled-slf4j_2.11/1.3.2/grizzled-slf4j_2.11-1.3.2.jar:/Users/hmf743/.m2/repository/com/github/scopt/scopt_2.11/3.5.0/scopt_2.11-3.5.0.jar:/Users/hmf743/.m2/repository/com/twitter/chill_2.11/0.7.6/chill_2.11-0.7.6.jar:/Users/hmf743/.m2/repository/com/twitter/chill-java/0.7.6/chill-java-0.7.6.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-optimizer_2.11/1.8.1/flink-optimizer_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-java/1.8.1/flink-java-1.8.1.jar:/Users/hmf743/.m2/repository/commons-cli/commons-cli/1.3.1/commons-cli-1.3.1.jar:/Users/hmf743/.m2/repository/org/apache/derby/derby/10.13.1.1/derby-10.13.1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-jdbc_2.11/1.8.1/flink-jdbc_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-scala_2.11/1.8.1/flink-table-api-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-common/1.8.1/flink-table-common-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java/1.8.1/flink-table-api-java-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-table_2.11-1.7.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-planner_2.11/1.8.1/flink-table-planner_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java-bridge_2.11/1.8.1/flink-table-api-java-bridge_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-json/1.8.1/flink-json-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-scala_2.11/1.8.1/flink-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm-6/6.2.1-6.0/flink-shaded-asm-6-6.2.1-6.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-reflect/2.11.12/scala-reflect-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-library/2.11.12/scala-library-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-compiler/2.11.12/scala-compiler-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.5/scala-xml_2.11-1.0.5.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-scala_2.11/1.8.1/flink-streaming-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-java_2.11/1.8.1/flink-streaming-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/hmf743/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk/1.11.579/aws-java-sdk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationinsights/1.11.579/aws-java-sdk-applicationinsights-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/jmespath-java/1.11.579/jmespath-java-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicequotas/1.11.579/aws-java-sdk-servicequotas-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeevents/1.11.579/aws-java-sdk-personalizeevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalize/1.11.579/aws-java-sdk-personalize-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeruntime/1.11.579/aws-java-sdk-personalizeruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ioteventsdata/1.11.579/aws-java-sdk-ioteventsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotevents/1.11.579/aws-java-sdk-iotevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotthingsgraph/1.11.579/aws-java-sdk-iotthingsgraph-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-groundstation/1.11.579/aws-java-sdk-groundstation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackagevod/1.11.579/aws-java-sdk-mediapackagevod-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-managedblockchain/1.11.579/aws-java-sdk-managedblockchain-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-textract/1.11.579/aws-java-sdk-textract-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-worklink/1.11.579/aws-java-sdk-worklink-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-backup/1.11.579/aws-java-sdk-backup-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-docdb/1.11.579/aws-java-sdk-docdb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewayv2/1.11.579/aws-java-sdk-apigatewayv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewaymanagementapi/1.11.579/aws-java-sdk-apigatewaymanagementapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kafka/1.11.579/aws-java-sdk-kafka-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appmesh/1.11.579/aws-java-sdk-appmesh-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-licensemanager/1.11.579/aws-java-sdk-licensemanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-securityhub/1.11.579/aws-java-sdk-securityhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fsx/1.11.579/aws-java-sdk-fsx-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconnect/1.11.579/aws-java-sdk-mediaconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisanalyticsv2/1.11.579/aws-java-sdk-kinesisanalyticsv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehendmedical/1.11.579/aws-java-sdk-comprehendmedical-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-globalaccelerator/1.11.579/aws-java-sdk-globalaccelerator-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transfer/1.11.579/aws-java-sdk-transfer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datasync/1.11.579/aws-java-sdk-datasync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-robomaker/1.11.579/aws-java-sdk-robomaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-amplify/1.11.579/aws-java-sdk-amplify-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-quicksight/1.11.579/aws-java-sdk-quicksight-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rdsdata/1.11.579/aws-java-sdk-rdsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53resolver/1.11.579/aws-java-sdk-route53resolver-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ram/1.11.579/aws-java-sdk-ram-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3control/1.11.579/aws-java-sdk-s3control-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointsmsvoice/1.11.579/aws-java-sdk-pinpointsmsvoice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointemail/1.11.579/aws-java-sdk-pinpointemail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-chime/1.11.579/aws-java-sdk-chime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-signer/1.11.579/aws-java-sdk-signer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dlm/1.11.579/aws-java-sdk-dlm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-macie/1.11.579/aws-java-sdk-macie-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-eks/1.11.579/aws-java-sdk-eks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediatailor/1.11.579/aws-java-sdk-mediatailor-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-neptune/1.11.579/aws-java-sdk-neptune-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pi/1.11.579/aws-java-sdk-pi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickprojects/1.11.579/aws-java-sdk-iot1clickprojects-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickdevices/1.11.579/aws-java-sdk-iot1clickdevices-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotanalytics/1.11.579/aws-java-sdk-iotanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acmpca/1.11.579/aws-java-sdk-acmpca-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-secretsmanager/1.11.579/aws-java-sdk-secretsmanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fms/1.11.579/aws-java-sdk-fms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-connect/1.11.579/aws-java-sdk-connect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transcribe/1.11.579/aws-java-sdk-transcribe-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscalingplans/1.11.579/aws-java-sdk-autoscalingplans-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workmail/1.11.579/aws-java-sdk-workmail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicediscovery/1.11.579/aws-java-sdk-servicediscovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloud9/1.11.579/aws-java-sdk-cloud9-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-serverlessapplicationrepository/1.11.579/aws-java-sdk-serverlessapplicationrepository-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-alexaforbusiness/1.11.579/aws-java-sdk-alexaforbusiness-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroups/1.11.579/aws-java-sdk-resourcegroups-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehend/1.11.579/aws-java-sdk-comprehend-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-translate/1.11.579/aws-java-sdk-translate-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemaker/1.11.579/aws-java-sdk-sagemaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotjobsdataplane/1.11.579/aws-java-sdk-iotjobsdataplane-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemakerruntime/1.11.579/aws-java-sdk-sagemakerruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisvideo/1.11.579/aws-java-sdk-kinesisvideo-1.11.579.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec-http/4.1.17.Final/netty-codec-http-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec/4.1.17.Final/netty-codec-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-handler/4.1.17.Final/netty-handler-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-buffer/4.1.17.Final/netty-buffer-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-common/4.1.17.Final/netty-common-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-transport/4.1.17.Final/netty-transport-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-resolver/4.1.17.Final/netty-resolver-4.1.17.Final.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appsync/1.11.579/aws-java-sdk-appsync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-guardduty/1.11.579/aws-java-sdk-guardduty-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mq/1.11.579/aws-java-sdk-mq-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconvert/1.11.579/aws-java-sdk-mediaconvert-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastore/1.11.579/aws-java-sdk-mediastore-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastoredata/1.11.579/aws-java-sdk-mediastoredata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-medialive/1.11.579/aws-java-sdk-medialive-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackage/1.11.579/aws-java-sdk-mediapackage-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costexplorer/1.11.579/aws-java-sdk-costexplorer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pricing/1.11.579/aws-java-sdk-pricing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mobile/1.11.579/aws-java-sdk-mobile-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsmv2/1.11.579/aws-java-sdk-cloudhsmv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glue/1.11.579/aws-java-sdk-glue-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-migrationhub/1.11.579/aws-java-sdk-migrationhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dax/1.11.579/aws-java-sdk-dax-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-greengrass/1.11.579/aws-java-sdk-greengrass-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-athena/1.11.579/aws-java-sdk-athena-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplaceentitlement/1.11.579/aws-java-sdk-marketplaceentitlement-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codestar/1.11.579/aws-java-sdk-codestar-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lexmodelbuilding/1.11.579/aws-java-sdk-lexmodelbuilding-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroupstaggingapi/1.11.579/aws-java-sdk-resourcegroupstaggingapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpoint/1.11.579/aws-java-sdk-pinpoint-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-xray/1.11.579/aws-java-sdk-xray-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworkscm/1.11.579/aws-java-sdk-opsworkscm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-support/1.11.579/aws-java-sdk-support-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpledb/1.11.579/aws-java-sdk-simpledb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicecatalog/1.11.579/aws-java-sdk-servicecatalog-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servermigration/1.11.579/aws-java-sdk-servermigration-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpleworkflow/1.11.579/aws-java-sdk-simpleworkflow-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-storagegateway/1.11.579/aws-java-sdk-storagegateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53/1.11.579/aws-java-sdk-route53-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3/1.11.579/aws-java-sdk-s3-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-importexport/1.11.579/aws-java-sdk-importexport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sts/1.11.579/aws-java-sdk-sts-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sqs/1.11.579/aws-java-sdk-sqs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rds/1.11.579/aws-java-sdk-rds-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-redshift/1.11.579/aws-java-sdk-redshift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticbeanstalk/1.11.579/aws-java-sdk-elasticbeanstalk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glacier/1.11.579/aws-java-sdk-glacier-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iam/1.11.579/aws-java-sdk-iam-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datapipeline/1.11.579/aws-java-sdk-datapipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancing/1.11.579/aws-java-sdk-elasticloadbalancing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancingv2/1.11.579/aws-java-sdk-elasticloadbalancingv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-emr/1.11.579/aws-java-sdk-emr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticache/1.11.579/aws-java-sdk-elasticache-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elastictranscoder/1.11.579/aws-java-sdk-elastictranscoder-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ec2/1.11.579/aws-java-sdk-ec2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dynamodb/1.11.579/aws-java-sdk-dynamodb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sns/1.11.579/aws-java-sdk-sns-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-budgets/1.11.579/aws-java-sdk-budgets-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudtrail/1.11.579/aws-java-sdk-cloudtrail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatch/1.11.579/aws-java-sdk-cloudwatch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-logs/1.11.579/aws-java-sdk-logs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-events/1.11.579/aws-java-sdk-events-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidentity/1.11.579/aws-java-sdk-cognitoidentity-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitosync/1.11.579/aws-java-sdk-cognitosync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directconnect/1.11.579/aws-java-sdk-directconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudformation/1.11.579/aws-java-sdk-cloudformation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudfront/1.11.579/aws-java-sdk-cloudfront-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-clouddirectory/1.11.579/aws-java-sdk-clouddirectory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesis/1.11.579/aws-java-sdk-kinesis-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworks/1.11.579/aws-java-sdk-opsworks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ses/1.11.579/aws-java-sdk-ses-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscaling/1.11.579/aws-java-sdk-autoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudsearch/1.11.579/aws-java-sdk-cloudsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatchmetrics/1.11.579/aws-java-sdk-cloudwatchmetrics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codedeploy/1.11.579/aws-java-sdk-codedeploy-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codepipeline/1.11.579/aws-java-sdk-codepipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kms/1.11.579/aws-java-sdk-kms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-config/1.11.579/aws-java-sdk-config-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lambda/1.11.579/aws-java-sdk-lambda-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecs/1.11.579/aws-java-sdk-ecs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecr/1.11.579/aws-java-sdk-ecr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsm/1.11.579/aws-java-sdk-cloudhsm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ssm/1.11.579/aws-java-sdk-ssm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workspaces/1.11.579/aws-java-sdk-workspaces-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-machinelearning/1.11.579/aws-java-sdk-machinelearning-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directory/1.11.579/aws-java-sdk-directory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-efs/1.11.579/aws-java-sdk-efs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codecommit/1.11.579/aws-java-sdk-codecommit-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-devicefarm/1.11.579/aws-java-sdk-devicefarm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticsearch/1.11.579/aws-java-sdk-elasticsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-waf/1.11.579/aws-java-sdk-waf-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacecommerceanalytics/1.11.579/aws-java-sdk-marketplacecommerceanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-inspector/1.11.579/aws-java-sdk-inspector-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot/1.11.579/aws-java-sdk-iot-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-api-gateway/1.11.579/aws-java-sdk-api-gateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acm/1.11.579/aws-java-sdk-acm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-gamelift/1.11.579/aws-java-sdk-gamelift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dms/1.11.579/aws-java-sdk-dms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacemeteringservice/1.11.579/aws-java-sdk-marketplacemeteringservice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidp/1.11.579/aws-java-sdk-cognitoidp-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-discovery/1.11.579/aws-java-sdk-discovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationautoscaling/1.11.579/aws-java-sdk-applicationautoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-snowball/1.11.579/aws-java-sdk-snowball-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rekognition/1.11.579/aws-java-sdk-rekognition-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-polly/1.11.579/aws-java-sdk-polly-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lightsail/1.11.579/aws-java-sdk-lightsail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-stepfunctions/1.11.579/aws-java-sdk-stepfunctions-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-health/1.11.579/aws-java-sdk-health-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costandusagereport/1.11.579/aws-java-sdk-costandusagereport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codebuild/1.11.579/aws-java-sdk-codebuild-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appstream/1.11.579/aws-java-sdk-appstream-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-shield/1.11.579/aws-java-sdk-shield-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-batch/1.11.579/aws-java-sdk-batch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lex/1.11.579/aws-java-sdk-lex-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mechanicalturkrequester/1.11.579/aws-java-sdk-mechanicalturkrequester-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-organizations/1.11.579/aws-java-sdk-organizations-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workdocs/1.11.579/aws-java-sdk-workdocs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-core/1.11.579/aws-java-sdk-core-1.11.579.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpclient/4.5.5/httpclient-4.5.5.jar:/Users/hmf743/.m2/repository/software/amazon/ion/ion-java/1.0.2/ion-java-1.0.2.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-cbor/2.6.7/jackson-dataformat-cbor-2.6.7.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-models/1.11.579/aws-java-sdk-models-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-swf-libraries/1.11.22/aws-java-sdk-swf-libraries-1.11.22.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-aws/2.8.5/hadoop-aws-2.8.5.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.2.3/jackson-core-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.2.3/jackson-databind-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.2.3/jackson-annotations-2.2.3.jar:/Users/hmf743/.m2/repository/joda-time/joda-time/2.9.4/joda-time-2.9.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-hadoop2/2.4.1-1.8.1/flink-shaded-hadoop2-2.4.1-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/Users/hmf743/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar:/Users/hmf743/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-math3/3.5/commons-math3-3.5.jar:/Users/hmf743/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/hmf743/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/hmf743/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/hmf743/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/hmf743/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/hmf743/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/hmf743/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/hmf743/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/hmf743/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/hmf743/.m2/repository/commons-configuration/commons-configuration/1.7/commons-configuration-1.7.jar:/Users/hmf743/.m2/repository/commons-digester/commons-digester/1.8.1/commons-digester-1.8.1.jar:/Users/hmf743/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/hmf743/.m2/repository/org/apache/zookeeper/zookeeper/3.4.10/zookeeper-3.4.10.jar:/Users/hmf743/.m2/repository/commons-beanutils/commons-beanutils/1.9.3/commons-beanutils-1.9.3.jar:/Users/hmf743/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/hmf743/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/hmf743/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/hmf743/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-s3-fs-hadoop/1.8.1/flink-s3-fs-hadoop-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-common/2.4.1/hadoop-common-2.4.1.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-annotations/2.4.1/hadoop-annotations-2.4.1.jar:/Users/hmf743/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/Users/hmf743/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/hmf743/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/hmf743/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.3/jackson-jaxrs-1.8.3.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.3/jackson-xc-1.8.3.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/hmf743/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/hmf743/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/Users/hmf743/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/Users/hmf743/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/Users/hmf743/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/hmf743/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpcore/4.1.2/httpcore-4.1.2.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/Users/hmf743/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-auth/2.4.1/hadoop-auth-2.4.1.jar:/Users/hmf743/Library/Application Support/JetBrains/Toolbox/apps/IDEA-U/ch-0/203.5981.155/IntelliJ IDEA.app/Contents/lib/idea_rt.jar examples.s3.FlinkReadS3
Connected to the target VM, address: '127.0.0.1:52571', transport: 'socket'
log4j:WARN No appenders could be found for logger (com.amazonaws.auth.AWSCredentialsProviderChain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.flink.util.FlinkException: Could not close resource.
at org.apache.flink.util.AutoCloseableAsync.close(AutoCloseableAsync.java:42)
at org.apache.flink.client.LocalExecutor.stop(LocalExecutor.java:155)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:227)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at examples.s3.FlinkReadS3$.main(FlinkReadS3.scala:124)
at examples.s3.FlinkReadS3.main(FlinkReadS3.scala)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
at java.util.concurrent.CompletableFuture$AsyncSupply.run$$$capture(CompletableFuture.java:1604)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:76)
at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:351)
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
... 8 more
Caused by: org.apache.flink.runtime.JobException: Creating the input splits caused an error: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:267)
at org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:853)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:232)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:100)
at org.apache.flink.runtime.jobmaster.JobMaster.createExecutionGraph(JobMaster.java:1198)
at org.apache.flink.runtime.jobmaster.JobMaster.createAndRestoreExecutionGraph(JobMaster.java:1178)
at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:287)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:83)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:37)
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
... 11 more
Caused by: java.net.SocketTimeoutException: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateInterruptedException(S3AUtils.java:330)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:171)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:260)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:317)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:256)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:231)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:372)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:308)
at org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.create(AbstractS3FileSystemFactory.java:125)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:395)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:318)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:587)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:62)
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:253)
... 20 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:139)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:5086)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:5060)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4309)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1337)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1277)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$verifyBucketExists$1(S3AFileSystem.java:373)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109)
... 33 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.handleError(EC2CredentialsFetcher.java:183)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:162)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.getCredentials(EC2CredentialsFetcher.java:82)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(InstanceProfileCredentialsProvider.java:151)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:117)
... 50 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1593)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:110)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:79)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider$InstanceMetadataCredentialsEndpointProvider.getCredentialsEndpoint(InstanceProfileCredentialsProvider.java:174)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:122)
... 53 more

On Mon, Mar 15, 2021 at 4:59 AM Robert Metzger <[hidden email]> wrote:
Since this error is happening in your IDE, I would recommend using the IntelliJ debugger to follow the filesystem initialization process and see where it fails to pick up the credentials.

On Fri, Mar 12, 2021 at 11:11 PM sri hari kali charan Tummala <[hidden email]> wrote:
Same error.
 


On Fri, 12 Mar 2021 at 09:01, ChesnaSchepler <[hidden email]> wrote:
From the exception I would conclude that your core-site.xml file is not being picked up.

AFAIK fs.hdfs.hadoopconf only works for HDFS, not for S3 filesystems, so try setting HADOOP_CONF_DIR to the directory that the file resides in.

On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException:
--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

Reply | Threaded
Open this post in threaded view
|

Re: Flink Read S3 Intellij IDEA Error

sri hari kali charan Tummala
Hi Guowei Ma,

Below is the error what I get when I ran the test case PrestoS3FileSystemITCase I have passed IT_CASE_S3_BUCKET` & `IT_CASE_S3_ACCESS_KEY` & `IT_CASE_S3_SECRET_KEY the values before test , I am testing in flink 1.8.1

apps/IDEA-U/ch-0/203.5981.155/IntelliJ IDEA.app/Contents/lib/idea_rt.jar examples.s3.FlinkReadS3
Connected to the target VM, address: '127.0.0.1:52571', transport: 'socket'
log4j:WARN No appenders could be found for logger (com.amazonaws.auth.AWSCredentialsProviderChain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.flink.util.FlinkException: Could not close resource.
at org.apache.flink.util.AutoCloseableAsync.close(AutoCloseableAsync.java:42)
at org.apache.flink.client.LocalExecutor.stop(LocalExecutor.java:155)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:227)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at examples.s3.FlinkReadS3$.main(FlinkReadS3.scala:124)
at examples.s3.FlinkReadS3.main(FlinkReadS3.scala)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
at java.util.concurrent.CompletableFuture$AsyncSupply.run$$$capture(CompletableFuture.java:1604)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:76)
at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:351)
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
... 8 more
Caused by: org.apache.flink.runtime.JobException: Creating the input splits caused an error: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:267)
at org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:853)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:232)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:100)
at org.apache.flink.runtime.jobmaster.JobMaster.createExecutionGraph(JobMaster.java:1198)
at org.apache.flink.runtime.jobmaster.JobMaster.createAndRestoreExecutionGraph(JobMaster.java:1178)
at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:287)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:83)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:37)
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
... 11 more
Caused by: java.net.SocketTimeoutException: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateInterruptedException(S3AUtils.java:330)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:171)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:260)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:317)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:256)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:231)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:372)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:308)
at org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.create(AbstractS3FileSystemFactory.java:125)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:395)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:318)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:587)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:62)
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:253)
... 20 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:139)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:5086)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:5060)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4309)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1337)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1277)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$verifyBucketExists$1(S3AFileSystem.java:373)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109)
... 33 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.handleError(EC2CredentialsFetcher.java:183)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:162)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.getCredentials(EC2CredentialsFetcher.java:82)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(InstanceProfileCredentialsProvider.java:151)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:117)
... 50 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1593)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:110)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:79)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider$InstanceMetadataCredentialsEndpointProvider.getCredentialsEndpoint(InstanceProfileCredentialsProvider.java:174)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:122)
... 53 more

Thanks
Sri

On Mon, Mar 15, 2021 at 8:04 PM Guowei Ma <[hidden email]> wrote:
Hi,

Could you try the test case `PrestoS3FileSystemITCase`(1.8.1) and see what happens?(you need provide the `IT_CASE_S3_BUCKET` & `IT_CASE_S3_ACCESS_KEY` & `IT_CASE_S3_SECRET_KEY`) in your ide.

Best,
Guowei


On Tue, Mar 16, 2021 at 2:31 AM sri hari kali charan Tummala <[hidden email]> wrote:
I can access AWS Kinesis from Flink under same account from Intellij, I am able to  access S3 from spark too.

Thanks
Sri

On Mon, Mar 15, 2021 at 11:23 AM Robert Metzger <[hidden email]> wrote:
Mh, this looks like a network issue. Is it possible that you can not access some AWS services from your network? 
On Mon, Mar 15, 2021 at 6:39 PM sri hari kali charan Tummala <[hidden email]> wrote:
Below is a complete stack trace running my job in Intellij debug mode.

Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/bin/java -agentlib:jdwp=transport=dt_socket,address=127.0.0.1:52571,suspend=y,server=n -javaagent:/Users/hmf743/Library/Caches/JetBrains/IntelliJIdea2020.3/captureAgent/debugger-agent.jar -Dfile.encoding=UTF-8 -classpath /Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/charsets.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/cldrdata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/dnsns.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jaccess.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/jfxrt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/localedata.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/nashorn.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunec.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/ext/zipfs.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jce.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfr.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jfxswt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/jsse.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/management-agent.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/resources.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/jre/lib/rt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/ant-javafx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/dt.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/javafx-mx.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/jconsole.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/packager.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/sa-jdi.jar:/Users/hmf743/Library/Java/JavaVirtualMachines/corretto-1.8.0_275/Contents/Home/lib/tools.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/target/classes:/Users/hmf743/.m2/repository/org/apache/flink/flink-core/1.8.1/flink-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-annotations/1.8.1/flink-annotations-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-metrics-core/1.8.1/flink-metrics-core-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm/5.0.4-6.0/flink-shaded-asm-5.0.4-6.0.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/kryo/kryo/2.24.0/kryo-2.24.0.jar:/Users/hmf743/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/hmf743/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/hmf743/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-guava/18.0-6.0/flink-shaded-guava-18.0-6.0.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-api/1.7.15/slf4j-api-1.7.15.jar:/Users/hmf743/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/hmf743/.m2/repository/org/apache/flink/force-shading/1.8.1/force-shading-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-clients_2.11/1.8.1/flink-clients_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-runtime_2.11/1.8.1/flink-runtime_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-queryable-state-client-java_2.11/1.8.1/flink-queryable-state-client-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-hadoop-fs/1.8.1/flink-hadoop-fs-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-netty/4.1.32.Final-6.0/flink-shaded-netty-4.1.32.Final-6.0.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-jackson/2.7.9-6.0/flink-shaded-jackson-2.7.9-6.0.jar:/Users/hmf743/.m2/repository/org/javassist/javassist/3.19.0-GA/javassist-3.19.0-GA.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-actor_2.11/2.4.20/akka-actor_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/config/1.3.0/config-1.3.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-java8-compat_2.11/0.7.0/scala-java8-compat_2.11-0.7.0.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-stream_2.11/2.4.20/akka-stream_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/reactivestreams/reactive-streams/1.0.0/reactive-streams-1.0.0.jar:/Users/hmf743/.m2/repository/com/typesafe/ssl-config-core_2.11/0.2.1/ssl-config-core_2.11-0.2.1.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-protobuf_2.11/2.4.20/akka-protobuf_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/com/typesafe/akka/akka-slf4j_2.11/2.4.20/akka-slf4j_2.11-2.4.20.jar:/Users/hmf743/.m2/repository/org/clapper/grizzled-slf4j_2.11/1.3.2/grizzled-slf4j_2.11-1.3.2.jar:/Users/hmf743/.m2/repository/com/github/scopt/scopt_2.11/3.5.0/scopt_2.11-3.5.0.jar:/Users/hmf743/.m2/repository/com/twitter/chill_2.11/0.7.6/chill_2.11-0.7.6.jar:/Users/hmf743/.m2/repository/com/twitter/chill-java/0.7.6/chill-java-0.7.6.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-optimizer_2.11/1.8.1/flink-optimizer_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-java/1.8.1/flink-java-1.8.1.jar:/Users/hmf743/.m2/repository/commons-cli/commons-cli/1.3.1/commons-cli-1.3.1.jar:/Users/hmf743/.m2/repository/org/apache/derby/derby/10.13.1.1/derby-10.13.1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-jdbc_2.11/1.8.1/flink-jdbc_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-scala_2.11/1.8.1/flink-table-api-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-common/1.8.1/flink-table-common-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java/1.8.1/flink-table-api-java-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-table_2.11-1.7.2.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-planner_2.11/1.8.1/flink-table-planner_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-table-api-java-bridge_2.11/1.8.1/flink-table-api-java-bridge_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-json/1.8.1/flink-json-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-scala_2.11/1.8.1/flink-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-asm-6/6.2.1-6.0/flink-shaded-asm-6-6.2.1-6.0.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-reflect/2.11.12/scala-reflect-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-library/2.11.12/scala-library-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/scala-compiler/2.11.12/scala-compiler-2.11.12.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.5/scala-xml_2.11-1.0.5.jar:/Users/hmf743/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-scala_2.11/1.8.1/flink-streaming-scala_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-streaming-java_2.11/1.8.1/flink-streaming-java_2.11-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/hmf743/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk/1.11.579/aws-java-sdk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationinsights/1.11.579/aws-java-sdk-applicationinsights-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/jmespath-java/1.11.579/jmespath-java-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicequotas/1.11.579/aws-java-sdk-servicequotas-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeevents/1.11.579/aws-java-sdk-personalizeevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalize/1.11.579/aws-java-sdk-personalize-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-personalizeruntime/1.11.579/aws-java-sdk-personalizeruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ioteventsdata/1.11.579/aws-java-sdk-ioteventsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotevents/1.11.579/aws-java-sdk-iotevents-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotthingsgraph/1.11.579/aws-java-sdk-iotthingsgraph-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-groundstation/1.11.579/aws-java-sdk-groundstation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackagevod/1.11.579/aws-java-sdk-mediapackagevod-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-managedblockchain/1.11.579/aws-java-sdk-managedblockchain-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-textract/1.11.579/aws-java-sdk-textract-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-worklink/1.11.579/aws-java-sdk-worklink-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-backup/1.11.579/aws-java-sdk-backup-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-docdb/1.11.579/aws-java-sdk-docdb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewayv2/1.11.579/aws-java-sdk-apigatewayv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-apigatewaymanagementapi/1.11.579/aws-java-sdk-apigatewaymanagementapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kafka/1.11.579/aws-java-sdk-kafka-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appmesh/1.11.579/aws-java-sdk-appmesh-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-licensemanager/1.11.579/aws-java-sdk-licensemanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-securityhub/1.11.579/aws-java-sdk-securityhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fsx/1.11.579/aws-java-sdk-fsx-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconnect/1.11.579/aws-java-sdk-mediaconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisanalyticsv2/1.11.579/aws-java-sdk-kinesisanalyticsv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehendmedical/1.11.579/aws-java-sdk-comprehendmedical-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-globalaccelerator/1.11.579/aws-java-sdk-globalaccelerator-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transfer/1.11.579/aws-java-sdk-transfer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datasync/1.11.579/aws-java-sdk-datasync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-robomaker/1.11.579/aws-java-sdk-robomaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-amplify/1.11.579/aws-java-sdk-amplify-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-quicksight/1.11.579/aws-java-sdk-quicksight-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rdsdata/1.11.579/aws-java-sdk-rdsdata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53resolver/1.11.579/aws-java-sdk-route53resolver-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ram/1.11.579/aws-java-sdk-ram-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3control/1.11.579/aws-java-sdk-s3control-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointsmsvoice/1.11.579/aws-java-sdk-pinpointsmsvoice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpointemail/1.11.579/aws-java-sdk-pinpointemail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-chime/1.11.579/aws-java-sdk-chime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-signer/1.11.579/aws-java-sdk-signer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dlm/1.11.579/aws-java-sdk-dlm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-macie/1.11.579/aws-java-sdk-macie-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-eks/1.11.579/aws-java-sdk-eks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediatailor/1.11.579/aws-java-sdk-mediatailor-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-neptune/1.11.579/aws-java-sdk-neptune-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pi/1.11.579/aws-java-sdk-pi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickprojects/1.11.579/aws-java-sdk-iot1clickprojects-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot1clickdevices/1.11.579/aws-java-sdk-iot1clickdevices-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotanalytics/1.11.579/aws-java-sdk-iotanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acmpca/1.11.579/aws-java-sdk-acmpca-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-secretsmanager/1.11.579/aws-java-sdk-secretsmanager-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-fms/1.11.579/aws-java-sdk-fms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-connect/1.11.579/aws-java-sdk-connect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-transcribe/1.11.579/aws-java-sdk-transcribe-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscalingplans/1.11.579/aws-java-sdk-autoscalingplans-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workmail/1.11.579/aws-java-sdk-workmail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicediscovery/1.11.579/aws-java-sdk-servicediscovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloud9/1.11.579/aws-java-sdk-cloud9-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-serverlessapplicationrepository/1.11.579/aws-java-sdk-serverlessapplicationrepository-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-alexaforbusiness/1.11.579/aws-java-sdk-alexaforbusiness-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroups/1.11.579/aws-java-sdk-resourcegroups-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-comprehend/1.11.579/aws-java-sdk-comprehend-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-translate/1.11.579/aws-java-sdk-translate-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemaker/1.11.579/aws-java-sdk-sagemaker-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iotjobsdataplane/1.11.579/aws-java-sdk-iotjobsdataplane-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sagemakerruntime/1.11.579/aws-java-sdk-sagemakerruntime-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesisvideo/1.11.579/aws-java-sdk-kinesisvideo-1.11.579.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec-http/4.1.17.Final/netty-codec-http-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-codec/4.1.17.Final/netty-codec-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-handler/4.1.17.Final/netty-handler-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-buffer/4.1.17.Final/netty-buffer-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-common/4.1.17.Final/netty-common-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-transport/4.1.17.Final/netty-transport-4.1.17.Final.jar:/Users/hmf743/.m2/repository/io/netty/netty-resolver/4.1.17.Final/netty-resolver-4.1.17.Final.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appsync/1.11.579/aws-java-sdk-appsync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-guardduty/1.11.579/aws-java-sdk-guardduty-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mq/1.11.579/aws-java-sdk-mq-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediaconvert/1.11.579/aws-java-sdk-mediaconvert-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastore/1.11.579/aws-java-sdk-mediastore-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediastoredata/1.11.579/aws-java-sdk-mediastoredata-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-medialive/1.11.579/aws-java-sdk-medialive-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mediapackage/1.11.579/aws-java-sdk-mediapackage-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costexplorer/1.11.579/aws-java-sdk-costexplorer-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pricing/1.11.579/aws-java-sdk-pricing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mobile/1.11.579/aws-java-sdk-mobile-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsmv2/1.11.579/aws-java-sdk-cloudhsmv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glue/1.11.579/aws-java-sdk-glue-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-migrationhub/1.11.579/aws-java-sdk-migrationhub-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dax/1.11.579/aws-java-sdk-dax-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-greengrass/1.11.579/aws-java-sdk-greengrass-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-athena/1.11.579/aws-java-sdk-athena-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplaceentitlement/1.11.579/aws-java-sdk-marketplaceentitlement-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codestar/1.11.579/aws-java-sdk-codestar-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lexmodelbuilding/1.11.579/aws-java-sdk-lexmodelbuilding-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-resourcegroupstaggingapi/1.11.579/aws-java-sdk-resourcegroupstaggingapi-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-pinpoint/1.11.579/aws-java-sdk-pinpoint-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-xray/1.11.579/aws-java-sdk-xray-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworkscm/1.11.579/aws-java-sdk-opsworkscm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-support/1.11.579/aws-java-sdk-support-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpledb/1.11.579/aws-java-sdk-simpledb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servicecatalog/1.11.579/aws-java-sdk-servicecatalog-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-servermigration/1.11.579/aws-java-sdk-servermigration-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-simpleworkflow/1.11.579/aws-java-sdk-simpleworkflow-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-storagegateway/1.11.579/aws-java-sdk-storagegateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-route53/1.11.579/aws-java-sdk-route53-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-s3/1.11.579/aws-java-sdk-s3-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-importexport/1.11.579/aws-java-sdk-importexport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sts/1.11.579/aws-java-sdk-sts-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sqs/1.11.579/aws-java-sdk-sqs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rds/1.11.579/aws-java-sdk-rds-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-redshift/1.11.579/aws-java-sdk-redshift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticbeanstalk/1.11.579/aws-java-sdk-elasticbeanstalk-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-glacier/1.11.579/aws-java-sdk-glacier-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iam/1.11.579/aws-java-sdk-iam-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-datapipeline/1.11.579/aws-java-sdk-datapipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancing/1.11.579/aws-java-sdk-elasticloadbalancing-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticloadbalancingv2/1.11.579/aws-java-sdk-elasticloadbalancingv2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-emr/1.11.579/aws-java-sdk-emr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticache/1.11.579/aws-java-sdk-elasticache-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elastictranscoder/1.11.579/aws-java-sdk-elastictranscoder-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ec2/1.11.579/aws-java-sdk-ec2-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dynamodb/1.11.579/aws-java-sdk-dynamodb-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-sns/1.11.579/aws-java-sdk-sns-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-budgets/1.11.579/aws-java-sdk-budgets-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudtrail/1.11.579/aws-java-sdk-cloudtrail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatch/1.11.579/aws-java-sdk-cloudwatch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-logs/1.11.579/aws-java-sdk-logs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-events/1.11.579/aws-java-sdk-events-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidentity/1.11.579/aws-java-sdk-cognitoidentity-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitosync/1.11.579/aws-java-sdk-cognitosync-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directconnect/1.11.579/aws-java-sdk-directconnect-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudformation/1.11.579/aws-java-sdk-cloudformation-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudfront/1.11.579/aws-java-sdk-cloudfront-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-clouddirectory/1.11.579/aws-java-sdk-clouddirectory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kinesis/1.11.579/aws-java-sdk-kinesis-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-opsworks/1.11.579/aws-java-sdk-opsworks-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ses/1.11.579/aws-java-sdk-ses-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-autoscaling/1.11.579/aws-java-sdk-autoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudsearch/1.11.579/aws-java-sdk-cloudsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatchmetrics/1.11.579/aws-java-sdk-cloudwatchmetrics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codedeploy/1.11.579/aws-java-sdk-codedeploy-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codepipeline/1.11.579/aws-java-sdk-codepipeline-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-kms/1.11.579/aws-java-sdk-kms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-config/1.11.579/aws-java-sdk-config-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lambda/1.11.579/aws-java-sdk-lambda-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecs/1.11.579/aws-java-sdk-ecs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ecr/1.11.579/aws-java-sdk-ecr-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cloudhsm/1.11.579/aws-java-sdk-cloudhsm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-ssm/1.11.579/aws-java-sdk-ssm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workspaces/1.11.579/aws-java-sdk-workspaces-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-machinelearning/1.11.579/aws-java-sdk-machinelearning-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-directory/1.11.579/aws-java-sdk-directory-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-efs/1.11.579/aws-java-sdk-efs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codecommit/1.11.579/aws-java-sdk-codecommit-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-devicefarm/1.11.579/aws-java-sdk-devicefarm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-elasticsearch/1.11.579/aws-java-sdk-elasticsearch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-waf/1.11.579/aws-java-sdk-waf-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacecommerceanalytics/1.11.579/aws-java-sdk-marketplacecommerceanalytics-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-inspector/1.11.579/aws-java-sdk-inspector-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-iot/1.11.579/aws-java-sdk-iot-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-api-gateway/1.11.579/aws-java-sdk-api-gateway-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-acm/1.11.579/aws-java-sdk-acm-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-gamelift/1.11.579/aws-java-sdk-gamelift-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-dms/1.11.579/aws-java-sdk-dms-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-marketplacemeteringservice/1.11.579/aws-java-sdk-marketplacemeteringservice-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-cognitoidp/1.11.579/aws-java-sdk-cognitoidp-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-discovery/1.11.579/aws-java-sdk-discovery-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-applicationautoscaling/1.11.579/aws-java-sdk-applicationautoscaling-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-snowball/1.11.579/aws-java-sdk-snowball-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-rekognition/1.11.579/aws-java-sdk-rekognition-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-polly/1.11.579/aws-java-sdk-polly-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lightsail/1.11.579/aws-java-sdk-lightsail-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-stepfunctions/1.11.579/aws-java-sdk-stepfunctions-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-health/1.11.579/aws-java-sdk-health-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-costandusagereport/1.11.579/aws-java-sdk-costandusagereport-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-codebuild/1.11.579/aws-java-sdk-codebuild-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-appstream/1.11.579/aws-java-sdk-appstream-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-shield/1.11.579/aws-java-sdk-shield-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-batch/1.11.579/aws-java-sdk-batch-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-lex/1.11.579/aws-java-sdk-lex-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-mechanicalturkrequester/1.11.579/aws-java-sdk-mechanicalturkrequester-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-organizations/1.11.579/aws-java-sdk-organizations-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-workdocs/1.11.579/aws-java-sdk-workdocs-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-core/1.11.579/aws-java-sdk-core-1.11.579.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpclient/4.5.5/httpclient-4.5.5.jar:/Users/hmf743/.m2/repository/software/amazon/ion/ion-java/1.0.2/ion-java-1.0.2.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-cbor/2.6.7/jackson-dataformat-cbor-2.6.7.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-models/1.11.579/aws-java-sdk-models-1.11.579.jar:/Users/hmf743/.m2/repository/com/amazonaws/aws-java-sdk-swf-libraries/1.11.22/aws-java-sdk-swf-libraries-1.11.22.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-aws/2.8.5/hadoop-aws-2.8.5.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.2.3/jackson-core-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.2.3/jackson-databind-2.2.3.jar:/Users/hmf743/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.2.3/jackson-annotations-2.2.3.jar:/Users/hmf743/.m2/repository/joda-time/joda-time/2.9.4/joda-time-2.9.4.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-shaded-hadoop2/2.4.1-1.8.1/flink-shaded-hadoop2-2.4.1-1.8.1.jar:/Users/hmf743/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/Users/hmf743/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar:/Users/hmf743/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar:/Users/hmf743/.m2/repository/org/apache/commons/commons-math3/3.5/commons-math3-3.5.jar:/Users/hmf743/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/hmf743/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/hmf743/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/hmf743/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/hmf743/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/hmf743/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/hmf743/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/hmf743/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/hmf743/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/hmf743/.m2/repository/commons-configuration/commons-configuration/1.7/commons-configuration-1.7.jar:/Users/hmf743/.m2/repository/commons-digester/commons-digester/1.8.1/commons-digester-1.8.1.jar:/Users/hmf743/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/hmf743/.m2/repository/org/apache/zookeeper/zookeeper/3.4.10/zookeeper-3.4.10.jar:/Users/hmf743/.m2/repository/commons-beanutils/commons-beanutils/1.9.3/commons-beanutils-1.9.3.jar:/Users/hmf743/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/hmf743/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/hmf743/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/hmf743/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/hmf743/.m2/repository/org/apache/flink/flink-s3-fs-hadoop/1.8.1/flink-s3-fs-hadoop-1.8.1.jar:/Users/hmf743/Documents/CapOneCode/ashwincode/flink-poc/src/main/resources/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-common/2.4.1/hadoop-common-2.4.1.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-annotations/2.4.1/hadoop-annotations-2.4.1.jar:/Users/hmf743/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/Users/hmf743/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/Users/hmf743/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/hmf743/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/hmf743/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.3/jackson-jaxrs-1.8.3.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.3/jackson-xc-1.8.3.jar:/Users/hmf743/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/hmf743/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/hmf743/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/Users/hmf743/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/Users/hmf743/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/Users/hmf743/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/hmf743/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/Users/hmf743/.m2/repository/org/apache/httpcomponents/httpcore/4.1.2/httpcore-4.1.2.jar:/Users/hmf743/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/Users/hmf743/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/Users/hmf743/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/hmf743/.m2/repository/org/apache/hadoop/hadoop-auth/2.4.1/hadoop-auth-2.4.1.jar:/Users/hmf743/Library/Application Support/JetBrains/Toolbox/apps/IDEA-U/ch-0/203.5981.155/IntelliJ IDEA.app/Contents/lib/idea_rt.jar examples.s3.FlinkReadS3
Connected to the target VM, address: '127.0.0.1:52571', transport: 'socket'
log4j:WARN No appenders could be found for logger (com.amazonaws.auth.AWSCredentialsProviderChain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.flink.util.FlinkException: Could not close resource.
at org.apache.flink.util.AutoCloseableAsync.close(AutoCloseableAsync.java:42)
at org.apache.flink.client.LocalExecutor.stop(LocalExecutor.java:155)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:227)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at examples.s3.FlinkReadS3$.main(FlinkReadS3.scala:124)
at examples.s3.FlinkReadS3.main(FlinkReadS3.scala)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
at java.util.concurrent.CompletableFuture$AsyncSupply.run$$$capture(CompletableFuture.java:1604)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:76)
at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:351)
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
... 8 more
Caused by: org.apache.flink.runtime.JobException: Creating the input splits caused an error: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:267)
at org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:853)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:232)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:100)
at org.apache.flink.runtime.jobmaster.JobMaster.createExecutionGraph(JobMaster.java:1198)
at org.apache.flink.runtime.jobmaster.JobMaster.createAndRestoreExecutionGraph(JobMaster.java:1178)
at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:287)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:83)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:37)
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
... 11 more
Caused by: java.net.SocketTimeoutException: doesBucketExist on cof-card-apollo-finicity-qa: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateInterruptedException(S3AUtils.java:330)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:171)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:260)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:317)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:256)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:231)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:372)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:308)
at org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.create(AbstractS3FileSystemFactory.java:125)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:395)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:318)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:587)
at org.apache.flink.api.common.io.FileInputFormat.createInputSplits(FileInputFormat.java:62)
at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:253)
... 20 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:139)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:5086)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:5060)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4309)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1337)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1277)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$verifyBucketExists$1(S3AFileSystem.java:373)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109)
... 33 more
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.handleError(EC2CredentialsFetcher.java:183)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:162)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.getCredentials(EC2CredentialsFetcher.java:82)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(InstanceProfileCredentialsProvider.java:151)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:117)
... 50 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1593)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:110)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:79)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.InstanceProfileCredentialsProvider$InstanceMetadataCredentialsEndpointProvider.getCredentialsEndpoint(InstanceProfileCredentialsProvider.java:174)
at org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:122)
... 53 more

On Mon, Mar 15, 2021 at 4:59 AM Robert Metzger <[hidden email]> wrote:
Since this error is happening in your IDE, I would recommend using the IntelliJ debugger to follow the filesystem initialization process and see where it fails to pick up the credentials.

On Fri, Mar 12, 2021 at 11:11 PM sri hari kali charan Tummala <[hidden email]> wrote:
Same error.
 


On Fri, 12 Mar 2021 at 09:01, ChesnaSchepler <[hidden email]> wrote:
From the exception I would conclude that your core-site.xml file is not being picked up.

AFAIK fs.hdfs.hadoopconf only works for HDFS, not for S3 filesystems, so try setting HADOOP_CONF_DIR to the directory that the file resides in.

On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
If anyone working have flink version 1.8.1 code reading S3 in Intellij in public GitHub please pass it on that will be huge help.


Thanks
Sri

On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <[hidden email]> wrote:
Which I already did in my pin still its not working.

Thanks
Sri

On Fri, 12 Mar 2021 at 06:18, Chesnay Schepler <[hidden email]> wrote:
The concept of plugins does not exist in 1.8.1. As a result it should be sufficient for your use-case to add a dependency on flink-s3-fs-hadoop to your project.

On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
Let's close this issue guys please answer my questions. I am using Flink 1.8.1.

Thanks
Sri

On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <[hidden email]> wrote:
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see ConfigConstants.ENV_FLINK_LIB_DIR will this work ?

Thanks
Sri

On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <[hidden email]> wrote:
I am not getting what you both are talking about lets be clear.

Plugin ? what is it ? Is it a Jar which I have to download from the Internet and place it in a folder ? Is this the Jar which I have to download ? (flink-s3-fs-hadoop) ?

Will this belo solution work ?

Thanks
Sri



On Wed, Mar 10, 2021 at 11:34 AM Chesnay Schepler <[hidden email]> wrote:
Well, you could do this before running the job:

// set the ConfigConstants.ENV_FLINK_PLUGINS_DIR environment variable, pointing to a directory containing the plugins

PluginManager pluginManager = PluginUtils.createPluginManagerFromRootFolder(new Configuration());
Filesystem.initialize(new Configuration(), pluginManager);

On 3/10/2021 8:16 PM, Lasse Nedergaard wrote:
Hi. 

I had the same problem. Flink use a plugins to access s3. When you run local it starts a mini cluster and the mini cluster don’t load plugins. So it’s not possible without modifying Flink.  In my case I wanted to investigate save points through Flink processor API and the workaround was to build my own version of the processor API and include the missing part. 

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala [hidden email]:


Flink,

I am able to access Kinesis from Intellij but not S3 I have edited my stack overflow question with kinesis code , Flink is still having issues reading S3.



Thanks
Sri

On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <[hidden email]> wrote:

On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <[hidden email]> wrote:

Hi Flink Experts,

I am trying to read an S3 file from my Intellij using Flink I am.comimg across Aws Auth error can someone help below are all the details.
   
I have Aws credentials in homefolder/.aws/credentials

My Intellij Environment Variables:-
ENABLE_BUILT_IN_PLUGINS=flink-s3-fs-hadoop-1.8.1
FLINK_CONF_DIR=/Users/Documents/FlinkStreamAndSql-master/src/main/resources/flink-config

flink-conf.yaml file content:-
fs.hdfs.hadoopconf: /Users/blah/Documents/FlinkStreamAndSql-master/src/main/resources/hadoop-config
core-site.xml file content:-
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
    <property>
        <name>fs.s3.impl</name>
        <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
    </property>

    <property>
        <name>fs.s3.buffer.dir</name>
        <value>/tmp</value>
    </property>

    <property>
        <name>fs.s3a.server-side-encryption-algorithm</name>
        <value>AES256</value>
    </property>

    <!--<property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SharedInstanceProfileCredentialsProvider</value>
    </property>-->

    <property>
        <name>fs.s3a.aws.credentials.provider</name>
        <value>org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider</value>
    </property>
    <property>
        <name>fs.s3a.access.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.secret.key</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.session.token</name>
        <value></value>
    </property>

    <property>
        <name>fs.s3a.proxy.host</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.port</name>
        <value>8099</value>
    </property>
    <property>
        <name>fs.s3a.proxy.username</name>
        <value></value>
    </property>
    <property>
        <name>fs.s3a.proxy.password</name>
        <value></value>
    </property>

</configuration>
POM.xml file:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>FlinkStreamAndSql</groupId>
    <artifactId>FlinkStreamAndSql</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.1.3</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
    <dependencies>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.derby</groupId>
            <artifactId>derby</artifactId>
            <version>10.13.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-jdbc_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.11</artifactId>
           <version>1.8.1</version>
       </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kinesis_2.11</artifactId>
                   <version>1.8.0</version>
                   <scope>system</scope>
                   <systemPath>${project.basedir}/Jars/flink-connector-kinesis_2.11-1.8-SNAPSHOT.jar</systemPath>
               </dependency>

               <dependency>
                   <groupId>org.apache.flink</groupId>
                   <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
                   <version>1.8.1</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>amazon-kinesis-client</artifactId>
                   <version>1.8.8</version>
               </dependency>

               <dependency>
                   <groupId>com.amazonaws</groupId>
                   <artifactId>aws-java-sdk-kinesis</artifactId>
                   <version>1.11.579</version>
               </dependency>

               <dependency>
                   <groupId>commons-dbcp</groupId>
                   <artifactId>commons-dbcp</artifactId>
                   <version>1.2.2</version>
               </dependency>
               <dependency>
                   <groupId>com.google.code.gson</groupId>
                   <artifactId>gson</artifactId>
                   <version>2.1</version>
               </dependency>

               <dependency>
                   <groupId>commons-cli</groupId>
                   <artifactId>commons-cli</artifactId>
                   <version>1.4</version>
               </dependency>

               <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-csv -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.7</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-compress</artifactId>
            <version>1.4.1</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>dynamodb-streams-kinesis-adapter</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.579</version>
        </dependency>


        <!-- For Parquet -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>1.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-twitter_2.10</artifactId>
            <version>1.1.4-hadoop1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_2.11</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.json4s</groupId>
            <artifactId>json4s-jackson_2.11</artifactId>
            <version>3.6.7</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-cloudsearch</artifactId>
            <version>1.11.500</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop2</artifactId>
            <version>2.8.3-1.8.3</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.8.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.8.5</version>
        </dependency>


    </dependencies>

</project>

Scala Code:-
package com.aws.examples.s3


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.{DataSet, ExecutionEnvironment}
import org.apache.flink.table.api.{Table, TableEnvironment}
import org.apache.flink.table.api.java.BatchTableEnvironment
import org.apache.flink.table.sources.CsvTableSource

object Batch {

  def main(args: Array[String]): Unit = {
    
    val env: ExecutionEnvironment =
      ExecutionEnvironment.getExecutionEnvironment
    val tableEnv: BatchTableEnvironment =
      TableEnvironment.getTableEnvironment(env)
    /* create table from csv */

    val tableSrc = CsvTableSource
      .builder()
      .path("s3a://bucket/csvfolder/avg.txt")
      .fieldDelimiter(",")
      .field("date", Types.STRING)
      .field("month", Types.STRING)
      .field("category", Types.STRING)
      .field("product", Types.STRING)
      .field("profit", Types.INT)
      .build()

    tableEnv.registerTableSource("CatalogTable", tableSrc)

    val catalog: Table = tableEnv.scan("CatalogTable")
    /* querying with Table API */

    val order20: Table = catalog
      .filter(" category === 'Category5'")
      .groupBy("month")
      .select("month, profit.sum as sum")
      .orderBy("sum")

    val order20Set: DataSet[Row1] = tableEnv.toDataSet(order20, classOf[Row1])

    order20Set.writeAsText("src/main/resources/table1/table1")

    //tableEnv.toAppendStream(order20, classOf[Row]).writeAsText("/home/jivesh/table")
    env.execute("State")

  }

  class Row1 {

    var month: String = _

    var sum: java.lang.Integer = _

    override def toString(): String = month + "," + sum

  }

}
Error:-
Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

Caused by: org.apache.flink.fs.s3base.shaded.com.amazonaws.AmazonClientException:
--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala



--
Thanks & Regards
Sri Tummala

12