Multiple Elasticsearch sinks not working in Flink

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

Multiple Elasticsearch sinks not working in Flink

Teena Kappen // BPRISE

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 


Flink-ElasticSearch-MultipleSinks.txt (6K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Multiple Elasticsearch sinks not working in Flink

Timo Walther
Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 


Reply | Threaded
Open this post in threaded view
|

RE: Multiple Elasticsearch sinks not working in Flink

Teena Kappen // BPRISE

Hi Timo,

 

It works fine when the second sink is a Cassandra Sink. The data gets read from KafkaTopic2 and it gets written to Cassandra as expected.

 

Regards,

Teena

 

From: Timo Walther [mailto:[hidden email]]
Sent: 18 January 2018 18:41
To: [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 

 

Reply | Threaded
Open this post in threaded view
|

Re: Multiple Elasticsearch sinks not working in Flink

Fabian Hueske-2
Hi Teena,

I created FLINK-8489 [1] to track the issue.
Please have a look and add information that might be relevant.

Best, Fabian

2018-01-18 14:16 GMT+01:00 Teena Kappen // BPRISE <[hidden email]>:

Hi Timo,

 

It works fine when the second sink is a Cassandra Sink. The data gets read from KafkaTopic2 and it gets written to Cassandra as expected.

 

Regards,

Teena

 

From: Timo Walther [mailto:[hidden email]]
Sent: 18 January 2018 18:41
To: [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 

 


Reply | Threaded
Open this post in threaded view
|

RE: Multiple Elasticsearch sinks not working in Flink

Teena Kappen // BPRISE

Thanks Fabian. I will go through it and add info if required.

 

From: Fabian Hueske [mailto:[hidden email]]
Sent: 23 January 2018 15:20
To: Teena Kappen // BPRISE <[hidden email]>
Cc: Timo Walther <[hidden email]>; [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

I created FLINK-8489 [1] to track the issue.

Please have a look and add information that might be relevant.

 

Best, Fabian

 

2018-01-18 14:16 GMT+01:00 Teena Kappen // BPRISE <[hidden email]>:

Hi Timo,

 

It works fine when the second sink is a Cassandra Sink. The data gets read from KafkaTopic2 and it gets written to Cassandra as expected.

 

Regards,

Teena

 

From: Timo Walther [mailto:[hidden email]]
Sent: 18 January 2018 18:41
To: [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 

 

 

Reply | Threaded
Open this post in threaded view
|

Re: Multiple Elasticsearch sinks not working in Flink

Stephan Ewen
As mentioned in the issue, please check if using two different config map objects solves the issue.

On Tue, Jan 23, 2018 at 1:32 PM, Teena Kappen // BPRISE <[hidden email]> wrote:

Thanks Fabian. I will go through it and add info if required.

 

From: Fabian Hueske [mailto:[hidden email]]
Sent: 23 January 2018 15:20
To: Teena Kappen // BPRISE <[hidden email]>
Cc: Timo Walther <[hidden email]>; [hidden email]


Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

I created FLINK-8489 [1] to track the issue.

Please have a look and add information that might be relevant.

 

Best, Fabian

 

2018-01-18 14:16 GMT+01:00 Teena Kappen // BPRISE <[hidden email]>:

Hi Timo,

 

It works fine when the second sink is a Cassandra Sink. The data gets read from KafkaTopic2 and it gets written to Cassandra as expected.

 

Regards,

Teena

 

From: Timo Walther [mailto:[hidden email]]
Sent: 18 January 2018 18:41
To: [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 

 

 


Reply | Threaded
Open this post in threaded view
|

Re: Multiple Elasticsearch sinks not working in Flink

Fabian Hueske-2
Hi Teena,

a potential fix for the issue has been merged: https://issues.apache.org/jira/browse/FLINK-8489
It would be great if you could check if that fixes the problem and report back.

Thank you,
Fabian

2018-01-23 20:04 GMT+01:00 Stephan Ewen <[hidden email]>:
As mentioned in the issue, please check if using two different config map objects solves the issue.

On Tue, Jan 23, 2018 at 1:32 PM, Teena Kappen // BPRISE <[hidden email]> wrote:

Thanks Fabian. I will go through it and add info if required.

 

From: Fabian Hueske [mailto:[hidden email]]
Sent: 23 January 2018 15:20
To: Teena Kappen // BPRISE <[hidden email]>
Cc: Timo Walther <[hidden email]>; [hidden email]


Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

I created FLINK-8489 [1] to track the issue.

Please have a look and add information that might be relevant.

 

Best, Fabian

 

2018-01-18 14:16 GMT+01:00 Teena Kappen // BPRISE <[hidden email]>:

Hi Timo,

 

It works fine when the second sink is a Cassandra Sink. The data gets read from KafkaTopic2 and it gets written to Cassandra as expected.

 

Regards,

Teena

 

From: Timo Walther [mailto:[hidden email]]
Sent: 18 January 2018 18:41
To: [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 

 

 



Reply | Threaded
Open this post in threaded view
|

RE: Multiple Elasticsearch sinks not working in Flink

Teena Kappen // BPRISE

@Fabian: I will run the code with the Git repo source and let you know the results.

 

@Stephan: Sorry I missed the email from you somehow. I understand from the JIRA link that you already have the answer for this. Yet I tried using two separate config map objects in my code and that resolved the issue. Both the sinks wrote into Elasticsearch as expected.

 

Thank you for taking this up. I will report back on the test results soon.

 

Regards,

Teena

 

From: Fabian Hueske [mailto:[hidden email]]
Sent: 31 January 2018 19:41
To: Stephan Ewen <[hidden email]>
Cc: Teena Kappen // BPRISE <[hidden email]>; Timo Walther <[hidden email]>; [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

a potential fix for the issue has been merged: https://issues.apache.org/jira/browse/FLINK-8489

It would be great if you could check if that fixes the problem and report back.

Thank you,

Fabian

 

2018-01-23 20:04 GMT+01:00 Stephan Ewen <[hidden email]>:

As mentioned in the issue, please check if using two different config map objects solves the issue.

 

On Tue, Jan 23, 2018 at 1:32 PM, Teena Kappen // BPRISE <[hidden email]> wrote:

Thanks Fabian. I will go through it and add info if required.

 

From: Fabian Hueske [mailto:[hidden email]]
Sent: 23 January 2018 15:20
To: Teena Kappen // BPRISE <[hidden email]>
Cc: Timo Walther <[hidden email]>; [hidden email]


Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

I created FLINK-8489 [1] to track the issue.

Please have a look and add information that might be relevant.

 

Best, Fabian

 

2018-01-18 14:16 GMT+01:00 Teena Kappen // BPRISE <[hidden email]>:

Hi Timo,

 

It works fine when the second sink is a Cassandra Sink. The data gets read from KafkaTopic2 and it gets written to Cassandra as expected.

 

Regards,

Teena

 

From: Timo Walther [mailto:[hidden email]]
Sent: 18 January 2018 18:41
To: [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 

 

 

 

 

Reply | Threaded
Open this post in threaded view
|

RE: Multiple Elasticsearch sinks not working in Flink

Teena Kappen // BPRISE

Hi Fabian,

 

We tried the fix that was merged and the sinks are working correctly now. Thank you for resolving the issue.

 

Regards,

Teena

 

From: Teena Kappen // BPRISE
Sent: 01 February 2018 19:12
To: Fabian Hueske <[hidden email]>; Stephan Ewen <[hidden email]>
Cc: Timo Walther <[hidden email]>; [hidden email]
Subject: RE: Multiple Elasticsearch sinks not working in Flink

 

@Fabian: I will run the code with the Git repo source and let you know the results.

 

@Stephan: Sorry I missed the email from you somehow. I understand from the JIRA link that you already have the answer for this. Yet I tried using two separate config map objects in my code and that resolved the issue. Both the sinks wrote into Elasticsearch as expected.

 

Thank you for taking this up. I will report back on the test results soon.

 

Regards,

Teena

 

From: Fabian Hueske [[hidden email]]
Sent: 31 January 2018 19:41
To: Stephan Ewen <[hidden email]>
Cc: Teena Kappen // BPRISE <[hidden email]>; Timo Walther <[hidden email]>; [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

a potential fix for the issue has been merged: https://issues.apache.org/jira/browse/FLINK-8489

It would be great if you could check if that fixes the problem and report back.

Thank you,

Fabian

 

2018-01-23 20:04 GMT+01:00 Stephan Ewen <[hidden email]>:

As mentioned in the issue, please check if using two different config map objects solves the issue.

 

On Tue, Jan 23, 2018 at 1:32 PM, Teena Kappen // BPRISE <[hidden email]> wrote:

Thanks Fabian. I will go through it and add info if required.

 

From: Fabian Hueske [mailto:[hidden email]]
Sent: 23 January 2018 15:20
To: Teena Kappen // BPRISE <[hidden email]>
Cc: Timo Walther <[hidden email]>; [hidden email]


Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

I created FLINK-8489 [1] to track the issue.

Please have a look and add information that might be relevant.

 

Best, Fabian

 

2018-01-18 14:16 GMT+01:00 Teena Kappen // BPRISE <[hidden email]>:

Hi Timo,

 

It works fine when the second sink is a Cassandra Sink. The data gets read from KafkaTopic2 and it gets written to Cassandra as expected.

 

Regards,

Teena

 

From: Timo Walther [mailto:[hidden email]]
Sent: 18 January 2018 18:41
To: [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena

 

 

 

 

 

 

 

 

Reply | Threaded
Open this post in threaded view
|

Re: Multiple Elasticsearch sinks not working in Flink

Fabian Hueske-2
Great, thanks for the feedback!

Best, Fabian

2018-02-03 9:37 GMT+01:00 Teena Kappen // BPRISE <[hidden email]>:

Hi Fabian,

 

We tried the fix that was merged and the sinks are working correctly now. Thank you for resolving the issue.

 

Regards,

Teena

 

From: Teena Kappen // BPRISE
Sent: 01 February 2018 19:12
To: Fabian Hueske <[hidden email]>; Stephan Ewen <[hidden email]>
Cc: Timo Walther <[hidden email]>; [hidden email]
Subject: RE: Multiple Elasticsearch sinks not working in Flink

 

@Fabian: I will run the code with the Git repo source and let you know the results.

 

@Stephan: Sorry I missed the email from you somehow. I understand from the JIRA link that you already have the answer for this. Yet I tried using two separate config map objects in my code and that resolved the issue. Both the sinks wrote into Elasticsearch as expected.

 

Thank you for taking this up. I will report back on the test results soon.

 

Regards,

Teena

 

From: Fabian Hueske [[hidden email]]
Sent: 31 January 2018 19:41
To: Stephan Ewen <[hidden email]>
Cc: Teena Kappen // BPRISE <[hidden email]>; Timo Walther <[hidden email]>; [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

a potential fix for the issue has been merged: https://issues.apache.org/jira/browse/FLINK-8489

It would be great if you could check if that fixes the problem and report back.

Thank you,

Fabian

 

2018-01-23 20:04 GMT+01:00 Stephan Ewen <[hidden email]>:

As mentioned in the issue, please check if using two different config map objects solves the issue.

 

On Tue, Jan 23, 2018 at 1:32 PM, Teena Kappen // BPRISE <[hidden email]> wrote:

Thanks Fabian. I will go through it and add info if required.

 

From: Fabian Hueske [mailto:[hidden email]]
Sent: 23 January 2018 15:20
To: Teena Kappen // BPRISE <[hidden email]>
Cc: Timo Walther <[hidden email]>; [hidden email]


Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

I created FLINK-8489 [1] to track the issue.

Please have a look and add information that might be relevant.

 

Best, Fabian

 

2018-01-18 14:16 GMT+01:00 Teena Kappen // BPRISE <[hidden email]>:

Hi Timo,

 

It works fine when the second sink is a Cassandra Sink. The data gets read from KafkaTopic2 and it gets written to Cassandra as expected.

 

Regards,

Teena

 

From: Timo Walther [mailto:[hidden email]]
Sent: 18 January 2018 18:41
To: [hidden email]
Subject: Re: Multiple Elasticsearch sinks not working in Flink

 

Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

 

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

 

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

 

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

 

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

 

Regards,

Teena