Want to write Kafka Sink to SQL Client by Flink-1.5

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Want to write Kafka Sink to SQL Client by Flink-1.5

Shivam Sharma
Hi All,

We want to write Kafka Sink functionality for Flink(1.5) SQL Client. We have read the code and chalk out a rough plan for implementation. 

Any guidance for this implementation will be very helpful.

Thanks
--
Shivam Sharma
Data Engineer @ Goibibo
Indian Institute Of Information Technology, Design and Manufacturing Jabalpur
Mobile No- (+91) 8882114744
Reply | Threaded
Open this post in threaded view
|

Re: Want to write Kafka Sink to SQL Client by Flink-1.5

Rong Rong
Hi Shivam,
Thank you for interested in contributing to Kafka Sink for SQL client. Could you share your plan for implementation. I have some questions as there might have been some overlap with current implementations. 

On a higher level, 
1. Are you using some type of metadata store to host topic schemas (Kafka can essentially be schema-less), it might be great to take a look at the TableSource/SinkFactory [1][2]
2. There's already a KafkaTableSource and KafkaTableSink available, I am assuming you are trying to contribute to the configuration in SQL Client to make it easier to interact with a Kafka table?

Thanks,
Rong


On Tue, Jul 10, 2018 at 3:28 AM Shivam Sharma <[hidden email]> wrote:
Hi All,

We want to write Kafka Sink functionality for Flink(1.5) SQL Client. We have read the code and chalk out a rough plan for implementation. 

Any guidance for this implementation will be very helpful.

Thanks
--
Shivam Sharma
Data Engineer @ Goibibo
Indian Institute Of Information Technology, Design and Manufacturing Jabalpur
Mobile No- (+91) 8882114744
Reply | Threaded
Open this post in threaded view
|

Re: Want to write Kafka Sink to SQL Client by Flink-1.5

Timo Walther
Hi Shivam,

a Kafka sink for the SQL Client will be part of Flink 1.6. For this we need to do provide basic interfaces that sinks can extends as Rong mentioned (FLINK-8866). In order to support all formats that also sources support we also working on separating the connector from the formats [1]. PR for these features are ready and I'm working on integrating them right now. Once this is done and we have support for INSERT INTO in SQL Client a Kafka sink implementation is straightforward.

Regards,
Timo


[1] https://issues.apache.org/jira/browse/FLINK-8558

Am 12.07.18 um 02:45 schrieb Rong Rong:
Hi Shivam,
Thank you for interested in contributing to Kafka Sink for SQL client. Could you share your plan for implementation. I have some questions as there might have been some overlap with current implementations. 

On a higher level, 
1. Are you using some type of metadata store to host topic schemas (Kafka can essentially be schema-less), it might be great to take a look at the TableSource/SinkFactory [1][2]
2. There's already a KafkaTableSource and KafkaTableSink available, I am assuming you are trying to contribute to the configuration in SQL Client to make it easier to interact with a Kafka table?

Thanks,
Rong


On Tue, Jul 10, 2018 at 3:28 AM Shivam Sharma <[hidden email]> wrote:
Hi All,

We want to write Kafka Sink functionality for Flink(1.5) SQL Client. We have read the code and chalk out a rough plan for implementation. 

Any guidance for this implementation will be very helpful.

Thanks
--
Shivam Sharma
Data Engineer @ Goibibo
Indian Institute Of Information Technology, Design and Manufacturing Jabalpur
Mobile No- (+91) 8882114744


Reply | Threaded
Open this post in threaded view
|

Re: Want to write Kafka Sink to SQL Client by Flink-1.5

Shivam Sharma

Awesome!!! It will be helpful if you share that PR.

Thanks