Hi to all,
I'd like to know whether it exists or not an example about how to leverage Debezium as a CDC source and to feed a Flink Table (From MySQL for example). Best, Flavio |
I actually thinking about this option as well . Im assuming that the correct way to implement it , is to integrate debezium embedded to source function ? On Wed, Jul 17, 2019 at 7:08 PM Flavio Pompermaier <[hidden email]> wrote:
|
I think that using Kafka to get CDC events is fine. The problem, in my case, is really about how to proceed: 1) do I need to create Flink tables before reading CDC events or is there a way to automatically creating Flink tables when they gets created via a DDL event (assuming a filter on the name of the tables? 2) How to handle changes in the table structure (adding or removing columns)...? Is Flink able to react to this? 3) CSC is a common use case (IMHO) and it's perfect for migrating or test to an event driven architecture. So I expect Flink to be able to easily allow to query Dynamic tables coming from a db (via Debezium) without implementing the logic to handle insert/delete/update statements What do you think? Il Gio 18 Lug 2019, 13:17 miki haiat <[hidden email]> ha scritto:
|
Anyone else having experience on this topic that could provide additional feedback here? On Thu, Jul 18, 2019 at 1:18 PM Flavio Pompermaier <[hidden email]> wrote:
|
Indeed Kafka connect is perfect but I think Flink could easily do the same without much work..this is what I'm asking for..if anybody has never thought about it |
Free forum by Nabble | Edit this page |