I was going to evaluate if Flink streaming could fit a use case we have, where data comes into the system, gets transformed and then added to a db (a very common problem..).
In such use case you have to manage the merge of existing records as new data come in. How can you ensure that only one row/entity of the db is updated at a time with Flink?
If the sink that writes to the Database executes partitioned by the primary key, then this should naturally prevent row conflicts.
Greetings, Stephan
On Mon, Dec 14, 2015 at 11:32 AM, Flavio Pompermaier <[hidden email]> wrote:
Hi flinkers,
I was going to evaluate if Flink streaming could fit a use case we have, where data comes into the system, gets transformed and then added to a db (a very common problem..).
In such use case you have to manage the merge of existing records as new data come in. How can you ensure that only one row/entity of the db is updated at a time with Flink?
On Mon, Dec 14, 2015 at 8:18 PM, Stephan Ewen <[hidden email]> wrote:
Hi!
If the sink that writes to the Database executes partitioned by the primary key, then this should naturally prevent row conflicts.
Greetings, Stephan
On Mon, Dec 14, 2015 at 11:32 AM, Flavio Pompermaier <[hidden email]> wrote:
Hi flinkers,
I was going to evaluate if Flink streaming could fit a use case we have, where data comes into the system, gets transformed and then added to a db (a very common problem..).
In such use case you have to manage the merge of existing records as new data come in. How can you ensure that only one row/entity of the db is updated at a time with Flink?