Hi flink, Im wonder, is it possible to dynamically (while job running) change sink topology* - by adding new sink on the fly? Say, we have input stream and by analyzing message property we decided to put this message into some kafka topic, i.e. choosen_topic = function(message.property).
Simplifying: sink_topic = ‘logger_group_’+message.groupId When job was launched we don`t know list of all possible groupId.
These sink topics (and groupId) are created (and even removed) dynamically upon executing some external rules during the entire job period. Best, Sergey |
Hi Sergey, I would not consider this to be a topology change (the sink operator would still be a Kafka producer). It seems that dynamic topic selection is possible with a KeyedSerializationSchema (Check "Advanced Serialization Schema: [1]). Best, Fabian Am Do., 6. Juni 2019 um 13:50 Uhr schrieb Smirnov Sergey Vladimirovich <[hidden email]>:
|
Great, thanks! From: Fabian Hueske [mailto:[hidden email]]
Hi Sergey, I would not consider this to be a topology change (the sink operator would still be a Kafka producer). It seems that dynamic topic selection is possible with a KeyedSerializationSchema (Check "Advanced Serialization Schema: [1]). Best, Fabian Am Do., 6. Juni 2019 um 13:50 Uhr schrieb Smirnov Sergey Vladimirovich <[hidden email]>:
|
Free forum by Nabble | Edit this page |