Flink SQL dynamic configuration

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink SQL dynamic configuration

Jaqie Chan
Hello,

I use Flink SQL API to process a data stream from Kafka. To process these data, I use some configurations loaded from an HTTP endpoint once at initialization.

The configuration is loaded only once at job initialization. So it works well with a static configuration, but do not handle dynamic ones.

How to handle dynamic configuration, without having to reload the configuration at each message?

Thanks
嘉琪
Reply | Threaded
Open this post in threaded view
|

Re: Flink SQL dynamic configuration

Till Rohrmann
Hi Jaqie,

not sure whether this is easily possible with Flink's SQL API but if you used the DataStream API directly you could create a connected stream where you have two inputs. One input could be the normal message stream and the other input could be the configuration stream. So whenever there is a configuration change, you would need to stream it into your application (e.g. by writing it to Kafka) and then the connected stream operators could apply the configuration changes.

Cheers,
Till

On Thu, Nov 7, 2019 at 4:16 AM Jaqie Chan <[hidden email]> wrote:
Hello,

I use Flink SQL API to process a data stream from Kafka. To process these data, I use some configurations loaded from an HTTP endpoint once at initialization.

The configuration is loaded only once at job initialization. So it works well with a static configuration, but do not handle dynamic ones.

How to handle dynamic configuration, without having to reload the configuration at each message?

Thanks
嘉琪
Reply | Threaded
Open this post in threaded view
|

Re: Flink SQL dynamic configuration

Jaqie Chan
Thanks for your helps Till. I appreciate it.

On Fri, Nov 8, 2019 at 9:02 PM Till Rohrmann <[hidden email]> wrote:
Hi Jaqie,

not sure whether this is easily possible with Flink's SQL API but if you used the DataStream API directly you could create a connected stream where you have two inputs. One input could be the normal message stream and the other input could be the configuration stream. So whenever there is a configuration change, you would need to stream it into your application (e.g. by writing it to Kafka) and then the connected stream operators could apply the configuration changes.

Cheers,
Till

On Thu, Nov 7, 2019 at 4:16 AM Jaqie Chan <[hidden email]> wrote:
Hello,

I use Flink SQL API to process a data stream from Kafka. To process these data, I use some configurations loaded from an HTTP endpoint once at initialization.

The configuration is loaded only once at job initialization. So it works well with a static configuration, but do not handle dynamic ones.

How to handle dynamic configuration, without having to reload the configuration at each message?

Thanks
嘉琪