Hi ,
We have a use case where we have to demultiplex the incoming stream to multiple output streams.
We read from 1 Kafka topic and as an output we generate multiple Kafka topics. The logic of generating each new Kafka topic is different and not known beforehand. Users of the system keep adding new logic and henceforth the system needs to generate the data in the new topic with logic applied to the incoming stream.
Input to the system would be logic code or SQL statement and destination topic or S3 location. The system should be able to read this configuration and emit those, hopefully at runtime.
Any guidance if this is possible in flink . and some pointers how this can be achieved.
regards,
Dhuranda