Hi Team ,
We are trying to build a data pipeline where we have to set up two different kafka consumers for two different kafka topics and with a single SNS sink. Below is the sample code for the same , but looks like from one of the sources the events are not flowing into the cluster. We are using the merge API for merging two input sources here. DataStream<Tuple2<String, AuditEvent>> inputStream1 = env.addSource(flinkKafkaConsumer) In the above code snippet, allStreams is only pulling events from inputStream1 but expectation is allStreams should be pulling events from both inputStream1 and inputStream2. Could you please help us in understanding if this is the right approach or if we are missing something. Thanks, Sudhansu |
For debug, you can just pull data from inputStream2. sudhansu069 [via Apache Flink User Mailing List archive.] <[hidden email]> 于2021年5月27日周四 下午11:22写道:
|
Free forum by Nabble | Edit this page |