Hi,
In addition to having unit tests for the individual components (map, flatmap, reduce, etc) of my application I would like to write unit tests for the entire flow of my Flink application. My application reads from Kafka, does various processing and writes out put to both kafka and files. This means I need a controlled (mock) Kafka broker where I can insert a specific sequence of messages, my application that reads from that and then writes the output somewhere else. What is the recommended way of doing this? What tools/example projects are available for building such tests? Thanks. Best regards / Met vriendelijke groeten,
Niels Basjes |
Hi Niels, Sorry for the late response. you can launch a Kafka Broker within a JVM and use it for testing purposes. Flink's Kafka connector is using that a lot for integration tests. Here is the code starting the Kafka server: https://github.com/apache/flink/blob/770f2f83a81b2810aff171b2f56390ef686f725a/flink-streaming-connectors/flink-connector-kafka-0.8/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaTestEnvironmentImpl.java#L358 In our tests we also start a Flink mini cluster, to submit jobs to. I hope you can find the right code lines to copy for your purposes. Regards, Robert On Fri, Oct 21, 2016 at 4:00 PM, Niels Basjes <[hidden email]> wrote:
|
Thanks. This is exactly what I needed. Niels On Wed, Oct 26, 2016 at 11:03 AM, Robert Metzger <[hidden email]> wrote:
Best regards / Met vriendelijke groeten,
Niels Basjes |
Free forum by Nabble | Edit this page |