Hi,
Flink Table&SQL has a testing suite to do integration test.
You can have a look at `org.apache.flink.table.planner.runtime.stream.sql.AggregateITCase`.
We are using a bounded source stream and a testing sink to collect result and verify the result.
You need to depend on the following dependency to access the testing utils.
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
Best,
Jark
Hello,
I have built a small Flink app which receive events (json events), deserialize them to an object and then uses the Table API to create two tables, do some join and then write the results back to a kafka stream.
What is the suggested method to correctly test that the code written with the Table API works as expected? Any best practice for ITs ?
Is the best way to simply run the app through flink and then test it with some external tools such a small python script?
(Using Flink 1.8.3)
Thank you,