public class TestDeserializer extends MyKafkaDeserializer<MyBean> {
public final static String END_APP_MARKER = "END_APP_MARKER";
public TestProxylogDeserializer() {
super(new MyParser(), new PlainMsgContentExtractor());
}
@Override
public boolean isEndOfStream(ParseResult<MyBean> nextElement) {
// Succeeded message
if (nextElement.getParseError() == null)
return false;
// Not parseable message raw data:
if (END_APP_MARKER.equals(nextElement.getParseError().getRawData()))
return true;
return false;
}
}
Hi,
I am new to Flink, at least to the testing part.
We need an end to end integration test for a flink job.
Where can I find documentation for this?
I am envisaging a test similar to that:
1) Start a local job instance in an IDE or maven test
2) Fire event jsons to the data source (i.e. a Kafka topic)
3) Retrieve result jsons from the data sink (i.e. a Kafka topic or an elastic search index)
4) Compared result jsons with the expected ones
Since our Flink job is a streaming one, how can we tear the Flink job instance running in an IDE?
Regards,
Min
Konstantin Knauf | Solutions Architect
+49 160 91394525
Planned Absences: 20. - 21.06.2019, 10.08.2019 - 31.08.2019, 05.09. - 06.09.2010
--
Data Artisans GmbH | Invalidenstrasse 115, 10115 Berlin, Germany
--
Data Artisans GmbHFree forum by Nabble | Edit this page |