How to handle exceptions in Kafka sink?

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

How to handle exceptions in Kafka sink?

HarshithBolar
This post was updated on .
Hi all

I have a Flink job that writes data into Kafka. The Kafka topic has maximum
message size set to 5 MB, so if I try to write any record larger than 5 MB,
it throws the following exception and brings the job down.

<http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/file/t1476/Screen_Shot_2018-09-13_at_1.png

Now I have configured checkpointing in my job, so if the job fails, it
restarts again. Problem is, every time it restarts, it fails for the same
record and goes into an infinite loop of failures and restarts. Is there a
way to handle this Kafka exception in my code so that it doesn't bring down
the entire job.

Thanks,
Harshith

--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/