Producing binary Avro to Kafka

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Producing binary Avro to Kafka

Elliot West
Hello,

What is the recommended flink streaming approach for serialising a POJO to Avro according to a schema, and pushing the subsequent byte array into a Kafka sink? Also, is there any existing approach for prepending the schema id to the payload (following the Confluent pattern)?

Thanks,

Elliot.
Reply | Threaded
Open this post in threaded view
|

Re: Producing binary Avro to Kafka

Suneel Marthi
This was presented at Flink Forward Ber;lin 2017 - see the slide deck here https://smarthi.github.io/flink-forward-berlin-2017-moving-beyond-moving-bytes/#/19

You should be able to leverage Confluent/Horton schema registries from flink pipelines. 

On Wed, Jan 9, 2019 at 4:14 PM Elliot West <[hidden email]> wrote:
Hello,

What is the recommended flink streaming approach for serialising a POJO to Avro according to a schema, and pushing the subsequent byte array into a Kafka sink? Also, is there any existing approach for prepending the schema id to the payload (following the Confluent pattern)?

Thanks,

Elliot.