Can i use Avro serializer as default instead of kryo serializer in Flink for scala API ?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Can i use Avro serializer as default instead of kryo serializer in Flink for scala API ?

shashank734
Hello Team,

As our schema evolves due to business logics. We want to use expendable schema like Avro as default serializer and deserializer for flink program and states.

My doubt is, We are using Scala API in our flink program, But Avro default supports Java POJO. So how we can use this in our scala APi should we have to use serializer like Avro4s ? Or we can use default Avro in our Scala flink app than what will be the steps ?

Please guide.

--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....




Reply | Threaded
Open this post in threaded view
|

Re: Can i use Avro serializer as default instead of kryo serializer in Flink for scala API ?

Nico Kruber
Hi Shashank,
enabling Avro as the default de/serializer for Flink should be as simple as
the following, according to [1]

val env = StreamExecutionEnvironment.getExecutionEnvironment
env.getConfig.enableForceAvro()

I am, however, no expert on this and the implications regarding the use of
Avro from inside Scala, so I included Gordon (cc'd) who may know more.



Nico


[1] https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/
types_serialization.html

On Saturday, 23 September 2017 10:11:28 CEST shashank agarwal wrote:

> Hello Team,
>
> As our schema evolves due to business logics. We want to use expendable
> schema like Avro as default serializer and deserializer for flink program
> and states.
>
> My doubt is, We are using Scala API in our flink program, But Avro default
> supports Java POJO. So how we can use this in our scala APi should we have
> to use serializer like Avro4s ? Or we can use default Avro in our Scala
> flink app than what will be the steps ?
>
> Please guide.


Reply | Threaded
Open this post in threaded view
|

Re: Can i use Avro serializer as default instead of kryo serializer in Flink for scala API ?

shashank734
Hi,

when I forcefully enable the Avro. That code doesn't work cause there are some dependencies in Flink-CEP library which needs Generic serializer also. So I have a question again?

We are using Scala for Flink program we need evolution schema support for our manage state, Cause variable changes in our models.

Should Case classes supports that or we have to use Avro? 

Should we generate and use our model classes from Avros chema code generator?

Avro generates Java classes is it a problem to use in our scala program and scala streams?

Is avro4s supported by flink?




On Mon, Sep 25, 2017 at 8:24 PM, Nico Kruber <[hidden email]> wrote:
Hi Shashank,
enabling Avro as the default de/serializer for Flink should be as simple as
the following, according to [1]

val env = StreamExecutionEnvironment.getExecutionEnvironment
env.getConfig.enableForceAvro()

I am, however, no expert on this and the implications regarding the use of
Avro from inside Scala, so I included Gordon (cc'd) who may know more.



Nico


[1] <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/ types_serialization.html" rel="noreferrer" target="_blank">https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/
types_serialization.html

On Saturday, 23 September 2017 10:11:28 CEST shashank agarwal wrote:
> Hello Team,
>
> As our schema evolves due to business logics. We want to use expendable
> schema like Avro as default serializer and deserializer for flink program
> and states.
>
> My doubt is, We are using Scala API in our flink program, But Avro default
> supports Java POJO. So how we can use this in our scala APi should we have
> to use serializer like Avro4s ? Or we can use default Avro in our Scala
> flink app than what will be the steps ?
>
> Please guide.





--
Thanks Regards

SHASHANK AGARWAL
 ---  Trying to mobilize the things....