Scala/ReactiveMongo: type classes, macros and java.util.Serializable

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Scala/ReactiveMongo: type classes, macros and java.util.Serializable

stefanobaghino
Hello everybody,

in the past days, I've written batch input/output formats for MongoDB.

Initially, I've tried to use the non-blocking ReactiveMongo driver, which uses the type class pattern in Scala for the serialization logic. The library also exposes some pretty neat macros that automatically generate the type class instances for you, given a case class.

Now, the problem is that these macros are not useful in Flink because the generated type class instances would have to be serializable (something that has been understandably left out from the macros). Has anyone ever faced a similar problem? I've encountered it again when using upickle, which has a similar facility but for JSON serialization.

In the end I've resorted to writing my own serialization logic and explicitly extending java.util.Serializable in the end but I feel there may be a way to not do this (without rewriting/extending the macros to make the generated classes serializable).

--
BR,
Stefano Baghino

Software Engineer @ Radicalbit
Reply | Threaded
Open this post in threaded view
|

Re: Scala/ReactiveMongo: type classes, macros and java.util.Serializable

Aljoscha Krettek
Hi,
could you maybe write TypeInformation/TypeSerializer wrappers that lazily instantiate a type class-based serializer. It might even work using a "lazy val". Something like this:

class ScalaTypeSerializer[T] extends TypeSerializer[T] {
  lazy val serializer = "create the scala serializer"
   ...

  def serialize(value: T, out: DataOutputView): () = {
    serializer.serialize(value, out) // not sure how the generated serializers would be used, just a placeholder
  }
}

Cheers,
Aljoscha

On Thu, 23 Jun 2016 at 16:23 Stefano Baghino <[hidden email]> wrote:
Hello everybody,

in the past days, I've written batch input/output formats for MongoDB.

Initially, I've tried to use the non-blocking ReactiveMongo driver, which uses the type class pattern in Scala for the serialization logic. The library also exposes some pretty neat macros that automatically generate the type class instances for you, given a case class.

Now, the problem is that these macros are not useful in Flink because the generated type class instances would have to be serializable (something that has been understandably left out from the macros). Has anyone ever faced a similar problem? I've encountered it again when using upickle, which has a similar facility but for JSON serialization.

In the end I've resorted to writing my own serialization logic and explicitly extending java.util.Serializable in the end but I feel there may be a way to not do this (without rewriting/extending the macros to make the generated classes serializable).

--
BR,
Stefano Baghino

Software Engineer @ Radicalbit