Hi , I've tried to to specify such a schema, when I read from kafka, and covert inputstream to table . But I got the exception:
And the code here:
Thanks ! sen |
I’m sorry, the whole code is:
Thanks~
sen
|
In reply to this post by sen
Hi,
Row is a very special datatype where Flink cannot generate serializers based on the generics. By default DeserializationSchema uses reflection-based type analysis, you need to override the getResultType() method in WormholeDeserializationSchema. And specify the type information manually there. Hope this helps. Regards, Timo Am 06.06.18 um 13:22 schrieb 孙森:
|
Sorry, I didn't see you last mail. The
code looks good actually. What is the result of
`inputStream.getType` if you print it to the console?
Timo Am 07.06.18 um 08:24 schrieb Timo Walther:
|
Yes, that's a workaround. I found the
cause of the problem. It is a Scala API specific problem.
See: https://issues.apache.org/jira/browse/FLINK-9556 Thanks for reporting it! Regards, Timo Am 08.06.18 um 09:43 schrieb 孙森: Yes,I really override the method, but it did not work. Finally ,I used ds.map()(Types.ROW()),then it works fine, but I did't know why. The code is
|
Free forum by Nabble | Edit this page |