Hi,
In https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/formats/,
it does not mention protobuf format. Does Flink SQL support protobuf format? If not, is there any plan to support it in the near future?
Thanks! |
Hello Shipeng, I am not an expert in Flink, just want to share some of my thoughts. Maybe others can give you better ideas. I think there is no directly available Protobuf support for Flink SQL. However, you can write a user-defined format to support it [1].
If you use DataStream API, you can leverage Kryo Serializer to serialize and deserialize with Protobuf format. [2]. There is an out-of-box integration for Protobuf here. You will need to convert it to Flink SQL after data ingestion. [2]
https://ci.apache.org/projects/flink/flink-docs-stable/dev/custom_serializers.html Best, Fuyao From:
Shipeng Xie <[hidden email]> Hi,
In https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/formats/,
it does not mention protobuf format. Does Flink SQL support protobuf format? If not, is there any plan to support it in the near future?
Thanks! |
Hi Shipeng, it looks like there is an open Jira issue FLINK-18202 [1] addressing this topic. You might want to follow up on that one. I'm adding Timo and Jark to this thread. They might have more insights. Best, Matthias On Sat, May 1, 2021 at 2:00 AM Fuyao Li <[hidden email]> wrote:
|
Hi Shipeng, Matthias is correct. FLINK-18202 should address this topic. There is already a pull request there which is in good shape. You can also download the PR and build the format jar yourself, and then it should work with Flink 1.12. Best, Jark On Mon, 3 May 2021 at 21:41, Matthias Pohl <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |