Hello,
I am pretty new to Apache Flink.
I am trying to figure out how does Flink parses an Apache Calcite sql query
to its own Streaming API in order to maybe extend it, because, as far as I
know, many operations are still being developed and not currently supported
(like TUMBLE windows). I need to be able to load rules from a file , like
so:
/tableEnv.sql([File])../
in order to do that I need a fully functional Streaming SQL parser.
I am currently analyzing the StreamTableEnvironment class from github [1] in
order to understand the method sql but I can't figure out where does the
parsing happens.
Can someone point me in the right direction?
[1]
https://github.com/apache/flink/blob/ d7b59d761601baba6765bb4fc407bc d9fd6a9387/flink-libraries/ flink-table/src/main/scala/ org/apache/flink/api/table/ StreamTableEnvironment.scala
<https://github.com/apache/flink/blob/ >d7b59d761601baba6765bb4fc407bc d9fd6a9387/flink-libraries/ flink-table/src/main/scala/ org/apache/flink/api/table/ StreamTableEnvironment.scala
Best Regards,
Pedro Chaves
--
View this message in context: http://apache-flink-user-mailing-list-archive.2336050. n4.nabble.com/Flink-SQL- Stream-Parser-based-on- calcite-tp9592.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.
Free forum by Nabble | Edit this page |