Hey guys, in a university project we are storing our collected sensor data
in an OpenTSDB
database. I have seen that there is no existing connector for this
Database, but I read in
the docs that is is possible to implement a custom
(Batch/Streaming)TableSource. So I created a new Java Class "OpenTSDBTableSource" that
implements "StreamTableSource", "DefinedProctimeAttribute",
"DefinedRowtimeAttribute" and "LookupableTableSource", as
suggested in the docs. I could also write a "SourceFunction" myself, pull the OpenTSDB
database in there and return the DataStream from the fetched
Collection, but I am not sure whether this is an efficient way. Any help is much appreciated, even if it is just a small pointer to the right direction. Thanks in advance! Sincerely, |
Hi, Lucas. There was a lot of refactoring in the Table API / SQL in the last release, so the user experience is not ideal at the moment — sorry for that. You can try using the DDL syntax to create your table, as shown in [1,2]. I'm CC'ing Timo and Jark, who should be able to help you further. Marta [1] https://flink.apache.org/news/2020/02/20/ddl.html On Tue, Apr 21, 2020 at 7:02 PM Lucas Kinne <[hidden email]> wrote:
|
For sake of brevity the code example does not show the complete code for setting up the environment using EnvironmentSettings class
As you can see comparatively the same protocol is not followed when showing setting up the environment.
or
or
Is there a complete code somewhere ? Please give me link. On Wed, 22 Apr 2020, 09:36 Marta Paes Moreira, <[hidden email]> wrote:
|
Hi Som, You can have a look at ths documentation: https://ci.apache.org/projects/flink/flink-docs-master/dev/table/common.html#create-a-tableenvironment It describe how to create differnet TableEnvironments based on EnvironmentSettings. EnvironmentSettings is a setting to setup a table's environment. ExecutionEnvironment is the entry point of DataSet, and StreamExecutionEnvironment is the entry point of DataStream. So they have nothing to do with EnvironmentSettings. Hi Lucas, I'm sorry that the documentation misses the piece of how to develop connectors for SQL DDL. The docs will be refined once the new connector API is ready before 1.11 release. If you want to develop a OpenTSDB source for SQL DDL, you should also develop a factory implements TableSourceFactory, and add the full class path into `META_INF/services/org.apache.flink.table.factories.TableFactory` file to make it can be discovered by framework. You can take `KafkaTableSourceSinkFactory` [1] as an example. Please let me know if you have other problems. Best, Jark On Wed, 22 Apr 2020 at 17:51, Som Lima <[hidden email]> wrote:
|
Thanks for the link.
On Wed, 22 Apr 2020, 12:19 Jark Wu, <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |