Hello,
I'm trying to create a `StreamTableSource` for Snowflake using `JdbcTableSourceSinkFactory.createStreamTableSource` (in package org.apache.flink.connector.jdbc.table) but it fails with the following error message due to `JdbcDialects` not having a dialect for Snowflake. My goal is to fully read a Snowflake table through Flink. Is there any way to work around this? ``` java.lang.IllegalStateException: Cannot handle such jdbc url: jdbc:snowflake://abc123.us-east-1.snowflakecomputing.com/?db=TEST at org.apache.flink.util.Preconditions.checkState(Preconditions.java:195) at org.apache.flink.table.descriptors.JdbcValidator.validateCommonProperties(JdbcValidator.java:79) at org.apache.flink.table.descriptors.JdbcValidator.validate(JdbcValidator.java:64) at org.apache.flink.connector.jdbc.table.JdbcTableSourceSinkFactory.getValidatedProperties(JdbcTableSourceSinkFactory.java:173) at org.apache.flink.connector.jdbc.table.JdbcTableSourceSinkFactory.createStreamTableSource(JdbcTableSourceSinkFactory.java:138) ``` Thanks, Abhishek |
Hello, Unfortunately, this driver is not currently supported by the Table API [1]. You can implement a dialect for it [2] and construct JdbcTableSource [3] manually. Alternatively, you can switch to the DataStream API and use JdbcInputFormat [4] which doesn't require dialect. I'm also pulling in Jingson Li and Jark Wu as they might know better. Regards,
Roman On Fri, Dec 18, 2020 at 4:55 PM Abhishek Rai <[hidden email]> wrote: Hello, |
Thanks Roman, I ended up switching to the DataStream API and using
JdbcInputFormat like you suggested and that worked out fine. Thanks! On Fri, Dec 18, 2020 at 10:21 AM Khachatryan Roman <[hidden email]> wrote: > > Hello, > > Unfortunately, this driver is not currently supported by the Table API [1]. > You can implement a dialect for it [2] and construct JdbcTableSource [3] manually. > > Alternatively, you can switch to the DataStream API and use JdbcInputFormat [4] which doesn't require dialect. > > I'm also pulling in Jingson Li and Jark Wu as they might know better. > > [1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/table/connectors/jdbc.html > [2] https://ci.apache.org/projects/flink/flink-docs-stable/api/java/org/apache/flink/connector/jdbc/dialect/JdbcDialect.html > [3] https://ci.apache.org/projects/flink/flink-docs-stable/api/java/org/apache/flink/connector/jdbc/table/JdbcTableSource.html > [4] https://ci.apache.org/projects/flink/flink-docs-stable/api/java/org/apache/flink/connector/jdbc/JdbcInputFormat.html > > Regards, > Roman > > > On Fri, Dec 18, 2020 at 4:55 PM Abhishek Rai <[hidden email]> wrote: >> >> Hello, >> >> I'm trying to create a `StreamTableSource` for Snowflake using >> `JdbcTableSourceSinkFactory.createStreamTableSource` (in package >> org.apache.flink.connector.jdbc.table) but it fails with the following >> error message due to `JdbcDialects` not having a dialect for >> Snowflake. >> >> My goal is to fully read a Snowflake table through Flink. >> >> Is there any way to work around this? >> >> ``` >> java.lang.IllegalStateException: Cannot handle such jdbc url: >> jdbc:snowflake://abc123.us-east-1.snowflakecomputing.com/?db=TEST >> at org.apache.flink.util.Preconditions.checkState(Preconditions.java:195) >> at org.apache.flink.table.descriptors.JdbcValidator.validateCommonProperties(JdbcValidator.java:79) >> at org.apache.flink.table.descriptors.JdbcValidator.validate(JdbcValidator.java:64) >> at org.apache.flink.connector.jdbc.table.JdbcTableSourceSinkFactory.getValidatedProperties(JdbcTableSourceSinkFactory.java:173) >> at org.apache.flink.connector.jdbc.table.JdbcTableSourceSinkFactory.createStreamTableSource(JdbcTableSourceSinkFactory.java:138) >> ``` >> >> Thanks, >> Abhishek |
Free forum by Nabble | Edit this page |