Hi,
I´trying to write to a postgres table but the job fails on a column of type varchar[]. I get the following error: Caused by: java.lang.IllegalStateException: Writing ARRAY type is not yet supported in JDBC:PostgreSQL. After getting data from a kafka topic, my code looks like this: tableEnv.executeSql( s""" |create table cities ( | id varchar primary key, | cities varchar array |) WITH ( | 'connector' = 'jdbc', | 'url' = 'jdbc:postgresql://$postgresHost:$postgresPort/$postgresDatabase', | 'username' = '$postgresUsername', | 'password' = '$postgresPassword', | 'table-name' = 'public.cities' |) |""".stripMargin ) tableEnv.executeSql( """ insert into cities | select | id, | cities | from source_table """.stripMargin ) tableEnv.execute("SomeJob") The column `cities` is indeed the problem as changing the data type to varchar (in both table and stream) solves the issue. Can anybody suggest a workaround? flink 1.12.1 flink-connector-jdbc_2.12-1.12.1 postgres jdbc driver 42.2.20.jre7 Thanks Federico -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ |
Hi Federico,
could you also share the full stack trace with us? According to the docs, the ARRAY type should be supported: https://ci.apache.org/projects/flink/flink-docs-stable/dev/table/connectors/jdbc.html#data-type-mapping Can you also try to use `cities ARRAY<STRING>` in your CREATE TABLE, maybe it is `VARCHAR` that causes the issues? Regards, Timo On 30.04.21 11:07, fgahan wrote: > Hi, > > I´trying to write to a postgres table but the job fails on a column of type > varchar[]. > I get the following error: > > Caused by: java.lang.IllegalStateException: Writing ARRAY type is not yet > supported in JDBC:PostgreSQL. > > After getting data from a kafka topic, my code looks like this: > > > tableEnv.executeSql( > s""" > |create table cities ( > | id varchar primary key, > | cities varchar array > |) WITH ( > | 'connector' = 'jdbc', > | 'url' = > 'jdbc:postgresql://$postgresHost:$postgresPort/$postgresDatabase', > | 'username' = '$postgresUsername', > | 'password' = '$postgresPassword', > | 'table-name' = 'public.cities' > |) > |""".stripMargin > ) > > tableEnv.executeSql( > """ insert into cities > | select > | id, > | cities > | from source_table """.stripMargin > ) > > tableEnv.execute("SomeJob") > > > The column `cities` is indeed the problem as changing the data type to > varchar (in both table and stream) solves the issue. > > Can anybody suggest a workaround? > > flink 1.12.1 > flink-connector-jdbc_2.12-1.12.1 > postgres jdbc driver 42.2.20.jre7 > > > Thanks > Federico > > > > > -- > Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ > |
Hi Timo,
I´m attaching the stacktrace. I did try the array<T> syntax with a few different options (string, varchar, character varying..) and they all end up with the same error. thanks... Federico flink_stacktrace.log <http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/file/t3127/flink_stacktrace.log> -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ |
Hi Federico,
if ARRAY<STRING> doesn't work, this is definitely a bug. Either in the documentation or in the implementation. I will loop in Jingsong Li who can help. In any case, feel free to open a JIRA ticket already. Regards, Timo On 30.04.21 14:44, fgahan wrote: > Hi Timo, > > I´m attaching the stacktrace. > > I did try the array<T> syntax with a few different options (string, varchar, > character varying..) and they all end up with the same error. > > thanks... > > Federico > > > flink_stacktrace.log > <http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/file/t3127/flink_stacktrace.log> > > > > -- > Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ > |
Free forum by Nabble | Edit this page |