Hi, Is CSV format supported for Kafka in Flink 1.10? It says I need to specify connector.type as Filesystem but documentation says it is supported for Kafka? import org.apache.flink.contrib.streaming.state.RocksDBStateBackend; This code generates the following error Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in the classpath. Reason: Required context properties mismatch. The matching candidates: org.apache.flink.table.sources.CsvAppendTableSourceFactory Mismatched properties: 'connector.type' expects 'filesystem', but is 'kafka' The following properties are requested: connector.property-version=1 connector.topic=test-topic1 connector.type=kafka connector.version=0.11 format.property-version=1 format.type=csv schema.0.data-type=VARCHAR(2147483647) schema.0.name=f0 update-mode=append The following factories have been considered: org.apache.flink.table.sources.CsvBatchTableSourceFactory org.apache.flink.table.sources.CsvAppendTableSourceFactory at org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322) at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190) at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143) at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:96) at org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:52) ... 34 more |
Hi Kant, Csv is supported in Kafka, but you should download and load flink-csv sql jar into SQL CLI using `--library`. Because, the Csv format factory is implemented in a separate module and not bundled by default. On Sun, 1 Mar 2020 at 03:48, kant kodali <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |