Hi, I used to read data from HDFS on Hadoop2 by adding the following dependencies: <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-java</artifactId> <version>1.4.0</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java_2.11</artifactId> <version>1.4.0</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients_2.11</artifactId> <version>1.4.0</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-filesystem_2.11</artifactId> <version>1.4.0</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>2.7.5</version> </dependency> But using the Hadoop3 and following dependencies I got the error: could not find a filesystem implementation for scheme 'hdfs' <dependency> How can I resolve that? |
UPDATE I noticed that it runs using the IntelliJ IDEA but packaging the fat jar and deploying on the cluster will cause the so-called hdfs scheme error! On Thu, May 9, 2019 at 2:43 AM Soheil Pourbafrani <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |