Hi Sunny,
As stated by Fabian try to see whether including the postgres classes in the shaded jar solves the problem. If it doesn't, you're probably hitting the same problem i had with an older version of Flink (https://issues.apache.org/jira/plugins/servlet/mobile#issue/FLINK-4061) and this you have to copy the postgres jar in the Flink lib directory.
Best,
Flavio
The error message is actually quite descriptive. Flink does not find the JDBC driver class.Hi Sunny,please avoid crossposting to all mailing lists.
The [hidden email] list is for issues related to the development of Flink not the development of Flink applications.
You need to add it to the classpath for example by adding the corresponding Maven dependency to your pom file.Fabian...2016-10-12 23:18 GMT+02:00 sunny patel <[hidden email]>:Hi Guys,I am facing JDBC error, could you please some one advise me on this error?$ java -version
java version "1.8.0_102"
Java(TM) SE Runtime Environment (build 1.8.0_102-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.102-b14, mixed mode)
$ scala -version
Scala code runner version 2.11.8 -- Copyright 2002-2016, LAMP/EPFL
=============== Scala Code
import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
import org.apache.flink.api.scala._
import org.apache.flink.api.table.typeutils.RowTypeInfo
object WordCount {
def main(args: Array[String]) {
val PATH = getClass.getResource("").getPath
// set up the execution environment
val env = ExecutionEnvironment.getExecutionEnvironment
// Read data from JDBC (Kylin in our case)
val stringColum: TypeInformation[Int] = createTypeInformation[Int]
val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))
val inputFormat = JDBCInputFormat.buildJDBCInputFormat ()
.setDrivername("org.postgresql.jdbc.Driver" )
.setDBUrl("jdbc:postgresql://localhost:5432/mydb" )
.setUsername("MI")
.setPassword("MI")
.setQuery("select * FROM identity")
.setRowTypeInfo(DB_ROWTYPE)
.finish()
val dataset =env.createInput(inputFormat)
dataset.print()
println(PATH)
}
}==============================
============================== ============== ==========POM.XML
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0 " xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance " xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd ">
<modelVersion>4.0.0</modelVersion >
<parent>
<artifactId>flink-parent</artifactId >
<groupId>org.apache.flink</groupId >
<version>1.2-SNAPSHOT</version>
</parent>
<groupId>org.apache.flink.quickstart</groupId>
<artifactId>flink-scala-project</artifactId>
<version>0.1</version>
<packaging>jar</packaging>
<name>Flink Quickstart Job</name>
<url>http://www.myorganization.org </url>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/ </url>snapshots/
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
</snapshots>
</repository>
</repositories>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEnc oding >
<flink.version>1.1.2</flink.version >
</properties>
<!--
Execute "mvn clean package -Pbuild-jar"
to build a jar file out of this project!
How to use the Flink Quickstart pom:
a) Adding new dependencies:
You can add dependencies to the list below.
Please check if the maven-shade-plugin below is filtering out your dependency
and remove the exclude from there.
b) Build a jar for running on the cluster:
There are two options for creating a jar from this project
b.1) "mvn clean package" -> this will create a fat jar which contains all
dependencies necessary for running the jar created by this pom in a cluster.
The "maven-shade-plugin" excludes everything that is provided on a running Flink cluster.
b.2) "mvn clean package -Pbuild-jar" -> This will also create a fat-jar, but with much
nicer dependency exclusion handling. This approach is preferred and leads to
much cleaner jar files.
-->
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId >
<artifactId>flink-jdbc</artifactId >
<version>${flink.version}</version >
</dependency>
<dependency>
<groupId>org.apache.flink</groupId >
<artifactId>flink-table_2.11</artifactId >
<version>${flink.version}</version >
</dependency>
<dependency>
<groupId>org.apache.flink</groupId >
<artifactId>flink-scala_2.11</artifactId >
<version>${flink.version}</version >
</dependency>
<dependency>
<groupId>org.apache.flink</groupId >
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>${flink.version}</version >
</dependency>
<dependency>
<groupId>org.apache.flink</groupId >
<artifactId>flink-clients_2.11</ artifactId>
<version>${flink.version}</version >
</dependency>
</dependencies>
<profiles>
<profile>
<!-- Profile for packaging correct JAR files -->
<id>build-jar</id>
<activation>
</activation>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId >
<artifactId>flink-scala_2.11</artifactId >
<version>${flink.version}</version >
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId >
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>${flink.version}</version >
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId >
<artifactId>flink-clients_2.11</ artifactId>
<version>${flink.version}</version >
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<!-- disable the exclusion rules -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</ artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes combine.self="override"></excludes >
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<!-- We use the maven-assembly plugin to create a fat jar that contains all dependencies
except flink and its transitive dependencies. The resulting fat-jar can be executed
on a cluster. Change the value of Program-Class if your program entry point changes. -->
<build>
<plugins>
<!-- We use the maven-shade plugin to create a fat jar that contains all dependencies
except flink and it's transitive dependencies. The resulting fat-jar can be executed
on a cluster. Change the value of Program-Class if your program entry point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</ artifactId>
<version>2.4.1</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<!-- This list contains all dependencies of flink-dist
Everything else will be packaged into the fat-jar
-->
<exclude>org.apache.flink:flink-annotations</exclude>
<exclude>org.apache.flink:flink-shaded-hadoop1_2.11</exclude >
<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
<exclude>org.apache.flink:flink-shaded-curator-recipes</excl ude >
<exclude>org.apache.flink:flink-core</exclude>
<exclude>org.apache.flink:flink-java</exclude>
<exclude>org.apache.flink:flink-scala_2.11</exclude>
<exclude>org.apache.flink:flink-runtime_2.11</exclude>
<exclude>org.apache.flink:flink-optimizer_2.11</exclude>
<exclude>org.apache.flink:flink-clients_2.11</exclude>
<exclude>org.apache.flink:flink-avro_2.11</exclude>
<exclude>org.apache.flink:flink-examples-batch_2.11</exclude >
<exclude>org.apache.flink:flink-examples-streaming_2.11</exc lude >
<exclude>org.apache.flink:flink-streaming-java_2.11</exclude >
<!-- Also exclude very big transitive dependencies of Flink
WARNING: You have to remove these excludes if your code relies on other
versions of these dependencies.
-->
<exclude>org.scala-lang:scala-library</exclude>
<exclude>org.scala-lang:scala-compiler</exclude>
<exclude>org.scala-lang:scala-reflect</exclude>
<exclude>com.typesafe.akka:akka-actor_*</exclude>
<exclude>com.typesafe.akka:akka-remote_*</exclude>
<exclude>com.typesafe.akka:akka-slf4j_*</exclude>
<exclude>io.netty:netty-all</exclude >
<exclude>io.netty:netty</exclude >
<exclude>commons-fileupload:commons-fileupload</exclude>
<exclude>org.apache.avro:avro</ exclude>
<exclude>commons-collections:commons-collections</exclude>
<exclude>com.thoughtworks.paranamer:paranamer</exclude>
<exclude>org.xerial.snappy:snappy-java</exclude>
<exclude>org.apache.commons:commons-compress</exclude>
<exclude>org.tukaani:xz</exclude >
<exclude>com.esotericsoftware.kryo:kryo</exclude>
<exclude>com.esotericsoftware.minlog:minlog</exclude>
<exclude>org.objenesis:objenesis</exclude>
<exclude>com.twitter:chill_*</exclude >
<exclude>com.twitter:chill-java</exclude>
<exclude>commons-lang:commons-lang</exclude>
<exclude>junit:junit</exclude>
<exclude>org.apache.commons:commons-lang3</exclude>
<exclude>org.slf4j:slf4j-api</exclude >
<exclude>org.slf4j:slf4j-log4j12</exclude>
<exclude>log4j:log4j</exclude>
<exclude>org.apache.commons:commons-math</exclude>
<exclude>org.apache.sling:org.apache.sling.commons.json</exc lude >
<exclude>commons-logging:commons-logging</exclude>
<exclude>commons-codec:commons-codec</exclude>
<exclude>com.fasterxml.jackson.core:jackson-core</exclude>
<exclude>com.fasterxml.jackson.core:jackson-databind</exclud e >
<exclude>com.fasterxml.jackson.core:jackson-annotations</exc lude >
<exclude>stax:stax-api</exclude >
<exclude>com.typesafe:config</exclude >
<exclude>org.uncommons.maths:uncommons-maths</exclude>
<exclude>com.github.scopt:scopt_*</exclude>
<exclude>commons-io:commons-io</ exclude>
<exclude>commons-cli:commons-cli</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>org.apache.flink:*</artifact >
<excludes>
<!-- exclude shaded google but include shaded curator -->
<exclude>org/apache/flink/shaded/com/**</exclude>
<exclude>web-docs/**</exclude>
</excludes>
</filter>
<filter>
<!-- Do not copy the signatures in the META-INF folder.
Otherwise, this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude >
<exclude>META-INF/*.DSA</exclude >
<exclude>META-INF/*.RSA</exclude >
</excludes>
</filter>
</filters>
<!-- If you want to use ./bin/flink run <quickstart jar> uncomment the following lines.
This will add a Main-Class entry to the manifest file -->
<!--
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.Mani <mainClass>org.apache.flink.qufestResourceTransformer"> ickstart.StreamingJob</mainCla </transformer>ss>
</transformers>
-->
<createDependencyReducedPom>false</createDependencyReducedPo m >
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId >
<artifactId>scala-maven-plugin</ artifactId>
<version>3.1.4</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Eclipse Integration -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.8</version>
<configuration>
<downloadSources>true</downloadSources >
<projectnatures>
<projectnature>org.scala-ide.sdt.core.scalanature</projectna ture >
<projectnature>org.eclipse.jdt.core.javanature</projectnatur e >
</projectnatures>
<buildcommands>
<buildcommand>
Free forum by Nabble | Edit this page |