Problem loading JDBC driver

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Problem loading JDBC driver

Nicholas Walton
Hi,
I have a pipeline which is sinking into an Apache Derby database, but Im constantly receiving the error
java.lang.IllegalArgumentException: JDBC driver class not found.
The Scala libraries Im loading are
val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion ,
"org.apache.flink" %% "flink-table" % "1.7.1" ,
"org.apache.flink" % "flink-table_2.11" % "1.7.2",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion,
"org.apache.flink" %% "flink-table-uber" % flinkVersion,
"org.apache.flink" %% "flink-jdbc" % flinkVersion,
"org.apache.derby" % "derby" % "10.15.1.3" % Test
The Scala code for the sink is
val sink: JDBCAppendTableSink = JDBCAppendTableSink.builder()
.setDrivername(
"org.apache.derby.jdbc.EmbeddedDriver")
.setDBUrl(
"jdbc:derby:/Volumes/HD1/nwalton/Databases/mydb")
.setQuery(
"INSERT INTO mydb (bearing, sample, value, hash, prevrepeats) VALUES (?,?,?,?,?)")
.setParameterTypes(
INT_TYPE_INFO, LONG_TYPE_INFO, DOUBLE_TYPE_INFO, STRING_TYPE_INFO, INT_TYPE_INFO)
.build()

tableEnv.registerTableSink(
"jdbcOutputTable",
// specify table schema
Array[String](mydb"),
Array[TypeInformation[_]](Types.INT, Types.LONG, Types.DOUBLE,Types.STRING,Types.INT),
sink)

val table: Table = tableEnv.fromDataStream(signalFourBuckets)
table.insertInto(
"jdbcOutputTable")
I note that all the examples I have found of Derby usage in Flink have been for in memory databases. Is there anything particular about Derby in that respect? I have checked the jar file (built using sbt assembly) and it appears to include the Derby drivers, and I have started the cluster, which is running on a single machine, with CLASSPATH set to include the drivers

Nick Walton

Reply | Threaded
Open this post in threaded view
|

Re: Problem loading JDBC driver

Caizhi Weng
Hi Nick,

The "Test" after "org.apache.derby" % "derby" % "10.15.1.3" seems suspicious. Is that intended?

Nicholas Walton <[hidden email]> 于2019年11月26日周二 下午4:46写道:
Hi,
I have a pipeline which is sinking into an Apache Derby database, but Im constantly receiving the error
java.lang.IllegalArgumentException: JDBC driver class not found.
The Scala libraries Im loading are
val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion ,
"org.apache.flink" %% "flink-table" % "1.7.1" ,
"org.apache.flink" % "flink-table_2.11" % "1.7.2",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion,
"org.apache.flink" %% "flink-table-uber" % flinkVersion,
"org.apache.flink" %% "flink-jdbc" % flinkVersion,
"org.apache.derby" % "derby" % "10.15.1.3" % Test
The Scala code for the sink is
val sink: JDBCAppendTableSink = JDBCAppendTableSink.builder()
.setDrivername(
"org.apache.derby.jdbc.EmbeddedDriver")
.setDBUrl(
"jdbc:derby:/Volumes/HD1/nwalton/Databases/mydb")
.setQuery(
"INSERT INTO mydb (bearing, sample, value, hash, prevrepeats) VALUES (?,?,?,?,?)")
.setParameterTypes(
INT_TYPE_INFO, LONG_TYPE_INFO, DOUBLE_TYPE_INFO, STRING_TYPE_INFO, INT_TYPE_INFO)
.build()

tableEnv.registerTableSink(
"jdbcOutputTable",
// specify table schema
Array[String](mydb"),
Array[TypeInformation[_]](Types.INT, Types.LONG, Types.DOUBLE,Types.STRING,Types.INT),
sink)

val table: Table = tableEnv.fromDataStream(signalFourBuckets)
table.insertInto(
"jdbcOutputTable")
I note that all the examples I have found of Derby usage in Flink have been for in memory databases. Is there anything particular about Derby in that respect? I have checked the jar file (built using sbt assembly) and it appears to include the Derby drivers, and I have started the cluster, which is running on a single machine, with CLASSPATH set to include the drivers

Nick Walton