在 2018年7月2日,上午2:18,Mich Talebzadeh <[hidden email]> 写道:Hi,I am still not there.This is the simple Scala program that I created a jar file for using mvnimport java.util.Properties
import java.util.Arrays
import org.apache.flink.api.common.functions.MapFunction
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.datastream.DataStream
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09
import org.apache.flink.streaming.util.serialization.SimpleStringSchema
import org.apache.flink.streaming.util.serialization.DeserializationSchema
import org.apache.flink.streaming.util.serialization.SimpleStringSchemaobject md_streaming
{
def main(args: Array[String]) {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val properties = new Properties()
//val kafkaParams = Map[String, String]("bootstrap.servers" -> "rhes75:9092", "schema.registry.url" -> "http://rhes75:8081", "zookeeper.connect" -> "rhes75:2181", "group.id" -> flinkAppName )
properties.setProperty("bootstrap.servers", "rhes75:9092")
properties.setProperty("zookeeper.connect", "rhes75:2181")
properties.setProperty("group.id", "md_streaming")
val stream = env
.addSource(new FlinkKafkaConsumer09[String]("md", new SimpleStringSchema(), properties))
.writeAsText("/tmp/md_streaming.txt")
env.execute("Flink Kafka Example")
}
}
}Compiles OK and creates this jar filejar tvf /home/hduser/dba/bin/flink/md_streaming/target/md_streaming-1.5.0.jar
157 Sun Jul 01 18:57:46 BST 2018 META-INF/MANIFEST.MF
0 Sun Jul 01 18:57:46 BST 2018 META-INF/
0 Sun Jul 01 18:57:46 BST 2018 META-INF/maven/
0 Sun Jul 01 18:57:46 BST 2018 META-INF/maven/flink/
0 Sun Jul 01 18:57:46 BST 2018 META-INF/maven/flink/md_streaming/
7641 Sun Jul 01 18:57:34 BST 2018 META-INF/maven/flink/md_streaming/pom.xml
102 Sun Jul 01 18:08:54 BST 2018 META-INF/maven/flink/md_streaming/pom.propertiesAnd I try to run is as belowbin/flink run /home/hduser/dba/bin/flink/md_streaming/target/md_streaming-1.5.0.jar------------------------------------------------------------
The program finished with the following exception:org.apache.flink.client.program.ProgramInvocationException: The program's entry point class 'md_streaming' was not found in the jar file.at org.apache.flink.client.program.PackagedProgram.loadMainClass(PackagedProgram.java:616)
at org.apache.flink.client.program.PackagedProgram.<init>(PackagedProgram.java:199)
at org.apache.flink.client.program.PackagedProgram.<init>(PackagedProgram.java:128)
at org.apache.flink.client.cli.CliFrontend.buildProgram(CliFrontend.java:833)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:201)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
Caused by: java.lang.ClassNotFoundException: md_streaming
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.client.program.PackagedProgram.loadMainClass(PackagedProgram.java:613)Any ideas?ThanksDr Mich Talebzadeh
Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
On Sun, 1 Jul 2018 at 18:35, Mich Talebzadeh <[hidden email]> wrote:apologies JornDr Mich Talebzadeh
Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
On Sun, 1 Jul 2018 at 17:51, Mich Talebzadeh <[hidden email]> wrote:Thanks Franke. That did the trick![INFO] Excluding org.slf4j:slf4j-api:jar:1.7.7 from the shaded jar.
[INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.7 from the shaded jar.
[INFO] Excluding log4j:log4j:jar:1.2.17 from the shaded jar.
[DEBUG] Processing JAR /home/hduser/dba/bin/flink/md_streaming/target/maven-compiler-plugin-1.5.0.jar
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing /home/hduser/dba/bin/flink/md_streaming/target/maven-compiler-plugin-1.5.0.jar with /home/hduser/dba/bin/flink/md_streaming/target/maven-compiler-plugin-1.5.0-shaded.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12.868 s
[INFO] Finished at: 2018-07-01T17:35:33+01:00
[INFO] Final Memory: 19M/736M
[INFO] ------------------------------------------------------------------------
Completed compilingRegards,Dr Mich Talebzadeh
Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
On Sun, 1 Jul 2018 at 17:26, Jörn Franke <[hidden email]> wrote:Shouldn’t it be 1.5.0 instead of 1.5?Ok some clumsy work by me not creating the pom.xml file in flink sub-directory (it was putting it in spark)!Anyhow this is the current issue I am facing[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.857 s
[INFO] Finished at: 2018-07-01T17:07:10+01:00
[INFO] Final Memory: 22M/962M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project maven-compiler-plugin: Could not resolve dependencies for project flink:maven-compiler-plugin:jar:1.5: The following artifacts could not be resolved: org.apache.flink:flink-java:jar:1.5, org
.apache.flink:flink-streaming-java_2.11:jar:1.5: Could not find artifact org.apache.flink:flink-java:jar:1.5 in central (https://repo.maven.apache.org/maven2) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal on project maven-compiler-plugin: Could not resolve dependencies for project flink:maven-compiler-plugin:jar:1.5: The following artifacts could not be re
solved: org.apache.flink:flink-java:jar:1.5, org.apache.flink:flink-streaming-java_2.11:jar:1.5: Could not find artifact org.apache.flink:flink-java:jar:1.5 in central (https://repo.maven.apache.org/maven2)FYI I remove ~/.m2 directory to get rid of anything cached!ThanksDr Mich Talebzadeh
Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
On Sun, 1 Jul 2018 at 15:07, zhangminglei <[hidden email]> wrote:Hi, Mich[WARNING] Expected all dependencies to require Scala version: 2.10.4
[WARNING] spark:scala:1.0 requires scala version: 2.11.7
[WARNING] Multiple versions of scala libraries detected!I think you should make your scala version to 2.11 first. And try again.CheersMinglei在 2018年7月1日,下午9:24,Mich Talebzadeh <[hidden email]> 写道:Hi Minglei,Many thanksMy flink version is 1.5This is the pom.xml from GitHub as suggested<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License atUnless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion><groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>1.5</version>
<packaging>jar</packaging><name>Flink Quickstart Job</name>
<url>http://www.myorganization.org</url><properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>@project.version@</flink.version>
<java.version>1.8</java.version>
<scala.binary.version>2.11</scala.binary.version>
<maven.compiler.source>2.11</maven.compiler.source>
<maven.compiler.target>2.11</maven.compiler.target>
</properties><repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories><dependencies>
<!-- Apache Flink dependencies -->
<!-- These dependencies are provided, because they should not be packaged into the JAR file. -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>1.5</version>
<scope>provided</scope>
</dependency><!-- Add connector dependencies here. They must be in the default scope (compile). --><!-- Example:<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.10_2.11</artifactId>
<version>1.5</version>
</dependency>
--><!-- Add logging framework, to produce console output when running in the IDE. -->
<!-- These dependencies are excluded from the application JAR by default. -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.7</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
<scope>runtime</scope>
</dependency>
</dependencies><build>
<plugins><!-- Java Compiler -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>2.11</source>
<target>2.11</target>
</configuration>
</plugin><!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. -->
<!-- Change the value of <mainClass>...</mainClass> if your program entry point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.flink:force-shading</exclude>
<exclude>com.google.code.findbugs:jsr305</exclude>
<exclude>org.slf4j:*</exclude>
<exclude>log4j:*</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<!-- Do not copy the signatures in the META-INF folder.
Otherwise, this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>${package}.StreamingJob</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins><pluginManagement>
<plugins><!-- If you want to use Java 8 Lambda Expressions uncomment the following lines -->
<!--
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>2.11</source>
<target>2.11</target>
<compilerId>jdt</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId>
<version>0.21.0</version>
</dependency>
</dependencies>
</plugin>
--><!-- This improves the out-of-the-box experience in Eclipse by resolving some warnings. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<versionRange>[3.0.0,)</versionRange>
<goals>
<goal>shade</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<versionRange>[3.1,)</versionRange>
<goals>
<goal>testCompile</goal>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build><!-- This profile helps to make things run out of the box in IntelliJ -->
<!-- Its adds Flink's core classes to the runtime class path. -->
<!-- Otherwise they are missing in IntelliJ, because the dependency is 'provided' -->
<profiles>
<profile>
<id>add-dependencies-for-IDEA</id><activation>
<property>
<name>idea.version</name>
</property>
</activation><dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.5</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>1.5</version>
<scope>compile</scope>
</dependency>
</dependencies>
</profile>
</profiles></project>But I am still getting the same errors for input[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building scala 1.0
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ scala ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/hduser/dba/bin/flink/md_streaming/src/main/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ scala ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-scala-plugin:2.15.2:compile (default) @ scala ---
[INFO] Checking for multiple versions of scala
[WARNING] Expected all dependencies to require Scala version: 2.10.4
[WARNING] spark:scala:1.0 requires scala version: 2.11.7
[WARNING] Multiple versions of scala libraries detected!
[INFO] includes = [**/*.java,**/*.scala,]
[INFO] excludes = []
[INFO] /home/hduser/dba/bin/flink/md_streaming/src/main/scala:-1: info: compiling
[INFO] Compiling 1 source files to /home/hduser/dba/bin/flink/md_streaming/target/classes at 1530451461171
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:3: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.api.common.functions.MapFunction
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:4: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.api.java.utils.ParameterTool
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:5: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.api.datastream.DataStream
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:6: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:7: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:8: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.util.serialization.SimpleStringSchema
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:9: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.util.serialization.DeserializationSchema
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:10: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.util.serialization.SimpleStringSchema
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:18: error: not found: value StreamExecutionEnvironment
[INFO] val env = StreamExecutionEnvironment.getExecutionEnvironment
[INFO] ^
[ERROR] 9 errors found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILUREDr Mich Talebzadeh
Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
On Sun, 1 Jul 2018 at 14:07, zhangminglei <[hidden email]> wrote:Hi, Mich.Is there a basic MVN pom file for flink? The default one from GitHub does not seem to be working!Please take a look on https://github.com/apache/flink/blob/master/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xmlCheersMinglei在 2018年7月1日,下午7:44,Mich Talebzadeh <[hidden email]> 写道:I have done many times with sbt or maven for spark streaming.Trying to compile a simple program that compiles ok in flink-scala.shThe imports are as followsimport java.util.Properties
import java.util.Arrays
import org.apache.flink.api.common.functions.MapFunction
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.datastream.DataStream
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09
import org.apache.flink.streaming.util.serialization.SimpleStringSchema
import org.apache.flink.streaming.util.serialization.DeserializationSchema
import org.apache.flink.streaming.util.serialization.SimpleStringSchemawith adding to classpath the following jars it compilesflink-connector-kafka-0.9_2.11-1.5.0.jarflink-connector-kafka-base_2.11-1.5.0.jarI guess my pom.xml is incorrect.I have added these two dependencies to the pom.xml file<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>1.4.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.5.0</version>
</dependency>However, I am getting these basic errors![ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:4: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.api.java.utils.ParameterTool
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:5: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.api.datastream.DataStream
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:6: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:7: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:8: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.util.serialization.SimpleStringSchema
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:9: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.util.serialization.DeserializationSchema
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:10: error: object flink is not a member of package org.apache
[INFO] import org.apache.flink.streaming.util.serialization.SimpleStringSchema
[INFO] ^
[ERROR] /home/hduser/dba/bin/flink/md_streaming/src/main/scala/myPackage/md_streaming.scala:18: error: not found: value StreamExecutionEnvironment
[INFO] val env = StreamExecutionEnvironment.getExecutionEnvironmentIs there a basic MVN pom file for flink? The default one from GitHub does not seem to be working!ThanksDr Mich Talebzadeh
Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
Free forum by Nabble | Edit this page |