Hi all,
I want to run the data-flow Wordcount example on a Flink Cluster. The local execution with „mvn exec:exec -Dinput=kinglear.txt -Doutput=wordcounts.txt“ is already working. How is the command to execute it on the cluster? Best regards, Lydia |
It's described here: https://ci.apache.org/projects/flink/flink-docs-release-0.9/quickstart/setup_quickstart.html#run-example Cheers, Till On Wed, Sep 30, 2015 at 8:24 AM, Lydia Ickler <[hidden email]> wrote:
|
Hi Lydia,
Till already pointed you to the documentation. If you want to run the WordCount example, you can do so by executing the following command: ./bin/flink run -c com.dataartisans.flink.dataflow.examples.WordCount /path/to/dataflow.jar --input /path/to/input --output /path/to/output If you want, you can try this command in a local cluster first. You can start a local cluster by using ./bin/start-local.sh. Best, Max On Wed, Sep 30, 2015 at 10:14 AM, Till Rohrmann <[hidden email]> wrote: > It's described here: > https://ci.apache.org/projects/flink/flink-docs-release-0.9/quickstart/setup_quickstart.html#run-example > > Cheers, > Till > > On Wed, Sep 30, 2015 at 8:24 AM, Lydia Ickler <[hidden email]> > wrote: >> >> Hi all, >> >> I want to run the data-flow Wordcount example on a Flink Cluster. >> The local execution with „mvn exec:exec -Dinput=kinglear.txt >> -Doutput=wordcounts.txt“ is already working. >> How is the command to execute it on the cluster? >> >> Best regards, >> Lydia > > |
In reply to this post by Till Rohrmann
Hi Till,
I want to execute your Matrix Completion program „ALSJoin“. Locally it works perfect. Now I want to execute it on the cluster with: run -c com.github.projectflink.als.ALSJoin -cp /tmp/icklerly/flink-jobs-0.1-SNAPSHOT.jar 0 2 0.001 10 1 1 but I get the following error: java.lang.NoSuchMethodError: org.apache.flink.api.scala.typeutils.CaseClassTypeInfo.<init>(Ljava/lang/Class;Lscala/collection/Seq;Lscala/collection/Seq;)V I guess something like the flink-scala-0.10-SNAPSHOT.jar is missing. How can I add that to the path? Best regards, Lydia |
I think there is a version mismatch between the Flink version you've used to compile your job and the Flink version installed on the cluster. Maven automagically pulls newer 0.10-SNAPSHOT versions every time you're building your job. On Fri, Oct 2, 2015 at 11:45 AM, Lydia Ickler <[hidden email]> wrote:
|
Hi,
but inside the pom of flunk-job is the flink version set to 0.8 <flink.version>0.8-incubating-SNAPSHOT</flink.version> how can I change it to the newest? <flink.version>0.10-SNAPSHOT</flink.version> Is not working
|
Are you relying on a feature only available in 0.10-SNAPSHOT? Otherwise, I would recommend to use the latest stable release (0.9.1) for your flink job and on the cluster. On Fri, Oct 2, 2015 at 11:55 AM, Lydia Ickler <[hidden email]> wrote:
|
It is a university cluster which we have to use.
So I am forced to use it :( How can I bypass that version conflict?
|
In reply to this post by rmetzger0
@Lydia Did you create your POM files for your job with an 0.8.x quickstart? Can you try to simply re-create your project's POM files with a new quickstart? I think that the POMS between 0.8-incubating-SNAPSHOT and 0.10-SNAPSHOT may not be quite compatible any more... On Fri, Oct 2, 2015 at 12:07 PM, Robert Metzger <[hidden email]> wrote:
|
In reply to this post by Lydia Ickler
|
In reply to this post by Stephan Ewen
Hi,
I did not create anything by myself. I just downloaded the files from here: https://github.com/tillrohrmann/flink-perf And then executed mvn clean install -DskipTests Then I opened the project within IntelliJ and there it works fine. Then I exported it to the cluster that runs with 0.10-SNAPSHOT.
|
I had problems running a flink job with maven, probably there is some issue of classloading. For me worked to run a simple java command with the uberjar. So I build the jar using maven, and then run it this way hope it helps,java -Xmx2g -cp target/youruberjar.jar yourclass arg1 arg2 2015-10-02 12:21 GMT+02:00 Lydia Ickler <[hidden email]>:
|
Lydia, can you check the log of Flink installed on the cluster? During startup, it is writing the exact commit your 0.10-SNAPSHOT is based on. I would recommend to check out exactly that commit locally and then build Flink locally. After that, you can rebuild your jobs jar again. With that method, there is certainly no version mismatch. (Thats one of the reasons why using SNAPSHOT versions on such environments is not recommended) On Fri, Oct 2, 2015 at 2:05 PM, Stefano Bortoli <[hidden email]> wrote:
|
Hi Lydia, I think the APIs of the versions 0.8-incubating-SNAPSHOT and 0.10-SNAPSHOT are not compatible. Thus, it’s not just simply setting the dependencies to Cheers, On Fri, Oct 2, 2015 at 2:11 PM, Robert Metzger <[hidden email]> wrote:
|
Thanks, Till!
I used the ALS from FlinkML and it works :) Best regards, Lydia
|
Free forum by Nabble | Edit this page |