Submitting jobs from within Scala code

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Submitting jobs from within Scala code

Philipp Goetze
Hi community,

in our project we try to submit built Flink programs to the jobmanager from within Scala code. The test program is executed correctly when submitted via the wrapper script "bin/flink run ..." and also with the webclient. But when executed from within the Scala code nothing seems to happen, but the following warning is found in the log:
10:47:18,153 WARN  akka.remote.ReliableDeliverySupervisor                        - Association with remote system [[hidden email]] has failed, address is now gated for [5000] ms. Reason is: [org.apache.flink.runtime.jobgraph.AbstractJobVertex]

Our submit method looks like that:
 def submitJar(master: String, path: String, className: String, args: String*) = { 
    val file = new File(path)
    val parallelism = 1 
    val wait = true
    try { 
      val program = new PackagedProgram(file, className, args:_*)
      val jobManagerAddress = getInetFromHostport(master)
      val client = new Client(jobManagerAddress, new Configuration(), program.getUserCodeClassLoader(), 1)  
      println("Executing " + path); 
      client.run(program, parallelism, wait); 
    } catch {
      case e: ProgramInvocationException => e.printStackTrace()
    }   
  }   

I took this as a reference: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-submit-flink-jars-from-plain-Java-programs-td656.html

I hope you can help.

Best Regards,
Philipp Götze
Reply | Threaded
Open this post in threaded view
|

Re: Submitting jobs from within Scala code

Till Rohrmann
Hi Philipp,

could you post the complete log output. This might help to get to the bottom of the problem.

Cheers,
Till

On Thu, Jul 16, 2015 at 11:01 AM, Philipp Goetze <[hidden email]> wrote:
Hi community,

in our project we try to submit built Flink programs to the jobmanager from within Scala code. The test program is executed correctly when submitted via the wrapper script "bin/flink run ..." and also with the webclient. But when executed from within the Scala code nothing seems to happen, but the following warning is found in the log:
10:47:18,153 WARN  akka.remote.ReliableDeliverySupervisor                        - Association with remote system [[hidden email]] has failed, address is now gated for [5000] ms. Reason is: [org.apache.flink.runtime.jobgraph.AbstractJobVertex]

Our submit method looks like that:
 def submitJar(master: String, path: String, className: String, args: String*) = { 
    val file = new File(path)
    val parallelism = 1 
    val wait = true
    try { 
      val program = new PackagedProgram(file, className, args:_*)
      val jobManagerAddress = getInetFromHostport(master)
      val client = new Client(jobManagerAddress, new Configuration(), program.getUserCodeClassLoader(), 1)  
      println("Executing " + path); 
      client.run(program, parallelism, wait); 
    } catch {
      case e: ProgramInvocationException => e.printStackTrace()
    }   
  }   

I took this as a reference: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-submit-flink-jars-from-plain-Java-programs-td656.html

I hope you can help.

Best Regards,
Philipp Götze

Reply | Threaded
Open this post in threaded view
|

Re: Submitting jobs from within Scala code

Philipp Goetze
Hi Till,

the problem is that this is the only output :( Or is it possible to get a more verbose log output?

Maybe it is important to note, that both Flink and our project is built with Scala 2.11.

Best Regards,
Philipp

On 16.07.2015 11:12, Till Rohrmann wrote:
Hi Philipp,

could you post the complete log output. This might help to get to the bottom of the problem.

Cheers,
Till

On Thu, Jul 16, 2015 at 11:01 AM, Philipp Goetze <[hidden email]> wrote:
Hi community,

in our project we try to submit built Flink programs to the jobmanager from within Scala code. The test program is executed correctly when submitted via the wrapper script "bin/flink run ..." and also with the webclient. But when executed from within the Scala code nothing seems to happen, but the following warning is found in the log:
10:47:18,153 WARN  akka.remote.ReliableDeliverySupervisor                        - Association with remote system [[hidden email]] has failed, address is now gated for [5000] ms. Reason is: [org.apache.flink.runtime.jobgraph.AbstractJobVertex]

Our submit method looks like that:
 def submitJar(master: String, path: String, className: String, args: String*) = { 
    val file = new File(path)
    val parallelism = 1 
    val wait = true
    try { 
      val program = new PackagedProgram(file, className, args:_*)
      val jobManagerAddress = getInetFromHostport(master)
      val client = new Client(jobManagerAddress, new Configuration(), program.getUserCodeClassLoader(), 1)  
      println("Executing " + path); 
      client.run(program, parallelism, wait); 
    } catch {
      case e: ProgramInvocationException => e.printStackTrace()
    }   
  }   

I took this as a reference: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-submit-flink-jars-from-plain-Java-programs-td656.html

I hope you can help.

Best Regards,
Philipp Götze


Reply | Threaded
Open this post in threaded view
|

Re: Submitting jobs from within Scala code

Till Rohrmann

When you run your program from the IDE, then you can specify a log4j.properties file. There you can configure where and what to log. It should be enough to place the log4j.properties file in the resource folder of your project. An example properties file could look like:

log4j.rootLogger=INFO, testlogger

log4j.appender.testlogger=org.apache.log4j.ConsoleAppender
log4j.appender.testlogger.target = System.err
log4j.appender.testlogger.layout=org.apache.log4j.PatternLayout
log4j.appender.testlogger.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

Alternatively, you can specify it via a JVM option: -Dlog4j.configuration=<path to properties file>

Cheers,
Till


On Thu, Jul 16, 2015 at 11:23 AM, Philipp Goetze <[hidden email]> wrote:
Hi Till,

the problem is that this is the only output :( Or is it possible to get a more verbose log output?

Maybe it is important to note, that both Flink and our project is built with Scala 2.11.

Best Regards,
Philipp


On 16.07.2015 11:12, Till Rohrmann wrote:
Hi Philipp,

could you post the complete log output. This might help to get to the bottom of the problem.

Cheers,
Till

On Thu, Jul 16, 2015 at 11:01 AM, Philipp Goetze <[hidden email]> wrote:
Hi community,

in our project we try to submit built Flink programs to the jobmanager from within Scala code. The test program is executed correctly when submitted via the wrapper script "bin/flink run ..." and also with the webclient. But when executed from within the Scala code nothing seems to happen, but the following warning is found in the log:
10:47:18,153 WARN  akka.remote.ReliableDeliverySupervisor                        - Association with remote system [[hidden email]] has failed, address is now gated for [5000] ms. Reason is: [org.apache.flink.runtime.jobgraph.AbstractJobVertex]

Our submit method looks like that:
 def submitJar(master: String, path: String, className: String, args: String*) = { 
    val file = new File(path)
    val parallelism = 1 
    val wait = true
    try { 
      val program = new PackagedProgram(file, className, args:_*)
      val jobManagerAddress = getInetFromHostport(master)
      val client = new Client(jobManagerAddress, new Configuration(), program.getUserCodeClassLoader(), 1)  
      println("Executing " + path); 
      client.run(program, parallelism, wait); 
    } catch {
      case e: ProgramInvocationException => e.printStackTrace()
    }   
  }   

I took this as a reference: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-submit-flink-jars-from-plain-Java-programs-td656.html

I hope you can help.

Best Regards,
Philipp Götze



Reply | Threaded
Open this post in threaded view
|

Re: Submitting jobs from within Scala code

Stephan Ewen
Could you also look into the JobManager logs? You may be submitting a corrupt JobGraph...

On Thu, Jul 16, 2015 at 11:45 AM, Till Rohrmann <[hidden email]> wrote:

When you run your program from the IDE, then you can specify a log4j.properties file. There you can configure where and what to log. It should be enough to place the log4j.properties file in the resource folder of your project. An example properties file could look like:

log4j.rootLogger=INFO, testlogger

log4j.appender.testlogger=org.apache.log4j.ConsoleAppender
log4j.appender.testlogger.target = System.err
log4j.appender.testlogger.layout=org.apache.log4j.PatternLayout
log4j.appender.testlogger.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

Alternatively, you can specify it via a JVM option: -Dlog4j.configuration=<path to properties file>

Cheers,
Till


On Thu, Jul 16, 2015 at 11:23 AM, Philipp Goetze <[hidden email]> wrote:
Hi Till,

the problem is that this is the only output :( Or is it possible to get a more verbose log output?

Maybe it is important to note, that both Flink and our project is built with Scala 2.11.

Best Regards,
Philipp


On 16.07.2015 11:12, Till Rohrmann wrote:
Hi Philipp,

could you post the complete log output. This might help to get to the bottom of the problem.

Cheers,
Till

On Thu, Jul 16, 2015 at 11:01 AM, Philipp Goetze <[hidden email]> wrote:
Hi community,

in our project we try to submit built Flink programs to the jobmanager from within Scala code. The test program is executed correctly when submitted via the wrapper script "bin/flink run ..." and also with the webclient. But when executed from within the Scala code nothing seems to happen, but the following warning is found in the log:
10:47:18,153 WARN  akka.remote.ReliableDeliverySupervisor                        - Association with remote system [[hidden email]] has failed, address is now gated for [5000] ms. Reason is: [org.apache.flink.runtime.jobgraph.AbstractJobVertex]

Our submit method looks like that:
 def submitJar(master: String, path: String, className: String, args: String*) = { 
    val file = new File(path)
    val parallelism = 1 
    val wait = true
    try { 
      val program = new PackagedProgram(file, className, args:_*)
      val jobManagerAddress = getInetFromHostport(master)
      val client = new Client(jobManagerAddress, new Configuration(), program.getUserCodeClassLoader(), 1)  
      println("Executing " + path); 
      client.run(program, parallelism, wait); 
    } catch {
      case e: ProgramInvocationException => e.printStackTrace()
    }   
  }   

I took this as a reference: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-submit-flink-jars-from-plain-Java-programs-td656.html

I hope you can help.

Best Regards,
Philipp Götze




Reply | Threaded
Open this post in threaded view
|

Re: Submitting jobs from within Scala code

Philipp Goetze
In reply to this post by Till Rohrmann
Hey Tim,

here the console output now with log4j:

0    [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Starting program in interactive mode
121  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.scala.ClosureCleaner$  - accessedFields: Map()
137  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.scala.ClosureCleaner$  - accessedFields: Map()
183  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.api.java.ExecutionEnvironment  - The job has 0 registered types and 0 default Kryo serializers
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo types:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo with Serializers types:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo with Serializer Classes types:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo default Serializers:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo default Serializers Classes
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered POJO types:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Static code analysis mode: DISABLE
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.client.program.Client  - Set parallelism 1, plan default parallelism 1
198  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.optimizer.Optimizer  - Beginning compilation of program 'Starting Query'
198  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.optimizer.Optimizer  - Using a default parallelism of 1
198  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.optimizer.Optimizer  - Using default data exchange mode PIPELINED
266  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.common.io.FileInputFormat  - Opening input split file:/home/blaze/Documents/TU_Ilmenau/Masterthesis/projects/pigspark/src/it/resources/file.txt [0,32]
269  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.common.io.FileInputFormat  - Opening input split file:/home/blaze/Documents/TU_Ilmenau/Masterthesis/projects/pigspark/src/it/resources/file.txt [16,16]
412  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - JobManager actor system address is localhost/127.0.0.1:6123
412  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Starting client actor system
415  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.runtime.client.JobClient  - Starting JobClient actor system
922  [flink-akka.actor.default-dispatcher-2] INFO  akka.event.slf4j.Slf4jLogger  - Slf4jLogger started
953  [flink-akka.actor.default-dispatcher-2] DEBUG akka.event.EventStream  - logger log1-Slf4jLogger started
954  [flink-akka.actor.default-dispatcher-2] DEBUG akka.event.EventStream  - Default Loggers started
1044 [flink-akka.actor.default-dispatcher-4] INFO  Remoting  - Starting remoting
1117 [flink-akka.remote.default-remote-dispatcher-6] DEBUG org.jboss.netty.channel.socket.nio.SelectorUtil  - Using select timeout of 500
1118 [flink-akka.remote.default-remote-dispatcher-6] DEBUG org.jboss.netty.channel.socket.nio.SelectorUtil  - Epoll-bug workaround enabled = false
1325 [flink-akka.actor.default-dispatcher-2] INFO  Remoting  - Remoting started; listening on addresses :[[hidden email]]
1343 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.runtime.client.JobClient  - Started JobClient actor system at 127.0.0.1:58455
1343 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Looking up JobManager
1542 [flink-akka.actor.default-dispatcher-2] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [akka.actor.Identify]
1567 [flink-akka.actor.default-dispatcher-2] DEBUG akka.remote.EndpointWriter  - Drained buffer with maxWriteCount: 50, fullBackoffCount: 1, smallBackoffCount: 0, noBackoffCount: 0 , adaptiveBackoff: 1000
1599 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - JobManager runs at [hidden email]
1600 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Communication between client and JobManager will have a timeout of 100000 milliseconds
1600 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Checking and uploading JAR files
1606 [flink-akka.actor.default-dispatcher-2] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [org.apache.flink.runtime.messages.JobManagerMessages$RequestBlobManagerPort$]
1638 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.runtime.blob.BlobClient  - PUT content addressable BLOB stream to /127.0.0.1:42947
1660 [flink-akka.actor.default-dispatcher-4] INFO  org.apache.flink.runtime.client.JobClient  - Sending message to JobManager [hidden email] to submit job Starting Query (227a3733c283899991ba8a5237a0f2a8) and wait for progress
1667 [flink-akka.actor.default-dispatcher-2] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [org.apache.flink.runtime.messages.JobManagerMessages$SubmitJob]
1712 [flink-akka.actor.default-dispatcher-4] DEBUG akka.remote.RemoteWatcher  - Watching: [akka://flink/user/$a -> [hidden email]]
1781 [flink-akka.actor.default-dispatcher-4] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [akka.dispatch.sysmsg.Watch]
1819 [flink-akka.actor.default-dispatcher-4] DEBUG org.apache.flink.runtime.client.JobClient  - Received failure from JobManager
org.apache.flink.runtime.client.JobSubmissionException: The vertex null (null) has no invokable class.
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:511)
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:507)
    at scala.collection.Iterator$class.foreach(Iterator.scala:743)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:507)
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$receiveWithLogMessages$1.applyOrElse(JobManager.scala:190)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:43)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:29)
    at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.applyOrElse(ActorLogMessages.scala:29)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    at org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:92)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
    at akka.dispatch.Mailbox.run(Mailbox.scala:221)
    at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
1858 [flink-akka.actor.default-dispatcher-3] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [akka.dispatch.sysmsg.Unwatch]
1862 [flink-akka.actor.default-dispatcher-3] DEBUG akka.remote.RemoteWatcher  - Unwatching: [akka://flink/user/$a -> [hidden email]]
1863 [flink-akka.actor.default-dispatcher-3] DEBUG akka.remote.RemoteWatcher  - Cleanup self watch of [[hidden email]]
1879 [flink-akka.actor.default-dispatcher-3] DEBUG akka.remote.RemoteWatcher  - Unwatched last watchee of node: [[hidden email]]
1932 [flink-akka.actor.default-dispatcher-2] INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator  - Shutting down remote daemon.
1935 [flink-akka.actor.default-dispatcher-2] INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator  - Remote daemon shut down; proceeding with flushing remote transports.
2037 [flink-akka.actor.default-dispatcher-4] INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator  - Remoting shut down.
org.apache.flink.client.program.ProgramInvocationException: The program execution failed: The vertex null (null) has no invokable class.
    at org.apache.flink.client.program.Client.run(Client.java:412)
    at org.apache.flink.client.program.Client.run(Client.java:355)
    at org.apache.flink.client.program.Client.run(Client.java:348)
    at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:63)
    at org.apache.flink.api.scala.ExecutionEnvironment.execute(ExecutionEnvironment.scala:590)
    at load$.main(/home/blaze/Documents/TU_Ilmenau/Masterthesis/projects/pigspark/load/load.scala:20)
    at load.main(/home/blaze/Documents/TU_Ilmenau/Masterthesis/projects/pigspark/load/load.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:437)
    at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:353)
    at org.apache.flink.client.program.Client.run(Client.java:315)
    at dbis.pig.tools.FlinkRun.submitJar(FlinkRun.scala:62)
    at dbis.pig.tools.FlinkRun.execute(FlinkRun.scala:37)
    at dbis.pig.PigCompiler$.run(PigCompiler.scala:106)
    at dbis.pig.PigCompiler$.main(PigCompiler.scala:69)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(FlinkCompileIt.scala:62)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(FlinkCompileIt.scala:53)
    at org.scalatest.prop.TableFor4$$anonfun$apply$17.apply(TableFor1.scala:797)
    at org.scalatest.prop.TableFor4$$anonfun$apply$17.apply(TableFor1.scala:795)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:778)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:777)
    at org.scalatest.prop.TableFor4.apply(TableFor1.scala:795)
    at org.scalatest.prop.TableDrivenPropertyChecks$class.forAll(TableDrivenPropertyChecks.scala:418)
    at org.scalatest.prop.TableDrivenPropertyChecks$.forAll(TableDrivenPropertyChecks.scala:665)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1.apply$mcV$sp(FlinkCompileIt.scala:53)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1.apply(FlinkCompileIt.scala:53)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1.apply(FlinkCompileIt.scala:53)
    at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    at org.scalatest.Transformer.apply(Transformer.scala:22)
    at org.scalatest.Transformer.apply(Transformer.scala:20)
    at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647)
    at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
    at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
    at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644)
    at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
    at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
    at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
    at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656)
    at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
    at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
    at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
    at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714)
    at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
    at org.scalatest.Suite$class.run(Suite.scala:1424)
    at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
    at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
    at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
    at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760)
    at org.scalatest.FlatSpec.run(FlatSpec.scala:1683)
    at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
    at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
    at sbt.TestRunner.runTest$1(TestFramework.scala:76)
    at sbt.TestRunner.run(TestFramework.scala:85)
    at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
    at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
    at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185)
    at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
    at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
    at sbt.TestFunction.apply(TestFramework.scala:207)
    at sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:239)
    at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
    at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
    at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
    at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.runtime.client.JobSubmissionException: The vertex null (null) has no invokable class.
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:511)
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:507)
    at scala.collection.Iterator$class.foreach(Iterator.scala:743)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:507)
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$receiveWithLogMessages$1.applyOrElse(JobManager.scala:190)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:43)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:29)
    at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.applyOrElse(ActorLogMessages.scala:29)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    at org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:92)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
    at akka.dispatch.Mailbox.run(Mailbox.scala:221)
    at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Best Regards,
Philipp


On 16.07.2015 11:45, Till Rohrmann wrote:

When you run your program from the IDE, then you can specify a log4j.properties file. There you can configure where and what to log. It should be enough to place the log4j.properties file in the resource folder of your project. An example properties file could look like:

log4j.rootLogger=INFO, testlogger

log4j.appender.testlogger=org.apache.log4j.ConsoleAppender
log4j.appender.testlogger.target = System.err
log4j.appender.testlogger.layout=org.apache.log4j.PatternLayout
log4j.appender.testlogger.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

Alternatively, you can specify it via a JVM option: -Dlog4j.configuration=<path to properties file>

Cheers,
Till


On Thu, Jul 16, 2015 at 11:23 AM, Philipp Goetze <[hidden email]> wrote:
Hi Till,

the problem is that this is the only output :( Or is it possible to get a more verbose log output?

Maybe it is important to note, that both Flink and our project is built with Scala 2.11.

Best Regards,
Philipp


On 16.07.2015 11:12, Till Rohrmann wrote:
Hi Philipp,

could you post the complete log output. This might help to get to the bottom of the problem.

Cheers,
Till

On Thu, Jul 16, 2015 at 11:01 AM, Philipp Goetze <[hidden email]> wrote:
Hi community,

in our project we try to submit built Flink programs to the jobmanager from within Scala code. The test program is executed correctly when submitted via the wrapper script "bin/flink run ..." and also with the webclient. But when executed from within the Scala code nothing seems to happen, but the following warning is found in the log:
10:47:18,153 WARN  akka.remote.ReliableDeliverySupervisor                        - Association with remote system [[hidden email]] has failed, address is now gated for [5000] ms. Reason is: [org.apache.flink.runtime.jobgraph.AbstractJobVertex]

Our submit method looks like that:
 def submitJar(master: String, path: String, className: String, args: String*) = { 
    val file = new File(path)
    val parallelism = 1 
    val wait = true
    try { 
      val program = new PackagedProgram(file, className, args:_*)
      val jobManagerAddress = getInetFromHostport(master)
      val client = new Client(jobManagerAddress, new Configuration(), program.getUserCodeClassLoader(), 1)  
      println("Executing " + path); 
      client.run(program, parallelism, wait); 
    } catch {
      case e: ProgramInvocationException => e.printStackTrace()
    }   
  }   

I took this as a reference: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-submit-flink-jars-from-plain-Java-programs-td656.html

I hope you can help.

Best Regards,
Philipp Götze




Reply | Threaded
Open this post in threaded view
|

Re: Submitting jobs from within Scala code

Till Rohrmann

Hi Philipp,

it seems that Stephan was right and that your JobGraph is somehow corrupted. You can see it in the exception JobSubmissionException that the JobGraph contains a vertex whose InvokableClassName is null. Furthermore, even the ID and the vertex name are null. This is a strong indicator, that the JobGraph is not correct.

Can you also post the log of the JobManager? Do you have the code of your job online?

Cheers,
Till


On Thu, Jul 16, 2015 at 1:20 PM, Philipp Goetze <[hidden email]> wrote:
Hey Tim,

here the console output now with log4j:

0    [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Starting program in interactive mode
121  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.scala.ClosureCleaner$  - accessedFields: Map()
137  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.scala.ClosureCleaner$  - accessedFields: Map()
183  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.api.java.ExecutionEnvironment  - The job has 0 registered types and 0 default Kryo serializers
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo types:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo with Serializers types:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo with Serializer Classes types:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo default Serializers:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered Kryo default Serializers Classes
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Registered POJO types:
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.java.ExecutionEnvironment  - Static code analysis mode: DISABLE
188  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.client.program.Client  - Set parallelism 1, plan default parallelism 1
198  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.optimizer.Optimizer  - Beginning compilation of program 'Starting Query'
198  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.optimizer.Optimizer  - Using a default parallelism of 1
198  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.optimizer.Optimizer  - Using default data exchange mode PIPELINED
266  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.common.io.FileInputFormat  - Opening input split file:/home/blaze/Documents/TU_Ilmenau/Masterthesis/projects/pigspark/src/it/resources/file.txt [0,32]
269  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.api.common.io.FileInputFormat  - Opening input split file:/home/blaze/Documents/TU_Ilmenau/Masterthesis/projects/pigspark/src/it/resources/file.txt [16,16]
412  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - JobManager actor system address is localhost/127.0.0.1:6123
412  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Starting client actor system
415  [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.runtime.client.JobClient  - Starting JobClient actor system
922  [flink-akka.actor.default-dispatcher-2] INFO  akka.event.slf4j.Slf4jLogger  - Slf4jLogger started
953  [flink-akka.actor.default-dispatcher-2] DEBUG akka.event.EventStream  - logger log1-Slf4jLogger started
954  [flink-akka.actor.default-dispatcher-2] DEBUG akka.event.EventStream  - Default Loggers started
1044 [flink-akka.actor.default-dispatcher-4] INFO  Remoting  - Starting remoting
1117 [flink-akka.remote.default-remote-dispatcher-6] DEBUG org.jboss.netty.channel.socket.nio.SelectorUtil  - Using select timeout of 500
1118 [flink-akka.remote.default-remote-dispatcher-6] DEBUG org.jboss.netty.channel.socket.nio.SelectorUtil  - Epoll-bug workaround enabled = false
1325 [flink-akka.actor.default-dispatcher-2] INFO  Remoting  - Remoting started; listening on addresses :[[hidden email]]
1343 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.runtime.client.JobClient  - Started JobClient actor system at 127.0.0.1:58455
1343 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Looking up JobManager
1542 [flink-akka.actor.default-dispatcher-2] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [akka.actor.Identify]
1567 [flink-akka.actor.default-dispatcher-2] DEBUG akka.remote.EndpointWriter  - Drained buffer with maxWriteCount: 50, fullBackoffCount: 1, smallBackoffCount: 0, noBackoffCount: 0 , adaptiveBackoff: 1000
1599 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - JobManager runs at [hidden email]
1600 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Communication between client and JobManager will have a timeout of 100000 milliseconds
1600 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO  org.apache.flink.client.program.Client  - Checking and uploading JAR files
1606 [flink-akka.actor.default-dispatcher-2] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [org.apache.flink.runtime.messages.JobManagerMessages$RequestBlobManagerPort$]
1638 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG org.apache.flink.runtime.blob.BlobClient  - PUT content addressable BLOB stream to /127.0.0.1:42947
1660 [flink-akka.actor.default-dispatcher-4] INFO  org.apache.flink.runtime.client.JobClient  - Sending message to JobManager [hidden email] to submit job Starting Query (227a3733c283899991ba8a5237a0f2a8) and wait for progress
1667 [flink-akka.actor.default-dispatcher-2] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [org.apache.flink.runtime.messages.JobManagerMessages$SubmitJob]
1712 [flink-akka.actor.default-dispatcher-4] DEBUG akka.remote.RemoteWatcher  - Watching: [akka://flink/user/$a -> [hidden email]]
1781 [flink-akka.actor.default-dispatcher-4] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [akka.dispatch.sysmsg.Watch]
1819 [flink-akka.actor.default-dispatcher-4] DEBUG org.apache.flink.runtime.client.JobClient  - Received failure from JobManager
org.apache.flink.runtime.client.JobSubmissionException: The vertex null (null) has no invokable class.
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:511)
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:507)
    at scala.collection.Iterator$class.foreach(Iterator.scala:743)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:507)
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$receiveWithLogMessages$1.applyOrElse(JobManager.scala:190)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:43)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:29)
    at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.applyOrElse(ActorLogMessages.scala:29)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    at org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:92)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
    at akka.dispatch.Mailbox.run(Mailbox.scala:221)
    at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
1858 [flink-akka.actor.default-dispatcher-3] DEBUG akka.serialization.Serialization(akka://flink)  - Using serializer[akka.serialization.JavaSerializer] for message [akka.dispatch.sysmsg.Unwatch]
1862 [flink-akka.actor.default-dispatcher-3] DEBUG akka.remote.RemoteWatcher  - Unwatching: [akka://flink/user/$a -> [hidden email]]
1863 [flink-akka.actor.default-dispatcher-3] DEBUG akka.remote.RemoteWatcher  - Cleanup self watch of [[hidden email]]
1879 [flink-akka.actor.default-dispatcher-3] DEBUG akka.remote.RemoteWatcher  - Unwatched last watchee of node: [[hidden email]]
1932 [flink-akka.actor.default-dispatcher-2] INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator  - Shutting down remote daemon.
1935 [flink-akka.actor.default-dispatcher-2] INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator  - Remote daemon shut down; proceeding with flushing remote transports.
2037 [flink-akka.actor.default-dispatcher-4] INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator  - Remoting shut down.
org.apache.flink.client.program.ProgramInvocationException: The program execution failed: The vertex null (null) has no invokable class.
    at org.apache.flink.client.program.Client.run(Client.java:412)
    at org.apache.flink.client.program.Client.run(Client.java:355)
    at org.apache.flink.client.program.Client.run(Client.java:348)
    at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:63)
    at org.apache.flink.api.scala.ExecutionEnvironment.execute(ExecutionEnvironment.scala:590)
    at load$.main(/home/blaze/Documents/TU_Ilmenau/Masterthesis/projects/pigspark/load/load.scala:20)
    at load.main(/home/blaze/Documents/TU_Ilmenau/Masterthesis/projects/pigspark/load/load.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:437)
    at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:353)
    at org.apache.flink.client.program.Client.run(Client.java:315)
    at dbis.pig.tools.FlinkRun.submitJar(FlinkRun.scala:62)
    at dbis.pig.tools.FlinkRun.execute(FlinkRun.scala:37)
    at dbis.pig.PigCompiler$.run(PigCompiler.scala:106)
    at dbis.pig.PigCompiler$.main(PigCompiler.scala:69)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(FlinkCompileIt.scala:62)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(FlinkCompileIt.scala:53)
    at org.scalatest.prop.TableFor4$$anonfun$apply$17.apply(TableFor1.scala:797)
    at org.scalatest.prop.TableFor4$$anonfun$apply$17.apply(TableFor1.scala:795)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:778)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:777)
    at org.scalatest.prop.TableFor4.apply(TableFor1.scala:795)
    at org.scalatest.prop.TableDrivenPropertyChecks$class.forAll(TableDrivenPropertyChecks.scala:418)
    at org.scalatest.prop.TableDrivenPropertyChecks$.forAll(TableDrivenPropertyChecks.scala:665)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1.apply$mcV$sp(FlinkCompileIt.scala:53)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1.apply(FlinkCompileIt.scala:53)
    at dbis.test.flink.FlinkCompileIt$$anonfun$1.apply(FlinkCompileIt.scala:53)
    at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    at org.scalatest.Transformer.apply(Transformer.scala:22)
    at org.scalatest.Transformer.apply(Transformer.scala:20)
    at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647)
    at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
    at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
    at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644)
    at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
    at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
    at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
    at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656)
    at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
    at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
    at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
    at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714)
    at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
    at org.scalatest.Suite$class.run(Suite.scala:1424)
    at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
    at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
    at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
    at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760)
    at org.scalatest.FlatSpec.run(FlatSpec.scala:1683)
    at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
    at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
    at sbt.TestRunner.runTest$1(TestFramework.scala:76)
    at sbt.TestRunner.run(TestFramework.scala:85)
    at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
    at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
    at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185)
    at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
    at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
    at sbt.TestFunction.apply(TestFramework.scala:207)
    at sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:239)
    at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
    at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
    at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
    at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flink.runtime.client.JobSubmissionException: The vertex null (null) has no invokable class.
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:511)
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:507)
    at scala.collection.Iterator$class.foreach(Iterator.scala:743)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:507)
    at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$receiveWithLogMessages$1.applyOrElse(JobManager.scala:190)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:43)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:29)
    at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
    at org.apache.flink.runtime.ActorLogMessages$$anon$1.applyOrElse(ActorLogMessages.scala:29)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    at org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:92)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
    at akka.dispatch.Mailbox.run(Mailbox.scala:221)
    at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Best Regards,
Philipp



On 16.07.2015 11:45, Till Rohrmann wrote:

When you run your program from the IDE, then you can specify a log4j.properties file. There you can configure where and what to log. It should be enough to place the log4j.properties file in the resource folder of your project. An example properties file could look like:

log4j.rootLogger=INFO, testlogger

log4j.appender.testlogger=org.apache.log4j.ConsoleAppender
log4j.appender.testlogger.target = System.err
log4j.appender.testlogger.layout=org.apache.log4j.PatternLayout
log4j.appender.testlogger.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

Alternatively, you can specify it via a JVM option: -Dlog4j.configuration=<path to properties file>

Cheers,
Till


On Thu, Jul 16, 2015 at 11:23 AM, Philipp Goetze <[hidden email]> wrote:
Hi Till,

the problem is that this is the only output :( Or is it possible to get a more verbose log output?

Maybe it is important to note, that both Flink and our project is built with Scala 2.11.

Best Regards,
Philipp


On 16.07.2015 11:12, Till Rohrmann wrote:
Hi Philipp,

could you post the complete log output. This might help to get to the bottom of the problem.

Cheers,
Till

On Thu, Jul 16, 2015 at 11:01 AM, Philipp Goetze <[hidden email]> wrote:
Hi community,

in our project we try to submit built Flink programs to the jobmanager from within Scala code. The test program is executed correctly when submitted via the wrapper script "bin/flink run ..." and also with the webclient. But when executed from within the Scala code nothing seems to happen, but the following warning is found in the log:
10:47:18,153 WARN  akka.remote.ReliableDeliverySupervisor                        - Association with remote system [[hidden email]] has failed, address is now gated for [5000] ms. Reason is: [org.apache.flink.runtime.jobgraph.AbstractJobVertex]

Our submit method looks like that:
 def submitJar(master: String, path: String, className: String, args: String*) = { 
    val file = new File(path)
    val parallelism = 1 
    val wait = true
    try { 
      val program = new PackagedProgram(file, className, args:_*)
      val jobManagerAddress = getInetFromHostport(master)
      val client = new Client(jobManagerAddress, new Configuration(), program.getUserCodeClassLoader(), 1)  
      println("Executing " + path); 
      client.run(program, parallelism, wait); 
    } catch {
      case e: ProgramInvocationException => e.printStackTrace()
    }   
  }   

I took this as a reference: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-submit-flink-jars-from-plain-Java-programs-td656.html

I hope you can help.

Best Regards,
Philipp Götze