I got this error in Gelly, which is a result of flink (i believe)
<nabble_embed> Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed. at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply$mcV$sp(JobManager.scala:822) at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply(JobManager.scala:768) at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply(JobManager.scala:768) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Caused by: java.lang.IllegalArgumentException: Too few memory segments provided. Hash Table needs at least 33 memory segments. at org.apache.flink.runtime.operators.hash.CompactingHashTable.<init>(CompactingHashTable.java:206) at org.apache.flink.runtime.operators.hash.CompactingHashTable.<init>(CompactingHashTable.java:191) at org.apache.flink.runtime.iterative.task.IterationHeadTask.initCompactingHashTable(IterationHeadTask.java:175) at org.apache.flink.runtime.iterative.task.IterationHeadTask.run(IterationHeadTask.java:272) at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:351) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584) at java.lang.Thread.run(Thread.java:745)</nabble_embed> I found a related topic: http://mail-archives.apache.org/mod_mbox/flink-dev/201503.mbox/%3CCAK5ODX4KJ9TB4yJ=BcNwsozbOoXwdB7HM9qvWoa1P9HK-Gb-Dg@mail.gmail.com%3E But i don't think the problem is the same, The code is as follows: ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment(); DataSource twitterEdges = env.readCsvFile("./datasets/out.munmun_twitter_social").fieldDelimiter(" ").ignoreComments("%").types(Long.class, Long.class); Graph graph = Graph.fromTuple2DataSet(twitterEdges, new testinggraph.InitVertices(), env); DataSet verticesWithCommunity = (DataSet)graph.run(new LabelPropagation(1)); System.out.println(verticesWithCommunity.count()); And it has only a couple of edges. I tried adding a config file in the project to add a couple of settings found here: https://ci.apache.org/projects/flink/flink-docs-release-0.8/config.html but that didn't work either I have no idea how to fix this atm, it's not just the LabelPropagation that goes wrong, all gelly methods give this exact error if it's using an iteration. |
By default Flink only allocates 2048 network buffers (64 MiB at 32 KiB/buffer). Have you increased the value for taskmanager.network.numberOfBuffers in flink-conf.yaml? On Thu, Oct 20, 2016 at 11:24 AM, otherwise777 <[hidden email]> wrote: I got this error in Gelly, which is a result of flink (i believe) |
Also pay attention to the Flink version you are using. The configuration link you have provided points to an old version (0.8). Gelly wasn't part of Flink then :) You probably need to look in [1]. Cheers, -Vasia. On 20 October 2016 at 17:53, Greg Hogan <[hidden email]> wrote:
|
I tried increasing the taskmanager.network.numberOfBuffers to 4k and later to 8k, i'm not sure if my configuration file is even read, it's stored inside my IDE as follows: http://prntscr.com/cx0vrx i build the flink program from the IDE and run it. I created several at different places to see if that helped but nothing changed on the error.
Afaik i'm using Flink 1.1.2 and Gelly 1.2-snapshot, here's my pom.xml: http://paste.thezomg.com/19868/41341147/ I see that the document i linked to points to an older config file, this is probably because it's the first hit on google, thanks for pointing it out |
Hi, On 21 October 2016 at 11:17, otherwise777 <[hidden email]> wrote: I tried increasing the taskmanager.network. that's correct, if you're running your application through your IDE, the config file is not read. For passing configuration options to the local environment, please refer to [1]. Alternatively, you can start Flink from the command line and submit your job as a jar using the bin/flink command or using the web interface. In that case, the configuration options that you set in flink-config.yaml will be taken into account. Please refer to [2] for more details. I hope this helps! -Vasia.
|
thank you so much, it worked immediately.
|
Free forum by Nabble | Edit this page |