Hi to all,
in my vertex centric iteration I get the following exception, am I doing something wrong or is it a bug of Gelly? starting iteration [1]: CoGroup (Messaging) (6/8) IterationHead(WorksetIteration (Vertex-centric iteration (test.gelly.functions.VUpdateFunction@1814786f | test.gelly.functions.VMessagingFunction@67eecbc2))) (4/8) switched to FAILED with exception. java.io.EOFException at org.apache.flink.runtime.operators.hash.InMemoryPartition$WriteView.nextSegment(InMemoryPartition.java:333) at org.apache.flink.runtime.memorymanager.AbstractPagedOutputView.advance(AbstractPagedOutputView.java:140) at org.apache.flink.runtime.memorymanager.AbstractPagedOutputView.write(AbstractPagedOutputView.java:201) at org.apache.flink.api.java.typeutils.runtime.DataOutputViewStream.write(DataOutputViewStream.java:39) at com.esotericsoftware.kryo.io.Output.flush(Output.java:163) at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.serialize(KryoSerializer.java:187) at org.apache.flink.api.java.typeutils.runtime.PojoSerializer.serialize(PojoSerializer.java:372) at org.apache.flink.api.java.typeutils.runtime.TupleSerializer.serialize(TupleSerializer.java:116) at org.apache.flink.api.java.typeutils.runtime.TupleSerializer.serialize(TupleSerializer.java:30) at org.apache.flink.runtime.operators.hash.InMemoryPartition.appendRecord(InMemoryPartition.java:219) at org.apache.flink.runtime.operators.hash.CompactingHashTable.insertOrReplaceRecord(CompactingHashTable.java:536) at org.apache.flink.runtime.operators.hash.CompactingHashTable.buildTableWithUniqueKey(CompactingHashTable.java:347) at org.apache.flink.runtime.iterative.task.IterationHeadPactTask.readInitialSolutionSet(IterationHeadPactTask.java:209) at org.apache.flink.runtime.iterative.task.IterationHeadPactTask.run(IterationHeadPactTask.java:270) at org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularPactTask.java:362) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559) at java.lang.Thread.run(Thread.java:745) Best, Flavio |
Hi Flavio, Could you also show us a code snippet? On Tue, Jul 14, 2015 at 2:06 PM, Flavio Pompermaier <[hidden email]> wrote:
|
Hi,
looks very similar to this bug: https://issues.apache.org/jira/browse/FLINK-1916 Best, Mihail On 14.07.2015 14:09, Andra Lungu wrote:
|
Hello, Sorry for the delay. The bug is not
in Gelly, but is, as hinted in the exception and as can be seen in the
logs, in Flink's Runtime. Mihail may actually be on to something. The
bug is actually very similar to the one described in FLINK-1916. On Tue, Jul 14, 2015 at 2:12 PM, Mihail Vieru <[hidden email]> wrote:
|
Thanks to all for the help..now let's hope for a fix ;) On 14 Jul 2015 22:01, "Andra Lungu" <[hidden email]> wrote:
|
In reply to this post by Andra Lungu
I thought a bit about this error..in my job I was generating multiple vertices with the same id.
Could this cause such errors? Maybe there could be a check about such situations in Gelly.. On Tue, Jul 14, 2015 at 10:00 PM, Andra Lungu <[hidden email]> wrote:
|
For now, there is a validator that checks whether the vertex ids correspond to the target/src ids in the edges. If you want to check for vertex ID uniqueness, you'll have to implement your own custom validator... I know people with the same error outside Gelly, so I doubt that the lack of unique ids triggered the exception :) On Thu, Jul 16, 2015 at 6:14 PM, Flavio Pompermaier <[hidden email]> wrote:
|
but isn't something that cause problem to have multiple vertices with the same id? On 16 Jul 2015 19:34, "Andra Lungu" <[hidden email]> wrote:
|
Hi Flavio, Gelly currently makes no sanity checks regarding the input graph data. We decided to leave it to the user to check that they have a valid graph, for performance reasons. That means that there might exist Gelly methods that assume that your input graph is valid, i.e. no duplicate vertices exist, no invalid edge ids, etc. I can't tell whether this particular error is caused because of the duplicate vertices, but it might be. Cheers, Vasia. On 16 July 2015 at 20:43, Flavio Pompermaier <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |