Hello :) I get an InvalidTypesException while starting a stratosphere job. Could anyone help me? I would welcome your response, xxx@xxx:~$ command time -o /home/hoenicke/stratosphere-0.5.1-hadoop2/zeitenKleinVernica.txt -a -f '%e' /home/hoenicke/stratosphere-0.5.1-hadoop2/bin/stratosphere run /home/hoenicke/stratosphere-0.5.1-hadoop2/jar/vernica.jar 0.9 \file:///home/hoenicke/stratosphere-0.5.1-hadoop2/input/klein file:///home/hoenicke/stratosphere-0.5.1-hadoop2/output -v setJoin.zip (25K) Download Attachment |
Hi! Is is pretty much as the exception says. In the current state, Flink does (as Stratosphere did) support abstract types for the function signatures. The reason is that we need to do some type instruction and inference to set up our runtime utilities. What you can do is use Type variables and bounds. This helps you with most cases: class MyGenericFilter<T extends AbstractType> implements FilterFunction<T> { public boolean filter(T value) { // inside here, you can assume all that "AbstractType" has } } DataSet<ConcreteType> data = ... data.filter( new MyGenericFilter<ConcreteType>() ).print(); Greetings, Stephan On Wed, Sep 17, 2014 at 8:18 PM, Rockstar Flo <[hidden email]> wrote:
|
This is type Signature of the FlatMap where the Exception is thrown: On the current Version it works. Did we maybe change anything and arrays didn't work yet in 0.5.x?public static class FlatMapPK extends FlatMapFunction<String, Tuple3<Integer, Integer, int[]>> On Wed, Sep 17, 2014 at 9:23 PM, Stephan Ewen <[hidden email]> wrote:
|
Problem solved! :)
Thanks a lot. I changed "int[]" to "Integer[]" and now it works. Florian Am 18.09.2014 07:07, schrieb Aljoscha Krettek:
|
In newer versions it also works with int[]. :D On Thu, Sep 18, 2014 at 12:28 PM, Florian Hönicke <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |