Hi Flinksters, Thanks!I just tried to build the current yarn version of Flink. The second error is probably a because maven is of an older version. But the first one seems to be an error. albermax@hadoop1:~/bumpboost/working/programs/flink/incubator-flink$ mvn clean package -DskipTests -Dhadoop.profile=2 [INFO] Scanning for projects... [ERROR] The build could not read 2 projects -> [Help 1] [ERROR] [ERROR] The project org.apache.flink:flink-shaded-hadoop:0.10-SNAPSHOT (/home/albermax/bumpboost/working/programs/flink/incubator-flink/flink-shaded-hadoop/pom.xml) has 1 error [ERROR] Child module /home/albermax/bumpboost/working/programs/flink/incubator-flink/flink-shaded-hadoop/error of /home/albermax/bumpboost/working/programs/flink/incubator-flink/flink-shaded-hadoop/pom.xml does not exist [ERROR] [ERROR] The project org.apache.flink:flink-hbase:0.10-SNAPSHOT (/home/albermax/bumpboost/working/programs/flink/incubator-flink/flink-staging/flink-hbase/pom.xml) has 1 error [ERROR] 'dependencies.dependency.version' for org.apache.hbase:hbase-server:jar must be a valid version but is '${hbase.version}'. @ line 97, column 13 [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException |
Hi Max! Nowadays, the default target when building from source is Hadoop 2. So a simple mvn clean package -DskipTests should do it. You only need the flag when you build for Hadoop 1: -Dhadoop.profile=1.On Tue, Jun 23, 2015 at 2:03 PM, Maximilian Alber <[hidden email]> wrote:
|
Thanks! I'm still used to 0.7 :) Cheers,On Tue, Jun 23, 2015 at 2:18 PM, Maximilian Michels <[hidden email]> wrote:
|
In reply to this post by Maximilian Michels
I think it is a valid concern that the POM should be written such that the given command is valid. Should be only a simple change on the root POM, to make sure the Hadoop 2 profile is not only activated in the absence of a profile selection, but also when the selected profile is "2" Stephan On Tue, Jun 23, 2015 at 2:18 PM, Maximilian Michels <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |