Maven artifacts scala 2.11 bug?

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Maven artifacts scala 2.11 bug?

David Kim
Hello again,

I saw the recent change to flink 1.0-SNAPSHOT on explicitly adding the scala version to the suffix.

I have a sbt project that fails. I don't believe it's a misconfiguration error on my end because I do see in the logs that it tries to resolve everything with _2.11.

Could this possibly be a bug on the flink build pipeline for these new names?

Here's the error with the resolve logs

[info]  [SUCCESSFUL ] org.apache.flink#flink-scala_2.11;1.0-SNAPSHOT!flink-scala_2.11.jar (4733ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-clients_2.11;1.0-SNAPSHOT!flink-clients_2.11.jar (3677ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-scala_2.11;1.0-SNAPSHOT!flink-streaming-scala_2.11.jar (3832ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-0.8_2.11;1.0-SNAPSHOT!flink-connector-kafka-0.8_2.11.jar (3422ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar (3624ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar(test-jar) (2376ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-java;1.0-SNAPSHOT!flink-java.jar (3164ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-optimizer_2.11;1.0-SNAPSHOT!flink-optimizer_2.11.jar (4014ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations;1.0-SNAPSHOT!flink-annotations.jar (1511ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2;1.0-SNAPSHOT!flink-shaded-hadoop2.jar (7671ms)
[info]  [SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.jar (433ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-runtime_2.11;1.0-SNAPSHOT!flink-runtime_2.11.jar (4392ms)
[info]  [SUCCESSFUL ] io.netty#netty-all;4.0.27.Final!netty-all.jar (6098ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-java_2.11;1.0-SNAPSHOT!flink-streaming-java_2.11.jar (3794ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-base_2.11;1.0-SNAPSHOT!flink-connector-kafka-base_2.11.jar (3608ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-curator-recipes;1.0-SNAPSHOT!flink-shaded-curator-recipes.jar (3803ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-test-utils_2.11;1.0-SNAPSHOT!flink-test-utils_2.11.jar (2831ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-contrib_2.11;1.0-SNAPSHOT!flink-streaming-contrib_2.11.jar (2503ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations_2.11;1.0-SNAPSHOT!flink-annotations_2.11.jar (1823ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2_2.11;1.0-SNAPSHOT!flink-shaded-hadoop2_2.11.jar (6010ms)
[info] Done updating.
[error] Modules were resolved with conflicting cross-version suffixes in {file:/home/vagrant/host-bt/mint/}planchets:
[error]    org.apache.flink:flink-shaded-hadoop2 <none>, _2.11
[error]    org.apache.flink:flink-core <none>, _2.11
[error]    org.apache.flink:flink-annotations <none>, _2.11
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations
        at scala.sys.package$.error(package.scala:27)
        at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
        at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
        at sbt.Classpaths$$anonfun$66.apply(Defaults.scala:1164)

Thanks,
David
Reply | Threaded
Open this post in threaded view
|

Re: Maven artifacts scala 2.11 bug?

rmetzger0
Hi David,

can you post your SBT build file as well?

On Wed, Jan 27, 2016 at 7:52 PM, David Kim <[hidden email]> wrote:
Hello again,

I saw the recent change to flink 1.0-SNAPSHOT on explicitly adding the scala version to the suffix.

I have a sbt project that fails. I don't believe it's a misconfiguration error on my end because I do see in the logs that it tries to resolve everything with _2.11.

Could this possibly be a bug on the flink build pipeline for these new names?

Here's the error with the resolve logs

[info]  [SUCCESSFUL ] org.apache.flink#flink-scala_2.11;1.0-SNAPSHOT!flink-scala_2.11.jar (4733ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-clients_2.11;1.0-SNAPSHOT!flink-clients_2.11.jar (3677ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-scala_2.11;1.0-SNAPSHOT!flink-streaming-scala_2.11.jar (3832ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-0.8_2.11;1.0-SNAPSHOT!flink-connector-kafka-0.8_2.11.jar (3422ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar (3624ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar(test-jar) (2376ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-java;1.0-SNAPSHOT!flink-java.jar (3164ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-optimizer_2.11;1.0-SNAPSHOT!flink-optimizer_2.11.jar (4014ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations;1.0-SNAPSHOT!flink-annotations.jar (1511ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2;1.0-SNAPSHOT!flink-shaded-hadoop2.jar (7671ms)
[info]  [SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.jar (433ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-runtime_2.11;1.0-SNAPSHOT!flink-runtime_2.11.jar (4392ms)
[info]  [SUCCESSFUL ] io.netty#netty-all;4.0.27.Final!netty-all.jar (6098ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-java_2.11;1.0-SNAPSHOT!flink-streaming-java_2.11.jar (3794ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-base_2.11;1.0-SNAPSHOT!flink-connector-kafka-base_2.11.jar (3608ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-curator-recipes;1.0-SNAPSHOT!flink-shaded-curator-recipes.jar (3803ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-test-utils_2.11;1.0-SNAPSHOT!flink-test-utils_2.11.jar (2831ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-contrib_2.11;1.0-SNAPSHOT!flink-streaming-contrib_2.11.jar (2503ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations_2.11;1.0-SNAPSHOT!flink-annotations_2.11.jar (1823ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2_2.11;1.0-SNAPSHOT!flink-shaded-hadoop2_2.11.jar (6010ms)
[info] Done updating.
[error] Modules were resolved with conflicting cross-version suffixes in {file:/home/vagrant/host-bt/mint/}planchets:
[error]    org.apache.flink:flink-shaded-hadoop2 <none>, _2.11
[error]    org.apache.flink:flink-core <none>, _2.11
[error]    org.apache.flink:flink-annotations <none>, _2.11
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations
        at scala.sys.package$.error(package.scala:27)
        at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
        at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
        at sbt.Classpaths$$anonfun$66.apply(Defaults.scala:1164)

Thanks,
David

Reply | Threaded
Open this post in threaded view
|

Re: Maven artifacts scala 2.11 bug?

David Kim
Hi Robert,

Here's the relevant snippet for my sbt config.


My dependencies are listed in a file called Dependencies.scala.

object Dependencies {

val flinkVersion = "1.0-SNAPSHOT"

val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion,
"org.apache.flink" %% "flink-clients" % flinkVersion,
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion,
"org.apache.flink" %% "flink-connector-kafka-0.8" % flinkVersion
)

val testDependencies = Seq(
"org.apache.flink" %% "flink-core" % flinkVersion % "it,test" classifier "tests",
"org.apache.flink" %% "flink-test-utils" % flinkVersion % "it,test",
"org.apache.flink" %% "flink-streaming-contrib" % flinkVersion % "it,test",
"org.scalatest" %% "scalatest" % "2.2.4" % "it,test",
"org.scalacheck" %% "scalacheck" % "1.12.5" % "it,test",
"org.scalamock" %% "scalamock-scalatest-support" % "3.2" % "it,test",
"net.manub" %% "scalatest-embedded-kafka" % "0.4.1" % "it,test"
)

My project settings are in a file called MyBuild.scala

object MyBuild extends Build {
override lazy val settings = super.settings ++ Seq(
scalaVersion := "2.11.7",
scalacOptions += "-target:jvm-1.8",
javacOptions ++= Seq("-source", "1.8", "-target", "1.8")
)
Thanks,
David

On Wed, Jan 27, 2016 at 1:30 PM, Robert Metzger <[hidden email]> wrote:
Hi David,

can you post your SBT build file as well?

On Wed, Jan 27, 2016 at 7:52 PM, David Kim <[hidden email]> wrote:
Hello again,

I saw the recent change to flink 1.0-SNAPSHOT on explicitly adding the scala version to the suffix.

I have a sbt project that fails. I don't believe it's a misconfiguration error on my end because I do see in the logs that it tries to resolve everything with _2.11.

Could this possibly be a bug on the flink build pipeline for these new names?

Here's the error with the resolve logs

[info]  [SUCCESSFUL ] org.apache.flink#flink-scala_2.11;1.0-SNAPSHOT!flink-scala_2.11.jar (4733ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-clients_2.11;1.0-SNAPSHOT!flink-clients_2.11.jar (3677ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-scala_2.11;1.0-SNAPSHOT!flink-streaming-scala_2.11.jar (3832ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-0.8_2.11;1.0-SNAPSHOT!flink-connector-kafka-0.8_2.11.jar (3422ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar (3624ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar(test-jar) (2376ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-java;1.0-SNAPSHOT!flink-java.jar (3164ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-optimizer_2.11;1.0-SNAPSHOT!flink-optimizer_2.11.jar (4014ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations;1.0-SNAPSHOT!flink-annotations.jar (1511ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2;1.0-SNAPSHOT!flink-shaded-hadoop2.jar (7671ms)
[info]  [SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.jar (433ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-runtime_2.11;1.0-SNAPSHOT!flink-runtime_2.11.jar (4392ms)
[info]  [SUCCESSFUL ] io.netty#netty-all;4.0.27.Final!netty-all.jar (6098ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-java_2.11;1.0-SNAPSHOT!flink-streaming-java_2.11.jar (3794ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-base_2.11;1.0-SNAPSHOT!flink-connector-kafka-base_2.11.jar (3608ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-curator-recipes;1.0-SNAPSHOT!flink-shaded-curator-recipes.jar (3803ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-test-utils_2.11;1.0-SNAPSHOT!flink-test-utils_2.11.jar (2831ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-contrib_2.11;1.0-SNAPSHOT!flink-streaming-contrib_2.11.jar (2503ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations_2.11;1.0-SNAPSHOT!flink-annotations_2.11.jar (1823ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2_2.11;1.0-SNAPSHOT!flink-shaded-hadoop2_2.11.jar (6010ms)
[info] Done updating.
[error] Modules were resolved with conflicting cross-version suffixes in {file:/home/vagrant/host-bt/mint/}planchets:
[error]    org.apache.flink:flink-shaded-hadoop2 <none>, _2.11
[error]    org.apache.flink:flink-core <none>, _2.11
[error]    org.apache.flink:flink-annotations <none>, _2.11
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations
        at scala.sys.package$.error(package.scala:27)
        at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
        at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
        at sbt.Classpaths$$anonfun$66.apply(Defaults.scala:1164)

Thanks,
David




--
Note: this information is confidential. It is prohibited to share, post online or otherwise publicize without Braintree's prior written consent.
Reply | Threaded
Open this post in threaded view
|

Re: Maven artifacts scala 2.11 bug?

Stephan Ewen
Hi David!

The dependencies that SBT marks as wrong (org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations) are actually those that are Scala-independent, and have no suffix at all.

It is possible your SBT file does not like miking dependencies with and without suffix?

Greetings,
Stephan



On Wed, Jan 27, 2016 at 8:40 PM, David Kim <[hidden email]> wrote:
Hi Robert,

Here's the relevant snippet for my sbt config.


My dependencies are listed in a file called Dependencies.scala.

object Dependencies {

val flinkVersion = "1.0-SNAPSHOT"

val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion,
"org.apache.flink" %% "flink-clients" % flinkVersion,
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion,
"org.apache.flink" %% "flink-connector-kafka-0.8" % flinkVersion
)

val testDependencies = Seq(
"org.apache.flink" %% "flink-core" % flinkVersion % "it,test" classifier "tests",
"org.apache.flink" %% "flink-test-utils" % flinkVersion % "it,test",
"org.apache.flink" %% "flink-streaming-contrib" % flinkVersion % "it,test",
"org.scalatest" %% "scalatest" % "2.2.4" % "it,test",
"org.scalacheck" %% "scalacheck" % "1.12.5" % "it,test",
"org.scalamock" %% "scalamock-scalatest-support" % "3.2" % "it,test",
"net.manub" %% "scalatest-embedded-kafka" % "0.4.1" % "it,test"
)

My project settings are in a file called MyBuild.scala

object MyBuild extends Build {
override lazy val settings = super.settings ++ Seq(
scalaVersion := "2.11.7",
scalacOptions += "-target:jvm-1.8",
javacOptions ++= Seq("-source", "1.8", "-target", "1.8")
)
Thanks,
David

On Wed, Jan 27, 2016 at 1:30 PM, Robert Metzger <[hidden email]> wrote:
Hi David,

can you post your SBT build file as well?

On Wed, Jan 27, 2016 at 7:52 PM, David Kim <[hidden email]> wrote:
Hello again,

I saw the recent change to flink 1.0-SNAPSHOT on explicitly adding the scala version to the suffix.

I have a sbt project that fails. I don't believe it's a misconfiguration error on my end because I do see in the logs that it tries to resolve everything with _2.11.

Could this possibly be a bug on the flink build pipeline for these new names?

Here's the error with the resolve logs

[info]  [SUCCESSFUL ] org.apache.flink#flink-scala_2.11;1.0-SNAPSHOT!flink-scala_2.11.jar (4733ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-clients_2.11;1.0-SNAPSHOT!flink-clients_2.11.jar (3677ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-scala_2.11;1.0-SNAPSHOT!flink-streaming-scala_2.11.jar (3832ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-0.8_2.11;1.0-SNAPSHOT!flink-connector-kafka-0.8_2.11.jar (3422ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar (3624ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar(test-jar) (2376ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-java;1.0-SNAPSHOT!flink-java.jar (3164ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-optimizer_2.11;1.0-SNAPSHOT!flink-optimizer_2.11.jar (4014ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations;1.0-SNAPSHOT!flink-annotations.jar (1511ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2;1.0-SNAPSHOT!flink-shaded-hadoop2.jar (7671ms)
[info]  [SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.jar (433ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-runtime_2.11;1.0-SNAPSHOT!flink-runtime_2.11.jar (4392ms)
[info]  [SUCCESSFUL ] io.netty#netty-all;4.0.27.Final!netty-all.jar (6098ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-java_2.11;1.0-SNAPSHOT!flink-streaming-java_2.11.jar (3794ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-base_2.11;1.0-SNAPSHOT!flink-connector-kafka-base_2.11.jar (3608ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-curator-recipes;1.0-SNAPSHOT!flink-shaded-curator-recipes.jar (3803ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-test-utils_2.11;1.0-SNAPSHOT!flink-test-utils_2.11.jar (2831ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-contrib_2.11;1.0-SNAPSHOT!flink-streaming-contrib_2.11.jar (2503ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations_2.11;1.0-SNAPSHOT!flink-annotations_2.11.jar (1823ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2_2.11;1.0-SNAPSHOT!flink-shaded-hadoop2_2.11.jar (6010ms)
[info] Done updating.
[error] Modules were resolved with conflicting cross-version suffixes in {file:/home/vagrant/host-bt/mint/}planchets:
[error]    org.apache.flink:flink-shaded-hadoop2 <none>, _2.11
[error]    org.apache.flink:flink-core <none>, _2.11
[error]    org.apache.flink:flink-annotations <none>, _2.11
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations
        at scala.sys.package$.error(package.scala:27)
        at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
        at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
        at sbt.Classpaths$$anonfun$66.apply(Defaults.scala:1164)

Thanks,
David




--
Note: this information is confidential. It is prohibited to share, post online or otherwise publicize without Braintree's prior written consent.

Reply | Threaded
Open this post in threaded view
|

Re: Maven artifacts scala 2.11 bug?

David Kim
Hi Stephan, Robert,

Yes, I found a solution. Turns out that I shouldn't specify a suffix for flink-core. I changed flink-core to not have any suffix.

"org.apache.flink" %% "flink-core" % flinkVersion % "it,test" classifier "tests",
"org.apache.flink" % "flink-core" % flinkVersion % "it,test" classifier "tests"


Once I did that I was able to resolve and build. Thanks for the help!

Cheers,
David

On Wed, Jan 27, 2016 at 2:12 PM, Stephan Ewen <[hidden email]> wrote:
Hi David!

The dependencies that SBT marks as wrong (org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations) are actually those that are Scala-independent, and have no suffix at all.

It is possible your SBT file does not like miking dependencies with and without suffix?

Greetings,
Stephan



On Wed, Jan 27, 2016 at 8:40 PM, David Kim <[hidden email]> wrote:
Hi Robert,

Here's the relevant snippet for my sbt config.


My dependencies are listed in a file called Dependencies.scala.

object Dependencies {

val flinkVersion = "1.0-SNAPSHOT"

val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion,
"org.apache.flink" %% "flink-clients" % flinkVersion,
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion,
"org.apache.flink" %% "flink-connector-kafka-0.8" % flinkVersion
)

val testDependencies = Seq(
"org.apache.flink" %% "flink-core" % flinkVersion % "it,test" classifier "tests",
"org.apache.flink" %% "flink-test-utils" % flinkVersion % "it,test",
"org.apache.flink" %% "flink-streaming-contrib" % flinkVersion % "it,test",
"org.scalatest" %% "scalatest" % "2.2.4" % "it,test",
"org.scalacheck" %% "scalacheck" % "1.12.5" % "it,test",
"org.scalamock" %% "scalamock-scalatest-support" % "3.2" % "it,test",
"net.manub" %% "scalatest-embedded-kafka" % "0.4.1" % "it,test"
)

My project settings are in a file called MyBuild.scala

object MyBuild extends Build {
override lazy val settings = super.settings ++ Seq(
scalaVersion := "2.11.7",
scalacOptions += "-target:jvm-1.8",
javacOptions ++= Seq("-source", "1.8", "-target", "1.8")
)
Thanks,
David

On Wed, Jan 27, 2016 at 1:30 PM, Robert Metzger <[hidden email]> wrote:
Hi David,

can you post your SBT build file as well?

On Wed, Jan 27, 2016 at 7:52 PM, David Kim <[hidden email]> wrote:
Hello again,

I saw the recent change to flink 1.0-SNAPSHOT on explicitly adding the scala version to the suffix.

I have a sbt project that fails. I don't believe it's a misconfiguration error on my end because I do see in the logs that it tries to resolve everything with _2.11.

Could this possibly be a bug on the flink build pipeline for these new names?

Here's the error with the resolve logs

[info]  [SUCCESSFUL ] org.apache.flink#flink-scala_2.11;1.0-SNAPSHOT!flink-scala_2.11.jar (4733ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-clients_2.11;1.0-SNAPSHOT!flink-clients_2.11.jar (3677ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-scala_2.11;1.0-SNAPSHOT!flink-streaming-scala_2.11.jar (3832ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-0.8_2.11;1.0-SNAPSHOT!flink-connector-kafka-0.8_2.11.jar (3422ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar (3624ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar(test-jar) (2376ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-java;1.0-SNAPSHOT!flink-java.jar (3164ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-optimizer_2.11;1.0-SNAPSHOT!flink-optimizer_2.11.jar (4014ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations;1.0-SNAPSHOT!flink-annotations.jar (1511ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2;1.0-SNAPSHOT!flink-shaded-hadoop2.jar (7671ms)
[info]  [SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.jar (433ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-runtime_2.11;1.0-SNAPSHOT!flink-runtime_2.11.jar (4392ms)
[info]  [SUCCESSFUL ] io.netty#netty-all;4.0.27.Final!netty-all.jar (6098ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-java_2.11;1.0-SNAPSHOT!flink-streaming-java_2.11.jar (3794ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-base_2.11;1.0-SNAPSHOT!flink-connector-kafka-base_2.11.jar (3608ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-curator-recipes;1.0-SNAPSHOT!flink-shaded-curator-recipes.jar (3803ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-test-utils_2.11;1.0-SNAPSHOT!flink-test-utils_2.11.jar (2831ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-contrib_2.11;1.0-SNAPSHOT!flink-streaming-contrib_2.11.jar (2503ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations_2.11;1.0-SNAPSHOT!flink-annotations_2.11.jar (1823ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2_2.11;1.0-SNAPSHOT!flink-shaded-hadoop2_2.11.jar (6010ms)
[info] Done updating.
[error] Modules were resolved with conflicting cross-version suffixes in {file:/home/vagrant/host-bt/mint/}planchets:
[error]    org.apache.flink:flink-shaded-hadoop2 <none>, _2.11
[error]    org.apache.flink:flink-core <none>, _2.11
[error]    org.apache.flink:flink-annotations <none>, _2.11
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations
        at scala.sys.package$.error(package.scala:27)
        at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
        at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
        at sbt.Classpaths$$anonfun$66.apply(Defaults.scala:1164)

Thanks,
David




--
Note: this information is confidential. It is prohibited to share, post online or otherwise publicize without Braintree's prior written consent.




--
Note: this information is confidential. It is prohibited to share, post online or otherwise publicize without Braintree's prior written consent.
Reply | Threaded
Open this post in threaded view
|

Re: Maven artifacts scala 2.11 bug?

Stephan Ewen
Good to hear!

Sorry for the hassle you have to go through. There is a lot of restructuring to make it clean for 1.0.

Greetings,
Stephan


On Wed, Jan 27, 2016 at 9:16 PM, David Kim <[hidden email]> wrote:
Hi Stephan, Robert,

Yes, I found a solution. Turns out that I shouldn't specify a suffix for flink-core. I changed flink-core to not have any suffix.

"org.apache.flink" %% "flink-core" % flinkVersion % "it,test" classifier "tests",
"org.apache.flink" % "flink-core" % flinkVersion % "it,test" classifier "tests"


Once I did that I was able to resolve and build. Thanks for the help!

Cheers,
David

On Wed, Jan 27, 2016 at 2:12 PM, Stephan Ewen <[hidden email]> wrote:
Hi David!

The dependencies that SBT marks as wrong (org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations) are actually those that are Scala-independent, and have no suffix at all.

It is possible your SBT file does not like miking dependencies with and without suffix?

Greetings,
Stephan



On Wed, Jan 27, 2016 at 8:40 PM, David Kim <[hidden email]> wrote:
Hi Robert,

Here's the relevant snippet for my sbt config.


My dependencies are listed in a file called Dependencies.scala.

object Dependencies {

val flinkVersion = "1.0-SNAPSHOT"

val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion,
"org.apache.flink" %% "flink-clients" % flinkVersion,
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion,
"org.apache.flink" %% "flink-connector-kafka-0.8" % flinkVersion
)

val testDependencies = Seq(
"org.apache.flink" %% "flink-core" % flinkVersion % "it,test" classifier "tests",
"org.apache.flink" %% "flink-test-utils" % flinkVersion % "it,test",
"org.apache.flink" %% "flink-streaming-contrib" % flinkVersion % "it,test",
"org.scalatest" %% "scalatest" % "2.2.4" % "it,test",
"org.scalacheck" %% "scalacheck" % "1.12.5" % "it,test",
"org.scalamock" %% "scalamock-scalatest-support" % "3.2" % "it,test",
"net.manub" %% "scalatest-embedded-kafka" % "0.4.1" % "it,test"
)

My project settings are in a file called MyBuild.scala

object MyBuild extends Build {
override lazy val settings = super.settings ++ Seq(
scalaVersion := "2.11.7",
scalacOptions += "-target:jvm-1.8",
javacOptions ++= Seq("-source", "1.8", "-target", "1.8")
)
Thanks,
David

On Wed, Jan 27, 2016 at 1:30 PM, Robert Metzger <[hidden email]> wrote:
Hi David,

can you post your SBT build file as well?

On Wed, Jan 27, 2016 at 7:52 PM, David Kim <[hidden email]> wrote:
Hello again,

I saw the recent change to flink 1.0-SNAPSHOT on explicitly adding the scala version to the suffix.

I have a sbt project that fails. I don't believe it's a misconfiguration error on my end because I do see in the logs that it tries to resolve everything with _2.11.

Could this possibly be a bug on the flink build pipeline for these new names?

Here's the error with the resolve logs

[info]  [SUCCESSFUL ] org.apache.flink#flink-scala_2.11;1.0-SNAPSHOT!flink-scala_2.11.jar (4733ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-clients_2.11;1.0-SNAPSHOT!flink-clients_2.11.jar (3677ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-scala_2.11;1.0-SNAPSHOT!flink-streaming-scala_2.11.jar (3832ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-0.8_2.11;1.0-SNAPSHOT!flink-connector-kafka-0.8_2.11.jar (3422ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar (3624ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-core;1.0-SNAPSHOT!flink-core.jar(test-jar) (2376ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-java;1.0-SNAPSHOT!flink-java.jar (3164ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-optimizer_2.11;1.0-SNAPSHOT!flink-optimizer_2.11.jar (4014ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations;1.0-SNAPSHOT!flink-annotations.jar (1511ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2;1.0-SNAPSHOT!flink-shaded-hadoop2.jar (7671ms)
[info]  [SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.jar (433ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-runtime_2.11;1.0-SNAPSHOT!flink-runtime_2.11.jar (4392ms)
[info]  [SUCCESSFUL ] io.netty#netty-all;4.0.27.Final!netty-all.jar (6098ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-java_2.11;1.0-SNAPSHOT!flink-streaming-java_2.11.jar (3794ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-connector-kafka-base_2.11;1.0-SNAPSHOT!flink-connector-kafka-base_2.11.jar (3608ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-curator-recipes;1.0-SNAPSHOT!flink-shaded-curator-recipes.jar (3803ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-test-utils_2.11;1.0-SNAPSHOT!flink-test-utils_2.11.jar (2831ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-streaming-contrib_2.11;1.0-SNAPSHOT!flink-streaming-contrib_2.11.jar (2503ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-annotations_2.11;1.0-SNAPSHOT!flink-annotations_2.11.jar (1823ms)
[info]  [SUCCESSFUL ] org.apache.flink#flink-shaded-hadoop2_2.11;1.0-SNAPSHOT!flink-shaded-hadoop2_2.11.jar (6010ms)
[info] Done updating.
[error] Modules were resolved with conflicting cross-version suffixes in {file:/home/vagrant/host-bt/mint/}planchets:
[error]    org.apache.flink:flink-shaded-hadoop2 <none>, _2.11
[error]    org.apache.flink:flink-core <none>, _2.11
[error]    org.apache.flink:flink-annotations <none>, _2.11
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.flink:flink-shaded-hadoop2, org.apache.flink:flink-core, org.apache.flink:flink-annotations
        at scala.sys.package$.error(package.scala:27)
        at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
        at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
        at sbt.Classpaths$$anonfun$66.apply(Defaults.scala:1164)

Thanks,
David




--
Note: this information is confidential. It is prohibited to share, post online or otherwise publicize without Braintree's prior written consent.




--
Note: this information is confidential. It is prohibited to share, post online or otherwise publicize without Braintree's prior written consent.