Testing Kafka interface using Flink interactive shell

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Testing Kafka interface using Flink interactive shell

Mich Talebzadeh
Hi,

IN Spark shell I can load Kafka jar file through spark-shell option --jar

spark-shell --master spark://50.140.197.217:7077 --jars ,/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar

This works fine.

In Flink I have added the jar file /home/hduser/jars/flink-connector-kafka-0.10.1.jar to the CLASSPATH.

However I don't get any support for it within flink shell

Scala-Flink> import org.apache.flink.streaming.connectors.kafka
<console>:54: error: object connectors is not a member of package org.apache.flink.streaming
            import org.apache.flink.streaming.connectors.kafka


Any ideas will be appreciated
                                              ^

Reply | Threaded
Open this post in threaded view
|

Re: Testing Kafka interface using Flink interactive shell

Chiwan Park-2
Hi Mich,

You can add external dependencies to Scala shell using `--addclasspath` option. There is more detail description in documentation [1].

[1]: https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/scala_shell.html#adding-external-dependencies

Regards,
Chiwan Park

> On Apr 17, 2016, at 6:04 PM, Mich Talebzadeh <[hidden email]> wrote:
>
> Hi,
>
> IN Spark shell I can load Kafka jar file through spark-shell option --jar
>
> spark-shell --master spark://50.140.197.217:7077 --jars ,/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar
>
> This works fine.
>
> In Flink I have added the jar file /home/hduser/jars/flink-connector-kafka-0.10.1.jar to the CLASSPATH.
>
> However I don't get any support for it within flink shell
>
> Scala-Flink> import org.apache.flink.streaming.connectors.kafka
> <console>:54: error: object connectors is not a member of package org.apache.flink.streaming
>             import org.apache.flink.streaming.connectors.kafka
>
>
> Any ideas will be appreciated
>                                               ^
>
> Dr Mich Talebzadeh
>  
> LinkedIn  https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>  
> http://talebzadehmich.wordpress.com
>  

Reply | Threaded
Open this post in threaded view
|

Re: Testing Kafka interface using Flink interactive shell

Mich Talebzadeh
Thanks Chiwan. It worked.

Now I have this simple streaming program in Spark Scala that gets streaming data via Kafka. It is pretty simple. Please see attached.

I am trying to make it work with Flink + Kafka

Any hints will be appreciated.

Thanks




On 18 April 2016 at 02:43, Chiwan Park <[hidden email]> wrote:
Hi Mich,

You can add external dependencies to Scala shell using `--addclasspath` option. There is more detail description in documentation [1].

[1]: https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/scala_shell.html#adding-external-dependencies

Regards,
Chiwan Park

> On Apr 17, 2016, at 6:04 PM, Mich Talebzadeh <[hidden email]> wrote:
>
> Hi,
>
> IN Spark shell I can load Kafka jar file through spark-shell option --jar
>
> spark-shell --master spark://50.140.197.217:7077 --jars ,/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar
>
> This works fine.
>
> In Flink I have added the jar file /home/hduser/jars/flink-connector-kafka-0.10.1.jar to the CLASSPATH.
>
> However I don't get any support for it within flink shell
>
> Scala-Flink> import org.apache.flink.streaming.connectors.kafka
> <console>:54: error: object connectors is not a member of package org.apache.flink.streaming
>             import org.apache.flink.streaming.connectors.kafka
>
>
> Any ideas will be appreciated
>                                               ^
>
> Dr Mich Talebzadeh
>
> LinkedIn  https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>
> http://talebzadehmich.wordpress.com
>



spark-kafka.scala.txt (2K) Download Attachment