http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Testing-Kafka-interface-using-Flink-interactive-shell-tp6141p6155.html
You can add external dependencies to Scala shell using `--addclasspath` option. There is more detail description in documentation [1].
> On Apr 17, 2016, at 6:04 PM, Mich Talebzadeh <
[hidden email]> wrote:
>
> Hi,
>
> IN Spark shell I can load Kafka jar file through spark-shell option --jar
>
> spark-shell --master spark://50.140.197.217:7077 --jars ,/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar
>
> This works fine.
>
> In Flink I have added the jar file /home/hduser/jars/flink-connector-kafka-0.10.1.jar to the CLASSPATH.
>
> However I don't get any support for it within flink shell
>
> Scala-Flink> import org.apache.flink.streaming.connectors.kafka
> <console>:54: error: object connectors is not a member of package org.apache.flink.streaming
> import org.apache.flink.streaming.connectors.kafka
>
>
> Any ideas will be appreciated
> ^
>
> Dr Mich Talebzadeh
>
> LinkedIn
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>
>
http://talebzadehmich.wordpress.com>