Flink 1.7 Notebook Environment

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink 1.7 Notebook Environment

Faizan Ahmed
Hi all,
I have been searching around quite a bit and doing my own experiments to make the latest Flink release 1.7.1 to work with Apache Zeppelin however Apache Zeppelin's Flink interpreter is quite outdated (Flink 1.3). AFAIK its not possible to use Flink running on YARN via Zeppelin as it only works with a local cluster. 

Has anyone been able to run Flink's latest release on Zeppelin? If yes then please share some instructions/tutorial. If no then is there any other suitable notebook environment for running Flink (maybe Jupyter)? I want to prototype my ideas in Flink and since I'm coming from Spark background it would be really useful to have notebook environment for vaildation of flink apps.

Looking forward to your response

Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Flink 1.7 Notebook Environment

Jeff Zhang
Hi Faizan,

I have implemented one flink interpreter for blink which is donated by alibaba to flink community recently. Maybe you notice this news recently. 

Here's some tutorials which you may be interested. 




Faizan Ahmed <[hidden email]> 于2019年2月11日周一 上午11:44写道:
Hi all,
I have been searching around quite a bit and doing my own experiments to make the latest Flink release 1.7.1 to work with Apache Zeppelin however Apache Zeppelin's Flink interpreter is quite outdated (Flink 1.3). AFAIK its not possible to use Flink running on YARN via Zeppelin as it only works with a local cluster. 

Has anyone been able to run Flink's latest release on Zeppelin? If yes then please share some instructions/tutorial. If no then is there any other suitable notebook environment for running Flink (maybe Jupyter)? I want to prototype my ideas in Flink and since I'm coming from Spark background it would be really useful to have notebook environment for vaildation of flink apps.

Looking forward to your response

Thanks


--
Best Regards

Jeff Zhang