Hi guys!
I’m new to Flink, and actually to this mailing list as well :) this is my first message. I’m still reading the documentation and I would say Flink is an amazing system!! Thanks everybody who participated in the development! The information I didn’t find in the documentation - if it is possible to describe data(stream) transformation without any code (Java/Scala). I mean if it is possible to describe datasource functions, all of the operators, connections between them, and sinks in a plain text configuration file and then feed it to Flink. In this case it would be possible to change data flow without recompilation/redeployment. Is there a similar functionality in Flink? May be some third party plugin? Thank you, Alex |
Hi Alex, welcome to the Flink community! This has been done before for other programming APIs (Flink's own libraries Table API, Gelly, FlinkML, and externals Apache Beam / Google DataFlow, Mahout, Cascading, ...). However, all of these are again programming APIs, some specialized for certain use-cases. Specifying Flink programs by config files (or graphically) would require a data model, a DataStream/DataSet program generator and probably a code generation component. Best, Fabian 2016-04-22 18:41 GMT+02:00 Alexander Smirnov <[hidden email]>: Hi guys! |
Hi, I think if the Table API/SQL API evolves enough it should be able to supply a Flink program as just an SQL query with source/sink definitions. Hopefully, in the future. :-) Cheers, Aljoscha On Fri, 22 Apr 2016 at 23:10 Fabian Hueske <[hidden email]> wrote:
|
Hi Alexander, Best, [1] http://stratosphere.eu/assets/papers/Sopremo_Meteor%20BigData.pdf On 23 Apr 2016 07:48, "Aljoscha Krettek" <[hidden email]> wrote:
|
thank you so much for the responses, guys! On Sat, Apr 23, 2016 at 12:09 AM Flavio Pompermaier <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |