Hi, I see that you need to tell the Flink Stateful runtime about remote stateful function modules via a yaml file provided at deploy time. Given remote modules and stateful functions are an external deployment concern anyway, Is it possible to dynamically associate Remote Modules with Remote Function Endpoints to an existing/already running Flink stateful application? The use case is allowing dynamic composability of functions. The flink stateful application would receive a request to dynamically string together a new route based on some input data. The new route would be making calls to lets say a newly created Flink stateful application deployed remotely in an arbitrary language. Is this possible? On the roadmap? A bad idea? Why or why not? Thank you! |
Hello!
Yes, in the upcoming StateFun release we are introducing exactly that :) As of the upcoming version, we are adding a capability to dynamically register and modify function types without the need to redeploy Flink. Igal. On Mon, Jan 11, 2021 at 10:48 PM Ahmad Alkilani <[hidden email]> wrote:
|
That's awesome, thank you! JIRA I can follow? On Tue, Jan 12, 2021 at 9:01 AM Igal Shilman <[hidden email]> wrote:
|
In reply to this post by Ahmad Alkilani
Is it possible to dynamically, as the flink application is running, inject new SQL to be executed against a stream? Thank you!
|
Hi Ahmad, afaik that's not directly supported and would not work well with how Flink is designed now, since new joins would potentially require new network connections. You can, however, execute ad-hoc SQL queries against an already running Flink cluster. Additionally, if your SQL is rather simple (filter/transform/join in static forms), you can use Calcite to execute SQL inside a UDF. So you'd send your SQL query through a broadcast to all operators, interpret and apply it that in your user code. I have built something similar in the past and could give more details. On Wed, Jan 13, 2021 at 4:54 AM Ahmad Alkilani <[hidden email]> wrote:
-- Arvid Heise | Senior Java Developer Follow us @VervericaData -- Join Flink Forward - The Apache Flink Conference Stream Processing | Event Driven | Real Time -- Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany -- Ververica GmbHRegistered at Amtsgericht Charlottenburg: HRB 158244 B Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji (Toni) Cheng |
In reply to this post by Ahmad Alkilani
Hi Ahmad, Yes there is [1], this feature is already in master, and if you would like, you can already try it out. Take a look at the statefun-python-greeter-example [2]. With that example, you can: (a) Make changes to the greeter function like introducing a new state type[3], or change the logic, without restarting Flink. (b) You can modify the example's module.yaml by Removing this line[4], and then: (b.1) This will allow you to add additional function types under the "example" namespace, that will be served by the http://python-worker:8000/statefun endpoint. (b.2) Use the templating feature to define different endpoints (host and path), for different function types. For example: [5]. Please note that changes to module.yaml still require a restart of the Flink cluster, but after that you can make arbitrary changes to the function definitions, add/remove functions, and create new endpoints that serve new functions, without restarting Flink. Good luck, Igal. On Tue, Jan 12, 2021 at 7:16 PM Ahmad Alkilani <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |