Re: data enrichment with SQL use case
Posted by
Ken Krugler on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/data-enrichment-with-SQL-use-case-tp19520p19556.html
Hi Miki,
I haven’t tried mixing AsyncFunctions with SQL queries.
Normally I’d create a regular DataStream workflow that first reads from Kafka, then has an AsyncFunction to read from the SQL database.
If there are often duplicate keys in the Kafka-based stream, you could keyBy(key) before the AsyncFunction, and then cache the result of the SQL query.
— Ken
HI thanks for the reply i will try to break your reply to the flow execution order .
First data stream Will use AsyncIO and select the table ,
Second stream will be kafka and the i can join the stream and map it ?
If that the case then i will select the table only once on load ?
How can i make sure that my stream table is "fresh" .
Im thinking to myself , is thire a way to use flink backend (ROKSDB) and create read/write through
macanisem ?
Thanks
miki
--------------------------------------------
+1 530-210-6378