Hi
I have attached fictive example (pdf-file) about processing of event traces from data streams (or batch data). I hope the picture of the attachment is clear and understandable.
I would be very interested in how best to solve it with Flink. Or it is possible or not ? If it is possible, can it be solved for example by CEP, Table/SQL or Gelly?
Little explanations.. Data processing reads three different and parallel streams (or batch data): A, B and C. Each of them have events which have different “keys with value” (like K1-K4) or record.
I would want to find all event traces, which have certain dependences or patterns between streams (or batches). To find pattern there are three steps:
1)
Searches an event that have value “X” in K1 in stream A and if it is found, stores it to global data for later use and continues next step
2)
Searches an event that have value A(K1) in K2 in stream B and if it is found, stores it to global data for later use and continues next step
3)
Searches an event that have value A(K1) in K1 and value B(K3) in K2 in stream C and if it is found, continues next step (back to step 1)
If that is not possible by Flink, do you have any idea of tools, which can solve this ?
Best, Esa
Free forum by Nabble | Edit this page |