Hi,
calling DataStream.partitionCustom() with the respective arguments
before the sink should do the trick, I think.
Cheers,
Konstantin
On 14.04.2016 01:22, neo21 zerro wrote:
> Hello everybody,
>
> I have an elasticsearch sink in my flink topology.
> My requirement is to write the data in a partitioned fashion to my Sink.
>
> For example I have Tuple which contains a user id. I want to group all events by a user id and partition all events for one particular user to the same Es Sink.
>
> Is it possible to achieve something like this in Flink?
>
>
> Thanks!
>
--
Konstantin Knauf *
[hidden email] * +49-174-3413182
TNG Technology Consulting GmbH, Betastr. 13a, 85774 Unterföhring
Geschäftsführer: Henrik Klagges, Christoph Stock, Dr. Robert Dahlke
Sitz: Unterföhring * Amtsgericht München * HRB 135082