I’d like to pour some data I’ve collected into a DataSet via JDBC into a Kafka topic, but I think I need to transform my DataSet into a DataStream first. If anyone has a clue how to proceed, I’d appreciate it; or let me know if I’m completely off track. Prez Cannady p: 617 500 3378 |
You could I suppose write the dateset to a sink a file and then read the file to a data stream. On Fri, Mar 11, 2016 at 4:18 AM, Prez Cannady <[hidden email]> wrote:
|
In reply to this post by Prez Cannady-2
As data is already collected, why do you want add one more layer of Kafka. Instead you can start processing your data. On Mar 11, 2016 4:19 AM, "Prez Cannady" <[hidden email]> wrote:
|
In reply to this post by Balaji Rajagopalan
This is roughly the solution I have now. On the other hand, I was hoping for a solution that doesn’t involve checking whether a file has updated.
Prez Cannady p: 617 500 3378
|
Hi! It should be quite straightforward to write an "OutputFormat" that wraps the "FlinkKafkaProducer". That way you can write to Kafka from a DataSet program. Stephan On Fri, Mar 11, 2016 at 1:46 PM, Prez Cannady <[hidden email]> wrote:
|
Free forum by Nabble | Edit this page |