Batch loading into postgres database

Posted by Dylan Forciea on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Batch-loading-into-postgres-database-tp39943.html

I am setting up a Flink job that will reload a table in a postgres database using the Flink SQL functionality. I just wanted to make sure that given the current feature set I am going about this the correct way. I am currently using version 1.11.2, but plan on upgrading to 1.12 soon whenever it is finalized.

 

I have setup a staging table and a final table in a postgres database. My plan is to have a Flink application that will truncate the contents of the staging table before the job begins using JDBC, run the job to completion, and then with JDBC delete/insert into the final table from the staging table in a transaction after the job completes.

 

Is this the expected way to interact with postgres in a batch job like this? Or is there some functionality or method that I am missing?

 

Regards,

Dylan Forciae