Getting started with Flink / scala, I wonder whether the scala base library should be excluded as a best practice:
// exclude Scala library from assembly assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false) Also I would like to know if https://github.com/tillrohrmann/flink-project is the most up to date getting started with flink-scala sample project you would recommend. Best, Georg |
If you want to really learn then I recommend you to start with a flink project that contains unit tests and integration tests (maybe augmented with https://wiki.apache.org/hadoop/HowToDevelopUnitTests to simulate a HDFS cluster during unit tests). It should also include coverage reporting. These aspects are equally crucial to know for developers to develop high quality big data applications and virtually all companies will require that you know these things. I am not sure if a hello world project in Flink exists containing all these but it would be a good learning task to create such a thing.
|
Thanks, this sounds like a good idea - can you recommend such a project? Jörn Franke <[hidden email]> schrieb am Mi., 29. Nov. 2017 um 22:30 Uhr:
|
You would suggest: https://github.com/ottogroup/flink-spector for unit tests? Georg Heiler <[hidden email]> schrieb am Mi., 29. Nov. 2017 um 22:33 Uhr:
|
Free forum by Nabble | Edit this page |