schedule tasks `inside` Flink
Posted by
Michal Fijolek on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/schedule-tasks-inside-Flink-tp4903.html
Hello.
My app needs Map[K, V] as simple cache for business data, which needs to be invalidated periodically, lets say once per day.
Right now I'm using rather naive approach which is
trait Dictionary[K, V] extends Serializable {
@volatile private var cache: Map[K, V] = Map()
def lookup(id: K): Option[V] = cache.get(id)
private def fetchDictionary: Map[K, V] = ???
private def updateDictionary() = {
cache = fetchDictionary
}
val invalidate = new Runnable with Serializable {
override def run(): Unit = updateDictionary()
}
Executors.newSingleThreadScheduledExecutor().scheduleAtFixedRate(invalidate, oncePerDay)
}
This seems wrong, because I guess I should do such thing `inside` Flink, and when I stop Flink job, nobody's gonna stop scheduled invalidation tasks.
What will be idomatic Flink way to approach this problem? How can I schedule tasks and make Flink aware of them?
Thanks,
Michal