Hi,
when I'm running my Flink job on a small dataset, it successfully
finishes. However, when a bigger dataset is used, I get multiple exceptions:
- Caused by: java.io.IOException: Cannot write record to fresh sort
buffer. Record too large.
- Thread 'SortMerger Reading Thread' terminated due to an exception: null
A full stack trace can be found here [0].
I tried to reduce the taskmanager.memory.fraction (or so) and also the
amount of parallelism, but that did not help much.
Flink 1.0.3-Hadoop2.7 was used.
Any tipps are appreciated.
Kind regards,
Sebastian
[0]:
http://paste.gehaxelt.in/?1f24d0da3856480d#/dR8yriXd/ VQn5zTfZACS52eWiH703bJbSTZSife gwI=
Free forum by Nabble | Edit this page |