Re: Cannot write record to fresh sort buffer. Record too large.

Posted by Kurt Young on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Cannot-write-record-to-fresh-sort-buffer-Record-too-large-tp13644p13666.html

Hi,

I think the reason is your record is too large to do a in-memory combine. You can try to disable your combiner.

Best,
Kurt

On Mon, Jun 12, 2017 at 9:55 PM, Sebastian Neef <[hidden email]> wrote:
Hi,

when I'm running my Flink job on a small dataset, it successfully
finishes. However, when a bigger dataset is used, I get multiple exceptions:

-  Caused by: java.io.IOException: Cannot write record to fresh sort
buffer. Record too large.
- Thread 'SortMerger Reading Thread' terminated due to an exception: null

A full stack trace can be found here [0].

I tried to reduce the taskmanager.memory.fraction (or so) and also the
amount of parallelism, but that did not help much.

Flink 1.0.3-Hadoop2.7 was used.

Any tipps are appreciated.

Kind regards,
Sebastian

[0]:
http://paste.gehaxelt.in/?1f24d0da3856480d#/dR8yriXd/VQn5zTfZACS52eWiH703bJbSTZSifegwI=