Re: Cannot write record to fresh sort buffer. Record too large.

Posted by Stephan Ewen on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Cannot-write-record-to-fresh-sort-buffer-Record-too-large-tp13644p13748.html

Here are some pointers

  - You would rather need MORE managed memory, not less, because the sorter uses that.

  - We added the "large record handler" to the sorter for exactly these use cases. Can you check in the code whether it is enabled? You'll have to go through a bit of the code to see that. It is an older Flink version, I am not quite sure any more how exactly it was there.

Stephan


On Wed, Jun 14, 2017 at 8:59 PM, Ted Yu <[hidden email]> wrote:
For #2, XmlInputFormat was involved.

Is it possible to prune (unneeded) field(s) so that heap requirement is lower ?

On Wed, Jun 14, 2017 at 8:47 AM, Sebastian Neef <[hidden email]> wrote:
Hi Ted,

sure.

Here's the stack strace with .distinct() with the Exception in the
'SortMerger Reading Thread': [1]

Here's the stack strace without .distinct() and the 'Requested array
size exceeds VM limit' error: [2]

If you need anything else, I can more or less reliably reproduce the issue.

The best,
Sebastian

[1]
http://paste.gehaxelt.in/?2757c33ed3a3733b#jHQPPQNKKrE2wq4o9KCR48m+/V91S55kWH3dwEuyAkc=
[2]
http://paste.gehaxelt.in/?b106990deccecf1a#y22HgySqCYEOaP2wN6xxApGk/r4YICRkLCH2HBNN9yQ=