Re: Cannot write record to fresh sort buffer. Record too large.

Posted by Ted Yu on
URL: http://deprecated-apache-flink-user-mailing-list-archive.369.s1.nabble.com/Cannot-write-record-to-fresh-sort-buffer-Record-too-large-tp13644p13665.html

Sebastian:
Are you using jdk 7 or jdk 8 ?

For jdk 7, there was bug w.r.t. code cache getting full which affects performance.

https://bugs.openjdk.java.net/browse/JDK-8051955

https://bugs.openjdk.java.net/browse/JDK-8074288

http://blog.andresteingress.com/2016/10/19/java-codecache


Cheers

On Mon, Jun 12, 2017 at 1:08 PM, Flavio Pompermaier <[hidden email]> wrote:
Try to see of in the output of dmesg command there are some log about an OOM. The OS logs there such info. I had a similar experience recently... see [1]

On 12 Jun 2017 21:51, "Sebastian Neef" <[hidden email]> wrote:
Hi Stefan,

thanks for the answer and the advise, which I've already seen in another
email.

Anyway, I played around with the taskmanager.numberOfTaskSlots and
taskmanager.memory.fraction options. I noticed that decreasing the
former and increasing the latter lead to longer execution and more
processed data before the failure.

The error messages and exceptions from an affected TaskManager are here
[1]. Unfortunately, I cannot find a java.lang.OutOfMemoryError in here.

Do you have another idea or something to try?

Thanks in advance,
Sebastian


[1]
http://paste.gehaxelt.in/?e669fabc1d4c15be#G1Ioq/ASwGUdCaK2rQ1AY3ZmCkA7LN4xVOHvM9NeI2g=