Question

I start some java code with -Xmx1024m, and at some point I get an hprof due to OOM. The hprof shows just 320mb, and give me a stack trace:

 at java.util.Arrays.copyOfRange([CII)[C (Arrays.java:3209)
  at java.lang.String.<init>([CII)V (String.java:215)
  at java.lang.StringBuilder.toString()Ljava/lang/String; (StringBuilder.java:430)
  ...

This comes from a large string I am copying.

I remember reading somewhere (cannot find where) what happened is these cases is:

  • process still has not consumed 1gb of memory, is way below
  • even if heap still below 1gb, it needs some amount of memory, and for copyOfRange() it has to be continuous memory, so even if it is not over the limit yet, it cannot find a large enough piece of memory on the host, it fails with an OOM.

I have tried to look for doc on this (copyOfRange() needs a block of continuous memory), but could not find any.

The other possible culprit would be not enough permgen memory.

Can someone confirm or refute the continuous memory hypothesis? Any pointer to some doc would help too.

Was it helpful?

Solution

If you are using the concurrent mark sweep collector you can get fragmentation. however for new objects, provided there is a enough young generation space you don't need to worry about fragmentation as the free Eden space is always continuous.

In many applications, only a small portion of the heap is given to the young generation so if you have a fragmented tenured space and you create a relatively small object (as small as 5% of the maximum memory size) you can get an OutOfMemoryError.

Given you will have very poor performance if you run close to the maximum memory, I would suggest you either make you application use less memory or increase the maximum. This increases the your generation size as well. Alternatively you could set -XX:NewSize=512m

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top