Question

I've set the max heap to 8 GB. When my program starts using about 6.4 GB (as reported in VisualVM), the garbage collector starts taking up most of the CPU and the program crashes with OutOfMemory when making a ~100 MB allocation. I am using Oracle Java 1.7.0_21 on Windows.

My question is whether there are GC options that would help with this. I'm not passing anything except -Xmx8g.

My guess is the heap is getting fragmented, but shouldn't the GC compact it?

Was it helpful?

Solution

Collecting bits and pieces of information (which is surprisingly difficult, since the official documentation is quite bad), I've determined...

There are generally two reasons this may happen, both related to fragmentation of free space (ie, free space existing in small pieces such that a large object cannot be allocated). First, the garbage collector might not do compaction, which is to say it does not defragment the memory. Even a collector that does compaction may not do it perfectly well. Second, the garbage collector typically splits the memory area into regions that it reserves for different kinds of objects, and it may not think to take free memory from the region that has it to give to the region that needs it.

The CMS garbage collector does not do compaction, while the others (the serial, parallel, parallelold, and G1) do. The default collector in Java 8 is ParallelOld.

All garbage collectors split memory into regions, and, AFAIK, all of them are too lazy to try very hard to prevent an OOM error. The command line option -XX:+PrintGCDetails is very helpful for some of the collectors in showing the sizes of the regions and how much free space they have.

It is possible to experiment with different garbage collectors and tuning options. Regarding my question, the G1 collector (enabled with the JVM flag -XX:+UseG1GC) solved the issue I was having. However, this was basically down to chance (in other situations, it OOMs more quickly). Some of the collectors (the serial, cms, and G1) have extensive tuning options for selecting the sizes of the various regions, to enable you to waste time in futilely trying to solve the problem.

Ultimately, the real solutions are rather unpleasant. First, is to install more RAM. Second, is to use smaller arrays. Third, is to use ByteBuffer.allocateDirect. Direct byte buffers (and their int/float/double wrappers) are array-like objects with array-like performance that are allocated on the OS's native heap. The OS heap uses the CPU's virtual memory hardware and is free from fragmentation issues and can even effectively use the disk's swap space (allowing you to allocate more memory than available RAM). A big drawback, however, is that the JVM doesn't really know when direct buffers should be deallocated, making this option more desirable for long-lived objects. The final, possibly best, and certainly most unpleasant option is to allocate and deallocate memory natively using JNI calls, and use it in Java by wrapping it in a ByteBuffer.

OTHER TIPS

Which garbage collector are you using? CMS doesn't do any compaction. Try using the new G1 garbage collector - this does some compaction.

For a bit of context: the G1 garbage collector, or `Garbage First' collector splits up the heap into chunks and after identifying (marking) all the garbage it will evacuate a chunk by copying all the live bits into a different chunk - this is what achieves compaction.

To use include the option -XX:+UseG1GC

This gives a great explanation of G1 and garbage collection in Java in general.

Whenever this problem has show up in the past, the actual free memory was much lower than it appeared. You can print the amount of free memory when an OutOfMemoryError occurs.

try {
    byte[] array = new byte[largeMemorySize];

} catch(OutOfMemroyError e) {
    System.out.printf("Failed to allocate %,d bytes, free memory= %,d%n", 
        largeMemorySize, Runtime.getRuntime().freeMemory());
    throw e;
}

Most likely, you are trying to allocate a large amount of contiguous memory, but all of the free memory is in little bits and pieces all over the place. Also, when the garbage collector starts taking up all of the processing time, that means that it is currently in the process of trying to find the maybe 1 or 2 objects in your whole set of objects that can be freed. In this case, all I think you can do is work on breaking your objects down so that they do not need quite as much continuous memory (at least, not all at one time).

Edit: As far as I know, there is no way that you can get Java to pack the memory so that you can use that full 8 GB, as it would involve the Garbage Collector having to pause all of the other threads, moving their objects around, updating all of the references to those objects, then refreshing stale cache entries, and so on...a very, very expensive operation.

See this about Memory Fragmentation

-Xmx only ensures that the heap will not exceed 8GB in size but makes no guarantees that this much memory will actually be allocated. Does your machine only have 8GB of memory?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top