Why does setting the -Xmx too high sometimes cause the JVM to fail, even if there's available RAM?

StackOverflow https://stackoverflow.com/questions/8751480

Domanda

Basically we've noticed that on some computers setting the JVM option -Xmx (max heap size) sometimes cause the JVM to fail to initialize, even if there's more than adequate RAM on the system.

So for example, on a 4gb machine, we have -Xmx1024m which fails but -Xmx800m works. I could understand on a 1gb machine, even a 2gb machine, but on a 4gb machine, especially considering that Windows, Linux, etc. can swap the RAM out, why does this fail?

I've seen a lot of threads and questions saying to reduce your max heap size, but no one can explain why it fails which is what I'm really looking for.

As well, how do you say consume as much memory as you want up to a certain size then?

È stato utile?

Soluzione

It's possible that this is due to virtual address space fragmentation. It may not be possible to reserve a contiguous 1024MB address range for the maximum potential size of the heap, depending on the load addresses of DLLs, threads' stack locations, immovable native memory allocations, kernel reserved addresses and so forth, especially in a 32-bit process.

Altri suggerimenti

I came across this issue a while ago with Windows XP. One most XP machines I could allocate 1400MB, while others were only 1200MB. The consensus was fragmentation as Jeffrey Hantin says in the other answer.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top