Question

I have a loop in my code that generates many byte[] arrays (around 1 to 2 MB each), fills them with data, and then discard the reference. So, even though the reference is only held for a short time, I can see the private working set growing.

Now, if I try to allocate a large array (~ 400 MB) after the loop, could I get an out of memory exception? Or will the allocation force the GC to collect the transient data?

Thanks!

Was it helpful?

Solution

Generating many 1-2MB arrays is a bad idea. Even if you avoid out of memory situation the performance really suffers. Allocating many short lived objects on the large object heap is an allocation pattern the current GC doesn't handle well.

I strongly recommend recycling them whenever possible. Implement a pool into which you throw the arrays once you don't need them anymore. And then when allocating first check if you can satisfy the request from the pool. That pattern resulted in huge performance benefits in one of my programs.

I think full memory forces a GC, but if unmanaged allocations happen around the same time you still can get OOMs.

OTHER TIPS

If you are worried about it, you can always call GC.Collect(); before that large array after the loop, that will force a garbage collection of all generations. Don't do in in the loop however, unless you are not concerned about time as this can be rather slow (too slow for a loop, not so much for a one off thing.)

This really depends. You cannot be sure that the garbage collector will discard in time. With byte arrays you are reasonably safe, but most objects are discarded too late if you make heavy use of them without using the dispose() method.
This results in out-of-memory exceptions even though you might have discarded all references.
If you are experience problems, you could try GC.Collect(0);, although this usually is ill-advisable.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top