Вопрос

I'm developing a desktop application that we are having memory problems with. The technologies are .Net, C++ and Fortran. The application is currently built in x86 due to restrictions from 3rd party components. It is a memory intensive application which will frequently create large arrays (up to 50Mb during the calculation process). With specific situations I can get the application to run out of memory relatively quickly with task manager only showing 350Mb allocated to the process. The application will fail to allocate a 50Mb array and throw an error. This allocation error can occur in either the Fortran or .Net. I've been trying to diagnose the error using ANTS memory profiler but this shows that there is no Large Object Heap fragmentation which to me indicates that there is not a problem with memory fragmentation in .Net.

Is it still possible that this is memory fragmentation given that ANTS is claiming that there is very little even stored on the large object heap at the point of allocating an array? If so what tools are available to diagnose and deal with this sort of problem?

Это было полезно?

Решение

i do not know how much influence this would have but may be try activating low fragmentation heap

additional to that you may try to avoid continously allocating and deallocating space. If you frequently require larger temporary memory blocks you may allocate a big enough block at one time and give this as working memory to your subroutines.

You also may overload new operator for some specific classes and implement own smarter memory handling.

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top