Frage

The setup: I have a unmanaged/native Win32 application that I inject my code into. My code is primarily managed, written in C++/CLI and compiled as a DLL. My loader/injector patches the application's crt0 startup code (basically swaps the call to the entry point function with one to LoadLibrary). On attaching to the host process, the DLL loads several component DLLs into memory (the injected DLL itself is written in unmanaged code, only the component DLLs are managed assemblies) and patches various memory locations to act as interfaces to the component DLLs.

Now my code in the component DLLs seems to be triggering a memory leak [ as mentioned in Proper Object Disposal In C++/CLI ] and I've been looking into debugging it with the help of the .NET memory Profiler tool. Unfortunately, my unusual environment hinders the full use of the tool. Attempting to attach the profiler to the process causes it to report that concurrent GC has been enabled for the process, which prevents it from using its internal API to track references and such.

I've tried to create an config file to disable that mode of GC but it seems to have no effect - The profiler continues to display the error message (I'm assuming this is because the host app is primarily native and therefore doesn't parse the config file at startup). I've also tried modifying the machine.config file, in vain.

Would there be some other way to forcefully disable the concurrent garbage collector?

War es hilfreich?

Lösung

Fixed: I forced the server mode of GC by using the appropriate flag with CorBindToRuntimeEx().

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top