Question

I am currently trying to debug the memory usage of my Python program (on Windows with CPython 2.7). But unfortunately, I can't even find any way to reliably measure the amount of memory it's currently using.

I've been using the Task Manager/Resource Monitor to measure the process memory, but this appears to only be useful for determining peak memory consumption. Often times Python will not reduce the Commit or Working Set even long after the relevant objects have been garbage collected.

Is there any way to find out how much memory Python is actually using, or failing that, to force it to free up its unused memory? I'd prefer not to use anything that would require recompiling the interperter.

An example of the behavior that proves it isn't freeing unused memory:

(after some calculations)   # 290k
gc.collect()                # still 290k
x = range(9999999)          # 444k
del x                       # 405k
gc.collect()                # 40k
Was it helpful?

Solution

Is there any way to find out how much memory Python is actually using,

Not from with-in Python.

You can get a rough idea of memory usage per object using sys.getsizeof however that doesn't capture total memory usage, overallocations, fragmentation, memory unused but not freed back to the OS.

There is a third-party tool called Pympler that can help with memory analysis. Also, there a programming environment called Guppy for object and heap memory sizing, profiling and analysis. And there is a similar project called PySizer with a memory usage profiler for Python code.

or failing that, to force it to free up its unused memory?

There is no public API for forcing memory to be released.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top