Question

I have a dictionary saved in a file. I load the dictionary into memory from a python interactive shell, and my system monitor says that the python process consumes 4GB. The following commands give the following outputs:

size1 = sys.getsizeof(mydict)/(1024**2)
print size1

96

size2 = 0
for i in mydict.keys():
    size2 += sys.getsizeof(i)
print size2/(1024**2)

37

size3 = 0
for i in mydict.keys():
    size3 += sys.getsizeof(mydict[i])
print size3/(1024**2)

981

size4 = 0
for i in mydict.keys():
    for j in mydict[i]:
        size4 += j
print size4/(1024**2)

2302

print str(size1 + size2 + size3 + size4)

3416

Now if i delete the dictionary

del(mydict)
gc.collect()

less than 400MB are freed from memory. Even if i remove first all the items one by one from the lists inside the dictionary, the memory freed is no more than 450-500 MB.. So i end up with no variables in my shell, but still 3,5GB are consumed.. Can anyone explain what is happening?

Was it helpful?

Solution

There are two things to keep in mind:

Even if you delete the entire object, python might reserve that memory for further use(instead of allocating memory once again later). The footprint that python leaves in the os won't significantly change.

On Linux and UNIX-based systems processes do not necessarily give back the allocated memory until the application dies.

In order to successfully release the memory you might want to take a look at the garbage collector. With it you can force python to free the allocated memory.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top