Frage

Consider a dictionary of dictionaries:

bigDict = { 'user1': userDict1, 'user2': userDict2 } 

Say it contains millions of users and I want to free the memory after I'm done processing the whole thing.

Will the following suffice:

del bigDict

Or do I need to do something like:

for userId, userDict in bigDict.iteritems():
    del userDict
War es hilfreich?

Lösung

If the only reference to your dictionaries is in bigDict, just deleting it will trigger the garbage collector. If there were references outside, deleting each and every one of them using your loop will not do anything, because del just means "reduce in one the reference counting", and if and only if the counter gets to zero, the object will be deleted.

The easiest way to make sure your object is freed is to create and use it in a closure, when it is possible. So you have a function that inside the body creates the big dictionary, process it, and once it is done, returns a set of values. This way you are certain there are no stray references. Of course, in most practical purposes, this is not really possible.

It must be noted that if you have many small objects alive at the same time, their memory will not be returned to the OS, but kept as a pool for new objects. (This behaviour has been partially improved in Python 3). If memory management is critical, you may want to use array objects, so they are not small objects anymore (and save some overhead in the process).

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top