Question

I am using django 1.3 and i am running a script outside of a web context using supervisor.

The memory usage of the process is growing every minute

The code look more or less like this :

while(1):
   for auction in auction_list:
       auction.update_auction()

    db.reset_queries()
    db.close_connection()
    sleep(1)

Adding close_connection helped me out by avoiding LOCKS on the table but now i have this growing process problem.

How could i manage things to avoid this ?

Was it helpful?

Solution

I found a solution. The close_connection() was responsible for the growing memory. It seems that it cames from the connection/disconnection from the database.

I proceeded this way:

    while(1):
        # Get auction list
        auction_list = Auction.objects.all()

        # Checking auctions
        for auction in auction_list:
            auction.update_auction()


        # Remove connection statements
        db.reset_queries()

        # Release lock tables
        db.transaction.commit_unless_managed()

        # Pause
        sleep(1)

With commit_unless_managed() the daemon keeps the connection open and doesn't grow in memory without locking completely the MyISAM table.

OTHER TIPS

Umm... because you have the code wrapped in a while(1) block? Of course it's growing out of control. You've created an infinite loop that continuously queries and updates the database until the end of eternity.

Let the process run completely, then let it die. The operating system will reclaim some or all of the python process's resources.

Or, you could consider using something like celery that's built for this.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top