Question

I'm using a script to test if a website runs smoothly, basically I open the site every 20 minutes or so and check the response time and so on. Like this:

while True:

    MechBrowser = mechanize.Browser()
    Response = MechBrowser.open("http://example.com")
    time.sleep(1000)

I know python will do garbage collection itself and we should really not bother, but when I check network monitor I always find several unclosed connection each running 1h or more. And not all of the connection opened would hang there, just some of them. I'm confused, or maybe there's a method to destroy these instances manually?

Was it helpful?

Solution

Try also closing your response object.

OTHER TIPS

delthe object manually, notice that this will not delete the object but will just decrement the reference count of the object. When the reference countof an object reaches zero the garbage collector removes it from the memory.

You can also use multiprocessing to ensure all the used resources are closed after checking:

from multiprocessing import Process
import time
import urllib2

def check_url(url):
    try:
        f = urllib2.urlopen(url) 
        f.close()
        print "%s working fine" % url
    except Exception as exc:
        print "Error ", exc

if __name__ == '__main__':
    while True:
        p = Process(target=check_url, args=("http://www.google.com", ))
        p.start()
        p.join()
        time.sleep(5)
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top