Pregunta

python 2.7.3 under linux: getting strange behaviour when trying to use the timeout parameter

from urllib2 import urlopen, Request, HTTPError, URLError

url = "http://speedtest.website-solution.net/speedtest/random350x350.jpg"

try:
    #f = urlopen(url, timeout=30)   #never works - always times out
    f = urlopen(url)    #always works fine, returns after < 2 secs
    print("opened")
    f.close()
    print("closed")

except IOError as e:
    print(e)
    pass

EDIT:

Digging into this more, it seems lower level.. the following code has the same issue:

    s = socket.socket()
    s.settimeout(30)
    s.connect(("speedtest.website-solution.net", 80))    #times out
    print("opened socket")
    s.close()

It's running behind a socks proxy. Running using tsocks python test.py. Wonder if that can be screwing up the socket timeout for some reason? Seems strange that timeout=None works fine though.

¿Fue útil?

Solución

OK.. figured it out. This is indeed related to the proxy. No idea why, but the following code seems to fix it:

Source: https://code.google.com/p/socksipy-branch/

Put this at the start of the code:

import urllib2
from urllib2 import urlopen, Request, HTTPError, URLError
import httplib

import socks
import socket
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5, "192.168.56.1", 101)
socks.wrapmodule(urllib2)

Now everything works fine..

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top