Pregunta

I'm writing a program that downloads files anywhere up to a 1Gb in size. Right now I'm using the requests package to download files, and although it works (I think it times out sometimes) it is very slow. I've seen some examples multi-part download examples using urllib2 but I'm looking for a way to use urllib3 or requests, if that package has the ability.

No hay solución correcta

Otros consejos

How closely have you looked at requests' documentation?

In the Quickstart documentation the following is described

r = requests.get(url, stream=True)
r.raw.read(amount)

The better way, however, to do this is:

fd = open(filename, 'wb')
r = requests.get(url, stream=True)
for chunk in r.iter_content(amount):
    fd.write(chunk)
fd.close()

(Assuming you are saving the downloaded content to a file.)

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top