Domanda

I'm writing a program that downloads files anywhere up to a 1Gb in size. Right now I'm using the requests package to download files, and although it works (I think it times out sometimes) it is very slow. I've seen some examples multi-part download examples using urllib2 but I'm looking for a way to use urllib3 or requests, if that package has the ability.

Nessuna soluzione corretta

Altri suggerimenti

How closely have you looked at requests' documentation?

In the Quickstart documentation the following is described

r = requests.get(url, stream=True)
r.raw.read(amount)

The better way, however, to do this is:

fd = open(filename, 'wb')
r = requests.get(url, stream=True)
for chunk in r.iter_content(amount):
    fd.write(chunk)
fd.close()

(Assuming you are saving the downloaded content to a file.)

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top