Question

I've developed a REST api to handle file uploads. Since the system handles large files the api handles upload with chunked encoding. The api work fine with JQuery's file upload plugin so there are no problems on the server side.

I'm currently working with a python script to do some batch processing with a large amount of big files. However i can't find any example, code snippets or guides on how to do a chunked upload with python. I looked up urllib2, httplib but i can't get it to work. I've also digged into curl but without any luck.

Was it helpful?

Solution

You can use Pycurl as seen in the example. If you omit the setting if pycurl.INFILESIZE, Pycurl will perform a chunked upload since it does not know the size of the data to upload.

OTHER TIPS

You need to upload the file as a multi-part form data. Can be accomplished in curl by setting the encoding type to "multipart/form-data". Similar upload can be accomplished using something like what is mentioned here - Using MultipartPostHandler to POST form-data with Python

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top