I have functioning code which uploads files from the user to Azure BLOB storage of any size. The issue is that if the files is larger than a certain size, I cut to chunked uploads (I also support continue-uploading).

While chunk uploading, the filereader is very slow. I am unsure of how to fix this -- I assume with webworkers? Would that involve Parallel.js?

Another flaw is that I require being able to upload on IE9 eventually -- although it currently only works with HTML5 because I use Filereader. How am I supposed to do it without web workers without it being soul-crushingly slow? (I have 50 mbps upload and was getting 5 mbps).

EDIT: Thanks guys, that did it. Instead of reading in file, I just:

var requestData = o['files'][0].slice(o.uploadedBytes, o.uploadedBytes + o.maxChunkSize);
o.data = requestData;
o.dataType = requestData.type;
o.type = "PUT";
o.beforeSend = function(xhr) {
    xhr.setRequestHeader('x-ms-blob-type', 'BlockBlob');    
};

And it works like a charm!

有帮助吗?

解决方案

In modern browsers you can .slice a file. This operation is very fast and solves your performance issue.

Older browsers don't support this. It's up to you to find a workaround that works, maybe a Flash uploader? Or simply limit the filesize in IE9.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top