Question

Can someone please point me in the right direction.

  1. I need to create a windows timer service that will upload files in the local file system to Azure blobs.
  2. Each file (video) may be anywhere between 2GB and 16GB. Is there a limit on the size? Do I need to split the file?
  3. Because the files are very large can I throttle the upload speed to azure?
  4. Is it possible in another application (WPF) to see the progress of the uploaded file? i.e. a progress bar and how much data has been transferred and what speed it is transferring at?
Was it helpful?

Solution

The upper limit for a block blob, the type you want here, is 200GB. Page blobs, used for VHDs, can go up to 1TB.

Block blobs are so called because upload is a two-step process - upload a set of blocks and then commit that block list. Client APIs can hide some of this complexity. Since you want to control the uploads and keep track of their status you should look at uploading the files in blocks - the maximum size of which is 4MB - and manage that flow and success as desired. At the end of the upload you commit the block list.

Kevin Williamson, who has done a number of spectacular blog posts, has a post showing how to do "Asynchronous Parallel Blob Transfers with Progress Change Notification 2.0."

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top