Вопрос

If I try to upload a large file - 388.7 MB in this case - to azure blob storage using the demo code, it fails like so:

begin
  content = File.open("big_file.dat", "rb") { |file| file.read }
  blob = azure_blob_service.create_block_blob(container.name,"image-blob", content)
  puts blob.name
rescue StandardError => e
  $stderr.puts e.message
end

# RequestBodyTooLarge (413): The request body is too large and exceeds the maximum permissible limit.

I've read in the blob storage documentation that blobs can be up to 200 GB in size, so it looks as though the Ruby API doesn't correctly chunk its file uploads. Am I missing something?

Это было полезно?

Решение

The current Ruby Azure SDK does indeed have methods for doing chunked uploads, but there are no usage examples anywhere and everything in the specs is a mock, which doesn't really help.

Getting chunked uploads to work is so fiddly that this is absolutely something that should be included in the library. It took me a number of hours to get this right and I hope this code snippet helps.

Here is a very basic usage example:

class ::File
  def each_chunk(chunk_size=2**20)
    yield read(chunk_size) until eof?
  end
end

container  = 'your container name'
blob       = 'your blob name'
block_list = []
service    = Azure::BlobService.new
counter    = 1

open('path/to/file', 'rb') do |f|
  f.each_chunk {|chunk|
    block_id = counter.to_s.rjust(5, '0')
    block_list << [block_id, :uncommitted]

    # You will likely want to get the MD5 for retries
    options = {
      content_md5: Base64.strict_encode64(Digest::MD5.digest(chunk)),
      timeout:     300 # 5 minutes
    }

    md5 = service.create_blob_block(container, blob, block_id, chunk, options)
    counter += 1
  }
end

service.commit_blob_blocks(container, blob, block_list)

Give me a couple of days and I should have something more reasonably encapsulated committed to https://github.com/dmichael/azure-contrib

Другие советы

Even though block blobs can be of 200 GB in size, you can upload a block blob without splitting it in chunks if the blob size is less than 64 MB. Any blob which is greater than this size must be uploaded by splitting it into chunks where each chunk can be of a maximum 4 MB in size.

Looking at the source code for this function, this function uploads the file in one go without splitting it into chunks. Since your file size is more than 64 MB, you are getting this error. The solution would be to split your file in chunks and then upload the file using create_blob_block function.

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top