Question

I set up a test script almost exactly like in the example here: https://github.com/GoodCloud/django-ajax-uploader

It seems to start uploading the file (javascript updates the name and size of the file), but the view gives me a 500 error with this message. I can't find anything on how to fix it.

S3ResponseError: S3ResponseError: 400 Bad Request
<Error><Code>MalformedXML</Code><Message>The XML you provided was not well-formed or did not validate against our published schema</Message><RequestId>26E6EF8296A0E585</RequestId><HostId>F4QUOsVT4LxC+6OUP2lE1/9uPC77keOejyWs57GpS5kjvHXpun3U+81ntL8ZTgDa</HostId></Error>

I was able to upload a file in the shell with boto using the commands here: Upload 0 byte file to Amazon S3

The view:

from ajaxuploader.views import AjaxFileUploader
from ajaxuploader.backends.s3 import S3UploadBackend

import_uploader = AjaxFileUploader(backend=S3UploadBackend)

javascript:

var uploader = new qq.FileUploader({
    action: "/ajax/profile-upload/",
    element: $('#file-uploader')[0],
    multiple: true,
    onComplete: function(id, fileName, responseJSON) {
        if(responseJSON.success) {
            alert("success!");
        } else {
            alert("upload failed!");
        }
    },
    onAllComplete: function(uploads) {
        // uploads is an array of maps
        // the maps look like this: {file: FileObject, response: JSONServerResponse}
        alert("All complete!");
    },
    params: {
        'csrf_token': $('[name=csrfmiddlewaretoken]').val(),
        'csrf_name': 'csrfmiddlewaretoken',
        'csrf_xname': 'X-CSRFToken',
    },
});

template:

<div id="file-uploader">       
    <noscript>          
        <p>Please enable JavaScript to use file uploader.</p>
    </noscript>         
</div>

I have the s3 access variables in my settings.py file (they are called in the ajaxuploader/backends/s3.py file):

AWS_ACCESS_KEY_ID = myAccessKey
AWS_SECRET_ACCESS_KEY = mySecretKey
AWS_BUCKET_NAME = bucketName

No correct solution

OTHER TIPS

I solved this problem with a custom s3 backend that override the upload function & use django-storages instead of boto to save files. try this :

from ajaxuploader.backends.base import AbstractUploadBackend
from django.core.files.storage import default_storage

class S3CustomUpload(AbstractUploadBackend):
    NUM_PARALLEL_PROCESSES = 4

    def upload_chunk(self, chunk):
        #save file to s3
        self._fd.write(chunk)
        self._fd.close()

    def setup(self, filename):
        self._fd = default_storage.open('%s/%s' % ('uploads/materials/', str(filename)), 'wb')

    def upload(self, uploaded, filename, raw_data, *args, **kwargs):
        try:
            if raw_data:
                # File was uploaded via ajax, and is streaming in.
                chunk = uploaded.read(self.BUFFER_SIZE)
                while len(chunk) > 0:
                    self.upload_chunk(chunk, *args, **kwargs)
                    chunk = uploaded.read(self.BUFFER_SIZE)
            else:
                # File was uploaded via a POST, and is here.
                for chunk in uploaded.chunks():
                    self.upload_chunk(chunk, *args, **kwargs)
            return True
        except:
            # things went badly.
            return False

    def upload_complete(self, request, filename, *args, **kwargs):
        upload = Upload()
        upload.upload = settings.S3_URL + "uploads/materials/"+ filename
        upload.name = filename
        upload.save()

        return {'pk': upload.pk}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top