Question

When using the aws-sdk npm plugin for nods.js, I can upload a pdf on 50kb with the following code (to AWS s3):

var params = {
            Bucket: BUCKET, 
            Key: pdf_key, 
            Body: file,
            ContentType: 'application/pdf'
        };
        var s3 = new AWS.S3();

        s3.putObject(params, function(error, data) {
            console.log(data);
            console.log(error);
            if (error) {
                console.log(error);
                callback(error, null);
            } else {
                callback(null, pdf_key);
            }
        });

But when uploading a 11mb pdf, even with specifying the ContentLength, the upload just continues forever, even with a timeout of 2 minutes.

The question is how do I make aws s3 accept the large pdf file?

UPDATE

I have still not found any documentation or anwers for the question.

UPDATE 2

I will accept answers which show's this or another framework that can do this. I will need that framework to be able to also allow auth-read of the object.

UPDATE 3 I got it working for now but I haven't found a reason it shouldn't work.

Thanks in advance!

Was it helpful?

Solution

Connecting to S3 isn't fast and then depending on the network fluctuations you can get timeouts and other weird behaviors.

The code you provided is fine, but you could take advantage of multipart uploads that could solve problems especially with >5MB files.

I made a rough implementation of a multipart upload and also made it to retry the upload of any failing part up to 3 times, this will also work for smaller files than 5MB.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top