Question

I am trying to upload a file to amazon S3 from GAE.

I tried the official amazon sdk (jetS3t, built on top of the lower-level sdk), just to find out that even if you can get it to work locally by setting permissions on the local JVM it is not supported for GAE crypto-related reasons once you deploy it.

Then out of desperation I found that some good soul forked the official low-level amazon sdk so that it would work with GAE. This kind of works (even though I can see some strage NullPointer exceptions being thrown here and there) and the file gets uploaded ... but if the file size exceeds 5MB I am getting a error from within the API:

com.google.apphosting.api.ApiProxy$RequestTooLargeException: The request to API call urlfetch.Fetch() was too large

I don't fully understand this as the current GAE limitations seem to be 32MB on file size upload and 1MB on request/response while my problem is occurring only when the file is around 5MB or bigger.

I think my only alternative left is jclouds, but I am having trouble finding examples of uploading files to S3 using the BlobStore library.

Does anyone have experience/examples to share of S3 file upload with jClouds? And am I likely to incur in the same urlfetch.Fetch() was too large error?

Any help appreciated.

Was it helpful?

Solution

URLFetch requests are limited to 5MB, as documented here. The only solutions that will work are those that involve breaking up a large payload into smaller chunks. Fortunately, S3 provides a multipart upload API.

OTHER TIPS

On the release notes for 1.5.0, I read In response to popular demand, the HTTP request and response sizes have been increased to 32 MB. So, request and response, URL fetch is not mentioned.

Indeed, looking at URL Fetch documentation, it says it's max 5 Mb.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top