Question

I'm in a situation where I need to push image storage for a number of websites out to a service that can scale indefinitely (S3, CloudFiles, etc.). Up until this point we've been able to allow our users to generate custom thumbnail sizes on the fly using Python's Imaging library with some help from sorl-thumbnail in Django.

By moving our images to something like S3, the ability to quickly create thumbnails on the fly is lost. We can either:

  1. Do it slowly by downloading the source from S3 and creating the thumbnail locally
    con: it is slow and bandwidth intensive
  2. Do it upfront by creating a pre-determined set of thumbnail sizes (a'la Flickr) and pushing them all to S3
    con: it limits the sizes that can be generated and stores lots of files that will never be used
  3. Let the browser resize using the height/width attributes on the img tag.
    con: extra bandwidth used by downloading larger than necessary files

At this point #3 looks to be a simple solution to the problem with few drawbacks. Some quick tests and data from this website show that the quality isn't as bad as expected (we could assure the aspect ratio is maintained).

Any suggestions on other options or drawbacks we might not be taking into consideration?

note: The images are digital photos and are only used for display on the web. Sizes would range from 1000-50 pixels in height/width.

Was it helpful?

Solution

I would recommend using EC2 to scale the images on demand. Since bandwidth between EC2 and S3 is free and it should be fast I think that eliminates all the problems with solution #1.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top