Question

Does anyone know about any solutions for keeping data bigger, than 1mb in memcached ?

This is no big deal to cut big data to 1mb pieces before setting and merge this pieces after getting from memcached. And this algorithm can work transparently for users.

This can works on the base of this snippet http://www.djangosnippets.org/snippets/488/

Was it helpful?

Solution

You can ask memcached to increase the ceiling, but it's quite often a sign that you're doing something wrong.

Most of the time, when we dig into the things people are trying to do with larger objects, they are backing themselves into a corner and start asking questions like "OK, now how can I request just part of this object?"

Sometimes, there's a legitimate need for larger objects (so we support that).

OTHER TIPS

Have you checked if gzipping your sitemap helps? For me it reduced my sitemaps to 200kb and now they fit perfectly into memcached. And sitemap bots don't have any problems with gzip these days.

I had a similar problem caching long lists with QuerySet results. MyModel.object.defer('huge_data_field1','huge_data_field2') helped to exclude huge amount of data from the results and solved the problem for me. Hope will help someone else too.

You could setup a cron job that gets the sitemap (generated by Django's sitemap framework) and saves this to a file. Then serve that file as static media (via nginx or apache, not via django ofcourse...)

curl -o /path-to-static-media/sitemap.xml http://yoursite.com/view-that-generates-sitemap/sitemap.xml
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top