Question

I have a REST API where I would like to cache the JSON response of the index (GET /foo) and the read actions (GET /foo/1) to significantly increase the performance. When there is a POST or a PUT on a resource the cache entries for the index and read results need to be expired, so no old content is served.

Is this a scenario that's best done with a Reverse proxy like Squid / Varnish or would you choose memcache(d)?

Was it helpful?

Solution

Using a reverse proxy that sits on the HTTP layer is more transparent. That means that it's possible to see what's going on over the wire. The bad thing is that few of these support caching authenticated responses, so their efficiency may drop to 0 if your resources require authentication. Reverse proxies also don't usually automatically expire resource A (/foo) when this completely unrelated resource B (/foo/1) is modified. That's correct behaviour that you'd have to add to your solution somehow.

Both of these problems can be solved if you use memcached, since it doesn't have the transparency requirement.

OTHER TIPS

I would go for a reverse proxy like varnish because you can implement (and test) your service without involving cache logic, and add caching as a separate layer. You can upgrade/restart your service while varnish serves old results for GET request (great for availability), and it's easy to setup rules in varnish to invalide (purge) existing cache results based on specific GET/POST actions.

If you want to employ distributed memory memcached is a good solution. https://github.com/cpatni/middleman is a reverse proxy which uses memcached for caching.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top