Question

Lets say you have a fragment of the page which displays the most recent posts, and you expire it in 30 minutes. I'm using Rails here.

<% cache("recent_posts", :expires_in => 30.minutes) do %>
  ...
<% end %>

Obviously you don't need to do the database lookup to get the most recent posts if the fragment exists, so you should be able to avoid that overhead too.

What I'm doing now is something like this in the controller which seems to work:

unless Rails.cache.exist? "views/recent_posts"
  @posts = Post.find(:all, :limit=>20, :order=>"updated_at DESC")
end

Is this the best way? Is it safe?

One thing I don't understand is why the key is "recent_posts" for the fragment and "views/recent_posts" when checking later, but I came up with this after watching memcached -vv to see what it was using. Also, I don't like the duplication of manually entering "recent_posts", it would be better to keep that in one place.

Ideas?

Was it helpful?

Solution

Evan Weaver's Interlock Plugin solves this problem.

You can also implement something like this yourself easily if you need different behavior, such as more fine grained control. The basic idea is to wrap your controller code in a block that is only actually executed if the view needs that data:

# in FooController#show
@foo_finder = lambda{ Foo.find_slow_stuff }

# in foo/show.html.erb
cache 'foo_slow_stuff' do
  @foo_finder.call.each do 
    ...
  end
end

If you're familiar with the basics of ruby meta programming it's easy enough to wrap this up in a cleaner API of your taste.

This is superior to putting the finder code directly in the view:

  • keeps the finder code where developers expect it by convention
  • keeps the view ignorant of the model name/method, allowing more view reuse

I think cache_fu might have similar functionality in one of it's versions/forks, but can't recall specifically.

The advantage you get from memcached is directly related to your cache hit rate. Take care not to waste your cache capacity and cause unnecessary misses by caching the same content multiple times. For example, don't cache a set of record objects as well as their html fragment at the same time. Generally fragment caching will offer the best performance, but it really depends on the specifics of your application.

OTHER TIPS

What happens if the cache expires between the time you check for it in the controller and the time it's beeing checked in the view rendering?

I'd make a new method in the model:

  class Post
    def self.recent(count)
      find(:all, :limit=> count, :order=>"updated_at DESC")
    end
  end

then use that in the view:

<% cache("recent_posts", :expires_in => 30.minutes) do %>
  <% Post.recent(20).each do |post| %>
     ...
  <% end %>
<% end %>

For clarity, you could also consider moving the rendering of a recent post into its own partial:

<% cache("recent_posts", :expires_in => 30.minutes) do %>
  <%= render :partial => "recent_post", :collection => Post.recent(20) %>
<% end %>

You may also want to look into

Fragment Cache Docs

Which allow you to do this:

<% cache("recent_posts", :expires_in => 30.minutes) do %>
  ...
<% end %>

Controller

unless fragment_exist?("recent_posts")
  @posts = Post.find(:all, :limit=>20, :order=>"updated_at DESC")
end

Although I admit the issue of DRY still rears its head needing the name of the key in two places. I usually do this similar to how Lars suggested but it really depends on taste. Other developers I know stick with checking fragment exist.

Update:

If you look at the fragment docs, you can see how it gets rid of needing the view prefix:

# File vendor/rails/actionpack/lib/action_controller/caching/fragments.rb, line 33
def fragment_cache_key(key)
  ActiveSupport::Cache.expand_cache_key(key.is_a?(Hash) ? url_for(key).split("://").last : key, :views)
end

Lars makes a really good point about there being a slight chance of failure using:

unless fragment_exist?("recent_posts")

because there is a gap between when you check the cache and when you use the cache.

The plugin that jason mentions (Interlock) handles this very gracefully by assuming that if you are checking for existence of the fragment, then you will probably also use the fragment and thus caches the content locally. I use Interlock for these very reasons.

just as a piece of thought:

in application controller define

def when_fragment_expired( name, time_options = nil )
        # idea of avoiding race conditions
        # downside: needs 2 cache lookups
        # in view we actually cache indefinetely 
        # but we expire with a 2nd fragment in the controller which is expired time based
        return if ActionController::Base.cache_store.exist?( 'fragments/' + name ) && ActionController::Base.cache_store.exist?( fragment_cache_key( name ) )

        # the time_fraqgment_cache uses different time options
        time_options = time_options - Time.now if time_options.is_a?( Time )

        # set an artificial fragment which expires after given time
        ActionController::Base.cache_store.write("fragments/" + name, 1, :expires_in => time_options )

        ActionController::Base.cache_store.delete( "views/"+name )        
        yield    
  end

then in any action use

    def index
when_fragment_expired "cache_key", 5.minutes
@object = YourObject.expensive_operations
end
end

in view

cache "cache_key" do
view_code
end
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top