Today we're releasing a backport gem of a new feature in Rails 5—cache keys for ActiveRecord collections. It tracks fairly closely with the implementation seen in the Rails source but has a few minor differences arising from support back to AR 3. We've certainly been missing this feature for a long time and hope it comes in handy.
Rails already has fairly advanced fragment caching available at the view layer—if you wanted to cache the rendered representation of a collection of objects, you could do something like this:
<table>
<% cache @cached_things do %>
<% @cached_things.each do |cached_thing| %>
<tr>
<td><%= cached_thing.name %></td>
<td><%= cached_thing.description %></td>
<td><%= cached_thing.status %></td>
</tr>
<% end %>
<% end %>
</table>
With caching enabled, you'd see a fragment read/write in your console that looks something like the following:
views/cached_things/1-20160404161453044590000/cached_things/2-20160404161002789386000/fda68e69e4c5ced06fcdd5a88ebd2fe5
That key includes a few data points. There's a key for each record in the collection that includes the id
and updated_at
timestamp. There's also a digest of the template fragment's content, so the cache can be invalidated when the cached part of the template changes. This is pretty great, but it requires that each instance of CachedThing
be initialized by ActiveRecord. As collection sizes grow this begins to significantly cut into the benefits of caching. Implementing #cache_key
at the collection level allows us to cache upstream of AR object initialization, making rendering from cache a matter of linear time.
Add it to your Gemfile
:
gem 'activerecord-collection_cache_key'
and run bundle
. The gem will be autorequired and will augment ActiveRecord::Base
with a .collection_cache_key
method, as well as adding a #cache_key
method to Activerecord::Relation
You can call #cache_key
on any activerecord relation to get a deterministic key for use with the Rails cache interface, and caching of a response can be implemented at the controller layer for maximum performance gains:
def index
@collection = CachedThing.where(status: 'published')
Rails.cache.fetch(@collection.cache_key) do
respond_with(@collection)
end
end
This is a simplistic example; a real-world implementation would most likely need to handle things like response format, query params, or anything else that could manipulate the output in a unique way given the same collection. You'll be able to read all about how we're using it in an upcoming post.
Cache keys look the same as the official Rails implementation:
#{model_name}/query-#{md5_digest}-#{collection_size}-#{timestamp}
Where the MD5 digest is a hash of the output of Relation#to_sql
, and the timestamp is the most recent updated_at
(or :timestamp_column of your choice) of the matching records in the collection.
While keys from this gem follow the same format as the Rails implementation, they are not identical. There are three notable ways in which they differ:
The SQL digest for full-collection caches includes where (1 = 1)
in its source.
This is because in ActiveRecord 3, Model.all returns an instance of Array, not ActiveRecord::Relation. We can add a where clause to unbound collections to force the correct type.
The collection_size
integer for limited queries will be the full collection size, not the size of the limited collection.
There are documented edge cases where the key for a limited collection may not change when its contents do. We've found that including the total size of the collection in the key and letting the sql digest handle uniqueness around limits and offsets suffices to yield properly deterministic caching.
The default cache_timestamp_format
for Rails 3.2 is :number
rather than :nsec
.
This is because the cache_timestamp_format
method was introduced in AR 3.2, but the :nsec
formatter did not yet exist. For consistency with other keys in 3.2, we've left the default untouched there. All other versions of AR will implement an :nsec
formatter and use it as the default.
Give the code a look, try it out, and let us know what you think!