Advice on memcached caching

Hi! I need to cache an overview page that lists the latest events
(messages, documents, etc). By the side of each event I show the user
who created it (including the user’s avatar). I have tried to use a
cache this with the cache_key of the page that holds the contents. In
the belongs_to method of each event I added :touch => true so the page
get updated when something changes.

This works great until someone changes it’s personal information or
avatar image. So the problem is, how do I invalidate the cache when an
association gets updated?

Cheers,
Fredrik Martenson

fredd wrote:

This works great until someone changes it’s personal information or
avatar image. So the problem is, how do I invalidate the cache when an
association gets updated?

Page cache is a big hammer when maybe 1 attribute of 1 related entity
has changed. What if you cached the row entry for each event?

Users are related to Events, no? And the User model knows what other
models it is related to, and the IDs of those related models. So when a
User model receives a change to something that appears in some other
models cache somewhere, it could invalidate (delete) that cache.

I did have a debate with a colleague over which model should be
invalidating what cache fragment, but my perspective is that the model
whose data is being shown in a particular cache fragment is responsible
for invalidating that cache fragment, not the model whose view that
cache fragment appears in, if that makes any sense.

I have a side bar for ‘show’ views that lists models ‘related’ to the
model being shown. Determining and rendering that sidebar can be very
expensive depending on the number of related models, and instances of
those related models.

That sidebar uses fragment caching at 2 levels:

  1. the overall ‘relateds’ cache, which is the whole sidebar
  2. a ‘model type’ cache, which is all the related models of type X,
    which is a portion of the sidebar (each model shows its name, your User
    seems to show the name and avatar)

Generally each model manages its own fragment caches. When data for a
model changes, each model knows what other models it is related to, and
their IDs. Given the relationship, each model also knows which types of
fragment caches its data appears in, and it invalidates (deletes) those
fragments when fragment cached data changes. Not all the fields trigger
this behavior, just those that appear in fragment caches.

Suppose ‘Feature’ 7 is related to ‘Project’ 13.
Feature 17 appears in the:
project.13.relateds cache fragment (the whole sidebar)
project.13.related.features cache fragment (the features portion of the
sidebar)

When Feature 7’s name changes, it knows it is related to project 13, and
deletes the relateds fragment for project 13, and the related.features
sub-fragment for project 13.

When project 13 needs to show its sidebar, it finds the sidebar cache
fragment missing and reconstructs it. The reconstruction finds that the
sidebar for a Project is actually composed of 9 sub-fragments (Projects
have 9 related models).

Eight of those cache fragments are found, but Project 17 reconstructs
the Features sub-fragment (Feature 7 was only 1 of 11 related Features)
and caches it, then finishes the entire sidebar, and caches that also.

The next time project 13 ‘shows’, it’ll find the full sidebar cache
fragment and just use that.

Fragment caching really helped the performance. Displaying project 13
which has 40 related models in 9 related types with all fragment caches
deleted == “Completed in 1728ms (View: 164, DB: 300)”, displaying
project 13 with all cache fragments present == “Completed in 76ms (View:
66, DB: 3)”.

And this is all file-based caching… haven’t needed memcached yet.