He’s referring to a cache stampede, when an item in the proxy cache
expires and multiple threads simultaneously try to repopulate it causing
load spikes and possible locking conditions. The item below from
memcache’s faq has a few suggestions of how to avoid this for memcached.
After a quick look I didn’t see any options to avoid stampede’s in
nginx’s proxy caching nor in the memcached module. Does nginx have this
built in? Or when a item in the proxy cache expires, is there a rush of
requests to the back-end to refresh it?
A useful method (if it isn’t already built in) might be to block threads
that have requested the resource being refreshed until the thread doing
the refreshing completes. A configurable timeout might be helpful.