I’m caching static files in a very traditional way, I hope:
location ~* .(js|css|png|jpg|jpeg|gif|ico)$ { #expires max;
expires 30m;
access_log off;
log_not_found on;
}
I would like to have longer caching. Sometimes there is a fast turn
around on live sites and clients have an issue bursting the cache. These
tend to be Windows based clients. No matter what they or I do, they
continue to retrieve cached versions of css and pngs.
Is there a better configuration, or anything that I am not doing at my
end to ensure they get served the fresh files?
Well, if you set cache to expire 30 years from now
the browser should honor that, that is if no
Last-Modified is set.
I have it set to 30m (30 minutes, right?). If I update, for example,
some css files and then a do a hard refresh on a) a Mac then I get the
new css or b) Windows 7 then I keep getting the cached css files.
Is there any technique to burst this caching. It might not be a ‘pure
nginx’ solution, but anything that could be supported by an nginx config
setting?
nginx’ solution, but anything that could be supported by an nginx config
setting?
What everyone else does is set the URLs to be unique and then change the
url when the asset is updated. eg a simple example which rails uses and
arguably isn’t perfect would be:
/assets/blah.jpg?12333234
The random string after the ? could be generated in various ways, eg
incrementing counter or it could be the epoch time of the file mtime (ie
age in seconds)
Now you can set expire time to 30 years and when the asset is updated
you simply arrange for the url to be updated in the html and a “new”
image is pulled down
Implementation left to the reader as an exercise…
Ed W
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.