I have registered with Uptime Robot. You can have it monitor a URL. It
does this by sending a HEAD request every 5 to 10 minutes and checking
for an OK response. This request triggers the fastcgi cache with an
empty content. If I understand the wiki correctly I can’t not cache HEAD
requests (Module ngx_http_fastcgi_module).
The only (dubious) fix this I can see at the moment is to add an IP
exception to the nginx config. However that effectively leaves the site
open to abuse by someone being malicious.
Is there a way I can just cache GET requests? Or requests with body
content size > 0? Or is there a better way?
Btw, what kind of cache invalidation are you
using, if any? Is it only time based or do you use
nginx no_cache and cache_bypass directives
(Module ngx_http_fastcgi_module
ache) as well?
It is a WordPress configuration, which relies on the
nginx-proxy-cache-purge plugin to purge pages on update.
/etc/nginx/my.conf# cat wordpress-fastcgi-cache.conf
# Segment to include in all WP installs.
# Configures php-fpm (socket) with fastcgi caching.
set $nocache "";
if ($http_cookie ~
(comment_author_.*|wordpress_logged_in.*|wp-postpass_.*)) {
set $nocache "Y";
}
fastcgi_pass unix:/usr/local/var/run/php-fpm.sock;
fastcgi_index index.php;
fastcgi_param CONTENT-LENGTH $content_length;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_param PATH_INFO $fastcgi_script_name;
include fastcgi_params;
fastcgi_cache_use_stale error timeout invalid_header http_500;
fastcgi_cache_key $request_method$host$request_uri;
fastcgi_cache WORDPRESS;
fastcgi_cache_valid 200 10m;
fastcgi_ignore_headers Expires Cache-Control;
fastcgi_cache_bypass $nocache;
fastcgi_no_cache $nocache;
Btw, what kind of cache invalidation are you using, if any? Is it only
time based or do you use nginx no_cache and cache_bypass directives
(Module ngx_http_fastcgi_module) as well?
Posted at Nginx Forum:
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.