I set up Ninx as a cache proxy for our webservers.
Everything is working fine when using a standard web browser (Firefox,
IE, Chrome), pages are correctly cached according to the rules I have
defined.
However, I noticed that non standard tools to retrieve web pages (wget,
curl and some monitoring tools) are not caching pages and are always
getting content from the backend. In the log file, I have
upstream_cache_status with MISS (however, if I first call my page with
Firefox, wget or curl are displaying the cached page).
I thought first it was due to the fact that the monitoring tool and wget
are using HTTP/1.0 but curl is using HTTP/1.1 and the result is the
same.
Is there a special parameters to allow such tools to cache pages ?