Nginx Cache not working with wget or curl

Hello,

I set up Ninx as a cache proxy for our webservers.

Everything is working fine when using a standard web browser (Firefox,
IE, Chrome), pages are correctly cached according to the rules I have
defined.

However, I noticed that non standard tools to retrieve web pages (wget,
curl and some monitoring tools) are not caching pages and are always
getting content from the backend. In the log file, I have
upstream_cache_status with MISS (however, if I first call my page with
Firefox, wget or curl are displaying the cached page).

I thought first it was due to the fact that the monitoring tool and wget
are using HTTP/1.0 but curl is using HTTP/1.1 and the result is the
same.

Is there a special parameters to allow such tools to cache pages ?

Thank you for your help,

Fred

Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,183034,183034#msg-183034

I found how to solve this problem : a cookie was set by the loadbalancer
of our backends.
I added Set-cookie to proxy_ignore_headers.

What I don’t understand, however is why it’s working with standard web
browsers.

Fred

Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,183034,183053#msg-183053

Hello!

On Tue, Mar 15, 2011 at 08:52:59AM -0400, Fred91 wrote:

I found how to solve this problem : a cookie was set by the loadbalancer of our
backends.
I added Set-cookie to proxy_ignore_headers.

What I don’t understand, however is why it’s working with standard web browsers.

Most likely standard browsers sent previously set cookie and
your backends doesn’t try to set new one.

Maxim D.

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs