Forum: NGINX Question about proxy cache when it expires

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
Jérôme Loyet (Guest)
on 2009-05-12 12:45
(Received via mailing list)
Hello igor,

I have a question about the cache behaviour in proxy mode.

I have nginx in front head which redirect to an apache back end. Nginx
caches eveything for M minutes.

If I have a large number of requests for the same page and this page
is cached : nginx returns the cached page ... no problems
After M minutes, the cached page expires
The first request coming after the expiration makes nginx to ask the
backend for refresh
When nginx receives the backend fresh response, it's saved to cache
and then nginx serves the fresh cached page

But what happen between the start of the request to the backend and
the end of the response from the backend ? (let's assume that the
backend serves the page in 5s ... and in 5s I can have a lot of
request to this page).:
- Are the request queued waiting for the backend response ?
- Every request makes try to refresh the cache from the backend ? (in
this case, I have multiple request for the same page to the backend
... I can have a burst of request and my apache can be overflowed by
request -- that's why I'm using nignx with cache).
- Do the requests serve the cached page even if it's expired until the
backend response has been received ?
- Maybe something else :)

Thanks for your answer.

++ jerome
Igor S. (Guest)
on 2009-05-12 18:42
(Received via mailing list)
On Tue, May 12, 2009 at 10:38:04AM +0200, J?r?me Loyet wrote:

> The first request coming after the expiration makes nginx to ask the
> this case, I have multiple request for the same page to the backend
> ... I can have a burst of request and my apache can be overflowed by
> request -- that's why I'm using nignx with cache).
> - Do the requests serve the cached page even if it's expired until the
> backend response has been received ?
> - Maybe something else :)

Currently all requests which found that a cached response expired
are proxied to backend. I plan to implement busy locks to pass the
single
request and leave others to wait the response up to specified time.
Jérôme Loyet (Guest)
on 2009-05-12 18:57
(Received via mailing list)
OK thanks for the answer.

I'm already ready to test this new feature which could be very benifit
to us :)

++ Jerome

2009/5/12 Igor S. <removed_email_address@domain.invalid>:
张立冰 (Guest)
on 2009-05-13 05:31
(Received via mailing list)
Igor, thanks.

2009/5/12 Igor S. <removed_email_address@domain.invalid>
"坏人" (Guest)
on 2009-05-13 11:37
(Received via mailing list)
当缓存过期时,所有的请求都会转发到后端,至于能否处理,看apache了。
如果请求数量巨大,并且生成缓存的时间较长,比如5秒,建议你改变缓存的思路。
给你几点思路
1)由php提前生成缓存(核心)
2)使用memcached在内存中缓存页面
3)用perl控制过期及缓存,减小后端压力

Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,1952,1970#msg-1970
Jérôme Loyet (Guest)
on 2009-05-13 11:45
(Received via mailing list)
this is an ENGLISH mailing list. Please use english so that everybody
here can understand what you want to say !!!

++ Jerome

2009/5/12 "»µÈË" <removed_email_address@domain.invalid>:
"坏人" (Guest)
on 2009-05-13 17:30
(Received via mailing list)
J Wrote:
-------------------------------------------------------
> this is an ENGLISH mailing list. Please use
> english so that everybody
> here can understand what you want to say !!!
>
> ++ Jerome
>
> 2009/5/12 "


我不懂英语,见谅

Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,1952,1983#msg-1983
Jim O. (Guest)
on 2009-05-13 17:50
(Received via mailing list)
坏人 Wrote:
-------------------------------------------------------
>
>
> 我不懂英语,见谅

Translation:

I do not understand English, forgive me

My translation:

This is a troll who understands enough to answer appropriately.

I'm banning him from the forum as I have had enough. Sorry everyone for
the inconvenience.

Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,1952,1985#msg-1985
Jérôme Loyet (Guest)
on 2009-05-19 19:20
(Received via mailing list)
>> is cached : nginx returns the cached page ... no problems
>> - Are the request queued waiting for the backend response ?
> request and leave others to wait the response up to specified time.
>

Hi igor,

about this feature. Do you know when you plan do implement it ? I
really need this feature. If you don't have enough time, I can look
into it if you explain to me briefly how you want to do it.

Thx
++ jerome
Igor S. (Guest)
on 2009-05-20 16:22
(Received via mailing list)
On Tue, May 19, 2009 at 05:14:11PM +0200, J?r?me Loyet wrote:

> >> is cached : nginx returns the cached page ... no problems
> >> - Are the request queued waiting for the backend response ?
> > request and leave others to wait the response up to specified time.
> >
>
> Hi igor,
>
> about this feature. Do you know when you plan do implement it ? I
> really need this feature. If you don't have enough time, I can look
> into it if you explain to me briefly how you want to do it.

This is complex thing that I plan to implement in 0.8.
Jérôme Loyet (Guest)
on 2009-05-20 16:42
(Received via mailing list)
OK

I'll try to look into the code to see what I can do. Do you have any
lead to guide me on this quest ? :)

2009/5/20 Igor S. <removed_email_address@domain.invalid>:
Igor S. (Guest)
on 2009-05-20 16:47
(Received via mailing list)
On Wed, May 20, 2009 at 02:32:39PM +0200, J?r?me Loyet wrote:

> OK
>
> I'll try to look into the code to see what I can do. Do you have any
> lead to guide me on this quest ? :)

This is complex thing. It requires sending notifications from one worker
to another when busy lock is being freed.
Maxim D. (Guest)
on 2009-05-20 17:49
(Received via mailing list)
Hello!

On Wed, May 20, 2009 at 04:36:12PM +0400, Igor S. wrote:

> On Wed, May 20, 2009 at 02:32:39PM +0200, J?r?me Loyet wrote:
>
> > OK
> >
> > I'll try to look into the code to see what I can do. Do you have any
> > lead to guide me on this quest ? :)
>
> This is complex thing. It requires sending notifications from one worker
> to another when busy lock is being freed.

BTW, what about something like "in-process" busy locks?  This will
effectively limit number of requests simulteneously send to
backends to number of worker processes.  At least it looks much
better than nothing, and should be simpler.

Maxim D.
Igor S. (Guest)
on 2009-05-20 17:58
(Received via mailing list)
On Wed, May 20, 2009 at 05:42:12PM +0400, Maxim D. wrote:

> >
> > This is complex thing. It requires sending notifications from one worker
> > to another when busy lock is being freed.
>
> BTW, what about something like "in-process" busy locks?  This will
> effectively limit number of requests simulteneously send to
> backends to number of worker processes.  At least it looks much
> better than nothing, and should be simpler.

Yes, they are much simpler, but I want to do it at once.
Arvind Jayaprakash (Guest)
on 2009-06-10 21:29
(Received via mailing list)
On May 12, Igor S. wrote:
>> is cached : nginx returns the cached page ... no problems
>> - Are the request queued waiting for the backend response ?
>request and leave others to wait the response up to specified time.
How about the notion of soft timeouts. Say the ttl is set to 300
seconds. We pick a value like say 10% and say that any request received
for the next 30 seconds will still get the stale content without any
waits and somewhere in there, we initiate a request to the backends and
refresh the cache. Beyond this window, we can take the busy wait
approach.

Squid 2.7.x has something like this.
http://www.squid-cache.org/Versions/v2/2.7/cfgman/...
This topic is locked and can not be replied to.