Fastcgi cache use stale "updating" — improvement suggestion

Hello All,

fastcgi_cache_use_stale is awesome, especially with “updating”
parameter.
But I have a feeling that it lacks a complementary parameter (or a
separate
setting to tune “updating” behaviour) that would instruct nginx to
quickly
return stale cached response also on first request (while fastcgi app is
busy doing it’s hard work). Currently existing behaviour is to return
stale
cached responses on consequent requests only, but first response is
delayed
until fastcgi finishes it’s job.

What do you think?

Thank you.
My deepest regards.

Posted at Nginx Forum:

Hello!

On Wed, Sep 10, 2014 at 10:21:18AM -0400, nanochelandro wrote:

What do you think?
As of now, nginx needs a client request to be able to request a
resource from a backend and to save it to the cache. That is,
this behaviour is an implementation detail which isn’t trivial to
change.


Maxim D.
http://nginx.org/

Maxim D. Wrote:

nginx needs a client request to be able to request a
resource from a backend and to save it to the cache.

I’m afraid my explanation wasn’t clear enough.
There’s no need to make nginx able to make requests to fastcgi on it’s
own
initiative.

How it works today:
A client makes a request. Nginx sees the cache has expired and issues a
request to fastcgi. It takes some time and client is patiently
waiting.
Finally, after nginx gets a response from fastcgi app, it stores it in
cache
and sends it to the client.

How it can be improved:
A client makes a request. Nginx sees the cache has expired and issues a
request to fastcgi. But nginx doesn’t wait for fastcgi response, and
immediately responds to the client with stale cache contents (if it
exists). Client is like “whoa, that was fast!”. And later, eventually,
nginx
gets a response from fastcgi app and updates cache.

Posted at Nginx Forum:

Hello!

On Wed, Sep 10, 2014 at 03:27:10PM -0400, nanochelandro wrote:

request to fastcgi. It takes some time and client is patiently waiting.
Finally, after nginx gets a response from fastcgi app, it stores it in cache
and sends it to the client.

How it can be improved:
A client makes a request. Nginx sees the cache has expired and issues a
request to fastcgi. But nginx doesn’t wait for fastcgi response, and
immediately responds to the client with stale cache contents (if it
exists). Client is like “whoa, that was fast!”. And later, eventually, nginx
gets a response from fastcgi app and updates cache.

Uhm, it looks like I wasn’t clear enough. What you suggest is
perfectly understood, thanks (and I believe there is even an
enhancement ticket in trac about this). The problem is that nginx
needs a request object (and a connection object) to get/cache a
response, and returning a stale cached response means the request
object will be used to send the cached response.


Maxim D.
http://nginx.org/