Can multiple nginx instances share a single proxy cache store?

Hi,

I am planning to deploy multiple nginx servers (10) to proxy a bunch of
apaches(20). The apaches host about 4000 vhosts, with a total volume of
about 1TB.
One scenario I am thinking of with regard to hard disk storage of the
proxy cache would be to have a single storage object, eg. NAS, connected
to all 10 nginx servers via fibre channel.

This would have the advantage of only pulling items into cache once, and
would also avoid cache inconsistencies, which could at least be a
temporal problem if all 10 nginx servers would have their own cache.

My question now is: would this work in theory?
Can multiple nginx instances share a single proxy cache store?
I am thinking of cache management, all 10 nginx instances would try to
manage the same cache directory. I don’t know enough about the cache
management to understand if there are problems with this scenario.

Strictly speaking this is a second question, but still: the alternative
would be to give the nginx local storage for the proxy cache (e.g. a
raid 5, or even jbod (just a bunch of disks)). This would obviously be
much simpler to set up and manage, und thus be more robust (the single
storage would be a single point of failure).
Which would you recommend?

Isaac

Hello!

On Tue, Jul 31, 2012 at 03:35:25PM +0200, Isaac H. wrote:

This would have the advantage of only pulling items into cache once,
and would also avoid cache inconsistencies, which could at least be
a temporal problem if all 10 nginx servers would have their own
cache.

My question now is: would this work in theory?
Can multiple nginx instances share a single proxy cache store?

No.

Which would you recommend?
Use local storage.

The main disadvantage of a network storage pretending to be a
local filesystem is blocking I/O. Even with fully working AIO (in
contrast to one available under Linux, which requires directio)
there are still blocking operations like open()/fstat(), and this
will likely result in suboptimal nginx performance even if just
serving static files from such storage.

Maxim D.

Thank you Maxim, I have made my decision now!

Isaac

Maybe you can try the srcache module:
GitHub - openresty/srcache-nginx-module: Transparent subrequest-based caching layout for arbitrary nginx locations..

This module can store and fetch cached page from remote memcached.

Thanks.

2012/7/31 Isaac H. [email protected]: