How to make Nginx work with distributed/multiple memcached servers?

Hi,

I want to use Nginx as a caching reverse proxy to a Django website that
runs behind Nginx over uWSGI.

Is it possible for Nginx to use multiple memcached servers to cache the
dynamic pages generated by Django? If so can you share some example
config? All the examples I’ve seen only uses one local memcached
server.

And if it’s possible to use multiple memcached servers, who will be
doing the storing and invalidation of cached pages - Nginx or Django?
And if it’s Django, how do i make sure that given the same key, both
Django and Nginx will hash to the same memcached server?

I’m new to this. Any help would be greatly appreciated. Thanks!

Andy

Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,191573,191573#msg-191573

On 4/15/11 2:09 PM, “Andy” [email protected] wrote:

Is it possible for Nginx to use multiple memcached servers to cache the
dynamic pages generated by Django? If so can you share some example
config? All the examples I’ve seen only uses one local memcached
server.

FWIW, I use moxi for nginx and whatever app I’m using.

http://www.couchbase.com/downloads/moxi-server/community


Brian A.

On Sat, Apr 16, 2011 at 2:09 AM, Andy [email protected] wrote:

Hi,

I want to use Nginx as a caching reverse proxy to a Django website that
runs behind Nginx over uWSGI.

You can try out ngx_srcache + ngx_memc to do such caching.

Is it possible for Nginx to use multiple memcached servers to cache the
dynamic pages generated by Django?

Sure, see these two slide pages for an example:

http://agentzh.org/misc/slides/nginx-state-of-the-art/#25
http://agentzh.org/misc/slides/nginx-state-of-the-art/#26

If so can you share some example
config? All the examples I’ve seen only uses one local memcached
server.

See above.

And if it’s possible to use multiple memcached servers, who will be
doing the storing and invalidation of cached pages - Nginx or Django?

If ngx_srcache is used, nginx is doing caching completely.

And if it’s Django, how do i make sure that given the same key, both
Django and Nginx will hash to the same memcached server?

If you want your python app to access memcached as well, the
recommended approach is to pass the nginx variable holding the hashed
memcached upstream name back to your fastcgi app.

When caching does not even reach your python app, your server will
easily reach 10k ~ 20k req/sec for a simple machine when a cache hit
happens :wink:

Cheers,
-agentzh

лл˴֣ĿҲõ memc srcache
ΨȽźûһȡ Ĺ
Ϊʵȡöܣֻһչģ飬һ

On Mon, Apr 18, 2011 at 1:15 PM, agentzh [email protected] wrote:

On Sat, Apr 16, 2011 at 2:09 AM, Andy [email protected] wrote:

Hi,

I want to use Nginx as a caching reverse proxy to a Django website that
runs behind Nginx over uWSGI.

You can try out ngx_srcache + ngx_memc to do such caching.

Sorry, forgot to give the links to these two nginx C modules:

http://wiki.nginx.org/NginxHttpMemcModule

http://github.com/agentzh/srcache-nginx-module

Cheers,
-agentzh

ллഺֵͶ
ڴ mget ܣ
ллŶ

2011/4/18 agentzh [email protected]

2011/4/18 Թ [email protected]:

лл˴֣ĿҲõ memc srcache

s/˴/ഺ/

ΨȽźûһȡ Ĺ
Ϊʵȡöܣֻһչģ飬һ

Indeed, support for the memcached mget command is a TODO item for my
ngx_memc module :slight_smile: Well, I’ll work on it :wink:

Cheers,
-agentzh

On Wed, Apr 20, 2011 at 10:09 AM, Andy [email protected] wrote:

recommended approach is to pass the nginx variable
holding the hashed
memcached upstream name back to your fastcgi app.

How do I pass the nginx variable holding the hashed memcached upstream?
Which variable is that?

Here’s a complete nginx.conf example for this:

http {
upstream A {
server 10.32.110.5:11211;
}
upstream B {
server 10.32.110.16:11211;
}
upstream C {
server 10.32.110.27:11211;
}

    upstream_list my_cluster A B C;

    server {
       location = /memc {
           internal;
           set $memc_key $query_string;
           set $memc_exptime 3600; # cache one hour

           # hashing the $memc_key to an upstream backend
           #  in the my_cluster upstream list, and set $backend:
           set_hashed_upstream $backend my_cluster $memc_key;

           # pass $backend to memc_pass:
           memc_pass $backend;
       }

       location /myapp {
            set $key 'my page key';  # you may want to construct a

dynamic cache key here…

            set_hashed_upstream $backend my_cluster $key;

            srcache_fetch GET /memc $key;
            srcache_store PUT /memc $key;

            fastcgi_param MEMC_BACKEND $backend;
            fastcgi_pass unix:/path/to/your/python/app.sock;
       }
    }
}

Another way is to use Lua to pick up a memcached backend from your
memcached cluster, such that you can define your own hashing
algorithm, including your custom consistent hashing ones.

Here we use fastcgi_param to pass nginx variable $backend to your
fastcgi app such that your fastcgi app can read that from its
environment MEMC_BACKEND. Another approach is to append the $backend
value to the query string of the forwarded http request.

Cheers,
-agentzh

agentzh Wrote:

memcached upstream name back to your fastcgi app.

How do I pass the nginx variable holding the hashed memcached upstream?
Which variable is that?

thanx.

Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,191573,192476#msg-192476

On Wed, Apr 20, 2011 at 12:11 PM, agentzh [email protected] wrote:

Another way is to use Lua to pick up a memcached backend from your
memcached cluster, such that you can define your own hashing
algorithm, including your custom consistent hashing ones.

Here’s an example for defining custom backend routing rules in Lua (by
means of our ngx_lua module) from my slides:

http://agentzh.org/misc/slides/nginx-state-of-the-art/#67
http://agentzh.org/misc/slides/nginx-state-of-the-art/#68
http://agentzh.org/misc/slides/nginx-state-of-the-art/#69
http://agentzh.org/misc/slides/nginx-state-of-the-art/#70

Well, you can also use the arrow keys or pageup/pagedown keys to
switch pages in my (ajax-based) slides.

Here we use fastcgi_param to pass nginx variable $backend to your
fastcgi app such that your fastcgi app can read that from its
environment MEMC_BACKEND. Another approach is to append the $backend
value to the query string of the forwarded http request.

Basically it’s as simple as

fastcgi_param QUERY_STRING "$query_string&_memc_backend=$backend";

Cheers,
-agentzh

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs