Running out of file descriptors under load

Hi All,
I’m running a Rails site with nginx/mongrel on a fairly well specced
dedicated box w/ 4GB RAM and during a spike both the Mongrels and nginx
logs
started to complain about running out of file descriptors. I’ve googled
for
this and not found anything particularly useful so I wondered if anyone
here
has experienced this, has any fixes or any tips on how to diagnose the
problem. The few posts found via Google all recommend adjusting the
global
descriptor limits (via ulimit) but its already set high (about 200000)
and
there are no per user limits set. I’ve not set worker_rlimit_nofile in
nginx but Im not sure thats the problem as the mongrels where hitting
the
file descriptor limits as well.

Any wisdom?

Thanks,


Dan W.
http://www.danwebb.net

aim: danwrong123
skype: danwrong

What has most likely happened is your mongrels have jammed on
something internal to your application, possibly database related.
Once this happens mongrel will continue to accept requests, but those
will be placed in a queue behind the mutex that rails places around
itself. Each connection uses up 2 fd’s on nginx’s side and 1 on the
mongrels side.

The situation will correct itself once the blockage has cleared itself
from your mongrels but the real solution is to find the source of the
blockage, not increase the number of fd’s.

Cheers

Dave

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs