Too many open files

I’m getting a “Too many open files in system” error after running for
about
10 minutes. I’m in the process of testing out nginx and it’s a pretty
busy
server doing around 200 requests/sec. My current open file limit is
131072. Does anyone know a safe amount to set this to for a 32-bit
linux
system with 6GB ram?

Davy C. <[email protected]…> writes:

I’m getting a “Too many open files in system” error after running for about 10
minutes. I’m in the process of testing out nginx and it’s a pretty busy
server
doing around 200 requests/sec. My current open file limit is 131072.Â
Does
anyone know a safe amount to set this to for a 32-bit linux system with
6GB ram?

ulimit -n actually returns 1024, i’m thinking this should be higher

cat /proc/sys/fs/file-max returns 131072

After a little more research, it looks like the php-cgi processes that
are
being proxied to by nginx may be the problem. It looks like the php-cgi
processes had a lot of files left open.

I ran lsof | wc -l and the result was 139566

After a restart of the php-cgi processes, the result was only 12045

On Thu, Sep 10, 2009 at 11:04 PM, [email protected] wrote:

That isn’t the point and you know it. But in case you don’t I’ll make it
simple for you. Help or don’t help but either way you don’t need to be an
arrogant ass.

yes, he should’ve pointed to this instead

http://www.catb.org/~esr/faqs/smart-questions.html

I answered my own question so I thought I would post it here :slight_smile: Looks
like
there was an issue with APC 3.0.18 that was causing the problems.

Further discussion is here:
http://groups.google.com/group/highload-php-ru/browse_thread/thread/33a2273b51bc41d8#

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs