aris
December 19, 2012, 8:52am
1
Hi,
On ubuntu, I tried
su - www-data
ulimit -Hn
200000
ulimit -Sn
100000
The value seems be fine as I have worker_connections = 4000, and
worker_processes = 4, so total is 16000, and still within the limit.
But when my server become busy, I can find the error log contains the
line
entries
“…(24: Too many open files),…”
Any way to debug? Thanks…
On Dec 19, 2012, at 11:52 , howard chen wrote:
The value seems be fine as I have worker_connections = 4000, and
worker_processes = 4, so total is 16000, and still within the limit.
But when my server become busy, I can find the error log contains the line
entries
“…(24: Too many open files),…”
Any way to debug? Thanks…
What does “cat /proc/sys/fs/file-max” ouuput ?
–
Igor S.
On Wed, Dec 19, 2012 at 4:32 PM, Igor S. [email protected] wrote:
/proc/sys/fs/file-max
cat /proc/sys/fs/file-max
394959
I’m facing the same problem here, but I found much lower settings on our
machine (a VPS running Ubuntu 12.04):
the hard limit of open files
www-data@215247:~$ ulimit -Hn
4096
the soft limit of open files
www-data@215247:~$ ulimit -Sn
1024
maximum number of file descriptors enforced on a kernel level
root@215247:~# cat /proc/sys/fs/file-max
262144
So I would hope that setting the ulimit values for max open files per
user
to 262144 (H) and 131072 (S) will help.
But then again, Howard Chen mentioned, that he’s facing this problem
despite
such high values on his system.
Any suggestions?
Posted at Nginx Forum:
Hi, On ubuntu, I tried su - www-data ulimit -Hn >> 200000 ulimit -Sn >> 100000 The value seems be fine as I have worker_connections = 4000, and worker_processes = 4, so total is 16000, and still within the limit.
Thanks a lot, Jader. Your hint saved me hours of additional headache
after
our site was almost unusable due to NGINX throwing tens of thousands of
“24:
Too many open files” errors.
I chose to set worker_rlimit_nofile to 65536 = 2^16 for peace of mind
It’s important to note, that the worker_connections count needs to be
adapted as well, so that connections now won’t be dropped because of a
too
low value here. Since we’re running on a VPS with only a single core, we
set
worker_connections to 65536, as well.
From the moment I reloaded those two changes into the NGINX process, I
haven’t had any of "too many open file errors.
I’ve put together a short blog post on this here:
→
shades of orange | Setting up NGINX as Reverse Proxy for Camping.Info .
Posted at Nginx Forum:
Hi, On ubuntu, I tried su - www-data ulimit -Hn >> 200000 ulimit -Sn >> 100000 The value seems be fine as I have worker_connections = 4000, and worker_processes = 4, so total is 16000, and still within the limit.