How optimizing nginx for local large files download

Hi,
I have some dell860 servers.2GB ram,20Gbits network bandwidth.
Twenty local large files(50M-2G) on the lighty server.
Nginx online-testing,the single server make use of bandwidth
400M~550Mbits now.
I wish optimizing nginx for bandwidth go up more than 1Gbits.
no-log,no-proxy,no-fcgi,only offer client soft download and patch
download.
How can do it resolve bandwidth low use?

                            T.kk

I would be looking more at the server then nginx, is the server running
out
of any resources at all?
If you only have 2gb of ram i would suggest you might be running out of
IO,
or depending on your netcard maybe soft interrupts


From: “lienkai” [email protected]
Sent: Sunday, July 13, 2008 2:13 PM
To: [email protected]
Subject: How optimizing nginx for local large files download.

Are you sure your hard drives can keep up the pace?

On Sunday 13 July 2008, lienkai wrote:

Hi,
I have some dell860 servers.2GB ram,20Gbits network bandwidth.
Twenty local large files(50M-2G) on the lighty server.
Nginx online-testing,the single server make use of bandwidth
400M~550Mbits now. I wish optimizing nginx for bandwidth go up more than
1Gbits.
no-log,no-proxy,no-fcgi,only offer client soft download and patch
download. How can do it resolve bandwidth low use?

try to bump worker_processes
and/or
disable sendfile and raise output_buffers

Why disable sendfile?

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs