How much work for Windows Multiple Process support?

How much work do you think it will take to get the Windows build to
support having multiple worker processes? Is anyone already working on
this? I volunteer to help later this month if I get a little guidance
on what is required.



Posted at Nginx Forum:,226147,226147#msg-226147

You can run multiple nginx sessions by seperating the config between
them, the main http section will have to use a ‘uniqe’ port for each
session, from there on you can seperate out the hosted websites and also
assign each session its own cpu. You are however limited to a 65.000
connection max for all sessions.

Another, easier, way for multiple sessions is to use 1 session as proxy
to many secondairy sessions which will make routing the http port
easier, this way also scales better.

Posted at Nginx Forum:,226147,226318#msg-226318

I think we might benefit more from solving the select problem, with 8
separate workers as mentioned before you’re still stuck to 1024*8 which
isn’t that much more, found an interesting discussion about select, why
it’s used and why it should not be used on windows systems:

If select was rewritten you could run 4 worker_processes with 10.000
worker_connections each on their own cpu. Windows can handle this just
as easy as Linux folks.

Posted at Nginx Forum:,226147,226690#msg-226690

itpp2012 Wrote:

worker_processes with 10.000 worker_connections
each on their own cpu. Windows can handle this
just as easy as Linux folks.

Do you know how hard it would be to do that with nginx? I’m a C/C++
programmer, I just don’t want to dive into this project if there are
going to be lots of other situations that aren’t thread-safe.



Posted at Nginx Forum:,226147,226698#msg-226698

I never said it was going to be easy, just that porting for win32
requires some win32 work as there are substantial differences between
linux and win32 (duh). They managed to get php win32 thread-safe after

Setup nginx as a front-end, make a pool with nginx back-ends (just like
you would do for fpm with sockets or tcp) and from there on the back-end
nginx servers are load-balanced from the pool front-end just like fpm
would work, nothing magic about it.

Posted at Nginx Forum:,226147,226739#msg-226739

Well this is all for one site where I want multiple workers to speed up
response time if any particular request is slow. Can you explain how I
could forward from one nginx to say 4 other ones, and then those could
talk to my WSGI application? Thanks.

Posted at Nginx Forum:,226147,226697#msg-226697

Here is my SS: 259 71 2451

itpp2012, I’ll take a look at having a load balancing nginx pass to
other nginx instances. I would still prefer to add some completion
ports or something similar to nginx, but that will obviously take a lot
longer than just running the chained-nginx scenario. Thanks.

Posted at Nginx Forum:,226147,226802#msg-226802