FastCGI or Upstream proxy for python?

Which would be fastest/preferable to connect nginx to python? FastCGI
(using Flup or similar) or the upstream module proxying to an http
server such as cherrypy?

do you have an upstream server such as cherrypy already setup?

if yes, that is going to be your fastest route.

Nothing yet - was going to either use Paste/WebOb or web.py; looking for
benchmarks for the smaller frameworks. However, I’m not adverse to using
something a little different if its preferable for nginx. I understand
sockets are faster than TCP, but is fastcgi better than upstream?

Sean A. wrote:

Phillip B Oldham
The Activity People
[email protected] mailto:[email protected]


Policies

This e-mail and its attachments are intended for the above named
recipient(s) only and may be confidential. If they have come to you in
error, please reply to this e-mail and highlight the error. No action
should be taken regarding content, nor must you copy or show them to
anyone.

This e-mail has been created in the knowledge that Internet e-mail is
not a 100% secure communications medium, and we have taken steps to
ensure that this e-mail and attachments are free from any virus. We must
advise that in keeping with good computing practice the recipient should
ensure they are completely virus free, and that you understand and
observe the lack of security when e-mailing us.

in terms of speed?

i’ve never had an issue with speed for either. unless you are pushing
tons of stuff through, i doubt you would either.

in the end, unless speed is REALLY a factor, it comes down to, which
do you find easier to maintain.

i find proxied setups easier to maintain. i abhor setting up and
maintaing fastcgi, so the choice for me is always easy.

Cliff W. wrote:

Spawning - not as fast as CP, but more scalable:
Spawning · PyPI

Looks interesting.

FAPWS - Very, very fast (I’ve clocked 5000req/s serving a simple page),
but very bleeding edge:
http://github.com/piranha/fapws2/

I’ve looked at this before. Looks great, but as you say its very
bleeding-edge.

There was also one called apricot which is blazingly-fast, but it only
supports GET requests and a subset of HTTP1.0.

Phillip B Oldham
The Activity People
[email protected] mailto:[email protected]


Policies

This e-mail and its attachments are intended for the above named
recipient(s) only and may be confidential. If they have come to you in
error, please reply to this e-mail and highlight the error. No action
should be taken regarding content, nor must you copy or show them to
anyone.

This e-mail has been created in the knowledge that Internet e-mail is
not a 100% secure communications medium, and we have taken steps to
ensure that this e-mail and attachments are free from any virus. We must
advise that in keeping with good computing practice the recipient should
ensure they are completely virus free, and that you understand and
observe the lack of security when e-mailing us.

On Wed, 2008-10-08 at 09:44 +0100, Phillip B Oldham wrote:

Nothing yet - was going to either use Paste/WebOb or web.py; looking for
benchmarks for the smaller frameworks. However, I’m not adverse to using
something a little different if its preferable for nginx. I understand
sockets are faster than TCP, but is fastcgi better than upstream?

I prefer proxying as it seems simpler to me. Performance difference
isn’t significant in any case.

For proxying, Paste’s HTTP server is okay (I’ve not used web.py), but
I’d consider one of these instead:

CherryPy’s wsgiserver - Fast and is only a single file:

Spawning - not as fast as CP, but more scalable:

FAPWS - Very, very fast (I’ve clocked 5000req/s serving a simple page),
but very bleeding edge:
http://github.com/piranha/fapws2/

All of them are WSGI compliant, so you should have no problem using
WebOb in any case.

Regards,
Cliff