Bugreport - connection broke on slow clients in proxy mode


#1

Hello,
we use nginx as reverse proxy to apache server and if we use
proxy_max_temp_file_size directive to limit the size of files buffering,
downloading larger files with slow connection is always be broken and it
is nessesary to start downloading again.

For example, how to replicate the problem:
Use “proxy_max_temp_file_size 10M” in the proxy configuration, generate
about 40MB binary file in document root of the proxyed apache (or maybe
other webserver as well) and try to download it with speed limited to
100k/sec. For example with wget:
wget -t 1 --limit-rate=100k http://server/file

Downloading will fail aproximatly in 12MB. If you download it with full
speed (10M in my case), there will be no problem. If you download it
directly from the apache server running on diferent tcp port, there will
by also no problem. The problem appears on latest stable (0.6.35)
version as well as on latest development (0.7.33).
Feel free to ask me about more details.
Best Regards Tomas Hala


#2

Maxim D. wrote:

For example, how to replicate the problem:
version as well as on latest development (0.7.33).

Hello,
that makes sense. It’s probably problem with understanding of meaning of
proxy_max_temp_file_size directive. With reference to documentation
(wiki) we tought, that if the file is larger then limit, it will be
served synchronously. When I try to strace the apache process serving
this file, it seams, that at first it downloads size acording to
proxy_max_temp_file_size and after reaching this point by client it
starts transfering synchronous. So the documentation is little bit
misguided. But I probably understand, why it is implemented like this
because it is easyer wait until it will fill the buffer then to detect
the size of serving file before.
Thanks for your hint.
BR Tomas Hala


#3

Hello!

On Fri, Feb 06, 2009 at 03:15:26PM +0100, Tomáš Hála wrote:

100k/sec. For example with wget:
wget -t 1 --limit-rate=100k http://server/file

Downloading will fail aproximatly in 12MB. If you download it with full
speed (10M in my case), there will be no problem. If you download it
directly from the apache server running on diferent tcp port, there will
by also no problem. The problem appears on latest stable (0.6.35)
version as well as on latest development (0.7.33).
Feel free to ask me about more details.

I’ve seen similar problem caused by client timeouts in Apache,
since from Apache’s point of view client downloads about 10M (+
nginx proxy memory buffers) and then stops downloading for a
relatively long time (the time needed for client to download at
least one memory buffer from nginx).

Maxim D.