Re: POST with huge HTTP body

Since a vanilla Mongrel will store uploaded data on disk in a tempfile,
maybe it has something to do with your Railsdav plugin?

I just tested an 80 MB upload with my Apache 2.2.6 reverse proxy and
Mongrel 1.0.1 cluster and it does save the object first to /tmp on disk
and not in memory. The memory used by the mongrel process before I
started uploading was 50 MB. While uploading it peaked at 57 MB but was
mostly stable around 50-52 MB. And after the upload was done it was back
again at 50 MB.

Maybe your $TMPDIR is not writable for the user you are running mongrel
under?

Just before the upload was finished:
$ ls -hl /tmp/
total 160288
-rw------- 1 mongrel wheel 78.2M Oct 15 16:31 mongrel.25.0

I would start looking in the Railsdav plugin you are using, seems like
they changed the default of saving to disk, to memory.

Cheers.

----- Original Message ----
From: Daniel B. [email protected]
To: [email protected]
Sent: Monday, October 15, 2007 4:10:06 PM
Subject: Re: [Mongrel] POST with huge HTTP body

Thanks for the replies.

The fact that Mongrel shouldn’t be the first one to get the POST/PUT
data
was a good point that I hadn’t thought of. However, even when talking
directly to Mongrel, the memory consumption of the application
increases
until it crashes if too much data is sent. In my case, I was sending a
500MB
file from a Webdav client. At some point all of it was loaded into RAM,
which
doesn’t work.

Are there any more productive alternatives than writing the entire
thing
from scratch in C? Please? :slight_smile:

/Daniel

On 10/15/07, Tim K. [email protected] wrote:

A while ago I wrote a plugin to limit uploads in Mongrel. Mongrel
does save uploads to a tempfile on disk - if the upload is bigger than
about 12 KB - using a Ruby TempFile-object. (which stores in the
$TMPDIR,
/tmp on most systems).

The request is handed over to Rails after it’s fully received by
Mongrel.

I’m not sure if this “saving to disk” works the same with chunked
uploads (uploads without a Content-Length header).

Maybe my plugin can help you:
http://slasaus.netsend.nl/articles/show/7 (warning: it works, but is
not very elegant).
Sent: Monday, October 15, 2007 2:00:20 PM
I want to do the opposite, streaming data from the client to the

[email protected]
http://rubyforge.org/mailman/listinfo/mongrel-users


Shape Yahoo! in your own image. Join our Network Research Panel
today! http://surveylink.yahoo.com/gmrs/yahoo_panel_invite.asp?a=7


Mongrel-users mailing list
[email protected]
http://rubyforge.org/mailman/listinfo/mongrel-users


Mongrel-users mailing list
[email protected]
http://rubyforge.org/mailman/listinfo/mongrel-users

On Mon, 15 Oct 2007 07:37:57 -0700 (PDT)
Tim K. [email protected] wrote:

Since a vanilla Mongrel will store uploaded data on disk in a tempfile, maybe it has something to do with your Railsdav plugin?

Yes, I believe this is true as well. Railsdav pretty much blows. My
experience with it is that none of the windows clients work well (all
three) and it did some stupid stuff like used send_file in rails and
processed the whole file on input.

Ehem, there’s this company called Xythos. They make a really good
WebDAV server and client. Their WebDAV server is written in Java but
it’s not insanely architected. Instead it’s very well done and using
Jakarta Slide you can JRuby yourself a little Ruby lib that looks like
Fileutils in about an hour. It’s also a very flexible server, and very
conforming since most of the people who worked on the RFCs works for
Xythos.

It’s possible with Xythos to swap out nearly everything as well. For
example, if you need to use your crappy “rails
acts_as_lamely_authenticated” authentication, then you can write your
own auth filter for Xythos and it’ll go and look in your database. If
you need to do something special to the files then you can do a filter
for that (it’s got a sample that uses Clam to do AV). They also give
you all the source to their (ugly) web front end for free so you can see
how every operation is done, even the weirdo admin stuff.

It’s a good product, and if you’re dumb enough to work with WebDAV
(since obviously you like Microsoft raping you over a barrel), then go
check it out.

Good luck, I’ve seen WebDAV kill off nearly every Rails project that’s
come near it. Not sure what it is, but it’s like the black hole with a
cache of gold in the center.


Zed A. Shaw

It looks like it’s CGI::QueryExtension#read_body in ActionPack that is
the
first one to cause problems. When a PUT or POST request arrives, it
reads
the entire thing into memory. Even if that method is patched, there are
tons
of other methods that wants to mess around with the request body.
According to the documentation, the CGI class is supposed to handle
Tempfile objects, but obviously not in this case.

Seems like the way to go is to just use Mongrel and write a HttpHandler
as a standalone server without both Rails and Railsdav, at least for a
first
version.

Lots of thanks for pointing me in the right direction.

/Daniel