Sendfile and mongrel

forgive me if this has been resolved already…

I have an app that is using sendfile to return large files to customers.
We can’t use regular Apache to handle this static content since the
files are what the customers are paying for - so it wouldn’t be cool for
anyone else to get access to them.

The issue is that the mongrel mem footprint gets bloated when the files
are transferred and this mem does not get returned to the system.

I’ve looked into x-sendfile, but it doesn’t appear to be compatible with
Apache 2.2.

Now the question, does the current mongrel 1.1.1 have this same issue
with sendfile and memory bloating? (I’m currently running 1.0.1)

Are there any other recommendations for handling this situation?

thanks

I use X_SendFile with Apache 2.2 just fine. FYI.

And wouldn’t it be possible to intercept all download requests and force
use of a cookie/session authorization requirement? I do it on one of my
websites. The X_SendFile feature doesn’t actually send the server
location of the file and if you used a url such as …

http://website/download/[filename]

/download/ is an action in one of my controllers and it requires a valid
login session. I do a query on the filename and if you’re not
authorized for the file you don’t get to download. All downloads are
stored outside the website directory so no direct linking is available.
Then I use the x_send_file feature to have apache serve up the file
without going through mongrel.

There’s also a nice x_send_file plugin that replaces “send_file” with an
“x_send_file” call, so code updates should be at a minimum.

matte

Matte E. wrote:

I use X_SendFile with Apache 2.2 just fine. FYI.

I was hoping I was wrong about that.

Where do you get your x-sendfile source from?

I got it from “tn123.ath.cx is offline

but I get all kinds of errors when trying to compile it.

That’s also where I got mine. I’m running Apache 2.2.6 on FreeBSD 6.x
and compiled it as instructed on that page.

apxs -cia mod_xsendfile.c

My linux-fu is moderate at best so I was lucky I didn’t experience any
errors compiling. Otherwise I’d be spending a day in Google finding
answers. Good luck.

matte

Before I discovered x-sendfile, I used to handle file uploads with
PHP – talking to the same database as my main Rails app. PHP doesn’t
bloat when dumping files across the wire like Ruby does.

Remember: you don’t need to use Rails / Mongrel for everything –
sometimes switching to another platform to solve a particular problem
can be the easiest, most robust solution.

But in this case: x-sendfile FTW. Deny from all in your files
directory, and apache will send via x-sendfile but won’t allow
clients to download them directly.

-n

On Dec 12, 2007 8:27 AM, Nathan V. [email protected] wrote:

Before I discovered x-sendfile, I used to handle file uploads with
PHP – talking to the same database as my main Rails app. PHP doesn’t
bloat when dumping files across the wire like Ruby does.

Ruby doesn’t.

Observe:
swiftiply 1510 0.7 0.2 53300 21136 ? Ss Dec09 27:43
ruby swiftiply -c /etc/swiftiply.cnf

Next I retrieve a 470mb file, twice. Then look at ps again

swiftiply 1510 0.7 0.2 53300 21136 ? Ss Dec09 27:56
ruby swiftiply -c /etc/swiftiply.cnf

No change in the memory footprint, despite running almost a gigabyte
of file data through it.

One should see essentially the same thing with Mongrel’s send_file()
method, because it reads the file in Const::CHUNK_SIZE pieces (16k).
However…if you are using send_file() from within Rails, then Rails
has to read the entire file into the response before it ever goes back
to Mongrel, and THAT is what is causing the memory bloat. Not Ruby.
Not Mongrel. Rails.

I agree with the other pieces of advice that have been given, though.
Leverage Apache’s x-sendfile since you are on Apache. Alternately,
write a dedicated mongrel handler that will deal with authentication &
file delivery outside of Rails (which also means outside of the Rails
mutex lock, which is a BIG win with regard to scalability of
downloads, at least compared to doing it with send_file from inside of
Rails).

Kirk H.

One should see essentially the same thing with Mongrel’s send_file()
method, because it reads the file in Const::CHUNK_SIZE pieces (16k).
However…if you are using send_file() from within Rails, then Rails
has to read the entire file into the response before it ever goes back
to Mongrel, and THAT is what is causing the memory bloat. Not Ruby.
Not Mongrel. Rails.

The rails mongrel handler actually. It buffers all the content from
rails and sends it after the request is done. Rails supports
streaming just fine with FCGI afaik. This is actually something I’d
like to fix (and bring the rails handler into rails) actually, but I’m
not totally clear why this was done. A thread for another time…

+1 for x-send-file or accel-redirect if you’re using nginx.


Rick O.
http://lighthouseapp.com
http://weblog.techno-weenie.net
http://mephistoblog.com

Matte E. wrote:

I use X_SendFile with Apache 2.2 just fine. FYI.

Matt how do i force apche to handle downlaod request??

On Thu, Sep 23, 2010 at 7:24 AM, Amit T. [email protected]
wrote:

Matte E. wrote:

I use X_SendFile with Apache 2.2 just fine. FYI.

Matt how do i force apche to handle downlaod request??

Wow. This is an ancient thread that you just hijacked.

This is 100000% an Apache configuration issue. It doesn’t have
anything to do with Mongrel, or with Ruby in the other thread on
ruby-talk that you’ve been beating Luis up on.

Kirk H.

ok, after reading the source of mod_xsendfile.c the command should be

apxs2 -cia mod_xsendfile.c

doing apxs tries to include files from Apache 1.3 - and then fails

also, this is on a Debian Etch

Kirk H. wrote:

On Thu, Sep 23, 2010 at 7:24 AM, Amit T. [email protected]
wrote:

Matte E. wrote:

I use X_SendFile with Apache 2.2 just fine. �FYI.

Matt how do i force apche to handle downlaod request??

Wow. This is an ancient thread that you just hijacked.

This is 100000% an Apache configuration issue. It doesn’t have
anything to do with Mongrel, or with Ruby in the other thread on
ruby-talk that you’ve been beating Luis up on.

Kirk H.

Kirk do u have any idea ,how should resolved this problem
how do configure apache to handle this download request???