So…apparently webrick does the following for serving pages:
loads the page into a string object
then sends that string to the requestor.
Unfortunately this means that if it is a very large file being served
that that buffer string will get very large. I.e. with a 700MB file it
will fail because Ruby runs out of memory. This fact also defeats the
‘streaming’ aspect of some ruby functions, for example the RoR send_file
has an option to send a file ‘a chunk at a time’ to the client–however
these chunks are all conglomerated within webrick then sent–so it
doesn’t stream out as it hoped to be doing. I therefore see this as a
bug in Webrick and was wondering what others thought.
Cheers!
-Roger
On Feb 27, 2007, at 13:36, Roger P. wrote:
So…apparently webrick does the following for serving pages:
loads the page into a string object
then sends that string to the requestor.
Unfortunately this means that if it is a very large file being served
that that buffer string will get very large.
Don’t dynamically build and send large strings. Use an IO instead.
WEBrick knows the difference and responds accordingly.
Eric H. wrote:
On Feb 27, 2007, at 13:36, Roger P. wrote:
So…apparently webrick does the following for serving pages:
loads the page into a string object
then sends that string to the requestor.
Unfortunately this means that if it is a very large file being served
that that buffer string will get very large.Don’t dynamically build and send large strings. Use an IO instead.
WEBrick knows the difference and responds accordingly.
I believe that rails must use strings, then–any ideas for that case?
Thanks!
Don’t dynamically build and send large strings. Use an IO instead.
WEBrick knows the difference and responds accordingly.I believe that rails must use strings, then–any ideas for that case?
Thanks!
Don’t use webrick for any serious hosting - sending out a 700mb file
constitutes serious hosting.
I’d say your best approach would be to setup a mongrel cluster running
behind apache and then either a:
a) let apache serve the file and never hit ruby code
or
b) install mod_x_sendfile into apache and just have your code set the
appropriate header to let apache do the lifting.
You could also do the same thing behind lighttpd. There’s many howtos
about on lighttpd+mongrel and apache+mongrel - google about.
Dan.
On Wed, Feb 28, 2007 at 12:53:18PM +0900, Mat S. wrote:
has an option to send a file ‘a chunk at a time’ to the client–
however
these chunks are all conglomerated within webrick then sent–so it
doesn’t stream out as it hoped to be doing. I therefore see this as a
bug in Webrick and was wondering what others thought.
Cheers!
-RogerI doubt Webrick was ever really intended for that sort of work. You
could try Mongrel, although it may yield the same result. Rock solid
ruby deployment is still something of a work in progress, I feel.
Or run your code as a fastcgi under Apache.
This doesn’t stop you from trying to read a 700MB file into a string of
course, but it does give you the option to simply open the file, read it
chunk at a time and squirt it to STDOUT.
I believe Rails will run happily under fastcgi. You’ll just need to tell
it
not to render anything if you’re generating your own HTTP headers and
body
directly.
Fastcgi also has the advantage of automatically making your program
thread
safe, since each instance is a separate process. The downside is that if
you’re handling (say) 5 concurrent client requests, you’ll have five
Ruby+Rails processes spawned from scratch, each with their own memory
footprint.
Regards,
Brian.
On Feb 27, 2007, at 4:36 PM, Roger P. wrote:
however
these chunks are all conglomerated within webrick then sent–so it
doesn’t stream out as it hoped to be doing. I therefore see this as a
bug in Webrick and was wondering what others thought.
Cheers!
-Roger
I doubt Webrick was ever really intended for that sort of work. You
could try Mongrel, although it may yield the same result. Rock solid
ruby deployment is still something of a work in progress, I feel.
-Mat