Forum: Ruby Maximum read size with Net::HTTP.get?

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
505cf9ac8f46a71a66197e554b69630a?d=identicon&s=25 smorabito (Guest)
on 2005-11-18 22:54
(Received via mailing list)
Folks,

I have a Ruby application that allows users fairly arbitrary access to
URLs.  The application then opens and reads the object at that URL
using Net::HTTP.get() and attempts to parse it as an XML document.

So far so good.

But this is a Rails app open to the general public, and it would be
fairly trivial to write a CGI somewhere that just returns garbage data
forever, leaving open a pretty obvious DoS attack.

I'd like to specify a maximum number of bytes to read with
Net::HTTP.get(), so, for example, if the process had read more than 1MB
it would throw an exception and stop reading.  I haven't been able to
find a way to do that so far, but then I confess to being fairly new to
Ruby.

Does anyone have any ideas or pointers?

-Seth
This topic is locked and can not be replied to.