Forum: Ruby on Rails Combating site suckers

Announcement (2017-05-07): is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see and for other Rails- und Ruby-related community platforms.
CSN (Guest)
on 2006-01-04 01:32
(Received via mailing list)
Does anybody have any tips for combating site suckers
- of the variety that use wget and other site
downloaders; hit sites hard; ignore robots.txt; and
even set User-agent to commonly used browser strings?
I'm using Lighty and SCGI. Back with Apache, there
were various modules that could limit total bandwidth,
but I never found anything that worked well.


Yahoo! DSL ? Something to write home about.
Just $16.99/mo. or less.
This topic is locked and can not be replied to.