Combating site suckers

Does anybody have any tips for combating site suckers

  • of the variety that use wget and other site
    downloaders; hit sites hard; ignore robots.txt; and
    even set User-agent to commonly used browser strings?
    I’m using Lighty and SCGI. Back with Apache, there
    were various modules that could limit total bandwidth,
    but I never found anything that worked well.


Yahoo! DSL ? Something to write home about.
Just $16.99/mo. or less.

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs