Hi Steve,
I would like to warn you about the issue of abusing bots, i.e. bots
who do not obey robots.txt. Such bots can really eat up your bandwidth
fast.
On one of my servers in October last month the total bandwidth usage
was 2300GB (serving just text and images). A detailed log scan showed
that most of the bandwidth was used by bots in Asian countries.
Scraping is really “hot” these days so you will want to ensure that
you get hold of various abusing bots to include in your robots.txt
file.
Since I manage multiple domains on dedicated clusters, the solution
for me was to ban these bots using mod_rewrite. If you would like, I
can post a copy of my mod_rewrite banned user agents.
I recommend putting crawl delay in your robots.txt file (if you have a
large webite) otherwise bots like MSN can hit your site hard
User-agent: *
Crawl-delay: 17
Not that this answers your question, but I thought it may help.
Frank
Steve O. [email protected] wrote: I’d been ignoring this error
message in my log for a while:
ActionController::RoutingError (Recognition failed for “/robots.txt”):
I had never touched robots.txt. So I decided to make it a proper
robots.txt file
I found this great article…
http://www.ilovejackdaniels.com/seo/robots-txt-file/
…where Dave explains the ins and outs of the file.
Before I changed mine, I thought I’d poll the group to see if anyone
had any good thoughts on the subject -like any rails-specific excludes.
And whether some samples could be posted.
Mine was going to look like this:
User-agent: *
Disallow: /404.php
Thanks,
Steve
http://www.smarkets.net
Rails mailing list
[email protected]
http://lists.rubyonrails.org/mailman/listinfo/rails