Limit robots

Hello,

Is there a chance to limit robots to 1 request per second?
The down below does not work: [emerg] “limit_req” directive is not
allowed here

Thanks a lot
Markus

http {
limit_req_zone $http_user_agent zone=useragenttrack:1m
rate=1r/s;
server {
listen 80;
location / {
if ($http_user_agent ~* “[a-z]bot[^a-z]”) {
limit_req zone=useragenttrack burst=100 nodelay;
}
}
}
}

Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,221162,221162#msg-221162

On 13 Jan 2012 13h36 WET, [email protected] wrote:

Hello,

Is there a chance to limit robots to 1 request per second?
The down below does not work: [emerg] “limit_req” directive is not
allowed here

Yes. Try:

At the http level:

limit_req_zone $http_user_agent zone=useragenttrack:1m rate=1r/s;

map $http_user_agent $is_bot {
default 0;
~[a-z]bot[^a-z] 1;
}

At the server level:

location / {
error_page 418 @bots;

if ($is_bot) {
  return 418;
}
...

}

location @bots {
limit_req zone=useragenttrack burst=100 nodelay;

}

Cf: http://wiki.nginx.org/HttpLimitReqModule

— appa

Your solution works great for rate limiting, but I need the requests to
still be 200 return code when they are not limited. Any ideas?

Thanks, that worked great for me.

On 17 Jan 2012 17h15 WET, [email protected] wrote:

Your solution works great for rate limiting, but I need the requests
to still be 200 return code when they are not limited. Any ideas?

Yes. Try:

  • error_page 418 @bots;
  • error_page 418 =200 @bots;

I.e.:

location / {
error_page 418 =200 @bots;

if ($is_bot) {
  return 418;
}
...

}

location @bots {
limit_req zone=useragenttrack burst=100 nodelay;

}

— appa

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs