Hi,
I want to block some robots such as google, to access some urls.
here is my config:
# robotlarin gezdigi luzumsuz adresleri blokla... zirt pirt
giriyorlar sanki bisey varmis gibi
set $test “”;
if ($request_filename ~* calendar) {
set $test "blo";
}
if ($http_user_agent ~* google\.com ) {
set $test "${test}ck1";
}
if ($test = "block1" ){
return 444;
}
this is as described in nginx config, for “and” ing some conditions.
but, it do not work now,
what I am doing wrong ?
Hello,
Why not using a robots.txt file ? http://www.robotstxt.org/
Posted at Nginx Forum:
Hi, I want to block some robots such as google, to access some urls. here is my config: # robotlarin gezdigi luzumsuz adresleri blokla... zirt pirt giriyorlar sanki bisey varmis gibi set $test ""; if ($reque
On Sun, Jul 18, 2010 at 22:34, toto2008 [email protected] wrote:
Hello,
Why not using a robots.txt file ? http://www.robotstxt.org/
It won’t block bots that don’t honor it (not would anything block bots
that present themselves as mozilla).
–
()Â ascii-rubanda kampajno - kontraÅ html-a retpoÅto
/\Â ascii ribbon campaign - against html e-mail
In robotstxt.org site, it says:
“robots can ignore your /robots.txt. Especially malware robots that
scan the web for security vulnerabilities, and email address
harvesters used by spammers will pay no attention.”
I am expeirencing nginx and learning its rewrite features.
So, I try to write rewrite statements.
still i have no luck.
2010/7/19 Nuno Magalhães [email protected] :
On Thu, Jul 15, 2010 at 01:51:36AM +0300, bvidinli wrote:
}
this is as described in nginx config, for “and” ing some conditions.
but, it do not work now,
location ~ calendar {
if ($http_user_agent ~* google.com ) {
return 444;
}
...
}
…
–
Igor S.
http://sysoev.ru/en/