Forum: NGINX Serving an alternate robots.txt for SSL requests.

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
A50d1b69d837f1cabc58e46cbc866c94?d=identicon&s=25 Juan Fco. Giordana (Guest)
on 2009-01-07 16:17
(Received via mailing list)
Hello list,

I'm using nginx-0.6.34 with the try_files patch applied and I'm trying
to serve an alternate robots.txt for requests on port 443 so pages under
secure connections are not shown by web crawlers.

I've tried many different approaches and couldn't get any of them to
work as I expected:

  if ($server_port = 443)
  if ($remote_port = 443)
  if ($scheme = https)
  With and without the location block.
  if () blocks inside @try_files rule.
  redirect flags: break, last, pemanent.
  All rewrite rules disabled except the one in question.

server {
     [...]
     location /robots.txt {
         if ($server_port = 443) {
             rewrite ^robots\.txt$ robots_ssl.txt last;
         }
     }
     [...]
}

Most of these approaches returned always robots.txt on both SSL and
NON-SSL while others 404 error under SSL. None robots_ssl.txt.

Am I doing something wrong?

Thanks!
Ed6583386f2b2d56e1b5eb2ee236da39?d=identicon&s=25 Nick Pearson (Guest)
on 2009-01-07 16:32
(Received via mailing list)
Hi Juan,

Try using two server directives -- one for http and one for https.  The
server directive chosen depends on the port that is requested.
Something
like this:

server {
    listen  80;  # http
    server_name  www.yoursite.com;
    [...]
    location /robots.txt {
        break;
    }
}
server {
    listen  443;  # https
    server_name  www.yoursite.com;
    [...]
    location /robots.txt {
        rewrite (.*)  /robots_ssl.txt;
    }
}


On Wed, Jan 7, 2009 at 9:06 AM, Juan Fco. Giordana
This topic is locked and can not be replied to.