Re: Serving an alternate robots.txt for SSL requests

Thank you Nick for your help,

I’ve followed your suggestions and it worked as expected.

I’ve changed the rewrite rule since I don’t need to capture anything
there.

server {
listen 443;
[…]
location = /robots.txt {
rewrite ^ /robots_ssl.txt last;
}
}

Does anybody know if this is possible to do within a single server
context that handle both protocols in version 0.7.*?

Thanks.

Does anybody know if this is possible to do within a single server
context that handle both protocols in version 0.7.*?

Thanks.

I prefer to put my vhost definitions in a seperate file so my version of
this would look something like this

server {
listen 80;
include vhost.d/vhost.conf;
}

server {
listen 443;
include ssl.conf;

location = /robots.txt { ... }

include vhost.d/vhost.conf;

}

Cheers

Dave

I actually do the same. The thing is I just don’t like that.

I tend to use the exact same configurations on both SSL and non-SSL
connections to allow users browsing the site in a safe way if they want
to.

But duplicate conf seems to lead to problems.

On Tue, Jan 13, 2009 at 09:15:07PM -0200, Juan Fco. Giordana wrote:

    rewrite ^    /robots_ssl.txt last;
}

}

Does anybody know if this is possible to do within a single server
context that handle both protocols in version 0.7.*?

If your servers are different only in this part, then in 0.7 you can

 server {
     listen 80;
     listen 443 default ssl;

     location = /robots.txt {
         if ($server_port = 443) {                # or ($scheme = 

https)
rewrite ^ /robots_ssl.txt last; # or “break;”
}
}

     ...

If the servers have many differences, then it’s better to use
separate servers. In this case you do not need rewrite, use just alias:

 server {
     listen 443;

     location = /robots.txt {
         alias  /path/to/robots_ssl.txt;
     }

Yet another way (the better than with if/rewrite):

 map  $scheme  $robots {
      default  robots.txt;
      https    robots_ssl.txt;
 }

 server {
     listen 80;
     listen 443 default ssl;

     location = /robots.txt {
         alias  /path/to/$robots;
     }

     ...

That is AWESOME Igor.

Cheers

Dave

Thanks again Igor,

Yes, that was exactly the first approach I tried. A couple months ago I
was quite impressed when that feature came out but needed to downgrade
to 0.6 because before going to production :slight_smile:

I knew it was possible but didn’t remember/know if there were support
for this on 0.6.

Thanks a lot and hope this clarify the doubts of others.