Hey Guys, I have run into a problem with the geo module. I have set up a
geo list containing a LARGE list of IPs which we need to have
“whitelisted” for getting through to the upstream. These IPs are for
search engines. Currently we have the list set up via the following
way…
geo $remote_addr $search {
default 0;
include geoip-search.conf;
}
The geoip-search.conf file contains a the list of IPs in the following
format…
114.111.36.26/32 search;
114.111.36.28/32 search;
114.111.36.29/32 search;
114.111.36.30/32 search;
114.111.36.31/32 search;
114.111.36.32/32 search;
119.63.193.100/32 search;
119.63.193.101/32 search;
119.63.193.102/32 search;
119.63.193.103/32 search;
Then inside of the configurations, we do the following… which was
based on recommendations from Igor…
if ( $search = search ) {
proxy_pass http://LB_HTTP_UPSTREAM;
break;
}
Then under that we also have some stuff for security which checks for a
cookie and stuff serving them a different page if no cookie is present.
We want the search engine IPs to be able to make it through to the
upstream, but it appears that this is no longer occurring. We had no
problems in the past… Perhaps it is due to something in 0.8.53 as we
had upgraded to that a while ago, and then after a while we got
complaints of google bots not getting through. Our list contains about
40,000 lines which covers well over 100,000 IPs. Anyone have any ideas
on what could be causing this?
Posted at Nginx Forum: