Question

I want to prevent using proxy on the website: I know that it is never that reliable if e.g. using anonymous proxies.. anyway, so I want to check the existence of the following headers

HTTP_CLIENT_IP:
HTTP_FORWARDED:
HTTP_X_FORWARDED_FOR:
HTTP_VIA:
HTTP_PROXY_CONNECTION:
HTTP_X_PROXY_ID

and ban that ip address, if at least one exists. But I am worried if googlebot, and actually other search engines can have one of these headers - when crawling: to not accidentally ban google. So, the question - can SE and specifically googlebot have one of the above-mentioned headers - when making a request for indexing the website.

Thanks

Was it helpful?

Solution

Have no info about other search engines, but for google it seems it does not https://productforums.google.com/forum/#!topic/webmasters/zLzCWlmb3so

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top