Question

One of the sharepoint 2013 site is not able to crawled by google bot. Google said, he is not able to reach Robots.txt. When i look from chrome developer tools, it seems server return 304 code.

How can i solve this problem?

EDIT: When i call the url without "www."(xxxxx.com/robots.txt) , server returning 200 ok.

User-agent: *

Allow: /

Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/


Sitemap: http://www.xxxxxxx.com:80/sitemap.xml
Was it helpful?

Solution

SOLVED:

Need to give access to crawler service on firewall.

Thanks.

OTHER TIPS

Try this:

User-agent: *
Allow: /
Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/
Sitemap: sitemap.xml
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top