SOLVED:
Need to give access to crawler service on firewall.
Thanks.
Pregunta
One of the sharepoint 2013 site is not able to crawled by google bot. Google said, he is not able to reach Robots.txt. When i look from chrome developer tools, it seems server return 304 code.
How can i solve this problem?
EDIT: When i call the url without "www."(xxxxx.com/robots.txt) , server returning 200 ok.
User-agent: *
Allow: /
Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/
Sitemap: http://www.xxxxxxx.com:80/sitemap.xml
Solución
SOLVED:
Need to give access to crawler service on firewall.
Thanks.
Otros consejos
Try this:
User-agent: *
Allow: /
Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/
Sitemap: sitemap.xml