SOLVED:
Need to give access to crawler service on firewall.
Thanks.
質問
One of the sharepoint 2013 site is not able to crawled by google bot. Google said, he is not able to reach Robots.txt. When i look from chrome developer tools, it seems server return 304 code.
How can i solve this problem?
EDIT: When i call the url without "www."(xxxxx.com/robots.txt) , server returning 200 ok.
User-agent: *
Allow: /
Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/
Sitemap: http://www.xxxxxxx.com:80/sitemap.xml
解決
SOLVED:
Need to give access to crawler service on firewall.
Thanks.
他のヒント
Try this:
User-agent: *
Allow: /
Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/
Sitemap: sitemap.xml