Pergunta

One of the sharepoint 2013 site is not able to crawled by google bot. Google said, he is not able to reach Robots.txt. When i look from chrome developer tools, it seems server return 304 code.

How can i solve this problem?

EDIT: When i call the url without "www."(xxxxx.com/robots.txt) , server returning 200 ok.

User-agent: *

Allow: /

Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/


Sitemap: http://www.xxxxxxx.com:80/sitemap.xml
Foi útil?

Solução

SOLVED:

Need to give access to crawler service on firewall.

Thanks.

Outras dicas

Try this:

User-agent: *
Allow: /
Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/
Sitemap: sitemap.xml
Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top