Question

Below is the entire content of my robots.txt file.

User-agent: *
Disallow: /marketing/wp-admin/
Disallow: /marketing/wp-includes/

Sitemap: http://mywebsite.com/sitemap.xml.gz

It is the one apparently generated by Wordpress. I haven't manually created one.

Yet when I signed up for Google Webmaster tools today. This is the content of that Google Webmasters tools is seeing:

User-agent: *
Disallow: /

... So ALL my urls are blocked!

In Wordpress, settings > reading > search engine visibility: "Discourage search engines from indexing this site" is not checked. I unchecked it fairly recently. (Google Webmaster tools is telling me it downloaded my robots.txt file on Nov 13, 2013.)

...So why is it still reading the old version where all my pages are disallowed, instead of the new version?

Does it take a while? Should I just be patient?

Also what is the ".gz" on the end of my sitemap line? I'm using the Yoast All-in-One SEO pack plugin. I'm thinking the plugin added the ".gz", whatever that is.

Was it helpful?

Solution

You can ask Googlebot to crawl again after you've changed your robots.txt. See Ask Google to crawl a page or site for information.

The Sitemap file tells Googlebot more about the structure of your site, and allows it to crawl more effectively. See About Sitemaps for more info.

The .gz is just telling Googlebot that the generated sitemap file is compressed.

OTHER TIPS

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top