Question

We have a global client who has five country sites-

domain.com/us/
domain.com/uk/
domain.com/de/

..

We have setup sitemap.xml files inside each folder.

domain.com/us/sitemap-us.xml
domain.com/uk/sitemap-uk.xml
domain.com/de/sitemap-de.xml

..

Now, will Google pick up these automatically inside each folder, or do we have to specify each sitemap inside the parent robots.txt? I ask because apparently, based on Google help, robots.txt can only be in the parent domain, once. So even if we put a robots.txt file inside each country's folder, it's useless -- the respective sitemap won't be picked up.

How should we specify Google to pick it up?

Was it helpful?

Solution

You should gather all your sitemaps into a single .XML file, and then submit it to the Google Webmaster Tools. If you wish your sitemap to be indexed by other SE (w/o manual submitting), then name your XML sitemap as sitemap.xml and place it to the site's root, e.g.: site.com/sitemap.xml

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top