Question

We have multiple websites served from the same Sitecore instance and same production web server. Each website has its own primary and Google-news sitemap, and up to now we have included a sitemap specification for each in the .NET site's single robots.txt file.

Our SEO expert has raised the presence of different domains in the same robots.txt as a possible issue, and I can't find any documentation definitely stating one way or the other. Thank you.

Was it helpful?

Solution

This should be OK for Google at least. It may not work for other search engines such as Bing, however.

According to https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt:

sitemap: [absoluteURL]

[absoluteURL] points to a Sitemap, Sitemap Index file or equivalent URL. The URL does not have to be on the same host as the robots.txt file. Multiple sitemap entries may exist. As non-group-member records, these are not tied to any specific user-agents and may be followed by all crawlers, provided it is not disallowed.

OTHER TIPS

The best way to achieve this is to Handle the Robots.txt from Sitecore Content Tree.

We also have similar structure where we are delivering multiple websites from Single sitecore instance.

I have written a blog for such please find it below. It is exactly what you want.

http://darjimaulik.wordpress.com/2013/03/06/how-to-create-handler-in-sitecore/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top