I'd like to block all bots from crawling a sub directory http://www.mysite.com/admin plus any files and folders in that directory. For example there may be further directories inside /admin such as http://www.mysite.com/admin/assets/img

I'm not sure what is the exact correct declarations to include in robots.txt to do this.

Should it be:

User-agent: *
Disallow: /admin/

Or:

User-agent: *
Disallow: /admin/*

Or:

User-agent: *
Disallow: /admin/
Disallow: /admin/*
有帮助吗?

解决方案

Based on information available on the net (I can't retrieve it all but some forums actually report the problem, as in here and here, for example) I'd follow those who suggest we never tell people or bots (or both) where is that we don't want them to look ("admin" looks like sensitive content...).

After having checked, I'd confirm it's the first one you say. Reference here

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top