Question

I have a few doubts about this robots file.

User-agent: *
Disallow: /administrator/
Disallow: /css/
Disallow: /func/
Disallow: /images/
Disallow: /inc/
Disallow: /js/
Disallow: /login/
Disallow: /recover/
Disallow: /Scripts/
Disallow: /store/com-handler/
Disallow: /store/img/
Disallow: /store/theme/
Disallow: /store/StoreSys.swf
Disallow: config.php

This is going to disable crawlers for all files inside each folder right? Or i have to add a asterisk at the end of each folder name?

I think this should do it. But i'm not sure if have to add Allow: / right after User-agent i suppose it isn't needed.

Anything wrong in this robots file?

PS: If someone can suggest a validation app for local use, i would be glad.

Thanks.

Was it helpful?

Solution

It's fine as is, if I understand what you want. E.g.

/administrator/
/css/subpage

are both blocked, but

/foo 

is allowed. Note that Allow is a less supported extension designed only to counter a previous Disallow. You might use it if, for instance, despite your

Disallow: /images/

you decide you want a particular image allowed. So,

Allow: /images/ok_image

All other images remain blocked. You can see http://www.searchtools.com/robots/robots-txt.html for more info, including a list of checkers.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top