Question

Using robot.txt is it possible to restrict robot access for (specific) query string (parameter) values?

ie

http://www.url.com/default.aspx  #allow
http://www.url.com/default.aspx?id=6  #allow
http://www.url.com/default.aspx?id=7  #disallow
Was it helpful?

Solution

User-agent: *
Disallow: /default.aspx?id=7  # disallow
Disallow: /default.aspx?id=9  # disallow
Disallow: /default.aspx?id=33 # disallow

etc...

You only need to specify the url's that are disallowed. Everything else is allowed by default.

OTHER TIPS

Can just the query variable defined such as

Disallow: /default.aspx?id=*

or better still

Disallow: /?id=

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top