Disallow: /*/features
Should do the trick. See https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt.
Also, see this thread: Robots.txt: Is this wildcard rule valid?
Question
I have a website that uses pretty urls and need to block certain parameters from search using robots.txt.
my url structure is like: http://example.com/vcond/Used/make/mymake/features/myfeatures
How can i use robots.txt to block urls only when features is a parameter of the url. I had read that you can do something like this:
Disallow: *features
And this will block bots from any url that has features in it. Is this true? But i need urls like: http://example.com/vcond/Used/make/mymake To work!!
Thank
Solution
Disallow: /*/features
Should do the trick. See https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt.
Also, see this thread: Robots.txt: Is this wildcard rule valid?