Question

I need to disallow /variable_dir_name/directory via robots.txt

I use:

Disallow: */directory Noindex: */directory

is that correct?

Was it helpful?

Solution

The following should work in your robots.txt:

User-Agent: *
Disallow: /*/directory

Further reading from Google: Block or remove pages using a robots.txt file

OTHER TIPS

Indeed, there was the opportunity of GoogleBot that allowed to use these in the Robots.txt:

  • Noindex
  • Nofollow
  • Crawl-delay

But seen on the GoogleBlog-News they will no longer support those (0,001% used) commands anymore from September 2019 on. So you should only use meta tags anymore for these on your page to be safe for the future.

What you really should do, is the following:

  • Disallow via robots.txt and
  • Noindex already indexed documents via Google Search Console
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top