Question

On our old site we had some directories that have since been deprecated. I'm worried about an influx of 404s hurting our rankings.

For example what was once www.mysite.com/intranet/ no longer exists on our server, but google is (I'm guessing) updating old records of that folder and returning 404s. (We're using a plugin to report 404s via rss)

The option I see are:

  1. Redirecting these urls via .htaccess
  2. Disallowing via robots.txt (confusing because there is no such directory)
  3. Remove directories via webmaster tools (probably not a recommended reason for doing this)

I'd greatly appreciate if anyone can provide some insight on how to keep Google from thinking these directories are still part of the site.

Was it helpful?

Solution

You should update your sitemap and send it to search engines, that is an important first step.

Additional to the sitemap, if you moved those resources to a new location in the new structure, then use 301 redirections. If they just disappeared, whatever the reason, then use 410 status to inform crawlers that they are not available anymore.

There is no need to touch the robots.txt file as you mention since that would be incorrect regarding the real structure of the site.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top