Question

I'm working on a web application that has a url /search. We also have a set of urls that correspond to the locations the business operates in as /search/location-1, /search/location-2, etc.

Until now, we'd filtered based on IP and displayed the same content to a user in location-1 whether they'd hit /search or /search/location-1. These pages are now being marked as duplicate content and negatively impacts our rankings on search engines.

Can we do a dynamic 301 redirect from /search to /search/location-1 for users in location-1, /search to /search/location-2 to users in location-2, etc? Or, is there another, better, way of removing the duplicate content?

Was it helpful?

Solution

You could certainly redirect users that way, but since you are going off of the IP of the requester, search bots would only be able to index the location URLs that are relative to each bot's location, not the location of end users. Is that what you really want?

Alternativley, you might consider using a <meta name="robots" content="noindex"> tag in your HTML to prevent search bots from indexing pages you do not want them to index. So it might make sense to let bots index /search/location-1 and /search/location-2 individually but ignore /search if you do not use a redirect on it.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top