The thread is a bit abandoned, and the circumstances are a bit other too. The code cited at the beginning seems to be from the agency i was working for.
After becoming known the googlebot is a kind of Chrome and posts like this, there are not many approaches remaining to hide links from googlebot.
One approach seems to me to be promising - maybe, if somebody would find it promising too, it would be coded:
- Step 1: publish a non-existing image
- Step 2: write a javascript function, which finds links with special certain
class="hidden"
and redesigns urls in the href-attribute using a regex. With redesign
i mean something like: replace .
and/or /
with |
, %
or *
, or divide url in some parts with non-url characters, like |
or *
.
- Step 3: write a second javascript function, which redesigns urls back to real urls,
- Step 4: place both javascripts into external file and close it against crawling with
X-Robots-Tag
- Step 5: fire
onError
for non-existing image at least 6 seconds after onLoad
(to be sure, that Googlebot goes away)
- Step 6: trigger through
onError
the second javascript, which makes urls urls again
The approach could be maybe reverted, like: url redesign could be triggered immediately through onError
, and Back-Redesign could be triggered through onClick
.