Domanda

We have some links we want to hide from Google, using Javascript to "hide" the link but let it work for the real clients.

I was told from the SEO agency that the best method is to base64 encode the link and call it via javascript:

<a data-href="RdcDovL1N0YWdpbmc...base64...hhcmRpbmctaGVycmVuLWhlbaQtMTgyMDg3"
   href="#">Link</a>


<script>
<!--
var _dlist = document.getElementsByTagName('A');
for(i=0;i<_dlist.length;i++) {
    var _data = _dlist[i].getAttribute( 'data-href' );
    if( _data !== 'null' ) {
        var _extend = CryptoJS.enc.Base64.parse( _data );
        _dlist[i].setAttribute( 'href', _extend.toString( CryptoJS.enc.Latin1 ) );
    }
}
-->
</script> 

My problem now is, I don't want to include another 2 files (they suggested me crypto-js lib) just for the links. I'd like to ask you, how far does Google reveal links and follow them and what's the easiest approach without loading more scripts. jQuery is available.

È stato utile?

Soluzione

This is what I ended up with:

Links look like:

<a href="#" onclick="linkAction(this); return false;" 
   data-href="uggc://fgntvat.....">

Where data-href is Rot13 encoded and linkAction does:

function linkAction(e) {
    window.location = rot13($(e).data('href'));
}

..in an external JS file.

I think this is the best obfuscation without performance overhead. Let's see what the SEO agency says :-)

P.S Rot13 taken from: Where is my one-line implementation of rot13 in JavaScript going wrong?

Altri suggerimenti

The thread is a bit abandoned, and the circumstances are a bit other too. The code cited at the beginning seems to be from the agency i was working for.

After becoming known the googlebot is a kind of Chrome and posts like this, there are not many approaches remaining to hide links from googlebot.

One approach seems to me to be promising - maybe, if somebody would find it promising too, it would be coded:

  • Step 1: publish a non-existing image
  • Step 2: write a javascript function, which finds links with special certain class="hidden" and redesigns urls in the href-attribute using a regex. With redesign i mean something like: replace . and/or / with |, % or *, or divide url in some parts with non-url characters, like | or *.
  • Step 3: write a second javascript function, which redesigns urls back to real urls,
  • Step 4: place both javascripts into external file and close it against crawling with X-Robots-Tag
  • Step 5: fire onError for non-existing image at least 6 seconds after onLoad (to be sure, that Googlebot goes away)
  • Step 6: trigger through onError the second javascript, which makes urls urls again

The approach could be maybe reverted, like: url redesign could be triggered immediately through onError, and Back-Redesign could be triggered through onClick.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top