Question

I've got a web app which heavily uses AngularJS / AJAX and I'd like it to be crawlable by Google and other search engines. My understanding is that I need to do something special to make it work, as described here: https://developers.google.com/webmasters/ajax-crawling

Unfortunately, that looks quite nasty and I'd rather not introduce the hash tags. What I'd like to do is to serve a static page to Googlebot (based on the User-Agent), either directly or by sending it a 302 redirect. That way, the web app can be the same, and the whole Googlebot workaround is nicely isolated until it is no longer necessary.

My worry is that Google may mistakenly assume that I'm trying to trick Googlebot, while my goal is to help it. What do you guys think about this approach, and what would you recommend?

Était-ce utile?

La solution 4

As of May 2014 GoogleBot now executes JavaScript. Check WebmasterTools to see how Google sees your site.

http://googlewebmastercentral.blogspot.no/2014/05/understanding-web-pages-better.html

Edit: Note that this does not mean other crawlers (Bing, Facebook, etc.) will execute Javascript. You may still need to take additional steps to ensure that these crawlers can see your site.

Autres conseils

Recently I come upon this excellent post from yearofmoo, explaining in details how to make your Angular app SEO friendly. In essence, when bots see an uri with a hash tag they will know it's an ajaxed page and will try to reach the same uri by replacing '#!' in your uri with '?_escaped_fragment_='. This alternative uri instructs bots that they should expect to find a definitive static version of the page they were accessing.

Of course, to achieve this you'd have to introduce hash tags into your uris. I don't see why are you trying to avoid them. Isn't gmail using hash tags?

Yeah unfortunately, if you want to be indexed - you have to adhere to the scheme :( If your running a ruby app - there's a gem that implements the crawling scheme for any rack app....

gem install google_ajax_crawler

writeup of how to use it is at http://thecodeabode.blogspot.com.au/2013/03/backbonejs-and-seo-google-ajax-crawling.html, source code at https://github.com/benkitzelman/google-ajax-crawler

Have a look at these links and it will give you a good direction:

  • Set up your own Prerender service using Prerender.io open source code:

      https://prerender.io/
    
  • Use a different existing service such as BromBone, Seo.js or SEO4AJAX:

      http://www.brombone.com/
      http://getseojs.com/
      http://www.seo4ajax.com/
    
  • Create your own service for rendering and serving snapshots to search engines. Read this article. It will give you the big picture:

      http://scotch.io/tutorials/javascript/angularjs-seo-with-prerender-io
    
Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top