Вопрос

I am building a sliding one-page website with the following structure:

<div id="wrapper">

  <div id="ajax_content">
    // AJAX action fires on document ready (jquery) and loads ALL pages in here
    // then I slide to the correct page-panel found via URL
  </div>

  <noscript>
    // Normal rendering of the SINGLE page that was requested by the url
  </noscript>

</div>

I am using History.js, so I don't have hashbanged URLs. All (menu-)links to the several pages are just normal links with a data-page attribute that return false by JS.

I want google to index all pages seperately.

The problem I am facing is that all documentation I found about ajaxed websites state that google needs hashbangs. So if I only have normal URLs in my href tags, google won't do any ajax actions, right?

Other question: Google will read the <noscript> tag. But as the content of the noscript is different than content shown to users, will google see this approach as cloaking?

Google states:

Ensure that you provide the same content in both elements (for instance, provide the same text in the JavaScript as in the noscript tag). Including substantially different content in the alternate element may cause Google to take action on the site.

What do you think about this approach?

Это было полезно?

Решение

So if I only have normal URLs in my href tags, google won't do any ajax actions, right?

Correct.

But as the content of the noscript is different than content shown to users, will google see this approach as cloaking?

No. Cloaking is intentionally showing different content to search engines then to your users for the purpose of manipulating search results. Using <noscript> this was is good usability as any non-JavaScript user, which includes some human beings with JavaScript turned off, can still use your website. So what you're doing is a good thing.

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top