Question

I don't know how search engine bots work, database entries for robots on my browser records are e.g. Googlebot, msnbot, BingBot, etc.

So I don't know if they have javascript enabled or anything.

Why I want to know if they have javascript enabled? Because I want to require Javascript on my site. What I did to do this is that I have an 'overlay' div with a short message that I set to 'display:none' with javascript on page load. So if there is no Javascript the overlay won't dissapear and nothing in the site is clickable or anything.

So can robots still crawl my site even with the Javascript requirement thing I made? Because I want google and other good bots to be able to crawl my site.

Was it helpful?

Solution 2

Search robots can and will crawl your site, but they do not understand JavaScript, so they will ignore any JS. They should however have no issue crawling the page you described.

OTHER TIPS

Yes, since 2010 at least googlebot was known for understanding not direct links created in (some) javascript, but they won't need your javascript to read links, as those are plainly visible in html. The same goes with people browsing with javascript turned off.

As far as I know search engines can parse content recognizing different content and behavior. They can also penalize specific hiding content. I suspect Google is a very smart search engine and has the ability to recognize some javascript but the internal behavior of their system is unknown to us.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top