Question

We can change crawl-rate of bots by using robots.txt. But Googlebot doesn't take robots.txt into consideration so I look for an alternative way.

In my site i have some pages changes very very frequently with small changes. Say 20-30 characters one day. But I don't want Googlebot to re-index them frequently. Say, I want Googlebot index it every month. Is it possible to make it with meta tags like this:

<META HTTP-EQUIV="expires" CONTENT="Wed, 26 Feb 1997 08:21:57 GMT">

Does Googlebot ignores this? Or can you recommend another solution?

Was it helpful?

Solution

I would advice you to generate a XML Sitemap.

A sitemap will allows you to specify the parameters you're looking to pass to the search engine, namingly the importance (or weight) that you give to pages and the rate at which the pages (usually) are updated.

This doesn't mean the search engines will stick to only that. It could be that you say the page updates once a year and that it gets crawled 3 times that year, or that it is set to daily, and only gets crawled once a month.

Google on SiteMaps

OTHER TIPS

You can create a sitemap with those URLs and set the <changefreq> element to monthly. These are only hints to search engines, but it's probably your best bet.

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
   <url>
      <loc>http://www.example.com/latest-updates.html</loc>
      <changefreq>monthly</changefreq>
   </url>
</urlset>
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top