Robots.txt Crawl-delay:

Crawl-delay

Crawl-delay

Robots.txt Crawl-delay:

Just came across a robots.txt today with a crawl delay in it. I hadn’t heard of it before, maybe some of you have.

The purpose of a Crawl-delay in your robots.txt seems “dumb” to put it lightly. It is supposed to slow down bots from crawling your page. Why would you want to do that. I mean for all argumentative purposes you want the bots all over your site, as often as possible getting all your fresh content and displaying it in the associate search engine.

The only bots most likely smart enough to even check for this are Yahoo Slurp! and GoogleBot. Do you really want to restrict them on anything. Do you want to say, hey Google skip me and come back later when my server is not so busy? I don’t.

If you want to slow down how fast Google crawls your site sign up for a Webmaster Tools account it’s free and from Google. The link http://www.google.com/webmasters/tools/ There you can see how fast they are crawling and slow them down if you want.

[ad]

What do you think??