The robots.txt file that sits in the root of your site is the place to add directions or permissions for robots. Search engine bots will check this file before indexing your pages. (at least they should)
If however you don’t want a certain bot to index your pages then you want to use the robots.txt disallow command in the file. The command is simple really.
If you want to block the search engines from indexing the contents of a folder use
If you want to prevent them from indexing a page use
Obviously you would replace the somefoldernamehere and somefilehere.html with the folder or file on your page.
Now of course the content above should be copied into a text file called robots.txt. If you want a blank robots.txt then download here. If you want a free web based utility to create the file for you then check out our Robots.txt Generator . It will create a valid Robots.txt file.
robots.txt crawl delay
Tags: generate a robots.txt, google robots.txt generator, robots generator, robots text file, robots text file generator, robots txt cgi bin, robots txt disallow googlebot, robots.txt disallow, robots.txt disallow all, robots.txt disallow directory, robots.txt disallow everything, robots.txt disallow file, robots.txt disallow wildcard, robots.txt sitemap