A robots.txt is a file placed on your server to tell the various search engine spiders not to crawl or index certain sections or pages of your website. You can use it to prevent indexing totally, prevent certain areas of your site from being indexes or to issue individual indexing instructions to specific search engines.
The file itself is a simple text file, which can be created in Notepad. It need to be saved to the root directory of your website, that is the directory where your home page or index page is.
All search engines, or at least all the important ones, now look for a robots.txt file as soon their spiders or bots arrive on your site. So, even if you currently do not need to exclude the spiders from any part of your site, having a robots.txt file is still a good idea, it can act as a sort of invitation into your website.
There are a number of situations where you may wish to exclude spiders from some or all of your website.
The very fact that search engines are looking for them is reason enough to put one on your website. Have you looked at your site statistics recently? If your stats include a section on 'files not found', you are sure to see many entries where search engines spiders looked for, and failed to find, a robots.txt file on your website.
There is nothing difficult about creating a basic robots.txt file. It can be created using notepad or whatever is your favorite text editor. Each entry has just two lines:
User-Agent: [Spider or Bot name]
Disallow: [Directory or File Name]
This line can be repeated for each directory or file you want to exclude, or for each spider or bot you want to exclude.
A few examples will make it clearer.
Exclude a file from an individual Search Engine
You have a file, example.html, in a directory called 'example' that you do not wish to be indexed by Google. You know that the spider that Google sends out is called 'Googlebot'. You would add these lines to your robots.txt file:
Exclude a section of your site from all spiders and bots
You are building a new section to your site in a directory called 'example' and do not wish it to be indexed before you are finished. In this case you do not need to specify each robot that you wish to exclude, you can simply use a wildcard character, '*', to exclude them all.
Note that there is a forward slash at the beginning and end of the directory name, indicating that you do not want any files in that directory indexed.
Allow all spiders to index everything
Once again you can use the wildcard, '*', to let all spiders know they are welcome. The second, disallow, line you just leave empty, that is your disallow from nowhere.
Allow no spiders to index any part of your site
This requires just a tiny change from the command above - be careful!
If you use this command while building your site, don't forget to remove it once your site is live! Use our Robots.txt Checker to view the contents of the robots.txt file for any website.