Google helps webmasters everyday with their blog, I find their transparency for such a large search engine fascinating. One of their entries consists of tips on enabling webmasters to have greater control of their website listings with robots.txt. This may be SEO 101, however, it exemplifies the efforts that Google is focused on – quality results for web surfers.
The key is a simple file called robots.txt that has been an industry standard for many years. It lets a site owner control how search engines access their web site. With robots.txt you can control access at multiple levels — the entire site, through individual directories, pages of a specific type, down to individual pages. Effective use of robots.txt gives you a lot of control over how your site is searched, but its not always obvious how to achieve exactly what you want. This is the first of a series of posts on how to use robots.txt to control access to your content.