Joe Manna

My Perspective on Business, Social Media & Community

  • Home
  • About Me
  • Contact Me

January 27, 2007

Google Blogs About Controlling Search Engines

Google helps webmasters everyday with their blog, I find their transparency for such a large search engine fascinating. One of their entries consists of tips on enabling webmasters to have greater control of their website listings with robots.txt. This may be SEO 101, however, it exemplifies the efforts that Google is focused on – quality results for web surfers.

The key is a simple file called robots.txt that has been an industry standard for many years. It lets a site owner control how search engines access their web site. With robots.txt you can control access at multiple levels — the entire site, through individual directories, pages of a specific type, down to individual pages. Effective use of robots.txt gives you a lot of control over how your site is searched, but its not always obvious how to achieve exactly what you want. This is the first of a series of posts on how to use robots.txt to control access to your content.

Last modified: July 9, 2012

Recent Posts

  • More thoughts on Progressive Snapshot
  • The future of cars and driving
  • Is Ubuntu Linux ready for small businesses?
  • How to not get duped by fake news
  • Hack your way to a better credit score

Popular Posts

  • What Every Driver Needs to Know about Progressive Snapshot
  • More thoughts on Progressive Snapshot
  • Pros and Cons of Using Unroll.me
  • 7 Automotive Performance Mods that Actually Work
  • Contact Me

Powered by
WP Engine

Built with
Studiopress

Copyright © 2025 Joe Manna. Content is licensed under Creative Commons [CC BY-SA 3.0].