Web Design-Web Master-Graphic Developer

For the past several years, I have built a reputation for creating a positive return on investment for our clients.

Robot Control Code Generation Tool

The robots.txt file can be a blessing or a curse. Used incorrectly, it can wipe out your rankings, but used properly it can help protect you from many marauding robots, helping protect your bandwidth and privacy. Use this free, easy to use tool to make sure that your robots.txt is generated correctly. Also contains a tutorial on robots.txt usage and myths.

Default -  All Robots are:  
Crawl-Delay:
Sitemap: (leave blank for none)
     
Specific Search Robots: Google   googlebot
  MSN Search   msnbot
  Yahoo   yahoo-slurp
  Ask/Teoma   teoma
  Cuil   twiceler
  GigaBlast   gigabot
  Scrub The Web   scrubby
  DMOZ Checker   robozilla
  Nutch   nutch
  Alexa/Wayback   ia_archiver
  Baidu   baiduspider
  Naver   naverbot, yeti
   
Specific Special Bots: Google Image   googlebot-image
  Google Mobile   googlebot-mobile
  Yahoo MM   yahoo-mmcrawler
  MSN PicSearch   psbot
  SingingFish   asterias
  Yahoo Blogs   yahoo-blogs/v3.9
   
Restricted Directories: The path is relative to root and must contain a trailing "/"
 
 
 
 
 

Now, copy and paste this text into a blank text file called "robots.txt" (don't forget the "s" on the end of "robots") and put it in your root directory. Like all other files on your server, make sure its permissions are set so that visitors (such as search engines) can read it.