Please turn on javascript in your browser to play chess.
Developers Forum

Developers Forum

  1. 08 Apr '06 01:36
    Im new to delevoping (but I think its great fun and am constantly doing sites etc, trying out new stuff etc lol). But i have a problem about the robots.txt file

    If you use...

    User-agent: *
    Disallow: /

    ... does this block all spiders going thru everything? But then you wont get on any search lists... im confused
  2. Subscriber ouroboros
    Digital Alchemist
    08 Apr '06 13:01
    Originally posted by Caro Kann
    Im new to delevoping (but I think its great fun and am constantly doing sites etc, trying out new stuff etc lol). But i have a problem about the robots.txt file

    If you use...

    User-agent: *
    Disallow: /

    ... does this block all spiders going thru everything? But then you wont get on any search lists... im confused
    In theory, it should. As long as the spider obeys the robots.txt file. The point of putting it into place is to stop a certain set of pages from being spidered. If you want your pages searched, don't use a robots.txt file. It isn't required. You can also specify which directories that you want to be protected using the Disallow: directive. See http://www.robotstxt.org/wc/norobots.html for details.