Originally posted by Caro KannIn theory, it should. As long as the spider obeys the robots.txt file. The point of putting it into place is to stop a certain set of pages from being spidered. If you want your pages searched, don't use a robots.txt file. It isn't required. You can also specify which directories that you want to be protected using the Disallow: directive. See http://www.robotstxt.org/wc/norobots.html for details.
Im new to delevoping (but I think its great fun and am constantly doing sites etc, trying out new stuff etc lol). But i have a problem about the robots.txt file
If you use...
... does this block all spiders going thru everything? But then you wont get on any search lists... im confused