Google Search: (inurl:”robot.txt” | inurl:”robots.txt” ) intext:disallow filetype:txt
Wasabi rates this entry 6 out of 10.
Submitted: 2004-08-09 07:37:03
Added by: Wasabi
Webmasters wanting to exclude search engine robots from certain parts of their site often choose the use of a robot.txt file on the root of the server. This file basicly tells the bot which directories are supposed to be off-limits.An attacker can easily obtain that information by very simply opening that plain text file in his browser. Webmasters should *never* rely on this for real security issues. Google helps the attacker by allowing a search for the “disallow” keyword.
2004-08-10 11:33:19 (smoke63b): That’s a good one :)
2004-08-25 06:31:35 (murfie): Oops, it is an OLD one.. :(
Completely missed that..
This one is different in that it also searches robot.txt files (without the s), and thats 47 extra results. I don’t think searchengines will respect it however..