subject: Bits and Pieces Most Custom Websites Need [print this page] Web Robots or bots are crawlers that are used to index the websites for content. Robot.txt files give out instructions to these Bots to stay out of certain areas of the websites. Something like copyrighted images can't be copied if you construct the Robot.txt file in a way that it prevents the Bots from entering the image directory of the website.