Board logo

subject: Bits and Pieces Most Custom Websites Need [print this page]


Web Robots or bots are crawlers that are used to index the websites for content. Robot.txt files give out instructions to these Bots to stay out of certain areas of the websites. Something like copyrighted images can't be copied if you construct the Robot.txt file in a way that it prevents the Bots from entering the image directory of the website.

Bits and Pieces Most Custom Websites Need

By: Craig Miles




welcome to loan (http://www.yloan.com/) Powered by Discuz! 5.5.0