What is robots.txt?
Printable View
What is robots.txt?
Robots.txt is one way of telling the Search Engine Bots about the web pages on your website which you do not want them to visit.
Robots.txt is useful for preventing the indexation of the parts of any online content that website owners do not want to display.
Robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl & index pages on their website.
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.