Robots exclusion standard
Appearance
Robots.txt is the name of the file used in the robots exclusion standard that aims to provide information to well-behaved web spiders and other web robots so that they can ignore parts of websites that the web site operators do not want accessed.
Note that the robots exclusion standard is only advisory, not mandatory, so marking an area of your site out of bounds with robots.txt does not guarantee privacy.