Robots.txt

Updated 
March 21, 2016

A file that gives web robots instructions on which pages you want them to crawl/ignore on your site. This can be useful for preventing duplicate content from being indexed in search engines, but not a good way to hide information.