Robots.txt is a text file that instructs Googlebot to know which pages on website is allowed to crawl and which not.