Robot.txt is used to crawl the pages website. It tells that which part of the area should not be accessed. We can define the pages which should not be accessed by putting the disallow tag in robot.txt. Those disallow pages are restricted to visit. It also help to index the web content.
You can ask your web hosting provider to upload it under your control panel (root directory of the website) and webmaster will pick it automatically.
If you have access then you can upload it from your end.
Bookmarks