If Google bot does not access robot.txt file, what is problem and how can it be solved ?
Printable View
If Google bot does not access robot.txt file, what is problem and how can it be solved ?
Please make sure you have uploaded it under the correct location in your web hosting panle. Robot.txt is used to crawl the pages website. It tells that which part of the area should not be accessed. We can define the pages which should not be accessed by putting the disallow tag in robot.txt. Those disallow pages are restricted to visit. It also help to index the web content.
You can ask your web hosting provider to upload it under your control panel (root directory of the website) and webmaster will pick it automatically.
If you have access then you can upload it from your end.
If you are still facing any issue then please let me n=know what exact error are you getting.
Robots.txt fetch error is actually a common type of error seen in Google Webmaster Tools section and most websites encounter this issue due to low quality web hosting.