PDA

View Full Version : What is the major use of Robots.txt?



ShreyaKoushik
10-21-2016, 04:15 AM
What is the major use of Robots.txt?

alex.thomson
10-25-2016, 04:01 AM
You might be surprised to hear that one small text file, known as robots.txt, could be the downfall of your website. If you get the file wrong you could end up telling search engine robots not to crawl your site, meaning your web pages won’t appear in the search results. Therefore, it’s important that you understand the purpose of a robots.txt file and learn how to check you’re using it correctly.

raheel
10-25-2016, 08:48 AM
Robots.txt is the text file that is created to instruct crawler how to crawl website or index.

davidweb09
10-25-2016, 10:27 PM
Robots.txt control your site indexing.

jayashree-marg
10-26-2016, 12:37 AM
The robots.txt file is a simple text file placed on your web server which tells webcrawlers like Googlebot if they should access a file or not.

Rajdeep Bose
10-26-2016, 01:19 AM
Robots.txt is the text file that is mostly used to instruct search engine which page should be crawled and which shouldn't be crawled.

Raghavendraastr
10-26-2016, 03:54 AM
Robots.txt file is control your website index and informer for all search engines

samant
10-27-2016, 01:37 AM
A robots.txt file is a text file that stops web crawler software, such as Googlebot, from crawling certain pages of your site.

somidiscount
10-27-2016, 01:48 AM
The major use of robot.txt is allow and disallow the crawler of the site and it is best option for controlling the crawler of your site.

redspiderae
10-27-2016, 03:30 AM
The robot that Google uses to index their search engine is called Googlebot. It understands a few more instructions than other robots. In addition to "User-name" and "Disallow" Googlebot also uses the Allow instruction.

dennis123
10-27-2016, 07:19 AM
The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.

Rammadhur
10-27-2016, 07:27 AM
Robot.txt: It is a text file( not the HTML) that gives instructions to web robots about the page when the website owner is not interested in crawling the page.

vishalesskay
10-28-2016, 04:34 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. ... The "Disallow: /" tells the robot that it should not visit any pages on the site. There are two important considerations when using /robots.txt: robots can ignore your /robots.txt.