PDA

View Full Version : What is a robots.txt file?



orientation
05-26-2016, 01:16 AM
hi friends,

What is a robots.txt file?

guptaabhijit318
05-26-2016, 02:29 AM
Robots.txt is frequent name of a text file that is uploaded to a Web site's root directory and linked in the html code of the Web site. The robots.txt file is used to have the funds for directions about the Web site to Web robots and spiders.

PradeepKumar
05-26-2016, 02:41 AM
Robots.txt is a text file that gives information to the search bots on which pages they should crawl and which they should not.

nancy07
05-26-2016, 05:39 AM
Robots.txt is the common name of a text file that is uploaded to a Web site's root directory and linked in the HTML code of the Web site. The robots.txt file is used to provide instructions to the Web site to Web robots and spiders. Web authors can use robots.txt to keep cooperating Web robots from accessing all or parts of a Web site that you want to keep private.

emilybp2nd
05-26-2016, 05:56 AM
Robots.txt is a file which is used to stop the crawlers from fetching the certain pages. you can restrict them to whatever directory you want

RH-Calvin
05-27-2016, 12:43 AM
Robots.txt is a text file that is inserted into the website to contain instructions for search engine robots. The file lists webpages which are allowed and disallowed from search engine crawling.

abhay3214
05-27-2016, 03:34 AM
robots.txt file is important, every search engine crawl first robots.txt file.

Nexevo Technolo
05-27-2016, 09:52 AM
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit.

marksteve741
05-27-2016, 01:43 PM
robots.txt is file to control the search engine bot

TinaLewis
05-31-2016, 07:18 AM
The robots.txt file is a simple text file placed on your web server which tells web crawlers like Googlebot if they should access a file or not. The robots.txt file is used to provide instructions about the Web site to Web robots and spiders. Web authors can use robots.txt to keep cooperating Web robots from accessing all or parts of a Web site that you want to keep private.

ShreyaKoushik
12-23-2016, 01:55 AM
Use of Robots.txt - The most common usage of Robots.txt is to ban crawlers from visiting private folders or content that gives them no additional information.

Robots.txt Allowing Access to Specific Crawlers.
Allow everything apart from certain patterns of URLs.

SD Groupkol
12-23-2016, 02:11 AM
It is most important for search engine crawler.

incomecracker
12-23-2016, 04:16 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. ... The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.