PDA

View Full Version : What Is A Robots.txt File?



ThedorisJackson
08-04-2014, 02:16 AM
Hi Friend,
Please can tell me about something about robots.txt file?

rehal
08-09-2014, 12:49 AM
The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.

AvniShergill
08-09-2014, 07:26 AM
Robots.txt helps to tell the web crawlers, which page of your website need to be crawled.

joannapaul
08-09-2014, 07:53 AM
robots.txt is a file for search engines to know what are the pages to be crawled and indexed in your website.

ehostingpk
08-11-2014, 04:44 AM
Robots.txt is a text (not HTML) file you put on your site to tell search robots which pages you would like them not to visit.

johnzakaria
08-11-2014, 06:41 AM
this text which manage the enter of Google spiders to your site, so be careful about this file and don't edit it if you understand how it works

sanjay1
08-18-2014, 04:16 AM
robot.txt file is file which contain some instruction for search engines bot that how website or pages should be crawl or not, its mean these instruction tell bot how crawl or index your site.

Sophiacalvin
08-18-2014, 05:54 AM
Hello

Web site owners use the /robots.txt file to give guidelines about their site to web robots; this is called The Robots Exemption Protocol. The "User-agent: *" means this segment applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

stevex
08-18-2014, 07:11 AM
robots.txt tells to web robots what to read and which portion should not read.We can set privacy by allowing and disallowing which part of site we don't want to crawl.

EdwardLee
09-27-2014, 03:26 AM
Robot.txt file is used to show search engine more about any website. Search engine gets an idea about which webpage has to crawl and which not. It is used to hide any web page from a search engine if you don't want to show it to others, it also used to prevent penalization.

Glenn Rodgers
10-02-2014, 09:45 AM
Robots.txt file is telling or informing search engine spider not to crawl or visit this page. Just put robots.txt file in those pages where you wanna not allow to search engine crawl to visit or fetching your page's data.

jannankhan1101
10-02-2014, 10:43 AM
Hello a robot.txt file tells a search engine bot which folder and pages should be crawled and indexed and which folders and pages should not be crawled/indexed. So its a good practice to use it as some times you will not want for search engine to index a specific webpage or web folder.

Melissa Feeney
10-08-2014, 03:09 AM
Robots.txt file helps Google and other search engines in crawling and indexing your site. You can tell robots, which page of your website need to be crawled and which need to be avoided.

crushmymugshot1
10-11-2014, 02:10 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.

interservermike
11-13-2014, 11:18 AM
robots.txt file is a text file that stops web crawler software, such as Googlebot, from crawling certain pages of your site. The file is essentially a list of commands, such Allow and Disallow, that tell web crawlers which URLs they can or cannot retrieve. So, if a URL is disallowed in your robots.txt, that URL and its contents won't appear in Google Search results.

Read more here.....
https://support.google.com/webmasters/answer/6062608?hl=en

Adam Stark
11-13-2014, 01:11 PM
A robots.txt file is a text file that stops web crawler software, such as Googlebot, from crawling certain pages of your site. It is basically an Allow and Disallow button in your site for the crawler.

rvtrainingchd
11-15-2014, 02:24 AM
Web site owners use the robots.txt file to give instructions about their site to Web spiders on how to crawl and index the website.

christopher12
11-17-2014, 02:59 AM
Robots.txt: It's a simple text file that inform search engine bots how to crawl and index our website.

Wooservers
11-17-2014, 04:16 AM
Robots.txt file is a text file that stops web crawler software, such as Googlebot, from crawling certain pages of your site.

milos87popovic
11-17-2014, 04:23 AM
Robots.txt file is very important for your site, this is a convention to advising cooperating web crawlers and other web robots about accessing all or part of a website which is otherwise publicly viewable.

Jesse R. Mitche
11-18-2014, 03:06 AM
In my view it is a mostly a file which helps crawler to know about your site. It tells crawler that which pages of your site need to be indexed.

jayanta1
11-19-2014, 01:31 AM
A robots.txt file restricts access to your site by search engine robots that crawl the web. Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit.

pryidevsblog
11-19-2014, 01:40 AM
robot.txt is file through which you can set the restriction which part of site you want to allow for the search engine crawlers and which one is not. The permitted part is neither seen in the search result nor indexed in search engine.

anirban09P
11-20-2014, 12:06 AM
Robots.txt is common name of a text file that is uploaded to a Web site's root directory and linked in the html code of the Web site. The robots.txt file is used to provide instructions about the Web site to Web robots and spiders. Web authors can use robots.txt to keep cooperating Web robots from accessing all or parts of a Web site that you want to keep private.

abhijepansd
11-20-2014, 12:40 AM
robots.txt file is useful for the search engines to crawls or not crawls the site pages. if your don't want to crawls the pages, you can use robots.txt file

day picnic near delhi (http://www.arounddelhi.net/?page_id=203) | picnic spots in gurgaon (http://www.kingfisheradventureretreat.com/picnic-spots.html)

ilead
11-20-2014, 01:40 AM
A lot of definitions and information on robot.txt have already been given. I just want to add one of my personal observations. You must use robot.txt in your file even you do not want to block any page. The reason is simple when Google crawler comes to crawl your web pages, first of all it check robot.txt and sometimes (not always) it postpone the crawling for some time if it could not find any rebot.txt file attached.