PDA

View Full Version : Use of robot.txt ?



bangalorewebgur
06-08-2017, 04:16 AM
Use of robot.txt ?

shilpa
06-08-2017, 04:22 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

daikaads
06-08-2017, 07:13 AM
Robot.txt is the process of informing search engines not to crawl and index the URL of the web page or website.

davidweb09
06-08-2017, 04:10 PM
Robots.txt is a set of webpages help to control the indexing of the site.

ITSGOLIVE
06-08-2017, 10:16 PM
robots.txt is a standard used by websites to communicate with web crawlers and other web robots.

jane1
06-09-2017, 02:07 AM
By robot.txt file you can prevent search engines from not crawling and indexing a webpage of your site that you do not like it to be indexed by a search engine. It is actually a search engine directive file.

smithroy
06-09-2017, 02:13 AM
Robots.txt is a standard used by websites to communicate with web crawlers and other web robots. Robots.txt file is used to control what content search engines are allowed to access on your site. This is great for controlling duplicate content and for directing your crawl budget to your most important pages.

Ultimez
06-09-2017, 02:28 AM
Use of Robots.txt:
Robots.txt allowing access to precise crawlers
Robots.txt Blocking specific folders/content
Default Robots.txt

Nekurokaze
06-09-2017, 02:57 AM
Based on my knowledge, robot.txt is like a blocker which will tell a particular search engine bot (in this case is Google bot) not to crawl into a particular page we don't want it to get in. It can be a good way to keep the bot from indexing some pages or links we don't want to index. However, since it only blocks the bot while the actual links still appear like normal, we can't use robot.txt as a way to "conceal" a site from other people.

Sojan Babu
06-13-2017, 06:46 AM
Robots.txt file is a text file for restricting bots (robots, search engine crawlers ) from a website or certain pages on the website. Using a robots.txt file and with a disallow direction, we can restrict bots or search engine crawling program from websites and or from certain folders and files.

Intuz
06-13-2017, 07:31 AM
The robots.txt file is used to provide instructions about the Web site to Web robots and spiders of search engine. Website owner can use robots.txt to keep cooperating Web robots from accessing all or parts of a Website that you want to not to crawl by search engines.

excelcare01
06-13-2017, 08:55 AM
Robot text is used to keep our visitor stay on the webpage

pawleybel
06-13-2017, 09:19 AM
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots.

jackar56
06-13-2017, 11:14 AM
robots.txt is very important for any website to monitor

sinelogixweb
06-16-2017, 08:32 AM
In a nutshell. Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.