PDA

View Full Version : What is disallow in robots.txt file?



parmodshastri
12-04-2017, 11:11 PM
Hlo Friends,

What is disallow in robots.txt file?

neelseowork
12-04-2017, 11:20 PM
Hlo Friends,

What is disallow in robots.txt file?

Website owners use the /robots.txt file to give instructions on their site to web robots
The "Disallow: /" tells the robot that it should not visit any pages on the site.

loveguruindia
12-05-2017, 12:16 AM
disallow instruct the robots not to visit this page.

24x7servermanag
12-05-2017, 12:54 AM
Can you please refer this post for the same answer - http://forums.hostsearch.com/showthread.php?101281-What-is-disallow-in-robots-txt-file

Hope this will help you. :)

Carlbrewster
12-05-2017, 01:50 AM
Instead of asking here you can google it to get more detailed knowledge! about robots.txt files

fayeseom
12-05-2017, 01:55 AM
Robots.txt file is created to tell google visit pages are to be crawled and indexed and the Disallow in this means to tell google not to crawl and index the page.

dennis123
12-06-2017, 02:00 AM
Disallow is used to block bots from accessing and indexing certain files and /or folders on your website.
Example:
Disallow: /folder/ - This will block bots from accessing anything inside this folder and indexing them. This is useful when you have certain pages or files on your website which you don't want to show up in SERP, like dummy pages or pages you are working on.
Similarly, Allow: will say bots that they are allowed to index them.

yuvashri
12-06-2017, 04:27 AM
Robots.txt document is made to advise google visit pages are to be crept and ordered and the Disallow in this way to advise google not to slither and list the page.

davidsmith21
12-07-2017, 01:46 AM
Web site holders utilize the /robots. Txt document with provide for educational something like their webpage with web robots; this is called those Robots prohibition Protocol. Those "Disallow: /" advises those robot that it ought further bolstering not visit At whatever pages on the site.