Results 1 to 12 of 12
  1. #1
    Registered User
    Join Date
    Jul 2015
    Location
    jaipur
    Posts
    251

    What is disallow in robots.txt file?

    What is disallow in robots.txt file?

  2. #2
    Registered User 24x7servermanag's Avatar
    Join Date
    Jul 2017
    Location
    India
    Posts
    1,020
    Robot.txt is used to crawl the pages website. It tells that which part of the area should not be accessed. We can define the pages which should not be accessed by putting the disallow tag in robot.txt. Those disallow pages are restricted to visit. It also help to index the web content.

    You can ask your web hosting provider to upload it under your control panel (root directory of the website) and webmaster will pick it automatically.

    If you have access then you can upload it from your end.
    Server Management Company
    India's Leading Managed Service Provider | Skype: techs24x7
    Cpanel Technical Discussions - Lets talk !

  3. #3
    Registered User
    Join Date
    Sep 2017
    Posts
    1,192
    Web webpage owners utilize the /robots. Txt document on provide for guidelines around their site should web robots; this may be called the Robots avoidance Protocol. Those "Disallow: /" recounts those robot that it ought to not visit At whatever pages on the site.

  4. #4
    Registered User
    Join Date
    Sep 2017
    Location
    Arizona
    Posts
    43
    Disallow in robotz.txt is used to stop the search bots to crawl a web page or website.

  5. #5
    Senior Member
    Join Date
    Jul 2017
    Location
    Surat
    Posts
    195
    Quote Originally Posted by muslimastro View Post
    What is disallow in robots.txt file?
    disallow term tells the robots that they should not crawl and visit the pages which are listed in "disallow"
    robots.txt is used to give instructions to web robots.

  6. #6
    Registered User
    Join Date
    Jun 2016
    Location
    Mumbai
    Posts
    872
    Use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. ... The "Disallow: /" tells the robot that it should not visit any pages on the site.

  7. #7
    Registered User
    Join Date
    Dec 2017
    Posts
    42
    It is an instruction to the Search Engine to prevent (restrict) accessing of specific pages or directories.

  8. #8
    Junior Member
    Join Date
    Jan 2018
    Posts
    8
    According to the SEO, disallow is the command in robots.txt file to stop the crawler to visit your website. It depends on the web developer or on SEO expert to block crawler for the whole website or for some specific pages.

  9. #9
    Senior Member deepakrajput's Avatar
    Join Date
    Feb 2012
    Location
    CA
    Posts
    1,223
    To prevent a webpage from search engine indexing, we use disallow tags.

  10. #10
    Member
    Join Date
    Sep 2017
    Posts
    90
    Web site page proprietors use the/robots. Txt report on accommodate rules around their webpage should web robots; this might be known as the Robots shirking Protocol. Those "Deny:/" describes those robot that it should not visit At whatever pages on the site.

  11. #11

  12. #12
    Registered User
    Join Date
    Jul 2017
    Location
    Pompton Plains, NJ 07444
    Posts
    185
    The "Disallow: /" in robots.txt that it should not visit any pages on the site.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

  Find Web Hosting      
  Shared Web Hosting UNIX & Linux Web Hosting Windows Web Hosting Adult Web Hosting
  ASP ASP.NET Web Hosting Reseller Web Hosting VPS Web Hosting Managed Web Hosting
  Cloud Web Hosting Dedicated Server E-commerce Web Hosting Cheap Web Hosting


Premium Partners:


Visit forums.thewebhostbiz.com: to discuss the web hosting business, buy and sell websites and domain names, and discuss current web hosting tools and software.