Results 1 to 11 of 11
  1. #1

  2. #2
    Junior Member
    Join Date
    Dec 2017
    Posts
    19
    The robots.txt file is primarily used to specify which parts of your website should be crawled by spiders or web crawlers.

  3. #3

  4. #4
    Registered User
    Join Date
    Sep 2017
    Location
    Arizona
    Posts
    43
    Robots.txt allow web crawlers to communicate with your website.

  5. #5
    Registered User
    Join Date
    Jan 2018
    Posts
    546
    The robots.txt file is primarily used to specify which parts of your website should be crawled by spiders or web crawlers.

  6. #6
    Registered User 24x7servermanag's Avatar
    Join Date
    Jul 2017
    Location
    India
    Posts
    1,020
    Robot.txt is used to crawl the pages website. It tells that which part of the area should not be accessed. We can define the pages which should not be accessed by putting the disallow tag in robot.txt. Those disallow pages are restricted to visit. It also help to index the web content.

    You can ask your web hosting provider to upload it under your control panel (root directory of the website) and webmaster will pick it automatically.

    If you have access then you can upload it from your end.
    Server Management Company
    India's Leading Managed Service Provider | Skype: techs24x7
    Cpanel Technical Discussions - Lets talk !

  7. #7
    Senior Member
    Join Date
    Jun 2013
    Location
    Forum
    Posts
    5,019
    Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
    Cheap VPS | $1 VPS Hosting
    Windows VPS Hosting | Windows with Remote Desktop
    Cheap Dedicated Server | Free IPMI Setup

  8. #8

  9. #9
    Senior Member
    Join Date
    Sep 2017
    Posts
    176
    Robots.txt is a content document that rundowns site pages which contain directions for web crawlers robots. The document records pages that are permitted and refused from web crawler creeping.

  10. #10
    Registered User
    Join Date
    Oct 2017
    Posts
    162
    The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.

  11. #11
    Registered User
    Join Date
    Dec 2017
    Posts
    46
    Robots.txt is a text file webmasters create to instruct web robots how to crawl pages on their website.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

  Find Web Hosting      
  Shared Web Hosting UNIX & Linux Web Hosting Windows Web Hosting Adult Web Hosting
  ASP ASP.NET Web Hosting Reseller Web Hosting VPS Web Hosting Managed Web Hosting
  Cloud Web Hosting Dedicated Server E-commerce Web Hosting Cheap Web Hosting


Premium Partners:


Visit forums.thewebhostbiz.com: to discuss the web hosting business, buy and sell websites and domain names, and discuss current web hosting tools and software.