Results 1 to 11 of 11
  1. #1

  2. #2
    Senior Member
    Join Date
    Nov 2018
    Posts
    1,853
    A robots.txt shows which pages or files the Googlebot can or can't request from a website. Webmasters usually use this method to avoid overloading the website with requests.

  3. #3
    Registered User
    Join Date
    Jul 2019
    Location
    USA
    Posts
    66
    Robots.txt file on the website is especially for the bots. When bots go to any of the websites they look for robots.txt file and then operate according to the instructions mention in that file. Not having robots.txt file will lead to some of the website pages un-crawled.

    Smith
    Admin | O365CloudExperts

  4. #4
    Registered User
    Join Date
    Dec 2018
    Posts
    544
    The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl.

  5. #5
    Senior Member
    Join Date
    Jul 2019
    Posts
    582
    Robots.txt is a text file used to communicate with web crawlers. The file is located in the root directory of a site. It works by telling the Search bots which parts of the site should be and shouldn't be scanned.

  6. #6
    Senior Member
    Join Date
    Sep 2019
    Posts
    770
    Robots.txt is a text file webmasters create to instruct web crawler (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the website

  7. #7
    Member
    Join Date
    Sep 2018
    Posts
    74
    Robots.txt is a text file create to instruct crawler how to crawl pages on their website.
    which regulate how robots crawl the website

  8. #8
    Senior Member
    Join Date
    Aug 2017
    Location
    8th Floor, Olympia National Towers, Block 3, A3 & A4, North Phase, Guindy Industrial Estate, Chennai - 600032, Tamilnadu, India.
    Posts
    486
    robots.txt is a file which is used to command the Crawler to block particular unwanted URL's, files and folders. It will helps us to avoid unwanted issue and maintain Our Website as a SEO friendly.

  9. #9

  10. #10
    Registered User
    Join Date
    Nov 2019
    Posts
    2,528
    The robots exclusion standard, also known as the robots exclusion protocol or simply robots. txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.

  11. #11
    Senior Member
    Join Date
    Jun 2013
    Location
    Forum
    Posts
    5,019
    Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
    Cheap VPS | $1 VPS Hosting
    Windows VPS Hosting | Windows with Remote Desktop
    Cheap Dedicated Server | Free IPMI Setup

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

  Find Web Hosting      
  Shared Web Hosting UNIX & Linux Web Hosting Windows Web Hosting Adult Web Hosting
  ASP ASP.NET Web Hosting Reseller Web Hosting VPS Web Hosting Managed Web Hosting
  Cloud Web Hosting Dedicated Server E-commerce Web Hosting Cheap Web Hosting


Premium Partners:


Visit forums.thewebhostbiz.com: to discuss the web hosting business, buy and sell websites and domain names, and discuss current web hosting tools and software.