Results 1 to 6 of 6
  1. #1
    Senior Member
    Join Date
    Sep 2021
    Location
    Bangalore, India
    Posts
    437

    What is robots.txt?

    What is robots.txt?

  2. #2
    Senior Member
    Join Date
    Nov 2021
    Location
    Bangalore
    Posts
    352
    Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website.

  3. #3
    Senior Member
    Join Date
    Aug 2021
    Location
    931 Clayton St San Francisco, CA 94117 United States
    Posts
    116
    Quote Originally Posted by laragiles View Post
    What is robots.txt?
    Hello,

    "A robots. txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, block indexing with no index or password-protect the page."

  4. #4
    Member
    Join Date
    Oct 2021
    Location
    Surat, Gujarat
    Posts
    56
    Quote Originally Posted by laragiles View Post
    What is robots.txt?
    Hello,

    "A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site. The robots.txt file controls which pages are accessed. The robots meta tag controls whether a page is indexed, but to see this tag the page. If a bot comes to your website and it doesn't have one, it will just crawl your website and index pages as it normally would. txt file is only needed if you want to have more control over what is being crawled."

  5. #5
    Member
    Join Date
    Nov 2021
    Posts
    33
    Hello Freinds,

    Robot.txt is a text file which is used to block the Search Engine crawlers from crawling on the website.

    A crawler is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index.
    The major search engines on the Web all have such a program, which is also known as a "spider" or a "bot."

    To find information on the hundreds of millions of Web pages that exist, a search engine employs special software robots, called spiders, to build lists of the words found on Web sites.

    When a spider is building its lists, the process is called Web crawling.

  6. #6
    Member
    Join Date
    Aug 2021
    Posts
    68
    A robot. The txt file tells search engine crawlers which URLs can be accessed on your website. This is mainly used to prevent your site from being overloaded by requests; it is not a mechanism to exclude pages from Google.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

  Find Web Hosting      
  Shared Web Hosting UNIX & Linux Web Hosting Windows Web Hosting Adult Web Hosting
  ASP ASP.NET Web Hosting Reseller Web Hosting VPS Web Hosting Managed Web Hosting
  Cloud Web Hosting Dedicated Server E-commerce Web Hosting Cheap Web Hosting


Premium Partners:


Visit forums.thewebhostbiz.com: to discuss the web hosting business, buy and sell websites and domain names, and discuss current web hosting tools and software.