Results 1 to 9 of 9
  1. #1

  2. #2
    Registered User arjupraja2's Avatar
    Join Date
    Aug 2016
    Location
    Tampa
    Posts
    22
    There are many online software/tools are available on Internet, you just search on google "create robots.txt file".

  3. #3
    Creating Robots.txt file is very simple. You just need to open Notepad, write below code

    User Agent: *
    Allow: /

    Sitemap:- (URL of your website Sitemap)

    If you want to block any specific search engine from crawling your website, than you need to mention search engine name in front of user agent. Suppose I don't want my website to be crawled for Google Bots, then my robots file will be some what like this:-

    User Agent: Google
    Disallow: /

    User Agent: *
    Allow: /

    This means other than Google bots all other crawlers can crawl the website.

    If you want to Disallow specific folder or specific page of the website. You just need to add this line.
    Disallow: /folder/
    Disallow: /file.html
    Disallow: /image.png

  4. #4
    Registered User
    Join Date
    Aug 2016
    Posts
    130
    The syntax for using the keywords is as follows:

    User-agent: [the name of the robot the following rule applies to]

    Disallow: [the URL path you want to block]

    Allow: [the URL path in of a subdirectory, within a blocked parent directory, that you want to unblock]
    Tool,video training, book...See more.
    Instaffiliate Review - Check it NOW.

  5. #5
    Registered User
    Join Date
    Aug 2015
    Location
    India
    Posts
    190
    You can easily create robots.txt file. make a robots.txt and use User-agent: *
    Disallow: .

    day picnic in delhi | Picnic near gurgaon

  6. #6
    Senior Member
    Join Date
    Jun 2013
    Location
    Forum
    Posts
    5,019
    Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
    Cheap VPS | $1 VPS Hosting
    Windows VPS Hosting | Windows with Remote Desktop
    Cheap Dedicated Server | Free IPMI Setup

  7. #7
    Registered User
    Join Date
    Jul 2016
    Posts
    38
    There are lot of tool available on the internet but we prefer to use the sitemap generator. It is a file which tell the search engine spider to which page has to be crawled or not by allowing or disallowing.

  8. #8
    Registered User
    Join Date
    Aug 2016
    Location
    UK
    Posts
    52
    The robots.txt file is a very simple format that can be created by the tool Notepad. If you use WordPress robots.txt file would look like this:

    User-agent: *
    Disallow: /wp-
    Disallow: /feed/
    Disallow: /trackback/

    “User-agent”:* which means that all search robots from Google, Yahoo and MSN should use this guide to find your site.
    “Disallow: /wp-“: This line of code tells the search engines know it should not "rummaging" in the files of WordPress starts with wp-.

    If you use Google Webmaster, they also allow you to create robots.txt

  9. #9
    Member
    Join Date
    Jun 2016
    Posts
    30
    Simply add this code in notepad and save as robots.txt and upload it on your website's root.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

  Find Web Hosting      
  Shared Web Hosting UNIX & Linux Web Hosting Windows Web Hosting Adult Web Hosting
  ASP ASP.NET Web Hosting Reseller Web Hosting VPS Web Hosting Managed Web Hosting
  Cloud Web Hosting Dedicated Server E-commerce Web Hosting Cheap Web Hosting


Premium Partners:


Visit forums.thewebhostbiz.com: to discuss the web hosting business, buy and sell websites and domain names, and discuss current web hosting tools and software.