Results 1 to 2 of 2
  1. #1
    Registered User
    Join Date
    Apr 2020
    Posts
    378

    SEs Check robots.txt and just go away?

    Help, guys! I submitted my site a month ago. I just checked my logs, and I can see that
    there have been 40 requests for robots.txt file, where they check that file, and nothing else.
    After checking the file, they go away. Is this normal? Do search engines just check it, then come back later to crawl?
    Here's my robots.txt file:

    User-agent: *
    Disallow: /hid/
    Disallow: /images/
    Allow: /
    Disallow:

    Anything wrong with this?

    Note:
    The "Allow" is there to invite the engines to crawl everything.
    Anybody know if this works?

  2. #2
    Member
    Join Date
    Sep 2021
    Posts
    69
    The "allow" should not be there, there is no "allow" directive for the robots.txt file. Likely they are ignoring the "allow" and interpreting the "/" (root) as disallow everything.
    I would change it to just read:

    User-agent: *
    Disallow: /hid/
    Disallow: /images/
    Allow: /
    Disallow:

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

  Find Web Hosting      
  Shared Web Hosting UNIX & Linux Web Hosting Windows Web Hosting Adult Web Hosting
  ASP ASP.NET Web Hosting Reseller Web Hosting VPS Web Hosting Managed Web Hosting
  Cloud Web Hosting Dedicated Server E-commerce Web Hosting Cheap Web Hosting


Premium Partners:


Visit forums.thewebhostbiz.com: to discuss the web hosting business, buy and sell websites and domain names, and discuss current web hosting tools and software.