PDA

View Full Version : Robot.txt File



Shantanu
10-20-2018, 07:49 AM
Hello Friends

Can anyone tell me where to submit robot.txt file in our website ???

Please tell me

georgeb
10-21-2018, 10:15 PM
Please look here (https://moz.com/learn/seo/robotstxt).

Jagdish.P
10-22-2018, 12:40 AM
In the Root folder of your site. Make sure you have used correct syntax to block those segment of site which are of no use.

Thanks

pharmasecure
10-22-2018, 12:55 AM
Keep in mind that only one file or folder can be used per Disallow line. You may add as many Disallow lines as you need. Once complete, save and upload your robots.txt file to the root directory of your site. For example, if your domain is www.mydomain.com, you will place the file at www.mydomain.com/robots.txt.

MVMInfotech
10-22-2018, 01:26 AM
Simply type in your root domain, then add /robots.txt to the end of the URL. For instance, Moz's robots file is located at moz.com/robots.txt.

stellasweety
10-22-2018, 07:51 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.

Rajdeep Bose
10-22-2018, 09:48 AM
Robots.txt" is a regular text file that through its name, has special meaning to the majority of "honorable" robots on the web. By defining a few rules in this text file, you can instruct robots to not crawl and index certain files, directories within your site, or at all.

iprism
10-23-2018, 08:02 AM
hen search engine crawlers come to your site it will be looking for some special file. i.e, call robots.txt file. This file tells to search engine spiders which pages of your site should be index and which pages of your site should be ignored.

praveenitech40
10-23-2018, 08:17 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.

Prateektechnoso
10-23-2018, 09:31 AM
robots.txt file is giving instructions to the Crawler or Search Engine Spider to Crawl and Block Particular Website URLs.


For your information:

Prateektechnosoft is a Netsuite Partner and expertise in NetSuite ERP, CRM, Cloud CRM, PSA and other Netsuite Solutions. And also providing Netsuite services of implementation, integration, support & development services.

RH-Calvin
10-24-2018, 03:34 AM
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.

fayeseom
10-25-2018, 02:15 AM
The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl.

roycpo
10-27-2018, 08:30 AM
Robots.txt is a text file that contain instructions for search engine spiders. The files lists pages to allow and disallow Google crawling. Robots.txt file helps to control the crawling of your website.