View Full Version : What is the use of Robots.txt?

01-29-2016, 01:31 PM
What is the use of Robots.txt?

01-30-2016, 12:20 AM
It used for security purpose. It instructs how to crawl and index pages on their website.

01-30-2016, 03:40 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

01-31-2016, 05:00 PM
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do.

01-31-2016, 11:16 PM
Robots.txt is the first page that any search engine bot will crawl before it visits the other web pages of a website.

02-01-2016, 12:34 AM
Robots.txt is frequent name of a text file that is uploaded to a Web site's root directory and linked in the html code of the Web site. The robots.txt file is used to have the funds for directions about the Web site to Web robots and spiders.

02-01-2016, 01:27 AM
Robots.txt helps us to clearly show our website index permissions to search engines. If you select disallow to a web page then search engines do not crawl or index or store that web page in their database. This gives us much privacy to decide which web pages can be index online.

02-01-2016, 01:43 AM
To block the useless content from indexing of the site by Google search engine.

Black Cows
02-01-2016, 02:01 AM
good crawlers, visit and respect the instructions in the robots.txt file.
for example you are creating a new website and you create it in a sub folder of ur website which you will later move to your root domain, u can instruct the robot.txt to not index those pages to avoid duplicate content penalty later when you later move the pages to your root folder.

did u get it?

if not you can ask me more about it or PM me.

I'll be glad to help you out.

02-01-2016, 02:36 AM
The robots.txt file is a simple text file placed on your web server which tells web crawlers if they should access a file or not.
If the robots.txt file says it can enter, the search engine spider then continues on to the page files.
If you have instructions for a search engine robot, you must tell it those instructions. The way you do so is the robots.txt file.
Here is an article that will tell you the basics of Robots.txt