View Full Version : What is the difference between Robots.txt and Spider?

07-30-2016, 03:25 AM
Hello friends,

What is the difference between Robots.txt and Spider?

07-30-2016, 08:49 AM
robots.txt: It is a notepad file in which we write commands to instruct web crawler to not visit the link or webpage which we do not want it to.

Spider: It is a crawler of the search engine which visits, crawls, indexes and saves the website in its database. It also gives ranks to the website which has good content and quality back-links.

08-01-2016, 03:26 AM
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.

Spider is a search engine automated program that are responsible to read through webpage source and provide information to search engines.

08-01-2016, 05:44 AM
robots.txt is the file to allow webmasters disallow other bots (or spiders) to crawl and index their website

08-01-2016, 06:18 AM
They are just different names for the same thing. "Spiders" are "Crawlers" and "Robots" is a word commonly used when talking of robot text which is where you put exclusions for the spiders not to find.

08-01-2016, 06:23 AM
Spider are called google bot or crawlers. They are the agents of google that collect the data and help in indexing the data in google data base.

Robots.txt is a file which is used to restrict the permission on access of data for crawlers. If you dont want to crawl the data of your webpage or any specific section of your website then you can block the access of that section for google bot/crawlers by using robots.txt file.

08-01-2016, 07:00 AM
robot.txt: It is a text file in which we put URLs we do not want the search engine crawler to crawl.

Spider: It is software of search engine which crawls and index the website in its database.This Spider can not crawl the image that is why we have to put image alt attribute tag.

08-01-2016, 07:17 AM
robot.txt is a file in which we writes commands URLs etc.
spider it is called crawler which crawl through search engine.

08-01-2016, 07:28 AM
Robot.txt is the file that is use to stop the search engine spider to crawl the website, and spider is the search engine crawler.

08-01-2016, 08:03 AM
Spider - The browsers are like a program and to download the web page.
Robots - It had automated computer program can visit websites. It will guided by search engine algorithms It can combine the tasks of crawler & spider helpful of the indexing the web pages and through the search engines.