PDA

View Full Version : What are Spiders, Robots and Crawlers and what are their functions?



dennis123
12-06-2018, 01:11 AM
What are Spiders, Robots and Crawlers and what are their functions?

mtgfanboy
12-06-2018, 01:30 AM
Spider: A spider is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index.
Robots: The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl.
Crawlers: A crawler is a program used by search engines to collect data from the internet. When a crawler visits a website, it picks over the entire website's content (i.e. the text) and stores it in a databank.

nancy07
12-06-2018, 01:50 AM
Hi, they all are same, and they have the same function which is to read the website and feed information about the site to google database and to rank a website according to search engine algorithms.

dombowkett
12-06-2018, 01:50 AM
Search engine Spiders, robots, & crawers are used to index the website to increase their search engine rank.

shopshs
12-18-2018, 02:14 AM
Spider - The browsers are like a program and to download the web page.


Robots - It had automated computer program can visit websites. It will guided by search engine algorithms It can combine the tasks of crawler & spider helpful of the indexing the web pages and through the search engines.

Crawler – The program is automatically to follow the links are web page

kanagaseo
12-22-2018, 05:59 AM
Crawlers, robots and spiders are all the same. In layman language they perform the activity of collecting all the information from your website to store on the temporary databases. If search engines won’t crawl your website then it’s difficult for your website to rank on google