PDA

View Full Version : What is a spider?



seoinheritx1
09-25-2012, 03:17 AM
Spider is an robot that travels the web examining the websites in order to add them to a search engine database and rank them according to the specific ranking criteria for that search engine. The spiders will periodically come back to your site once indexed in their database. That is why it is important to constantly refresh your content, because the spiders will stop coming to your site if your content remains the same.

rudraksha
09-25-2012, 03:20 AM
It is a program that most search engines use to find what’s new on the Internet. Google’s web crawler is known as GoogleBot. There are many types of web spiders in use, but for now, we’re only interested in the Bot that actually “crawls” the web and collects documents to build a searchable index for the different search engines. The program starts at a website and follows every hyperlink on each page.

mikeslough5
09-25-2012, 04:12 AM
Spiders are basically crawlers of search engines.

adelaclark
09-25-2012, 04:39 AM
A lot of content and backlinks proven on a website may not actually be recognizable to the search engines, e.g. Falsh Based content, content created through JavaScript, content proven as images, and so on. Usually anything conducted on the individual side may not recognizable to search engines.

JanPaul
09-25-2012, 04:41 AM
No idea about spider brother...and thanks for sharing info about Spider...

chugtairizwan
09-25-2012, 06:13 AM
Hey Mike ! I agree with you great answer i am sure spider are crawler of search engine like when you check crawl which code you type in search bar of Google it's called Google Spider it's similar you say it (Google crawler)...

sabrinasai
09-25-2012, 06:28 AM
SEO Spider crawls any website and returns the number of pages on the server, the number of pages indexed by Google, link popularity, and Alexa rank, and a summary report with search engine ranking stats.

maxjohn148
09-25-2012, 09:24 AM
Usually people call spider or robots to search engine crawlers

shwygft
09-25-2012, 09:48 AM
Thanks for sharing info. Learn a lot!

sea queen
09-27-2012, 12:53 AM
It's very nice information I hope you will share these types of information next time.

seanbarrington
09-27-2012, 01:25 AM
SEO Spider crawls any website and returns the number of pages on the server, the number of pages indexed by Google, link popularity, and Alexia rank, and a summary report with search engine ranking stats.

iphone application development (http://timeappsshop.com/)

titleseo
09-27-2012, 02:01 AM
Spider is a search engine software which crawl the websites. It is built for collecting data from online sources. All spider software work automatically.

fivebucksdeals
09-27-2012, 02:27 AM
(Spiders) The main program used by search engines to retrieve web pages to include in their database. See also: Robot.

blessy_smith
09-27-2012, 02:43 AM
Very interesting post.. Thanks for it...

samdorjey
09-28-2012, 01:51 AM
Search engine spider, its program that automatically fetches web pages from site then it will take a snapshot of the web pages and then forward in to database.

prgetxhtml
10-12-2012, 03:57 AM
A 'spider' is a searchbot - the tool that a search engine uses to crawl a website and index its pages for the search engine results. Because it 'crawls the web' it is generally termed a spider.

webcreations
10-12-2012, 05:11 AM
A spider is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a crawler or a bot. Spiders are typically programmed to visit sites that have been submitted by their owners as new or updated. Entire sites or specific pages can be selectively visited and indexed. Spiders are called spiders because they usually visit many sites in parallel at the same time, their legs spanning a large area of the web. Spiders can crawl through a site's pages in several ways. One way is to follow all the hypertext links in each page until all the pages have been read.

SamDenial
10-12-2012, 06:45 AM
Spider or bot or crawler is a software program fetches websites to index in search engines result pages whenever some body search query related to you.

jacksonpeter`
10-12-2012, 07:39 AM
It's a google's generated algorithm which is named as "SPIDER". It is developed to read or index sites which is required by the users according to their search quires. Thanks

adamscot
10-12-2012, 07:52 AM
Spider is a search engine software which crawl the websites. It is built for collecting data from online sources. All spider software work automatically.

can you explain how it will works...?

ahsanabrar
10-12-2012, 07:55 AM
spider means that search engine crawler, who crawl your website

otonat007
10-12-2012, 10:49 AM
Spiders is a name for the robots (automatic programs) that scan web and save the information in websites to a database.

danish00
10-13-2012, 02:13 AM
spider is a programming system in which it visits the websites pages and save its database. it give benefits to ours sites.
day picnic around delhi (http://www.weekendgetawaysdelhi.net/?page_id=90)

robotforce
10-13-2012, 03:33 AM
A program that automatically fetches Web pages. Spiders are used to feed pages to search engines. It's called a spider because it crawls over the Web. Another term for these programs is webcrawler.
Because most Web pages contain links to other pages, a spider can start almost anywhere. As soon as it sees a link to another page, it goes off and fetches it. Large search engines, like Alta Vista, have many spiders working in parallel.

nilam-shah
10-30-2014, 06:29 AM
Spider is also called as Googlebot and crawler....... Thaty is crawl the websites content.......

vcominfotech
11-03-2014, 01:17 AM
Its like a program it will crawl all our web pages and it will submit to Google about our website.

ajay@bhavya
11-03-2014, 01:45 AM
A Web crawler may also be called a Web spider, an ant, an automatic indexer, or (in the FOAF software context) a Web scutter. Web search engines and some other sites use Web crawling or spidering software to update their web content or indexes of others sites' web content.

rajivdevkk
11-03-2014, 01:53 AM
That's nice details about spider. spider is search engine crawlers which crawls the site pages and save it to its database.

day picnic near delhi (http://www.arounddelhi.net/?page_id=203) | picnic spots near delhi (http://www.conventionindia.in/Daysout.aspx) | shoghi resort (http://www.shogi.in/)

jaysh4922
11-03-2014, 05:24 AM
Spiders,Crawlers, or google Bots are just programs the search engines use to index the content found on the World Wide Web.

Christopherdave
11-03-2014, 11:42 PM
Spider is an robot that analysis search engine database completely and rank them according to their position in search engine.

acsius1
11-04-2014, 07:04 AM
Spiders belong to the phylum Arthropoda, along with insects and crustaceans. the order of spiders, Araneae together with scorpions, harvestmen, and the large order of mites and ticks make up the class of Arachnida.

Rajdeep Bose
11-05-2014, 07:06 AM
A spider, also known as a robot or a crawler, is actually just a program that follows, or "crawls", links throughout the Internet, grabbing content from sites and adding it to search engine indexes. Spiders only can follow links from one page to another and from one site to another.

ATSI
03-09-2015, 03:22 AM
Spider or Bot is a Search Engine program that crawls or visits every websites that are indexed and checks for any updates that meets with the latest Google algorithms and rank them respectively

SammiRose
03-09-2015, 06:24 AM
Spider is one of the most special program from the various search engines to collect data from the websites and delivers to the datacenters of the search engines. You can prevent this spider to not index certain webpage or directory through robots.txt file.

om2580
03-09-2015, 07:12 AM
very nice , useful information, thanks for posting with us

oliviagrant
03-09-2015, 07:15 AM
Hello Friend,

A spider is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index.

BenWood
03-09-2015, 07:33 AM
Spiders follows hyperlinks from one web page to another and from one web page to another. That is the reason why hyperlinks to your web page are so crucial. Getting hyperlinks to your web page from other sites will give the spiders more possibilities to find and re-index your web page.

sheenagoyal13
03-09-2015, 07:43 AM
spider is a robot. it is used to web examining for the website in order to add in the database of search engine. it means it is used to index of the new website.

cmsideas
03-10-2015, 11:13 PM
A spider is a piece of software that follows links throughout the Internet, grabbing content from sites and adding it to search engine databases.