PDA

View Full Version : What Are Search Engine Spiders?



bradedvin
03-23-2010, 05:44 AM
Search engine spiders are software programs that sift through content on web pages and build lists of words that appear on those pages. This process is called web crawling. The program visits page after page, following each link and recording the content of each page as it goes along like a spider crawling through the pages. This content is then added to the search engine's index.

Different search engines use different approaches when sending out their search engine spiders. Some keep track of every word on a page. Others record meta tags, titles and subtitle words. Indexing the 100 most common words on a page is another tactic used by a search engine spider.

Links to websites are what feed search engine spiders. The more often that spiders see links to a website, the more often they visit. This gives the spiders more information to index a site and makes the site appear higher in the search engine's results for a search terms related to that site.

bermuda
03-27-2010, 06:56 AM
In the past the search spiders were functioning irrelevant of the links they would find across the net but nowadays they are following the backlinks established over the net to find some new websites launched or index recently set up pages.

elena1234
04-09-2010, 03:00 AM
Crawlers can also be used for automating maintenance tasks on a Web site, such as checking links or validating HTML code.

davidemarsh01
04-13-2010, 11:07 PM
A spider, also known as a robot or a crawler, is actually just a program that follows, or "crawls", links throughout the Internet, grabbing content from sites and adding it to search engine indexes.

Spiders only can follow links from one page to another and from one site to another. That is the primary reason why links to your site are so important. Links to your website from other websites will give the search engine spiders more "food" to chew on. The more times they find links to your site, the more times they will stop by and visit. Google especially relies on its spiders to create their vast index of listings.

Spiders find Web pages by following links from other Web pages, but you can also submit your Web pages directly to a search engine or directory and request a visit by their spider. In fact, it's a good idea to manually submit your site to a human-edited directory such as Yahoo, and usually spiders from other search engines will find it and add it to their database. It can be useful to submit your URL straight to the various search engines as well; but spider-based engines will usually pick up your site regardless of whether or not you've submitted it to a search engine.

chincalen
04-19-2010, 12:55 PM
Sometimes they're call spiders, other times they're called bots, and still sometimes they're called crawlers. Either way, they all mean the same thing - programs that search through your website to index it properly.

~ServerPoint~
04-20-2010, 04:48 AM
Thanks for your information. The thread is closed.