bradedvin
03-23-2010, 05:44 AM
Search engine spiders are software programs that sift through content on web pages and build lists of words that appear on those pages. This process is called web crawling. The program visits page after page, following each link and recording the content of each page as it goes along like a spider crawling through the pages. This content is then added to the search engine's index.
Different search engines use different approaches when sending out their search engine spiders. Some keep track of every word on a page. Others record meta tags, titles and subtitle words. Indexing the 100 most common words on a page is another tactic used by a search engine spider.
Links to websites are what feed search engine spiders. The more often that spiders see links to a website, the more often they visit. This gives the spiders more information to index a site and makes the site appear higher in the search engine's results for a search terms related to that site.
Different search engines use different approaches when sending out their search engine spiders. Some keep track of every word on a page. Others record meta tags, titles and subtitle words. Indexing the 100 most common words on a page is another tactic used by a search engine spider.
Links to websites are what feed search engine spiders. The more often that spiders see links to a website, the more often they visit. This gives the spiders more information to index a site and makes the site appear higher in the search engine's results for a search terms related to that site.