PDA

View Full Version : How does Search Engine index a Website..?



ankitasharma123
09-08-2016, 02:13 AM
Hello friends,

How does Search Engine index a Website..?

RH-Calvin
09-08-2016, 02:27 AM
Search engines have their own algorithm to crawl and index a webpage. First the webpage is crawled by robots and then indexed in search engines.

pablohunt2812
09-10-2016, 05:15 AM
How search engines index your site

How do you get your site to appear in search engines?

The first question many people have is how to get their site to appear in search engines. In days gone by webmasters would manually submit their site to all of the major directories such as Yahoo, and wait for their site to appear. Those days have pretty much come to an end, with search engines like Google becoming completely automated.

Google uses software known as spiders to crawl over the web, looking for sites to add to their index. So the chances are your site is already indexed. To check if your site is indexed in Google you can type ‘site:’ followed by your website address into the Google search box (e.g. site: http://www.netxtra.net).

If your site isn’t listed in the index it means that the spiders haven’t found it yet, which can occur for a number of reasons (such as your site was unavailable when Google tried to call it, or the design makes it difficult).

To understand how search engines index your site you first need to understand how spiders work.

Spiders
As you probably already know when you type in a search the results you see are not ‘real time’. They are built up from snapshots of pages taken by these things called “spiders”.

So if you perform a search for netXtra and click on the ‘cached’ link underneath the entry you will see the text at the top of the page explaining when the snapshot of that page was taken.

How do spiders work?
In order for robots to spider your site they follow links in your pages. So in the netXtra example the spider would identify ‘about us’, ‘services’, ‘clients’ etc. Once it had indexed the main page it would follow these links and then index those pages. Once it has indexed the ‘about us’ page it would then look for further links to carry on spidering.

This means sites with Flash navigation, for example, make life hard for the spider, as it cannot follow links.

The spider looks through the content on all of the pages it finds, and uses this to help build up the Google search index.

How do I make my site ‘search-engine friendly’?
This is a frequently asked question, and not one with an easy answer. There are many companies who offer guaranteed positions on search engines after undertaking ‘search engine optimisation’ – so many believe there is a formula which means you’ll get a high ranking.

Although there are many quick-and-dirty ways to move a site up a few places trying to con a search engine spider can result in your site being removed from the index entirely.

Some sneaky tricks that have been used to ‘improve’ ranking in the past include:

Adding hidden text or hidden links to pages specifically for search engines.
Cloaking – showing different content to search engines and users.
Load pages with irrelevant words.
Create multiple pages, subdomains, or domains with substantially duplicate content.
Creating "doorway" pages created just for search engines or pages for affiliate programs with little or no original content.
The way spiders work are constantly being updated, so the quick win never lasts very long, and could have potentially damaging long-term effects for your website.

The most important thing to do is make sure your content is relevant and up to date, and have a number of high-quality sites linked to your pages.

deepakrajput
09-10-2016, 07:10 AM
When you allow the site robots, search engine index the sites.