View Full Version : How Google finds out duplicated content?

04-06-2010, 12:43 PM
I wonder how Google finds out duplicated content and what should be changed on the site so that search engines properly index the pages and donít penalize them for non-original content?

04-07-2010, 03:23 AM
google search engine is a s/w, it is store complete data base in own memory in serially.

04-07-2010, 04:37 AM
Google - Duplicate content....
If your site contains multiple pages with largely identical content, there are a number of ways you can indicate your preferred URL to Google. This is called "canonicalization".
In some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.
Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a "regular" and "printer" version of each article, and neither of these is blocked with a no index meta tag, we'll choose one of them to list. In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

04-07-2010, 08:20 AM
Yes they do. But that is not instant.

Lenny Debra
04-07-2010, 04:35 PM
google results to one unique string were two pages one our original page and one exactly the same looking page with URL of the redirect link. we have seen this by looking the cached snapshot.

04-10-2010, 12:52 AM
In rare cases ,google finds out duplicate content.Duplicate content does not imply in any way shape or form spamming. ... any search engine should be able to examine your site down to it's smallest detail.
Dental Negligence (http://www.5r1claims.co.uk)

04-11-2010, 03:35 AM
Google and other major search engine like unique content. When any one first time submit a unique content on web then google keeps record of its own. We should always try to make our own unique content.

04-12-2010, 09:10 AM
Google also look at the backlinks within the copy. A great deal of content is stolen by automated scraper bots who often maintain links contained within it. That is why it is a good practice to leave one internal link in your blogs/articles, because if Google sees a link pointing back to a site that has the same content, chances are that content originated at the arrowhead end of the link.