Duplicate content is one of the biggest SEO myths around. The idea is that if different pages of a website have content very similar to each other, this will incur some kind of penalty from Google that will cause the site to fall down the search engine rankings. However, this isn’t true, and a recent statement from Google’s John Mueller confirms it.
Speaking at Google Search Central SEO office-hours, Mueller pointed out that when Google finds duplicate content, there’s no “negative signal” associated with it. That means it’s not going to get penalized just because it’s duplicated, although there can still be problems with posting too many pages with very similar content.
While this is great news for businesses - especially ecommerce stores with large numbers of similar pages - it doesn’t mean you can sit back and continue as you are. If you want to make sure your SEO efforts are as effective as possible, you may need to make some changes.
What does this mean for your SEO?
For the most part, businesses don’t need to worry about posting duplicate content. The vast majority of this is non-malicious, and is generally either items in an online store, pages in discussion forums or ‘printer-only’ versions of pages. If that sounds like your business, then you don’t need to worry.
It should go without saying that maliciously duplicating content - especially in a way that provides a poor user experience - should be avoided. It’s likely to be tagged as spam, as in one example of a PR firm sending out hundreds of press releases at once that all contained the same content, taken straight from the homepage of the site they were promoting.
However, for most of us this simply means we don’t need to worry about duplicating any content. When building sites, we can focus on providing the best user experience, even if that means repeating words, phrases or larger chunks of text.
Duplicate content still carries risks
That said, there are still problems that duplicate content can cause. For example, if a search engine like Google receives a search query and multiple pages with the same content could be valid results, it’ll generally only return one page; whichever the algorithm thinks is the original. That could lead to some pages going missing from search results altogether.
Moz also points out that several metrics Google uses to rank pages - such as trust, domain authority or link equity - could end up split between different pages, or excluded from some pages altogether. This, in turn, can have a dramatic impact on search ranking performance and therefore should be considered when creating or duplicating content.
There’s a solution to all this: canonicalization. Put simply, this involves informing Google (or other search engines) which page out of the duplicates you consider the main one. Google will then apply all its ranking metrics to this one. This prevents a lot of the problems with duplicate content, preserving your SEO ranking.