The short answer is yes. We have written extensively on the subject before: SEO Obviousness: Duplicate content sucks, Duplicate content sin #1: Pagination, Duplicate content sin #2: Default page linking, SEO worst practices: The content duplication toilet bowl of death, and 5 SEO Strategies We Swear Aren’t Going Anywhere.
So we have done our due diligence warning you about the dangers of duplicate content. But, there is another side to the story. There are instances where we’ve seen clients fear the duplicate content plague so much so that they go to extreme lengths to avoid it, and end up preventing their sites from performing optimally.
Let’s take Portent client www.RealTruck.com for example. This large e-commerce site in the competitive aftermarket truck accessories market had taken several measures to prevent their site from falling prey to the problem of duplicate content. They had implemented:
The client was worried about pages competing with each other, or cannibalizing each other. For example, a sub-category page such as Front-Mount Snow Plows would compete with the main Snow Plows category page. But in taking these measures, they simply weren’t ranking for long-tail keyword terms like “front-mount snow plows,” or search results for long-tail terms were going to the wrong landing pages.
We recommended that they remove noindex, nofollow attributes on filtered pages and internal links. By opening up the flood gates to sub-category and filtered pages, the main category pages were not losing out; rather, additional keywords were won.
We also recommended removing the canonical tags that were on sub-category and product pages that pointed to main category pages, using correct pagination for categories with multiple pages. With correct pagination, all pages reference the first page. Using the canonical tag, however, tells search engines that these pages are the same and prevents all but the first page from ranking, which is usually not what you want.
Once these recommendations were implemented, their top 20 keyword rankings nearly doubled in less than a year. Organic unique visits increased 102% between August 2013 and August 2014 – and continues to grow. Organic revenue increased 113% during that same time frame.
For RealTruck.com, much of their success comes from the fact that they are dedicated to search engine optimization. When we make a recommendation, their team whole-heartedly commits to implementing it. They’ve spent considerable time ensuring that their pages are unique and do not have duplicate content. Here are some of the additional tactics they’ve taken to make sure every page of their site provides unique value:
These things improved search engine optimization dramatically and, even more importantly, they provided a better user experience. Each page is not a cookie-cutter duplicate of the last, but rather is crafted and personalized depending on the filters selected.
In case you didn’t already know, Portent won Best SEO Campaign of 2014 at the US Search Awards for this campaign.
Duplicate content is a huge can of worms in itself, but the real problem here was an over-extension of the definition. There’s a difference between real duplicate content and fake duplicate content.
See if you can spot the duplicate page here:
Duck Image provided by <a href="http://commons.wikimedia.org/wiki/File:Rubber_Duck_Florentijn_Hofman_Hong_Kong_2013d.jpg" target="_blank">Florentijn Hofman</a>
I tried to make it painfully obvious but just in case you didn’t notice, Page B is definitely the duplicate. Printer-friendly versions of pages, by definition, have absolutely no unique content from the original page, and so those should be noindexed or should have a canonical tag pointing to the original page. Page C however, even though it has a similar image and layout as page A, has some unique content, so it should be allowed to stand on its own.
Many people worry about duplicate content, but Google has never said that it would directly cause a penalty.
Google doesn’t treat duplicate content as spam by default. You have to do spammy things with that duplicate content, like set up an auto-generated site based off an RSS feed, in order for Google to take action against your site.
The problem with duplicate content is not that it will get you a slap on the wrist from Google; it’s that it confuses the hell out of the search engines and forces them to choose which one of your duplicate pages is the best to show in search results. And sometimes they’ll choose the wrong one.
If you need help strategizing how best to address duplicate content issues on your site, check out our SEO services.
Outreach as an SEO practice is becoming more common—and more difficult. With backlinks being one of the top Google ranking…
SEO consultants love the principled stand. We make recommendations like You Must Change Your Sitewide Navigation Or Die. Change Your…
To immediately clear the air, this post is not about why you should be tracking micro conversions. If you are…
Blogging has been a hot-button topic among Portent’s clients for the last decade, with no signs of abating. And given…
How do you measure the success of your SEO campaigns? What are the best SEO KPIs (key performance indicators) or…
As mentioned in a previous post, social attribution windows are incredibly important for advertisers to track metrics and maximize performance.…