Duplicate Content

Enough already! There is so much mis-information in the marketplace that I wanted to address the fear that Jason Potash may have created (perhaps unintended) when he pitched his newest ArticleAnnouncer product a few weeks ago.

The reason I think Jason introduced the fear was to show how his product offered 3 rotating resource boxes of content within the database design as the solution to the perceived problem of Google discounting sites for duplicate content (the idea that changing 10-20% of your article’s content prevents it from being considered substantially duplicate content in the eyes of the major search engines). There may be some truth to the fear, but let’s look into this issue further…

Here is an excerpt from Google’s guidelines:

Google says, “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.” Source.

The TRUTH: Google is already discounting duplicate content across multiple sites and for good reason. Do you really want to see 20 copies of the same article when you go to Google to find something of specific interest to your needs at the moment? I didn’t think so. The strongest authoritive sites show up at the top of the SERPS (search results) just like they should.

The TRUTH: Don’t create substantially duplicate content on sites that you own or control.

The TRUTH: Syndicating your articles on sites like ours helps you create traffic (sometimes without the help from search engines) and you NEED this type of traffic because you never want to be fully dependant on any one search engine for the bulk of your traffic, right?

One of the things I love about syndicating your articles or blog entries is that you reach people that would not have otherwised found your site…many times even without search engines.

Don’t get me wrong, we love what the major search engines can do for our listed authors, but I don’t know of any author that has been delisted from a major search engine for syndicating their articles across the Internet, even if done in a big way.

IF the major search engines penalized sites for syndicating articles, they would have to also penalize sites for syndicating their BLOG entries too and yet we find the exact opposite happening! …meaning Blogs get rapid indexing attention…and the same goes for newspapers that all reprint the same news stories around the world.

Agree/Disagree? What do you think…


D. M. Giolitto writes:

So you’re saying that:

– an author who dupes articles under different pen names on one singular website is wasting his or her time?

-and the reason is because Google can identify their duped content and refuse to index it?

But are you also saying that it’s still OKAY to submit an article to several DIFFERENT syndication sites because Google doesn’t penalize that and the exposure helps the author?

(I’m hazy on the last bit).


Comment provided July 11, 2005 at 8:52 PM


Chris Knight writes:

Bottom line is that article marketing as you know it is OK and there is no downside to syndicating articles via submitting them to sites like ours and others.

In other words, the perceived downside does not exist in reality and other authoritive forums/blogs that I researched on this issue are all saying the same thing: This is not a real issue.

Carry on. :-)

Comment provided July 11, 2005 at 9:06 PM


Joel writes:

The duplicate content penalty applies to:

1) Content duplicated across a single site, rooted in the days when search engine spammers would take a single sales letter and then make 1000 copies of it using find and replace to insert a different keyword for each version.

2) “Mirror” sites that duplicate ALL or nearly all the content on a site, rooted in the days of slow servers and poor international connections when this was an important way of dealing with huge amounts of traffic. It also helps guard against site theft.

This is the proof I always use: if reprint content carried a duplicate content penalty, newspaper websites would not have such high PR, since they all carry articles reprinted from the newswires.

Comment provided August 4, 2005 at 7:47 PM


Kevin writes:

PR has nothing to do with duplicate content. I can use duplicate content all day long and still get a high PR ranking. In fact, content period has nothing to do with PR.

Comment provided August 8, 2005 at 3:39 PM


Curt Landberg writes:

I believe that Google has found a way to penalize sites that use duplicate copy. And that it is beneficial to authors to submit articles for distribution.

Comment provided January 24, 2006 at 11:58 AM


RSS feed for comments on this post.

Leave a comment

Please read our comment policy before commenting.