Duplicate Content
Enough already! There is so much mis-information in the marketplace that I wanted to address the fear that Jason Potash may have created (perhaps unintended) when he pitched his newest ArticleAnnouncer product a few weeks ago.
The reason I think Jason introduced the fear was to show how his product offered 3 rotating resource boxes of content within the database design as the solution to the perceived problem of Google discounting sites for duplicate content (the idea that changing 10-20% of your article’s content prevents it from being considered substantially duplicate content in the eyes of the major search engines). There may be some truth to the fear, but let’s look into this issue further…
Here is an excerpt from Google’s guidelines:
Google says, “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.” Source.
The TRUTH: Google is already discounting duplicate content across multiple sites and for good reason. Do you really want to see 20 copies of the same article when you go to Google to find something of specific interest to your needs at the moment? I didn’t think so. The strongest authoritive sites show up at the top of the SERPS (search results) just like they should.
The TRUTH: Don’t create substantially duplicate content on sites that you own or control.
The TRUTH: Syndicating your articles on sites like ours helps you create traffic (sometimes without the help from search engines) and you NEED this type of traffic because you never want to be fully dependant on any one search engine for the bulk of your traffic, right?
One of the things I love about syndicating your articles or blog entries is that you reach people that would not have otherwised found your site…many times even without search engines.
Don’t get me wrong, we love what the major search engines can do for our listed authors, but I don’t know of any author that has been delisted from a major search engine for syndicating their articles across the Internet, even if done in a big way.
IF the major search engines penalized sites for syndicating articles, they would have to also penalize sites for syndicating their BLOG entries too and yet we find the exact opposite happening! …meaning Blogs get rapid indexing attention…and the same goes for newspapers that all reprint the same news stories around the world.
Agree/Disagree? What do you think…
So you’re saying that:
– an author who dupes articles under different pen names on one singular website is wasting his or her time?
-and the reason is because Google can identify their duped content and refuse to index it?
But are you also saying that it’s still OKAY to submit an article to several DIFFERENT syndication sites because Google doesn’t penalize that and the exposure helps the author?
(I’m hazy on the last bit).
Thanks.
[Reply]