Keyword Density Rejection Case Study

Keyword Density and Keyphrase Density in articles is a topic we don’t like discussing because we’d prefer that our members remain blissfully unaware of what all this nonsense is. The temptation to optimize your articles for some numeric keyword density metric would (in our opinion) produce a less authentic, less valuable, and less reader-centric article.

Occasionally a member will produce an article that has excessive keywords that we reject and the member will feel it’s way unfair because they weren’t aiming for any keyword density metric. Our position is that math trumps human review in these cases, meaning that we’ve had to put a hard floor on what we’ll accept numerically.

The actual numeric floor that is public is to not exceed 1% keyword density. That means you should not repeat a keyword or keyphrase more than once per 100 words. There is a gray area where we’ll accept up to double that (2%) if other variables are met (high quality unique article by an author with a clean history of submitting high value content over time).

EzineArticles expert author Steve Weber has given us permission to use one of his unpublished articles as a case study. Below you will see the internal tool that we use to see what the human eye might not see when reviewing content for this issue:

How Important is Google Page Rank Anyway?

Many people misunderstand the issue of Google Page Rank. For those who don’t know, it is a ranking Google gives to every individual page it indexes. It varies page to page; your home page may have a PR of 4, but a secondary page may have a PR of 2…or even visa versa sometimes.

You can see the page rank of every page you visit by installing the Google Toolbar. After installing it, you must manually enable the Page Rank feature in the bar. Be aware that the toolbar is only updated by Google every 3 or 4 months. Therefore, the PR displayed may not be the same PR Google actually has given your site at any given time.

PR for a given page is based upon how many backlinks there are to that page. Additionally, the quality of those backlinks plays a significant role in the rankings. Even a site’s own internal linking has a positive effect on PR. In most cases a dozen or so directory type backlinks, along with a good internal linking strategy between a site’s own pages will eventually result in a PR of 1 or 2.

All things being equal, a higher PR is better than a lower PR. However, a page with a higher PR will not necessarily insure it ranks higher in a Google search than a page with lower PR. In fact, it happens all the time that a low PR page will outrank a higher PR page for a given keyword.

How can this happen? It’s all about relevance. If a page is optimized better for a given keyword, then chances are it will rank higher than a page which is not optimized as well even if the poorly optimized page has higher PR. This can be especially true when long tail keywords are concerned. Again, it’s all about how relevant a given page is for a given keyword…according to Google. Page Rank is only one factor in the search result rankings.

A side note about PR:

The rankings are based on an algorithmic scale. Take for example a page which has a PR of 3. Moving that page to a PR of 4 means increasing the page‘s “link juice” by a factor of 10. Each singular increase in PR represents a ten fold increase in back link quality and/or quantity. This factor is why it is very difficult and rare for pages to ever obtain PR‘s of 7 and higher.

The moral of the Page Rank story is not to worry a lot about it. Long tail keywords are most important for smaller/newer sites. PR only plays a small role in determining the success of a well optimized niche site.

Article Stats:

  • Body Word Count: 452 words
  • High Density Keywords:
    • page occurs 24 times (5.3%)
    • pr occurs 19 times (4.2%)

I can see why Steve is frustrated with us for not accepting this article because I believe he was honestly not aiming for a keyword or keyphrase density metric even if he is an SEO expert and has full knowledge of what keyword density as a concept is.

Essentially his article exceeds our external published guidelines by 4-5 times and our internal “not-so public until now” guidelines by a little more than double the metric allowed.

At this time, his unpublished article above is a casualty of our system designed to prevent keyword abuse.

Side Note: We only compute this metric on the article body, but if the keywords and keyphrases that exceed our guidelines are also used in the Article Title and Resource Box (They ALWAYS are in articles that are correctly rejected) this may count against your article to get accepted if it’s on the edge of acceptance or rejection.

Suggestion: Be keyword intelligent in your Article TITLE, and forget about doing any keyword density metric in the article body.

I’d like to know your thoughts on this issue?


Kathryn Merrow writes:

Hello Chris,

Would Steve be able to edit and resubmit his article with fewer keywords? Or, once an article is rejected for this (or any) reason, is it not allowed to be edited?

While I saw instances where PR or Page could be eliminated, I couldn’t come up with a suitable replacement word or phrase to replace Page.


Comment provided January 8, 2009 at 9:55 AM



GREAT info!

I think that rejecting an article that uses keywords too often has a second value to the author. Beyond all of the above info, it just doesn’t read well. Eyes gloss over words that are overused and an article, book or whatever becomes boring and redundant. I am reminded at this moment of how viewers laughed at Sarah Palin’s overuse of the name ‘Charlie’, during her interview with Charles Gibson. While everyone understands the value of repeating someones name back to them, there is a point where it becomes obnoxious.


Comment provided January 8, 2009 at 10:03 AM


Steve Weber writes:

Correct, I did not try to optimize for the word “page”…why would I?

I wrote the article completely off the top of my head with no thought whatsoever of optimizing. I offered this article to my members in another format…I was writing for their benefit to learn about page rank.

Reading the article, what word could you substitute for “page”? Also, where could I have not used the word “page”?

Also, if I didn’t write “PR”, I would have to write “Page Rank”…so I am still screwed with the word “page”

The rejection email I received stated “nothing over 1% density will be allowed.” I know for a fact of others who have submitted articles lately who have a keyword density higher than that.

1%….?? I just don’t get that.

So is this rule purely subjective on EA’s part? They enforce it sometimes and sometimes not??

Comment provided January 8, 2009 at 10:19 AM


Mark Claysob writes:

I have also been a “victim” of this and I can say that it has helped me enormously. Often, on re-reading an article that has been rejected, I can see that it simply does not scan as well and by just removing the offending words (and replacing with more appropriate ones) the article is that much better for it.

I would like to add that I have begun editing my article before it is sent – something that a good author should be doing anyway. the rejection of articles by EzineArticles in this was just solidifies my opinion that it is quality content that is sought for.

Comment provided January 8, 2009 at 10:43 AM


Jim McDowell writes:

All I can say is Wow ! I had not thought about this issue and have had no personal problems with it, but can see what a major problem this could become . Again I must say Wow! Will be watching how this turns out.

Comment provided January 8, 2009 at 10:53 AM



Steve.. It is soo funny because honestly in ‘your’ case? I am stumped too as to how I would rewrite it.

Good luck with it! LOL! You really do have an interesting article!

Comment provided January 8, 2009 at 10:59 AM


michael cardus writes:

As an author that this has happened to I must admit that I accept the keyword density math.
Once my three articles were denied for execcive keywords. I read then and, what ezine did was what a good editor would do. Told me there was an issue and I fixed it. Re-wrote the articles and they were accepted.
Perhaps ezine should be thanked for keeping its standards high. It makes all of look better.

Comment provided January 8, 2009 at 11:39 AM


mike Krutza writes:

Steve’s comments make sense. I found his article very useful and instructive.

Comment provided January 8, 2009 at 12:30 PM



The truly high standards with EzineArticles are spurring me on to be a better writer — and giving me a great incentive to write more and increase my articles. I LOVE THIS. What a fantastic way that Chris and his team are serving !!!!!!! Now, back to my next article……………

Comment provided January 8, 2009 at 12:44 PM




Yes, you can edit an article if it has been rejected. In fact, we encourage it.


Of course our review process is partially subjective. :-) Without it, we would not need the human element of the review and would auto-deny based on a flat percentage alone. By doing this, we would lose a percentage of good quality articles which is why we don’t do that. We have in place continuous training sessions that teach our editors what to look for when making decisions like this.


We currently have 2,403 articles in problem status for keyword density concerns. Of those, I feel that ~70-80% of these can be fixed and resubmitted for another review.

Comment provided January 8, 2009 at 1:11 PM


Calvin Loh writes:

Overall, I’m happy with EzineArticles, though I need to write more articles.

Perhaps you could check for key phrases (with 2 or more words) instead of keywords. I don’t think anyone actually does SEO for a single word.

I’ve been tripped by this issue a few times. I vary my key phrases, but some key phrases share common words.

BTW, do you still show the offending keywords/phrases when the article is rejected? The last time one of my articles was rejected due to excessive keywords (one or two months ago), the offending keywords were not highlighted and I had to guess.

Comment provided January 8, 2009 at 1:26 PM


Lance Winslow writes:

This has happened to me before too, accidentally, as I do not ever worry about keyword density having made up for any of those potential gains with content quantity.

Comment provided January 8, 2009 at 1:44 PM


Bruce Point writes:

We think that you have gone to the extremes in your Keyword policing. You act like the Gestapo. There really is nothing wrong with this article. We would not have written it this way but that is writers right to write poorly if he or she chooses.

In the past there were a number of people who would waste their time keyword spamming an article to the point that it was worthless. EzineArticles corrected the problem about 2 years ago.

Now you have reached the point of the ridiculous. Using a mathematical model to police article keywords is plainly stupid and unethical editorial misconduct at best.

Frequently you cite our members for keywords that they are not even trying to compete for. Just like the word “page” in the above article. Just plain stupid policy.

We do not need more cops and laws in our lives we need less. We need the freedom to express our selves freely without a lot of needless BS rules.

Stop micro managing the article content or you will lose the best authors.

EzineArticles is a great place to publish but you might kill the goose that lays the golden egg if you continue to over regulate content.

Our guild already publishes a large number of articles on other directories as a result of some of EzineArticle rules. We also realize that you could care less.

We would suggest that you return to the level of common sense that was prevalent in the past.

Bruce Point
Bruce Point Partners LLC

Comment provided January 8, 2009 at 1:59 PM


Peter Cutforth writes:

I reckon you’ve opened up a can of worms Chris LOL!

I’ve coincidentally just had my first article rejected for this reason, so it’s a very useful discussion, and thanks for letting us know the benchmark.

I’m one of those authors that the purists within the EzineArticles community would seemingly regard as “evil” :-) because I do pay attention to getting the article to rank well in Google.

I researched the top 10 most viewed articles in EzineArticles, and with 2 (political) exceptions, they ALL were ranked in the top 5 in Google for the keyword phrase the article was written for, and that’s why they have had so many views. To me it makes much more sense to spend a few minutes researching, and write your article with a (long tail) phrase in mind.

The fact that you have a max of 1% I think is a good thing, because my understanding is Google don’t like too much more than that anyway.

In a typical 400 word article, if you use the phrase in the title, and a couple of times throughout the article that’s all that’s needed.

Problems arise when you naturally use a single word, or smaller phrase more often in the natural body of the article, as in the PR example above… (Could Steve use the word “webpage”? Would that trigger the count for “page”?)

I think it also comes down to the article marketing strategy you use. If you are going for volume, and relying on organic traffic from within EzineArticles, and ezine distributions, then KW phrases are not important, but to me this is a “shotgun” method. You’d be better off doing some research and using the wonderful authority EzineArticles has in Google for your article’s benefit.

By the way Chris, whilst I can relate to some of Bruce Point’s frustration, I also think that the very fact that EzineArticles has such “tight” standards, is one of the reasons it is so well treated by Google particularly. The last thing we would want is for EzineArticles to suffer a “Google slap” like Squidoo did some time ago.

Comment provided January 8, 2009 at 5:22 PM


Lance Winslow writes:


Ha ha ha, well that is funny.

I think 2% is a better idea for specific words. Therefore a 400 word article could have a word used 8-times, of course negating things like; “the” and such.

Phrases could be 1%. You shouldn’t ever be over that. Using key-word combinations and phrases should never be more than one-time per hundred words.

Comment provided January 8, 2009 at 5:32 PM


Jenny Thompson writes:

Wow! So I’m learing about blogging and there is so much info out there on what works and what doesn’t that one never knows what is correct and what is BS. But I did print out your article as a good example of what NOT to do. ;-)
Thank you.

Comment provided January 8, 2009 at 9:39 PM


Shirley Bass writes:

While I understood Steve’s article, I can understand why using a keyword or phrase every 100 words is appropriate. Anymore than that is called ‘spamming,’ isn’t it?

I have one word that I use in my articles that I do not consider to be a keyword and like the word ‘page,’ I may use it too often. I will keep an eye on that word.

Chris, I do not like this topic either, I think it is stifling.

To me, Google has preset the rules on this matter.

Hope it all gets resolved soon.

Shirley Bass

Comment provided January 8, 2009 at 10:59 PM


Calvin Loh writes:

Flawed or not, I can live with EZA’s rules. It is still the best performing article directory for me.

Even now, EzineArticles sends me much more traffic than GoArticles, which has no editors and all articles are automatically approved. Up to the middle of last year, I could still get 50 or 60 article views every month, but now I’m lucky to get 5 or 6 per month.

I still get a lot of article views from Article Dashboard, where articles require approval, but Statcounter tells me I’m not getting any traffic from them.

Overall, if EZA’s rules continue to get good traffic for my website, I’m happy.

Comment provided January 9, 2009 at 1:14 AM



It’s a big challenge of balancing ‘quality’ with ‘keyword overuse’. The article has quality, but keyword overuse. But the problem is how to replace those necessary words.

Someone would do a huge service if he/she edits this article using maximum permissible 2% use of keywords and post a sample here.

This often debated topic is always interesting and not stifling, Shirley. We can go on learning from it.

Comment provided January 9, 2009 at 1:34 AM


Mel Menzies writes:

As the author of a number of books, I do understand the importance of editing out (among other things) repetitious words. However, I can quite see the difficulty in this particular article. When you’re writing about something as tightly focused as page ranking, there are few, if any, synonyms which can usefully be employed. A web page, for instance, can’t be called a “folio” or a “sheet” as it can in the print world.

Personally, I find headings and sub-headings useful, and always try to make the first three words count. Sometimes that means turning your syntax around as in my latest blog: Manuscript Submission Guidelines (long tail key phrase) “How To Write A Publishing Proposal (key phrase) For Your Book” is better than beginning with the phrase “How To Write etc. etc.” as those three words are not key words or phrases.

I’m far from perfect with my submissions to EzineArticles. But I do applaud your high standards, Chris.

Comment provided January 9, 2009 at 6:26 AM


Des writes:

I would be interested in a having access to the same tool that ezine uses to detect “keyword over population”.

Like the spell check tool on submission.

I am sure that with a tool like that, there would be much fewer instances of articles being rejected if I was able to edit my submissions beforehand.

Thanks for letting me submit to ezine!


Comment provided January 9, 2009 at 6:39 AM


Steve Weber writes:

That’s a good idea. I’d like to see a rewrite for that aritcle where “page” was only used 1 or 2% of the time.

So far, no moderator or commenter has suggested how to trim the word “page” or “pr” by 80% in an article on that topic.

I’m not even going to try rewriting it. I think it would look so awkward jumping around the word “page” that the article would be embarrassing.

Comment provided January 9, 2009 at 8:07 AM




Actually, thousands of authors submit excessive keyword loaded article vomit monthly… hence the need for a hard line in the sand to be drawn somewhere.

Yes, we now show the highlighted words & keyphrases after an article is rejected. Just click VIEW the unpublished article and you’ll see the same example above minus the actual percentage calculation. We don’t provide the calculation because the rejection % varies on purpose.

Bruce Point,

Authors who accidentally or purposely write highly dense keyword loaded articles are not golden eggs. We consider these articles as liabilities that test the trust we have with the overall market.


Correct, we have systems in place to not penalize for conjunction words like “a” “the” “or” “and” and so forth…


The only reason we haven’t released the tool yet is because we don’t want to encourage you to write close to the edge of the cliff…. It’s our belief that the quality of your articles would go down if you tried to meet some keyword or keyphrase density metric… because you’ve taken the eye off the most important ball: Writing for the end reader & your ideal client in mind.


I’m with you that this article may not be one that can be saved…which is why I appreciated you allowing us to use it as a case study for public discussion.

Out of the 2k articles that are in keyword abuse status, there are casualties of the system and this may be a perfect example of an article that just can’t be rewritten to work.

Comment provided January 9, 2009 at 9:09 AM


Paul Darby writes:

I had the same situation as Steve back in March 2008.

Explaining the diference between types of heart disease without using the word “heart” higher than 2% is nigh on impossible.

As I was given no guidlines on the acceptable %, I ended up making 3 or 4 re-submissions before the article was finally accepted.

This resulted in me getting slapped by EzineArticles in the form of being told that I would have to submit 25 articles to achieve Platinum status. This was despite all my articles being totally original content and the article in question obviously only used the higher % for clarity, not for keyword density.

I was so dissolusioned and deflated (not by having my article rejected, but by the penalty imposed) that I have not submitted any further articles to EzineArticles.

Chris, you often comment on the fact that most members submit very few articles. I wonder how many other new authors have reacted in a similar manner to myself, if there is any correlation to members stopping writing and whether there is a different way for EzineArticles to deal with an innocent infringement of EA’s policies?

Comment provided January 9, 2009 at 12:35 PM




My hopes in posting this case study was with hopes a member would make a suggestion for how specifically we could deter willful excessive keyword spamming while being able to make the rare exception when the issue warrants it.

So far, I’ve seen no solutions offered that balance the risks against the potential for future gain of more submissions.

Said another way: We’d rather lose highly efficient keyword loaded articles (even if they are innocent in intent while guilty in metric) because it’s our belief that a greater good for all members concerned is being met with that intention.

Comment provided January 9, 2009 at 3:12 PM


Lance Winslow writes:

Chris, I would re-invite your to re-read my comment above about “individual words” and “prhases” to decrease the number of rejected articles for excessive key-words. It is a solution and should cause your number of rejected articles for excessive keywords to be cut by a third. This is a solution that helps everyone, without hurting quality issues.

Comment provided January 9, 2009 at 4:05 PM


Edward writes:

I know you’ve written about this before Chris, but the title is where keywords should be first and foremost. I’m sure savvy author’s already know this.

I used to write articles with keywords in mind. Now, I may sprinkle a few here and there, but as long as I’ve got the title “down,” no worries.

Google will still index and rank in the top 20 if the keywords aren’t too competitive. Can’t ask for much more than that.

Comment provided January 9, 2009 at 5:11 PM




So your suggestion in COMMENT #15 is to put a hard limit on 2% for keywords and 1% on key phrases.

Ok, I’ll take that idea back to our developers to run some more reports to see what that would mean to this issue.

I can tell you that from the human read of the article, key phrases that are excessive do jump out as obvious moreso than keywords that are repeated.

Even so…though, applying this 2%/1% idea against Steve’s article would have still rejected it as he’s at 4.2%/5.3%… My point being that this solution wouldn’t help cases like his.

Essentially, you are advocating for us to be MORE restrictive than we already are with the 1% target and 2% hard limit standards, right?


Comment provided January 9, 2009 at 5:18 PM


Lance Winslow writes:

Yes, it would have rejected this example, but it certainly could reduce the number of rejected keyword articles caught in the software rule matrix. My thinking is that YES, you have to prevent the abuse, but it’s probably best to mitigate it so that accidentals do not enter the fray too often, some will, there is almost no way to prevent this.

Although, I did consider one way to do it. If an author has less than 1% of their articles rejected disallow the rule. If they historically have more than 1% then enforce it. This allows the most dedicated writers to not get caught in an unnecessary loop of conflict, when they are not the problem.

It might also help those authors that end up hiring out there articles to ghost writers. I write my own articles, but realize many article marketers do not write their own articles. Often those who pay others to write their articles tell them which key words to use and to make them keyword rich. Well, that’s fine, they just cannot post them here. Then maybe they will instruct their ghost writers to chill out or they can go elsewhere. Which I think is what is happening due to this rule – GOOD.

Of course, alienating article authors who are not intentionally trying to GAME the search engines is something to be avoided and thus, requires slight tweaking of the rules.

And although there seems to be some spirited comments here and much debate. Really, this should be the least of anyone’s worries.

Look, I have what 15,000 articles here? I have very rarely triggered this rule to reject an article, it’s only happen to me a couple of times, when it has, I simply re-worded it, then re-submitted and it was no problem. It’s just not a big deal. But, if we want a superior system, some little trial and error and tweaking will need to be done to prevent the abuse.

As the last thing anyone wants here is to have EzineArticles some how devalued in the search engine rankings, that would just ruin it for everyone.

Again, I am just thinking out loud, No complaints here.

Comment provided January 9, 2009 at 6:01 PM


Steve Weber writes:


Why not have a script upon preview at submission to show and specify exactly what keyword density’s could be a problem. (and maybe other issues while the script is at it)

For example, aWeber scans my autoresponder submissions and tells me what exactly may be seen as spam by ISP’s.

Why not do the same thing for keywords. It could save both EzineArticles and the authors a lot of back and forth grief.

Wouldn’t this actually help to streamline the screening of articles?

Many authors would fix the problem right then and save you some work.


Comment provided January 9, 2009 at 8:09 PM



I think you should write an article the purposely sends the key word checker into orbit. I will bet it might even get published….

Comment provided January 9, 2009 at 8:52 PM


Kevin Leland writes:

for a good HUMAN editor to read an eight hundred word article, it would take less than 5 minutes.

2000 articles, incarcerated for a crime they may not have committed? Accused by a robot? C’mon!

Until these robots get to the brink of artificial intelligence, the human eye will be required. I stopped submitting to EzineArticles when the robot accused me of the crime of obscenity, because of the words I used to talk about a vasectomy, in a light-hearted and interesting way: Testicles=Balls

Human math tells me that 24 people could process those articles in a day! The news tells me that humans need jobs!

When is EzineArticles going to give some work to the same humans that they take work from? Is the goal to teach robots to write articles? C’mon Chris! let your success trickle down to humans not machines! Hire some of your good freelance writers to be proof readers.

BTW: “Human” 4% “robot” 2.6%

Comment provided January 10, 2009 at 7:13 AM


Mel Menzies writes:

Whoops! Kevin Leland writes “the robot accused me of the crime of obscenity, because of the words I used to talk about a vasectomy, in a light-hearted and interesting way: Testicles=Balls”

I’ve just submitted an article on laughter, and used the words “bottoms, wee-wees, willies and poo-poohs” to describe the humour of my four year old twin grandchildren. Does that mean that my article will be rejected? Worse – will they, too, be accused of obscenity? Mel

Comment provided January 10, 2009 at 7:53 AM


Kevin Leland writes:

Right Mel! In all fairness to EzineArticles, I think they are on the right track about holding up a clean, unspamadelik standard. But I argue that using these robots, without more human oversight to do it, will eventually get them nothing more than QUANTITY, as quality, comedy, and interest get flushed down the toilet, leaving them sitting on a towering mountian of poopey-doop!

EZA brags about how “quality” keeps them ranked high in SEO, but keep in mind that “quantity” of content has a huge bearing on the search engines too. I predict that soon all that will change. The article that can give good info and at the same time give a reader a chuckle will far outrank an article that turns on the robot’s green light, but bores a reader to tears.

Keep encouraging that humor in your grand-kids, before we become a world full of dry mechanical information disseminated by unfeeling Mr. Spock’s!

Comment provided January 10, 2009 at 8:42 AM


Mel Menzies writes:

Thanks for your encouragement, Kevin. You gave me a good laugh anyway – and that has to be good for my dopamine and stress levels. Keep up the good work! Mel

Comment provided January 10, 2009 at 8:55 AM


Shirley Bass writes:


I saw your Twitter complaints about EzineArticles and felt it was an inappropriate place to complain and make trouble. In my eyes, it was ridiculous.

Why didn’t you take your troubles straight to the source? Was it easier to try and sway other people away from EA?

I see that you advertise yourself as having extraordinary skills in Internet marketing. Do you teach your followers to spam the system? That’s what your article does, whether that’s what you intended or not. In my opinion, with your qualifications, you should have known better as an Internet Marketer.

Looks to me, your complaint may ruin it for the rest of us. There always has to be one in the group…

If the shoe fits…wear it!

You have the right to choose where your articles will be submitted. If you don’t like the rules, then find a place where your work will be appropriate, but don’t expect all of us to agree with your standards.

Shirley Bass

Comment provided January 10, 2009 at 11:25 AM



Chris,Jeff and Lance always say write, write, write!

I think the less we write the more protective and sensitive we get about what is rejected. I also think that the less we write the better we THINK we are. It takes rapidity to get better at anything at all. Steve’s article was rejected because it does not meet the criteria of EzineArticles because of his excessive use of one word. Upon reading other things he has written I am major impressed. I am also impressed with him because he allowed the hits to come about his rejected article. Wow! Now that takes guts! Hats off Steve!

Recently a new acquaintance peeked into my studio where I must have over 100 paintings stacked. She questioned it and said how many did I plan on painting. Well, “DA!” right? I plan on painting and writing till I stop breathing-that is how we get better at anything.

Hugs when you need ’em to everyone at EzineArticles and a Happy New Year!

Comment provided January 11, 2009 at 12:14 PM



Ohh darn!

“rapidity’ was the wrong word to use of course-LOL! I meant to infer something that we are obsessive about enough to throw ourselves into. Humm.. cannot think of the word!

Darn again- do you know how many times I have written things on this blog I ‘wished’ I hadn’t? Once I even accused someone of doing pin up art and kind of slammed him about that! And he didn’t even ‘paint!’ Remember that Ed? So sorry still! Tee hee.. we all do that right? Yup!

Comment provided January 11, 2009 at 12:22 PM



Ohhh! Sorry Chris! Just thought of the word! DA!!


Comment provided January 11, 2009 at 12:26 PM


Shirley Bass writes:

Okay Kathy, You are good…

Steve, I apologize if I were too harsh. I just felt, that if you had come to any of the blog forums on EzineArticles, it would have been better than advertising your disappointments on Twitter. I would imagine your issue would have been quickly dealt with here.

Secondly, we are all busy people and of course, I can only speak for myself, but I don’t want to use another tool to check my work. Writing the article is a big enough step for me…

Chris, I apologize to you and EzineArticles, if I interrupted the flow of this blog. I am so grateful for the service EzineArticles provides that it upsets me, when the service is ‘put on the block’ to sway disapproval by many individuals, who may not be authors here at EzineArticles.

I just didn’t like the way it was presented!

Shirley Bass

Comment provided January 12, 2009 at 9:23 AM




Thanks… and remember my insert about messing up, we all do.

Hugs all day!

Comment provided January 12, 2009 at 9:32 AM


James Evers writes:

Their is a thin line between commercial and clinical in the discussion of key words. The article in question supports the frequent use of the terms to provide clarity in the presentation. Had the key words in question been a name of a product or web site or even a person the ruling would be more applicable.

Comment provided January 12, 2009 at 9:45 AM


Shirley Bass writes:

True enough Kathy! Shirley…

Comment provided January 12, 2009 at 10:28 AM



Why go on debating over a dead article. Nobody can ever rewrite it or resubmit it. Better concentrate on producing new articles using keywords everybody knows in what quantity.

Only joking!

Comment provided January 12, 2009 at 11:15 AM


Marcelo writes:

Hi there, I found his article very useful and instructive. It’s a big challenge of balancing quality with keyword overuse. The article has quality, but keyword overuse. But the problem is how to replace those necessary words.


Comment provided January 21, 2009 at 4:08 PM


Bill Urell writes:

I have been taught by 2 different SEO strategists independently, and at different times that the keyword is important in 3 places. At the beginning of your title. At the beginning of your description (the blurb google picks up in listings) and at the very end of your article.

All else is about readability and really not worth your time to tweak. There are lost of free tools out there that will calculate keyword density if you are set on it. Just run your article through one now that the guidelines are known.

Bill Urell

Comment provided January 22, 2009 at 2:26 PM


nnamdi agha writes:

I have once been a victim of overusing keywords.

I had no intention of doing that while writing the article and I did not bother to find the keyword density. I had to reduce the size of the article before it was accepted.

I am now learning to use other words that appear similar to the ones I have in mind but in most cases, the original meaning and intent of the article is somewhat distorted.

Comment provided January 26, 2009 at 8:16 AM


Dennis Bandy writes:

Thanks for letting us in on the rules…no one wants to re-write an article due to something that could have been easily avoided.

Chris, a suggestion that’s off topic – something occurred to me while reading the comments. How about spell check for comments?

Comment provided January 26, 2009 at 9:00 AM




Have you considered using FireFox for your browser?

It has has built-in spell check!


We do fix comments from time to time that have spelling errors.

Comment provided January 26, 2009 at 9:44 AM


Chris Ralph writes:

I am sorry to come to this discussion a little late. I have side with those who think that a hard standard is excessive and leads to worse quality prose rather than better. I’d like to point out the painfully obvious fact:

Mr. Knight’s blog post on Keyword spamming violates his own EzineArticles standards on keyword spamming!

Yep, his blog post contains the term “Keyword” 13 times in a 418 word post! Thats 3.1% and a violation of the “hard standard”. So Chris – are you spamming on your own blog telling folks not to spam?

The evidence certainly says yes!

Kind of silly, huh?

If you are writing on “topic X” and there are many different terms that could be used to describe “topic X” then you will probably be OK and will not need to violate the standard, but if your topic, like “Page rank”, is a unique term and the English language does not have lots of other words for it, then by the failure of the English language, you are toast in the eyes of the Ezine “big brother”.

I do agree that excessive stuffing of a document with desired words does create lower quality works, but the Ezine staff are not robots and should be allowed to use personal judgement in the matter rather than ironclad rules.

Comment provided February 3, 2009 at 8:58 PM



Chris R,

Let’s look at *INTENT* here.

Members who keyword stuff on purpose are doing so to game the search engines. It’s in our best collective interests to not let those types of articles to pass.

I wrote this article to educate and could care less whether the search engines index this blog entry or not.

I will concede that legitimate non-intending-to-game-the-search-engines members will/may have a negative experience with us because of our hard floor ruling on keyword & keyphrase use… and we’ve made peace internally with the fact that we can’t please everyone’s needs nor are we going to try to please everyone.

The consequence of that last statement is that we’ll lose a percentage of members that write highly keyword efficient (on purpose or not on purpose) articles.

It would be easier to agree with you Chris R. if there wasn’t such an extreme number of articles being submitted that are INTENT on gaming the search engines that we have to deflect with both automated and human intervention.

Comment provided February 4, 2009 at 8:22 AM


Kevin Leland writes:

Very well put! I think you just won me over again with that comment. I left mad, because my articles got shot down for obscenity (believe me, this content was PG-13 at worst) Now I see that it isn’t EA’s fault that the robots they need to send out to clean up all the garbage are not advanced enough to discern what a human being -organic intelligence, could. For now I’m going to be patient, and try to fit in the guidelines that you layout for us, even if I have to shuffle around some words; can always just edit it later… Soon the robots will be able to distinguish between the garbage and the stuff that might just kinda look like it. What about my suggestion about more human editors? Could there be a way to set that up so everyone concerned -writers, proof readers, editors, would benefit?

Comment provided February 4, 2009 at 10:13 AM




Sorry, we’ve made the decision to NEVER hire freelance editors since day one.

We already have more than 30+ full time Editors in-house and are hiring more this year.

Comment provided February 4, 2009 at 1:15 PM


--- writes:

The main reason page rank was brought is to rank the pages based on content.. Nowadays people are worrying about the pagerank and not the content.. This makes the concept obsolete..

We shouldn’t worry too much about page rank.. Make the content perfectly and ranks would follow!

MODERATORS NOTE: Yes, we agree. Focus on producing high quality unique original content and forget about aiming for pagerank. It’ll happen naturally over time when it’s earned.

Comment provided April 18, 2009 at 11:09 AM


Will writes:

“Side Note: We only compute this metric on the article body, but if the keywords and keyphrases that exceed our guidelines are also used in the Article Title and Resource Box (They ALWAYS are in articles that are correctly rejected) this may count against your article to get accepted if it’s on the edge of acceptance or rejection.”

Keywords used in the Title and Resource box may count against you?

What article ‘marketing’ course did you study? Of course the keyword is in the Title and in the Resource box as anchor text. Those are two of the main criterion to having an article rank in Google and link to your site. Unless of course you don’t care if anybody FINDS your article in Google.

There was a whole section in the EzineArticles guidelines about putting your keyword in the front of the Title as the first words, just so it gets noticed more easily.

EA has gone overboard on its keyword density rules and opted out of using journalistic common sense by the editors to understand some micro-niche market topics have only one or two, or maybe even no synonyms that accurately reflect the main keyword. And for that we have to throw away an original article with good content that somebody wasted their time and energy into developing.

Thank you for listening.


Comment provided April 30, 2009 at 4:25 PM


Kevin writes:

I was glad to listen Wil. They are way off the point, and very contradictive in more ways than just keyword use. This is why I write for instead…Look for me over there….search kdelik or Kevin Leland

I just did an article “Why I won’t pay the subscription fee at EzineArticles.”

You might find it interesting. Your comments would be welcome. I write about online writing and SEO a lot too. You sound as if you may have a lot of common sense advice to offer everyone on this topic. You should check it out.

Comment provided May 1, 2009 at 3:50 AM


Martin - Security. writes:

I just don’t get it.

Oh I understand the need not to stuff an article with keywords or keyword phrases alright. But . . .

If you create an article on the topic of ‘toy dogs’, for example, you can abide by the guideline of 1% for that keyword phrase without much problem, I should think.

However, I would also think that it would be extremely difficult not to exceed the limit for the single word ‘dogs’

The article would not be optimized for ‘dogs’ and would not rank at the engines for that, but could at least stand a chance of ranking for its intended keyword phrase – – something that both EzineArticles and the author would welcome. But that would never be because the article would be rejected for the overuse of the word ‘dogs.’

Yes, canine, pooch etc. can be substituted but only so often.

As other authors have pointed out, nobody optimizes articles or web pages for a single keyword these days.

Comment provided July 12, 2009 at 12:43 PM


Im so annoy with this keyword density writes:

Im so annoy with this 2% keyword density, Most of the time i have to look for synonyms of my keyword just to passed with this 2% keyword density requirement.

Comment provided February 25, 2010 at 9:43 AM


Dear annoyed person,

You’ve got your facts not quite right.

It’s a 2% KEYPHRASE limit, not 2% keyword limit. A keyphrase is 2 or more words.

Our KEYWORD density limits are currently 4-6%.

We determined that most SEO spammers target keyphrase abuse primarily and therefore we are tighter on keyphrase abuse than we are on single keyword abuse.


J Davis writes:

I agree with Steve. This policy is nonsensical and arbitrary. Most of us write articles without any thought of keyword stuffing and like Steve I am not willing to rewrite my articles to meet some moron’s arbitrary rules. I have had articles kicked back for the most nit picky and petty reasons. If I adhered to these ‘guidelines’ my articles would make no sense at all.

Comment provided June 23, 2010 at 2:17 PM


Jarrod writes:


I appreciate the clarification you made in comment #58. However, when referring back to the original case study article on page rank, I get confused.

The reason I get confused is because the stats you gave for the case study article are as follows:

* page occurs 24 times (5.3%)
* pr occurs 19 times (4.2%)

Yet, in your comment #58 you said you are less strict with single-word keywords and allow a single keyword density limit of 4% to 6%.

In the case study article it does look like the density of the words *page* and *pr* falls between 4% and 6% for each keyword. So, why the rejection?

Perhaps, I’m interpreting your comment about the 4% to 6% range incorrectly. Would you mind addressing my point of confusion?


Comment provided July 24, 2010 at 6:16 PM


Wendy writes:

I’d like clarification on that too. If one word is allowed to be repeated at a 4-6% density, that is a big difference from what is noted in the example.

I just spent an hour rewriting a clients articles that had a 3.1% keyword density of the word alarms even though the keyword phrase the article was written on which included “alarms” had a density below 1%. I liked the article better before rewriting it because the synonyms of alarms just don’t read the same in the case of this article.

Clarification would be GREATLY appreciated.



This example shows more than one keyword being abused (PR and page). This is a problem. We are more lenient when it is one word used 4-6% versus several words used 4-6%. This makes reading it clearly near impossible.

Does this make sense? If you have examples, we can review them with you privately.



I would maintain the keyword density of 3-5% white doing seo to my website

Comment provided January 12, 2011 at 3:12 AM


Mack Trucks writes:

Its a process in which the keywords as a percentage of indexable text words. Keyword density is not constant it can vary when a page is analyzed by any search engines. keyword density is being an important factor in how a page get ranked in various search engines!!!

Comment provided February 17, 2011 at 12:32 AM



Things are tougher at EzineArticles…

I just submitted a 1.8% KD article and hope to get it approved..

Still waiting… Be back here to confirm that you can go beyond the “regulated” keyword density of 1% is true..

Comment provided January 4, 2012 at 7:35 AM


Ender Berett writes:

SEO and keyword density can be very tricky to figure out. How much is too little or too much? I’m glad that someone set a standard. That will help me in the future.

Ender Berett

Comment provided February 5, 2014 at 2:57 PM



Unquestionably believe that which you said. Your favorite justification seemed to be on the internet the simplest thing to be aware of.
I say to you, I definitely get annoyed while people consider worries that they just do not know about.

You managed to hit the nail upon the top and also defined out the whole thing
without having side-effects , people can take a signal. Will likely be back to get more.

Comment provided June 5, 2014 at 2:32 PM


RSS feed for comments on this post.

Leave a comment

Please read our comment policy before commenting.