Using your keyword in the body of the text multiple times to increase your keyword density is important to indicate to Google the subject matter of your article. However, you must not overdo it; otherwise, Google may think you are spamming it to manipulate the search results. Too many keywords and your web page will suffer a penalty.
The big question is; What is normal? Well, instead of having a fixed percentage, Google will now calculate the density of a keyword or keyword phrase for a particular query by referencing other web pages on that topic that rank highly. As long as you have roughly the similar keyword density as those sites that already rank highly you should be ok.
This reference to what is considered normal is called the TF-IDF (Term Frequency–Inverse Document Frequency) of the page. We will look at this in more detail in a separate article (coming soon).
Despite the reference now to TF-IDF, it doesn’t stop SEO tools pushing keywords in the Title, content, meta descriptions, image alts, etc. Moz, for example, wants to see a mention of the keyword in each of these, although, their tool does highlight keyword stuffing too. SEMrush, I think, does it slightly better as it provides statistics such as the TF-IF of the top results for your keyword for comparison purposes.
There are, of course, many other tools, but it is essential to use an SEO tool that doesn’t use a fixed density.
Like John Mueller has said, things have moved on from there.
Will higher keyword density improve your Google Rankings?
We took an in-depth look at what the Googlers \ Experts had to say:
Moz on-page optimization tool encourages the use of keywords.
- When optimizing a page for a specific keyword in the Moz On-Page Grader, the tool encourages you to use your target keyword at least once in the Title, Body, Meta Description, URL, and Image Alt Tag. They also alert you to keyword stuffing.
If you use keywords too many times in the document text, search engines may tag your page for keyword stuffing (a form of search engine spam), which can hurt your rankings in the search engines, as well as appear spam-like in the search results to potential visitors.
One of the leading indicators of low-quality main content is keyword stuffing.
- In the Quality Raters Guidelines, Google has set out the type of characteristics that denote the lowest quality of webpage or content. One of the main indicators that they are looking for is keyword stuffing.
For the absolute lowest quality page, the keyword stuffing is going to be more akin to jibberish. Think of a block of text that is not meant to be read by a person. An extreme example is referenced in the Raters Guide and can be found here.
While this doesn’t shed much light on whether the use of keywords is a ranking factor, it does provide reaffirm Matt Cutts statement that keyword stuffing can cause a penalty.
Keyword Density, in general, is something I wouldn’t focus on […] Search engines have kind of moved on from there.
- In an English Google Webmaster Central office-hours hangout, John Mueller said:
Keyword density, in general, is something I wouldn't focus on. Make sure your content is written in a natural way. Humans, when they view your website, they're not going to count the number of occurrences of each individual word.
And search engines have kind of moved on from there over the years as well. So they're not going to be swayed by someone who just has the same keyword on their page 20 times because they think that this, kind of, helps search engines understand what this page is about.
Essentially, we look at the content. We try to understand it, as we would with normal text. And if we see that things like keyword stuffing are happening on a page, then we'll try to ignore that, and just focus on the rest of the page that we can find.
John Mueller, Google Webmaster Trends Analyst, was asked in an English Google Webmaster Central office-hours hangout about keyword density.
In particular, he was asked whether using exact match anchor text in navigation links, or in headers can be bad for SEO even when a site has unique content, and the density of keywords is natural. Google confirmed that you should use anchor text that makes sense for your users.
He then went on to talk about keyword density, and how things have moved on from there. I think it is safe to take his comments as meaning Google now uses TF-IDF, although he doesn’t explicitly say this.
You can view the entire conversation in the video below:
I would love it if people could stop obsessing about keyword density.
- Matt Cutts, in response to a question about keyword density, confirmed that there is no hard and fast rule. It varies depending on keyword and topic. If you keyword stuff, then it could hurt your position in the rankings
I would love it if people could stop obsessing about keyword density […] It’s going to vary by area […] there’s not a hard and fast rule.
Matt Cutts, the previous head of Webspam at Google, was asked the following question:
What is the ideal keyword density: 0.7%, 7%, or 77%? Or is it some other number?
Here are some of the most important points contained in Matt Cutts reply:
A lot of people think there’s someones recipe and you can just follow that like you know baking cookies, and if you follow it to the letter you’ll rank number one and that’s just not the way it works.
So if you think that you can just say okay I’m going to have 14.5% keyword density or seven percent or seventy-seven percent and that will mean I’ll rank number one that’s really not the case.
If you continue to repeat stuff over and over again, then you’re in danger of getting into keyword stuffing.
If you’re an experienced SEO someone’s just like trying to get the same phrase on the page as many times as possible because it just it just looks fake and that’s the sort of area in that niche where we try to say okay rather than helping let’s make that hurt a little bit.
So I would love it if people could stop obsessing about keyword density you know it’s going to vary it’s going to vary by area it’s going to vary based on you know what other sites are ranking it there’s not a hard and fast rule and anybody who tells you that they there is a hard and fast rule.
The points made by Matt Cutts are interesting. Even as far back as 2011, Google seems to be looking beyond a set keyword density. It will vary based on the density used by other sites that are already ranking. Furthermore, if you add too many keywords, they will penalize your site.
You need to write normally. This is easier said than done, though.
Frequently Asked Questions
There are a number of formulas around the web that can be used to calculate keyword density.
Let’s me take a very simple formula first:
- (Number of keywords/ Total number of words on page) * 100
However, this does not do the concept justice. Google is quite a bit smarter than that, and indeed, has even moved way beyond this next formula with TF-IDF.
- Keyword Density = (Nkr / (Tkn - (Nkr x (Nwp - 1)))) x 100
- Nkr = how many times you repeated a specific key-phrase
- Nwp = number of words in your key-phrase
- Tkn = total words in the analyzed text
But like I have indicated in the article, discussions of keyword density in this way is now irrelevant. Google has moved on.
There are many keyword research tools you can use to help decide on your primary and related keywords. Many will also check whether your keyword or phrase is optimized for Google, offer up semantic keywords or particular keyword phrases, and overall check your content quality.
I recommend the following:
- SEMrush (7-day free trial)[/go/semrush/]
- Cognitive SEO
- Ahrefs (keyword research only)
There are many more SEO tools, but these are the main ones. If you can only afford one tool, I recommend SEMrush as you can undertake a detailed site audit as well as keyword research and optimization. It also provides the most detailed information to help you get the ideal keyword density.
Jonathan Griffin Editor, SEO Consultant, & Developer.
Jonathan Griffin is The Webmaster's Editor & CEO, managing day-to-day editorial operations across all our publications. Jonathan writes about Development, Hosting, and SEO topics for The Webmaster and The Search Review with more than nine years of experience. Jonathan also manages his own SEO consultancy, offering SEO developer services. He is an expert on site-structure, strategy, Schema, AMP, and technical SEO. You can find Jonathan on Twitter as @thewebmastercom.