21st Century Keyword Density
Tom Schmitz Oct 27 2011
I am not a fan of using keyword density for SEO. It’s been long and many a winter since search engines cared about how often a word or phrase appears on a page in relation to the quantity of other words.
Today, Google and Bing care far more about natural language patterns. This means the optimal keyword density for any search query will change depending on the words and the context. Which do you think Google will rank higher? The web page that has baseball bat at a steady x% keyword density, but no mentions of bases, balls, dugouts, mounds, pitches, hits and other baseball terms? Or the page that does?
Go to minute 36 of this video: http://www.uwtv.org/video/player.aspx?dwrid=3898. That’s Google engineer Jeff Dean in 2005 sharing with University of Washington students how Google search works and where it is headed. It gets really interesting after minute 38. Remember, this video was made in 2005. If natural language word clusters were important to Google six years ago, and knowing Google makes over 500 changes and upgrades to their search algorithms every year, it’s a safe bet that Google has advanced and incorporated this into their search results. During search conferences, Google representatives have pretty much confirmed this. SEOmoz even developed the Latent Dirichlet Allocation or LDA Score to measure this, with some success. Oh, and Ben Hendrickson, the person who invented the LDA Score? Google recruited him.
If you’re unsure, run your own study. Perform any keyword search on Google then run the top 10 organic results through a keyword density tool. Let me know what happens.