So you’ve optimized your web content for your niche keyword terms, made sure you use those terms in the meta title, the meta keywords, meta description, H tags, top of the page, bottom of the page, and scattered throughout the Web page. You’ve spent weeks getting links with your keywords in the anchor text. You’ve check keyword ranking blogged and pinged, tagged and pinged, posted to forums, and prayed, “Please Google, list my site as #1 for my main keywords!”
It used to be that you could rank pretty well with just on-page search engine optimization. Sprinkle the right keywords in the right places in your Web content and Bam! You are at the top of the search engine results.
Then it started to get more difficult. You had to get many other sites to link to you with your keywords in the anchor text (the link text). And if the sites or pages linking to your site were on the same theme, that was best too.
But then the spam sites took over — millions of junk blogs (splogs) stuffed full of scraped keyword-rich Web content that often wasn’t even very readable for humans. It got so that the top search engine results for any keyword were often these junk sites. Imagine if you were Google. Wouldn’t you start to worry whether people would stop using your search engine if you couldn’t deliver the Web content they were really looking for?
The new Google patent indicates that Google is now using (or intends to use) new algorithms to try to weed out the black hat SEO spam sites. Latent semantic indexing is one way of doing that. For any given topic, you can calculate what other words and even ideas you would expect to see in the content on the web page and even on the web site. You can even calculate how often they should appear, what phrases should appear, etc.
If a keyword appears many times on a page, but an LSI analysis indicates the page does not really seem to naturally be about that topic, it is not going to rank high in Google’s search engine results any more. And if the keyword-rich links leading to a page indicate that the Web content should be about something other than what LSI indicates it is actually about, it is not going to rank high in Google’s results either.
This may sound overly simplistic, but maybe you should actually just write normal Web content about the topic! After all, that’s really what Google wants to see and it’s really what the people who surf to your Web site want to see. Maybe you should give it to them, if you’re not already. Instead of trying to trick Google, provide great Web content that is naturally written about the topic. Without even trying, the words that Google would expect to be on the page with your keywords will just naturally be there. After all, they are comparing your page to their analysis of other real content on the topic.