Push the panic button! Google are going to start cracking down on sites with too much SEO, at least that’s what spokesman extraordinaire Matt Cutts claimed at the SXSW conference last week.
But hold on a minute, what constitutes too much SEO? Is there a maximum keyword density or number of inbound links? Is this the end of the line for optimisation as we know it? Of course not.
Cutts essentially explained that Google were looking to eradicate spammy techniques that have (apparently) long-since been outlawed but still manage to deliver results. As in any good horror film, just because something appears to be dead, it doesn’t mean that it can never return. Search Engine Optimisation is pretty much the same.
Chasing a keyword density of 10%+ probably should have died five years ago. In fact, as soon as PageRank became clever enough to determine authority based on link profiles, it ought to have gone the way of the floppy disk. Apparently, this is something that Matt Cutts identified as being one of the things that Google would now seek to punish.
About time too.
If you’re pumping in keywords to artificially inflate your rankings then you deserve to be penalised. But more worryingly, for me at least, is that this is almost an admission of failure on Google’s part. They are essentially accepting that some sites may still be benefiting from a practice that is entirely lacking in quality or benefit for users. Of course they should be given the boot until they can clean up their act. For further clarification/confusion, here’s what Matt had to say on the subject last year.
What about links though? Cutts suggests that if you “exchange way too many links or go well beyond what you normally expect” you can expect a penalty under this new update. Again, this is a pretty vague statement. Does this cover inbound, outgoing and reciprocal linking? How many links are too many, 1,000, 10,000 or 100,000? There are hundreds of thousands of sites with entirely manufactured link profiles, so will they all be given a penalty? Is there a specific over-optimisation threshold?
Will anchor text also be used as a signifier? Logic dictates that if you’re receiving thousands of links with the same anchor text (not your business name) then there’s a good chance that a fair percentage of those links are unnatural. After all, why would someone choose to link to a specific page using a particular term that may or may not have any contextual relevance to their own page content?
Anchor text is still a decent method of building keyword relevance, but it is also the cause of most spam online. Popular blogs are littered with a litany of low-value comments from users with optimised names. This does not help Google, it doesn’t help blogs and it shouldn’t benefit the site. However, it still works as a technique and so is still employed.
Keywords should be in titles and should also appear within the text of a page, but not to the extent where a reader can visually track their usage, line by line. Essentially, they should blend in and work with your content to give users a clearer idea of what your site is offering. The trouble is, for all their efforts to promote quality content, Google still can’t accurately determine what is good and what is bad. Can they afford to banish every site that has a keyword density of 10%+ or will they choose a more conservative, higher figure? Alternatively, will there be a more sophisticated method for detecting unnatural keyword usage? Again, there are more questions than answers.
Who should be worried?
If you’ve been following Google’s guidelines for the past five years or so, then you should have nothing to worry about. However, if you’re still employing outdated techniques, it might be time to hit the brakes and mend your ways.
However, I do understand the counterargument to all of this. After all, if you have been benefiting from spammy techniques in spite of Google’s many past warnings, why should you deviate from a winning formula? Just because they say that they’re going to come down hard on ‘over optimisation’, there’s no guarantee that you will be affected.
This has always been the problem between SEO and Google. Some think the two work against each other, some believe the opposite is true; however, most are battle-hardened enough to know that the search engines aren’t always able to practice what they preach. For instance, they talk about cracking down on sites buying links, if you were to take it as gospel, that would mean that there was no more link buying/selling going on. However, this just isn’t the case. It still goes on and people are still getting decent results and income from it.
I would still urge caution though. Even if you get away with over-optimisation now, Google are effectively setting out their stall to tackle it more and more in the coming months and years. As their ranking factors grow and the Googlebot’s intelligence develops, they will be better positioned to take action. Whether they are quite there yet is anybody’s guess. However, we will probably find out more in the coming weeks when this update takes effect.
So how would you define over-optimisation? Is this all just a storm in a teacup that has been blown out of proportion from some SEOs? What identifiers would you use to determine whether a site is overly optimised (keywords, reciprocal links, anchor text etc.)? As always, your comments are most welcome.
Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget - it’s free, so why haven’t you signed up already?
Call us on 0330 353 0300, email info@koozai.com or fill out our Contact Form.
What do you think?