We love digital - Call
03332 207 677 and say hello - Mon - Fri, 9am - 5pm
Call 03332 207 677
Unlike 08 numbers, 03 numbers cost the same to call as geographic landline numbers (starting 01 and 02), even from a mobile phone. They are also normally included in your inclusive call minutes. Please note we may record some calls.
As a Copywriter within the Online/Digital/Inbound Marketing industry, I’ve always been taught to write with freedom…and keywords. There’s always been a dual aspect to writing copy, just as there is with many other SEO disciplines – work on quality, but don’t forget what you’re targeting.
A Brief History
For as long as people have been optimising websites, there have been keywords. These magical terms have informed search engines as to what a page is targeting and allowed them to make a decision on where they should rank. In days gone by, or in what we can now safely refer to as the pre-Google age, these were the predominant indicators of authority and relevance.
Over the years, the relevance of keywords as a ranking factor has ebbed and flowed. However, even when people started to tone down their keyword density and remove hidden text on pages, these little words remained integral.
Take anchor text as an example. Why would a site just use a dull URL strand or company name when they could bolt on a target keyword and further highlight the relevance of a page? This was done on off-site content, directories, business profiles and even within the site copy itself. Why? Because it worked and, to a certain extent, it still does.
Spanner in the Works
As mentioned, the search engines have been keen to dissuade site owners and Copywriters from overusing keywords for years. The party line is to “write for humans, not Google”, but whilst this may be true from a usability point of view, sites full of poor quality, keyword stuffed content would frequently win the race for rankings.
Then we have our good old friend – anchor text. It’s always been a bit of a nonsense (why would a link naturally contain a two word term that exactly matched the keyword of the target page?), but the simple fact is that it got results – good ones too.
However, Google weren’t happy. Their listings were full of low value articles, affiliate sites and other nonsense. This unhappiness was echoed and exacerbated by users who were dissatisfied with what they found. So along came an angry Panda update that worked to eradicate those sites that were littering results pages everywhere. Or at least that was the theory.
Article sites were hardest hit, their churned out content and AdSense filled pages were wiped from search results left right and centre.
But this was by no means a comprehensive expulsion of all the bottom feeders and over-optimised nonsense. It was a start, but even after a dozen or so updates, results still haven’t been cleansed entirely. The Panda needed backup. Content wasn’t the only factor that was ruining search results, it was the way in which pages were being optimised.
So, in waddled Google’s newest weapon – the Penguin update.
Anchor Text, Over-optimisation and Keywords
In their usual subtle manner, Google sent out Matt Cutts to politely explain that sites shouldn’t be over-optimised, should consider reducing keyword-based anchor text and certainly shouldn’t have paid links. Cue uproar and, a couple of months later, a major algorithm update.
Suddenly, after years of flagrant violations of Google’s rules (which they had always struggled to enforce), a swath of sites plunged into obscurity. Owners and SEOs were given a message explaining that they had “unnatural links” or similar, and that they would be able to apply for reconsideration when removed. Ouch.
However, even with the Penguin and Panda working together, only a small percentage of sites were being affected. As an example, the former of these only actually impacted 3% of search queries. Whilst these may be key verticals in industries that are renowned for underhanded techniques, another 97% of searches were almost entirely unaffected.
Despite this, one of the key issues to come out of the Penguin update (as well as various mentions of pushing towards semantic search) was the effectiveness of keyword targeting. Suddenly Google were openly admitting that they were trying to move away from the conventions that many of us had become accustomed to.
They were hitting doorway pages, punishing excessive anchor text and paying little heed to the keyword targeting of pages. Suddenly (which I say in the loosest possible sense), social signals and brand/author authority were coming into their own. Weak links are ignored and a whole host of code violations were finally getting caught out.
So where are we now?
For many sites, the Panda and Penguin updates were entirely inconsequential. Rankings may have fluctuated slightly, but largely it was little more than a minor bump in the road. We hear (and, in some cases, experience) the horror stories of sites being entirely wiped out, but it is by no means right across the board.
Sure, plenty of site owners will think twice about how many keyword-optimised anchor text links they create or whether they choose to buy links. Clearly Google are getting better at spotting these things, so it is wise to tread with caution (as it always has been). However, the search engines haven’t been purged of poor results entirely. Plenty of affiliates and low value, keyword-optimised sites are profiting.
This creates a bit of a problem for Google and SEOs. We always talk about building an identity, working on ‘inbound marketing’ (*shudder*) and developing a social profile to truly dominate search rankings. Conversely, the advice we provide is to stop creating nonsense landing pages to target keywords, never buy links and be more creative with anchor text (if indeed you are going to use it at all).
But not everybody is playing by the rulebook and, quite often, they are the ones that are winning.
Context, content and authority might be the ranking factors of tomorrow, but in the here and now, plenty of people are still getting thousands of visitors from breaking/bending the rules and over-optimising sites. Whilst this continues, keywords will never be dead.
What about the future?
The theory behind the future of search is that it will be a much more organic process. Rather than people using keyword-based searches, they’ll just ask questions and be provided with answers. These will be based on sites with proven authority or that have been shared by people within your social circle/influential industry figures or even the search engines themselves.
In such a world, keywords and most links would be entirely redundant. The search engines would know what you do, where you’re based and whether you’re best placed to provide the answer/product/service being sought. But when will that actually happen?
Should you abandon pages and techniques that are still working to focus on what may or may not happen in the future? Keywords might be considered ridiculous in a few years; however, the notion that they would pale into insignificance might be similarly mocked. We are entirely at the mercy of Google. Whilst they might be pointing towards a new dawn of search, there’s still plenty of evidence that this isn’t about to happen here and now.
That’s scant consolation for those that did get hit by the updates of course. Whether it’s for paid links or low-grade content, these penalties were largely justified. However, they have taken years to enforce; all of which means it’s difficult to predict when the next step forward will come.
Should you abandon what is working?
Personally, I’ve not been overly concerned about keywords for a while and won’t be changing that opinion any time soon. It’s illogical, other than where they are used as a natural part of on-site content. Writing copy to target keywords, rather than creating content that happens to include keywords, is a major problem. So having a page for “SEO Southampton”, “SEO Hampshire”, “SEO UK”, “SEO London” and so on is just a waste of server space.
Anchor text is much the same. It stands out like a sore thumb and shouldn’t really pass on relevance – even though it does. Links should be used to add context or to divert readers to appropriate sources, not as a method of justifying writing content or passing on relevance. That flies in the face of content marketing to a certain extent, but links can still go back to your page, just using a more logical path or term.
By all means build for a social future, create a brand identity and author relevance, but there’s no point in abandoning what is working for you in the here and now. Okay, this may prove to be short sighted, particularly if updates to Penguin roll in soon. However, if you’re number one for a phrase, why should you go back and change 70% of your inbound anchor text to alternative terms?
Some might risk it, others will simply be happy to enjoy the good times whilst they last. Of course you can always vary your techniques moving forwards and strong links will usually provide a tonic to any potential slap on the wrist for low value content. It’s a choice that marketing agencies and site owners have to make, independently or together, and it may be easier in some situations than others.
So keywords are alive and may be for some time to come. Whether they will survive as more semantic-based search factors are implemented is open to debate, so feel free to leave your thoughts below.
Group Of King Penguins On The Beach via BigStock
We continue to go from strength to strength here at Koozai, and we are very proud to announce that our London branch has expanded into even bigger and better offices.
Google Tag Manager (GTM) is a powerful tool and when properly understood and implemented, can be an SEO’s best friend.
However, before you can actually begin a migration to GTM, you need to take some key steps to ensure everything goes to plan.