We love digital - Call
03332 207 677 and say hello - Mon - Fri, 9am - 5pm
Call 03332 207 677
Unlike 08 numbers, 03 numbers cost the same to call as geographic landline numbers (starting 01 and 02), even from a mobile phone. They are also normally included in your inclusive call minutes. Please note we may record some calls.
Last week Google finally announced that they would be undertaking the biggest algorithm overhaul in their relatively brief history. Forget Panda, Venice or the Brand update, this is a potential game-changer beyond anything we have ever seen. But what is it?
Semantic search has always been the theoretical target of all search engines. The ability to understand a user’s intention and provide bespoke results and information based on a single query is what Google, Bing, Yahoo and the rest have all been working towards. Now, it would appear, the theory could finally be turned into practice?
What does this mean?
The biggest flaw with algorithm-based search is that the engines could only ever deliver a list of websites in order of perceived relevance. Whilst the formulas have become more advanced and detection of wrongdoing improved, results are still far from perfect. Poor sites with low value content continue to thrive, making a mockery of their processes.
The obvious solution has always been for the search engines to simply bi-pass websites entirely and provide their own answer wherever possible. However, in order to do this, they would have to be able to understand the query and answer it accordingly. Unfortunately, this would take a near-human understanding of wording and semantics, making it next to impossible – until, it would appear, now.
Semantic search shouldn’t simply help search engine to answer queries though; it should ensure that users are directed to sites that deliver the most relevant content too. With the data and knowledge-base that Google now has, this should be much easier. By drilling down into user data and social activity, they can see who is delivering what their users are seeking. Whilst big brands may always gain some form of precedence, it could potentially open the door for smaller sites even further.
What is the impact likely to be?
Some are clearly concerned about the potential ramifications of Google skimming information from websites and serving it up in search results. Click-through rates are likely to suffer and advertising revenue could be reduced for a number of sites. This is as unfortunate as it is perhaps unavoidable .
Essentially the search engines want to act like an extremely knowledgeable person. If they know the answer, they’ll give it to you. If not, they’ll find a reliable source (or two) and provide you with a response on this basis.
So for instance, if I ask “what are the ingredients for a mojito” the current results look as follows:
Google has understood the correlation between ‘ingredients’ and ‘recipe’, providing a range of results which provide reasonable responses. However, in a world of semantic search the answer would probably simply be:
Followed swiftly by the most relevant sites.
For the user, this is perfect. The information is available straightaway, negating any requirement to investigate further. With more searches taking place on mobile devices, being able to access such answers without trawling through sites, which may or may not be tailored for their reduced resolution, can be hugely advantageous.
However, as mentioned, site owners may be less enamoured with any such developments.
The writing has been on the wall for quite a while now though. Back in 2009, when they started to experiment with personalised search, this provided the first tantalising look at what a truly intelligent algorithm could deliver. Since then they’ve fiddled with local results, giving greater prominence to businesses within their Places network. Kicked out low-quality, high-advertising density domains with the Panda update and even started to integrate more answers within their results – as I highlighted last month.
Essentially these have all been stepping stones towards something much bigger. Semantic search would now appear to be the answer for the thousands of those questioning what is going on at Google.
Anybody who wants to consider Plus a failure ought to look at the data that it is now feeding directly into the algorithm. As well as influencing personal results, it will inevitably have an impact on how organic listings are ranked. They tried with Wave a while ago, but now have cracked a way of dragging more data out of users – including emails and personal messages.
Knowledge is Power
As humans, we learn through gaining an understanding of a subject. However, we also know where to look for answers if we need them as well as sharing knowledge when required. If Google can crack this level of understanding, it can deliver results that will blow Bing and the others out of the water. However, to achieve it they have to take chances and risk alienating users who are keen to protect their privacy.
However, the more information we all pump into the data-sponge that is Google, the more it understands. As I attempted to explore in my recent post – What Would Happen if Google Ignored Links? – existing optimisation techniques could, conceivably at least, become obsolete in a socially-obsessed search world. If search engines can deliver results based on their understanding of a query, using thousands of data sources, what use would links be – apart from a rudimentary indicator of authority? Remember, it wasn’t all that long ago that the removal of Meta keywords as a ranking factor would have been unthinkable.
Schema.org offered a clue, +1 and Plus were major nods and even rel=”author” and associated tags were basically screaming that a future based on human understanding of queries rather than advanced mathematics was an inevitability. Whether or not they can deliver though is an entirely different question entirely. If results are confused, users could quickly start looking for alternative search engines to deliver the algorithm-based results that we have all become accustomed to.
With changes being made to results within months and updates planned for the next couple of years, Google is likely to be a very different kind of platform in 2014. For better or worse, we are lurching towards semantic search at a rate of knots. It’s not a time to panic or throw in the towel; however, the way in which we develop and optimise content online is going to change. By how much? Well, that’s anybody’s guess.
Remember, Google rarely do things by accident. There have been plenty of markers, all of which pointed towards a major move towards Semantic search. Whilst algorithm updates have made a huge impact on rankings in the past, this latest announcement could result in a complete overhaul in how search visibility is apportioned. The big G wants to be a one stop, community-based reference for all information. Sites that fail to meet their new standards could be lost in the wilderness, whilst others may find their content being scraped. Whether it’s a grim vision of the future or the dawning of a brave new age for search is certainly up for debate.
So what do you make of semantic search? Is it a whole lot of bluster over not a lot? Will it change the way we interact and use search engines? How will online marketing methods have to change to adapt? Are you happy or upset to hear Google are finally pushing out semantic results? Your views, as always, are most welcome.
Search engine technology is evolving, and so is the digital marketing industry. The more experienced professionals amongst you may remember the days of gleefully stuffing keywords into your copy to boost your rankings, blindly spamming strangers to join your email lists and easily securing media coverage for your thinly veiled advertisements.
Site speed is an important area of website optimisation that people working in the world of Search Engine Optimisation are becoming increasingly concerned about.