Semantic search, the core idea of web 3.0, is all about understanding the searcher and the terms used in order to provide answers, not just results; could Google personalisation, along with other factors, be heralding in this new era of the Internet and search?
The Google algorithm has, for a long time, been dedicated solely to refining the average user’s quality of search results. Of course, the SEO industry has for some time been able to push up websites for appropriate key phrases, using increasingly subtle techniques so as not to upset the algorithm as it becomes less susceptible to ‘gaming’.
Google are now all about quality. They have made it abundantly clear that they want their SERPs to be full of only the most relevant sites that visitors will use, enjoy and revisit; Google are no longer interested in people who invest more time in their SEO than they do in the site or business. As if to prove that along came Vince, followed up by announcements about Caffeine and now personalised search.
Personalised search isn’t in itself a wholly new thing. If you log in to your Google account and search whilst still logged in, they will keep a record of where you went and give preference to those sites in future related searches. However, now they are going one step further, now the personalised search settings and data are appearing on SERPs everywhere, regardless of whether the user is signed in or not.
Semantic search is still very much a theory. We are in the claws of web 2.0, with companies – particularly those in search – and Internet innovators like Tim Berners-Lee desperately trying to crack the web 3.0 code. Microsoft call their Bing search engine the ‘decision engine’ an allusion to their future aspirations of providing a user-orientated semantic search; being able to understand search strands that are entered and automatically provide exactly what a user is looking for.
But for a semantic Internet to work, there needs to be a core engine capable of digesting all the knowledge, information and data that is spinning around online and translate it into meaningful intuitive results; not easy. Through universally available personalised search, Google is tapping into the mindset of each searcher and providing them with the SERPs that they have been effectively manufacturing. But these are just website search results; often surrounded by live-search, news, local results and images. As yet, Google or Bing have been unable to give you the actual answers themselves – they are just a means to an end.
One thing I found interesting, aside from the aforementioned personalised search, was the inclusion of Wolfram Alpha into Bing [See: Bing to Ring the Changes]. Okay, so Bing is only the third most popular search engine and Wolfram Alpha is something of a niche informational hub, a data aggregator that can provide semi-intelligent results; so this isn’t a merger of superpowers. The Wolfram Alpha ‘computational knowledge engine’ being adopted, gives far greater credence to Bing’s ability to truly claim to be an ‘answer engine’ – thus pushing it further towards semantic search and web 3.0.
As highlighted in my recent (snappily titled) blog post, ‘What to Look Out For in the World of Search in 2010’, web 3.0 is likely to start becoming more of a reality in the coming months. Full, intelligent semantic Internet is still not here, at least not in its entirety. Whoever can master it first, assuming that it isn’t Google, could well be the one to finally topple the search giant. The step from algorithm to aggregator, as found a real-time search, could well help pave the way for this [see: Google Goes Real-time].
Semantic search is about intelligence and decoding the audiences search intentions. To achieve it we need to move away from algorithm-based mathematics and towards computational understanding of language. This will of course alter how we do SEO in the future, but for every wall that is placed in the way of optimisation, there is always a way of getting over, round or under it to ensure online visibility.