We love digital - Call
03332 207 677 and say hello - Mon - Fri, 9am - 5pm
Call 03332 207 677
Unlike 08 numbers, 03 numbers cost the same to call as geographic landline numbers (starting 01 and 02), even from a mobile phone. They are also normally included in your inclusive call minutes. Please note we may record some calls.
With Google facing an antitrust investigation in Europe, the question of search neutrality has once again raised its ugly head. But can search engines really be regulated?
Google’s alleged fiddling of search engine rankings for Microsoft associates Foundem and Ciao! have landed them in some pretty hot water. With a European antitrust case pending, the issue of search neutrality and monopolising of the search/advertising market is in the spotlight.
On their European Public Policy blog, Google have been quick to defend its actions. Yesterday, under the title of ‘This stuff is tough‘ Amit Singhal (a Google Fellow) explained just how complex it is to regulate and maintain a search engine. This was in response not only to the pending court action, but also an Op-Ed piece in the New York Times by Adam Raff, co-founder of Foundem.
The article in question is entitled, ‘Search, but You May Not Find‘. In it, Raff explores the idea that search should be a regulated industry. A nonsense. Search engine technicians, as argued by Singhal, tend to be some of the greatest minds in their field. They are used to programming complex algorithms in a continuous effort to improve the level of search results. How can that ever be regulated?
A government official, no matter how technically minded, is unlikely to have the knowledge needed to understand the calculations involved in maintaining an algorithm. The algorithm itself is constantly evolving, with slight tweaks needed to ensure that issues are dealt with swiftly. Whilst you could monitor human overrides, of which Google claim there are few, ham-handed intervention of the mathematical process for rankings would be disasters.
Everybody wants an even playing field. Search neutrality would be an only nirvana for Webmasters. Each site in its rightful place. But that just isn’t possible, at least not at the moment. Conspiracy theorists will always tell you that they have been surreptitiously dumped from rankings, they might even be right. But even-handedness in the world of search can’t be easy to maintain; that’s why so many people invest so heavily in their SEO.
Search engines, whether Bing, Yahoo! or Google, don’t like to be gamed. This is why they provide guidelines for what they like and what they don’t. This feeds down to SEOs and become the guidelines for future optimisation. If you choose to flaunt these laws, as in the case of Foundem (Chris Lake at Econsultancy has been following this story closely over the past few months; for a comprehensive round-up I recommend reading ‘Foundem vs Google redux: it was a penalty! And search neutrality is at stake, dammit!‘), you have to expect that you won’t rank well. In short, their site is an SEO car crash. No original content. No original idea. If you don’t play by their rules, you will be punished, it’s as simple as that.
Now, whether there was anything underhanded in all this, we will probably never know. But is it justification for an antitrust suit or calls for search engine regulation? Hardly. If Google have been fiddling there results, and I’m certainly not going to suggest that they have, are they the only ones to have done this?
It’s hard to avoid the fact that some of their entities do get preferential treatment within the search engine, but is that much of a surprise. Whilst translation services, analytics providers and many others besides might have cause for umbrage, I think there should also be a little acceptance that if you go directly up against Google, you might not win. They are the largest, most visited and (most importantly) strongest site in the world. When they link to a sub-domain, you can be sure that it will also have a pretty strong ranking as a consequence. Probably why Foundem was so keen to write for the New York Times, rather than a lower ranked site of more relevance; that strong link could have done wonders, well it would have done if they included.
If there is to be any regulation, it should be done within the industry. If Microsoft and Google weren’t concentrating on antitrust cases, they could instead be working together to improve search results. Why fill the coffers of the European Union, when they could keep disputes in-house? I’m not suggesting a fully co-operative search engine, just an entente cordiale – we won’t fiddle if you don’t, style agreement.
Okay, so that probably isn’t going to happen in a million years, but individual Government legislation is even less likely – certainly in terms of its likely success. This whole process smacks of sour grapes; whether or not they have a valid point or not. As Amit Singhal openly admits on the Google Blog, “Search is nowhere near a solved problem… The science is probably just about at the point where we’re crawling. Soon we’ll walk.”
Search is incomplete. It isn’t perfect. It’s just that Google are developing at a faster pace than anybody else. Inevitably some will feel hard done by, but I am sure the majority (at least those who have worked hard on SEO) are reasonably accepting of where they stand. It is a game, tactics play a part, but everything has to be done to within the confines of the rules – in this case, those rules are set out by the search engines. As with any game, there will be winners and losers – it just seems some are more gracious than others.
Site speed is an important area of website optimisation that people working in the world of Search Engine Optimisation are becoming increasingly concerned about.
The term “content marketing” is frequently thrown around by marketers, influencers and business owners, but what does it actually mean? Let’s kick off with a quick definition before we take a closer look at this concept.