We love digital - Call
03332 207 677 and say hello - Mon - Fri, 9am - 5pm
Call 03332 207 677
Unlike 08 numbers, 03 numbers cost the same to call as geographic landline numbers (starting 01 and 02), even from a mobile phone. They are also normally included in your inclusive call minutes. Please note we may record some calls.
A recent study by Slingshot SEO analyses the Click through Rate (CTR) for the top ten positions in Google, revealing some interesting and differing findings from previous studies.
‘Mission ImposSERPble: Establishing Google Click-Through Rates’ was conducted to build on the previous work from the likes of Optify and Enquiro. They found that the number one position in Google’s SERPs yielded 18.2% of clicks, a good figure, but somewhat smaller than previous studies.
Slingshot’s study was conducted over a six month period, where they examined 324 keywords and the sites that were clicked on as a result of searches. The study is one of the largest yet into comparative CTR, using more than 170,000 user visits.
Using a range of tools, Slingshot SEO examined the number 1 to 10 positions in Google’s SERPs for the 324 keywords. The keywords needed to have a stable rank as does the website in order to provide a fair result. From here they were able to calculate the CTR.
Whilst the top spot revealed at CTR of 18.2%, positions two and three delivered a CTR of 10.05% and 7.22% respectively. These three positions alone account for a substantial CTR, especially to the Home page of a website. For example, the number one spot saw 35% of clicks on the Home page and number two and three saw 19% and 14% of clicks on a site’s Home Page.
This is an important discovery as it confirms that getting as close to the top and to the Home page is most important when it comes to ranking in search engines.
However what this report does indicate is that the CTR isn’t as high as it once was, or what other reports have revealed. It’s particularly fuzzy the reasons why, however Rob Young at Search Engine Watch has complied a number of potential causes.
From seasonality to long tail keywords and from user perceptions to the data sets used, Young examined a variety of reasons for the difference in lowered CTR on Google’s top spots. However what’s most likely is that the methodological differences between various studies could have skewed the results. As a result of using the Google AdWords tool, Slingshot admitted themselves that AdWords tend to round off the figures and overstate their results, making their CTRs more understated.
What this shows is the variable nature when it comes to using different analytical tools. Not only that but so many variables can be factored in, as Young acknowledges. Google themselves have taken on massive changes to their search engine and SERPs; algorithm updates, new user interface, the rise of mobile search and social media [See: Google +: The Long Awaited Social Network] as well as their own SERP integrating a number of their own products such as videos, images and places.
Could the result of Slingshot’s study actually accurately represent the changes that Google have taken on and the resultant value of Google’s top spot? Well, let’s not get carried away, it’s still the dominant search engine and still accounts for significant traffic levels and CTR, but there are so many variables to consider, that different studies are likely to yield different results – it’s the nature of the beast.
Search engine technology is evolving, and so is the digital marketing industry. The more experienced professionals amongst you may remember the days of gleefully stuffing keywords into your copy to boost your rankings, blindly spamming strangers to join your email lists and easily securing media coverage for your thinly veiled advertisements.
Site speed is an important area of website optimisation that people working in the world of Search Engine Optimisation are becoming increasingly concerned about.