Call 03332 207 677
Unlike 08 numbers, 03 numbers cost the same to call as geographic landline numbers (starting 01 and 02), even from a mobile phone. They are also included as part of inclusive call minutes and discount schemes from all major mobile phone and landline operators.
A long-trusted servant of mine, Screaming Frog, had itself a little upgrade back in June. A number of changes were made but the standout features that caught my attention were the new ‘Custom Extraction’ and, OMG… GOOGLE ANALYTICS INTEGRATION!
SEOs, digital marketers and all other Holistic Inbound Internet Botherers rejoiced at the news, for at last there would be a means to combine Google Analytics data with HTML elements in a more convenient format than tinkering with the inner workings of the Google Analytics interface.
In this post, we’ll set ourselves up and consider and touch on what we can do with the results of the crawl.
Point of order: crawl responsibly, folks. There are things to bear in mind when crawling a website, as you are sending a lot of requests to a server in a very short space of time.
1. Before you start a crawl, you need to enable access to the GA API. Go to Configuration –> API Access –> Google Analytics and a box will appear:
2. Click ‘Connect to New Account’ and use the email address linked to your Google Analytics account:
3. Now you can pick the Google Analytics account, property and view for the domain you’re looking to crawl. Do this from the drop-down menus and don’t hit ‘OK’ just yet:
4. Now pick your date range. Sadly, there isn’t a comparison function here, that would be even sweeter:
5. Let’s choose some metrics to look at. Currently, the limit is 20. For a set of 10 metrics, the API call is much smaller and therefore much quicker to execute. As a result, you’ll see a default set of 10:
Once you’re happy with your set-up (and have added a domain to crawl!), hit the Start button and go get a coffee.
On your return, that shouty frog, as he is known at Koozai HQ, will have returned a crawl of all the HTML pages and the console will look a little like this:
Note that the Export drop down has three pre-defined segments ready to export:
• Sessions Above 0
• Bounce Rate Above 70%
• No GA Data
But we can do even better than that. By switching to the Internal tab and then selecting the HTML option from the Export drop-down, we can see the Google Analytics data alongside the other HTML elements extracted from the crawl:
I open up the CSV export in Excel and get tinkering. First of all, I hide the following, mainly because they aren’t needed:
• Meta Keyword
• Meta Keywords 1 Length
• The tag lengths
• Meta Refresh 1
• Response Time
• Last Modified
What’s left can let me know where there are HTML improvements to make as well as the various GA metrics you may want to look at.
You may have different preferences or scenarios but these choices work for me as regards combining GA data with the key elements of a page that should be, or have been, optimised and should have their performance checked.
Now, on to practical uses. I’m going to look at a few ways to use this data to analyse your content.
When you first start working on a project, you will typically run a crawl of your site as well as pick through the various Google Analytics reports to build up a picture of the site you’re looking at.
Now you can add the GA data and specify as large a date range as possible to see meta data like title tags, header status next to Pageview or ecommerce data.
For an SEO focus, if you set up the API access details to pull specifically Organic sessions, you can pinpoint certain performance measures that cover SEO and conversion analysis:
In your export, add a filter to each column and freeze panes so your URL remains in view and get busy isolating pages by a common theme (subfolder), for instance.
From here, you may look for pages that stand out. These can then be investigated further and improved from a usability, conversion or search engine perspective. Here are some suggestions:
• High bounce rates and low conversions – Is the meta optimised? What is the word count? When was the last content audit?
• Pages duplicating meta or URLs with/without trailing ‘/’ – Which has had the most sessions? This could point you in the right direction of which URL to canonicalise.
• For a range of products, are there any conversion or session outliers that significantly outperform the rest? What can be replicated or what needs removing?
Screaming Frog was always one of the most popular tools used by SEOs and the like. Ideally, you should use as few tools as possible, but a one-stop SEO shop doesn’t yet exist. That said, with Google Analytics integration, Screaming Frog is now a step closer to that.