Get in, pour coffee, log on, Tweet something glib then fire up Screaming Frog: that seems to be my routine of late. The reason Screaming Frog features is because it’s just sooooo useful. Now, no one tool can lay claim to being able to tick all the boxes for carrying out any SEO work so don’t sacrifice all others, but you can cover so much SEO ground with Screaming Frog that it merits a post.
In this post I am focusing on activities that fall into the technical SEO and on-page SEO realms.
Crawling a site is a necessity regardless of your tool of choice. That being said it’s worth understanding how this can naively be misused. Excessive and or repeated crawls can eat up a domain’s bandwidth potentially slowing it down for the legitimate users, so take care and use responsibly.
First step is to suss out what you want to look at: the whole domain or a portion of it. To crawl an entire site, simply enter your domain and hit ‘Start’. If you’re looking to include subdomains such as blog.domain.com you’ll need to amend the settings in the Spider Configuration menu. You’ll find this here:
Configuration –> Spider –> Select ‘Crawl All Subdomains
If you’re only interested in looking at a particular section of the site you can limit the crawl using the Inclusion setting found in the Configuration menu again:
Configuration –> Include
Now you can add all the URLs you’re looking to specifically crawl. Nifty use of RegExp can make this even easier. To understand how you can use RegEx to make life loads easier in lots more other ways check out https://www.rexegg.com.
Conversely if there is a section or chunk of content that you’re not interested in, select ‘Exclude’ and follow the same steps. Once the crawl is complete your eyes are likely to be drawn to the tabbed console sections:
These break down what Screaming Frog was able to crawl and return from your site; it then breaks it out into neat sections defined a by common features. It’s easy at this point to let the tool take over your attention by clicking on different tabs and getting absorbed in what you find. We’re not going to lose ourselves here, we want to get a sense of the state of our site.
An ideal place to start is to find the quick wins; find where the site trips itself and user’s up.
Broken links are a fact of life. If your site is updated frequently you’ll find links that need updating or pages that have disappeared.
Roll up your sleeves and head to Bulk Exports –> Response Codes Client Errors (4xx) Inlinks and you’ll export a list of broken links with the page that they are found on so you can fix or redirect users to more appropriate pages.
Similarly, server errors can be investigated following the same path (but selecting Server Error (5xx) Inlinks).
For aged domains redirect chains can be common, again due to content changing and redirects being implemented. This time though, a page gets redirected to another page that also has a redirect in place. This is not a catastrophe as Google will play along for up to 3 redirects in a chain, but after that you’ll lose Googlebot. In any case a redirect chain of any length is worth resolving.
To do this export the Redirect Chains report, Reports –> Redirect Chains:
You will want to redirect the original link in need of changing as well as the destination URL that has subsequently been redirected as well. The report will show you:
Where more than one Redirect URL exists, you have a chain of redirects to sort out. So in a chain of 2 redirects, the original link as well as the subsequent redirect should be sent the way of the final destination. In the same export you’ll see any Redirect Loops. These occur when a page is redirected to itself. Not the best way to manage a page intended for ranking.
You can also use Screaming Frog to highlight site structure needs. Within the main console, the column on the far right has 3 tabs; Overview, Site Structure and Response Times. Select Site Structure and you’ll see a bar graph in pointing the amount of content found one click deep, two clicks deep etc…
At a glance, you can judge if too much content sits a click or two too deep for either your user or a search engine. Content depth at a URL level can be found when looking through a full export and is labelled ‘Level’. You can pinpoint URLs that may need repositioning within your site architecture.
Using the full export from Screaming Frog, once you’ve applied some smart filtering you can highlight pages in need of attention by looking for low Word Count as well as clunky URL structures that are lengthy, use underscores or other inappropriate characters.
Analysing your domain’s on-page health is vital and something that should be maintained and not left until worrying dips in Google Analytics data prompts a post-mortem. Screaming Frog allows quick, but thorough analysis by grouping your content again using common features.
For instance selecting Page Titles you can export of URLs in need of TLC either for being too long, too short, duplicated or not even there in the first place.
This exercise can be carried out again this time in the Meta Description tab. You can also generate the same report using the Report –> SERP Summary
The extra-handy pixel width measurement brings into view another feature of Screaming Frog that can be used to make your life easier. See List Mode for more on that.
Team GB’s Olympic Director supremo Dave Brailsford talks of ‘The Aggregation of Marginal Gains’, a means of achieving significant gain through many, many slight improvements.
Using Screaming Frog’s data pull specifically for images, you can see which images have less than optimal file names as well as missing alt tags. Pick out the images with chunky file sizes that my be slowing your site down. Simple, small changes that as part of a whole glut of changes made as a result of a SEO health check can compound to help achieve success.
By default Screaming Frog will crawl as a spider the domain that you enter. Changing the mode to List (Mode –> List) allows you to select a list of URLs you specifically want to crawl.
This is ideal if you are looking to review work you may have carried out in Excel on a set of URLs. You’ll get the full benefit of the technical features mentioned in the first part of this post as well as seeing how new titles appear in the SERP, the pixel count, and the ability to change device type to review how your new shiny snippet will appear.
Having gone through all of that you can pick on your competitor and perform some pretty powerful research on their strengths and weaknesses.
You never know, your fiercest competitor may still be using the Meta keywords tag. A clear signal of what a page is targeting and a very handy keyword research tactic. Check out their page titles too. Is there any content that can be “repurposed” or even improved upon?
Used to compliment tools that address the many facets of auditing site performance (link diagnostics tools, web analytics platforms etc…) Screaming Frog offers up so much guidance on improving a site. So remember to use with care, but use the bejeesus out of it!