We love digital

Call 0845 485 1219

We love digital - Call and say hello - Mon - Fri, 9am - 5pm

Microsoft SEO Toolkit Review

Chris Simmance

by Chris Simmance on 15th April 2013

bigstock-SEO--Search-Engine-Optimizati-14219222There are a lot of good SEO tools out there and I use loads of them.  Like many SEO professionals I use tools daily and I will often pick one tool over another depending on the needs at the time. Recently I came across the Microsoft SEO Toolkit and have been blown away at how thorough it is as an SEO tool! In this post I will run through some of my favourite features and reasons why everyone should download it for their sites.

I have been playing with this tool since the beginning of the year and (like most of us) I haven’t read the instructions so this review is based on all the cool stuff I have found through clicking away and learning as I go. If I missed any other useful features please let me know as I would love to use them too!

So, to the tool!

What does it do?

Really the question should be ‘What doesn’t it do’ because it is so comprehensive that it can take days (depending on the site size) to trawl through all the data it gleams from a site.  Below is a shortened version of the key features from Microsoft’s site.

  • Configurable numbers of requests to ensure you don’t overload the site’s processor or bandwidth.
  • Robots.txt read or ignore features to set where the tool crawls your site.
  • The ability to ‘noindex’ and ‘nofollow’ metatags allowing analysis from a user perspective over a search engine.
  • Changeable pre-set limits for analysis, maximum number of URLs to download, and maximum number of kilobytes to download per URL.
  • Options to include content from set directories or the entire site and sub domains.
  • View detailed summary of Web site analysis results through a rich dashboard or CSV export.
  • A Query Builder interface allowing you to build custom reports for a site.
  • Detailed route analysis showing unique routes to better understand the way search engines reach your content.

Basically, the tool does a deep crawl of your site as if it was a search engine and returns all the data in such a way that you can choose how to read it depending on your needs. Where it does return so much data there is likely to be areas that you will choose to ignore for ROI reasons or because they are intentional but if you wanted to run a full, comprehensive site audit this is the best place to start.

The Main Categories

No matter what way you decide to look at the data from the crawl it is broken down into the following categories;

Violations: This category contains the bulk of the information you are going to need to get the most out of the tool. I would recommend either filtering it or changing the ordering of the columns as there are some violations that aren’t necessarily going to provide as much of a return on time investment as others. This main category can be broken down into different columns:

- SEO
This is pretty obvious and on the whole contains issues that are likely to have an effect on a site’s SEO.

- Content
Most of these violations are centered around invalid code markup and on the whole may not be high on the agenda for resolving depending on your time and access to a site’s code.

- Standards
Most of these are page structure violations such as multiple H1’s etc.

- Performance
Violations that are likely to have an effect on the site’s performance, for example large inline code or resources served from multiple URL’s.

The other main navigable categories contain more generalised information from a site crawl and are good to use if you want a quick snapshot of potential issues.

Content : This category contains all the information on a site’s content violations such as duplicate titles, descriptions and Keywords  as well as Server code summaries and pages with broken links.

Performance: This category contains all the information relating to a site’s performance violations and is broken down into sub-categories with information on slow pages on a site and pages with many resources that may also have an effect on the performance.

Links: This category also contains sub-categories and these are based on the site’s internal linking and you are able to see the pages with the most links, any page redirects or links that are in a site’s code that are blocked by Robots.

How do I use it?

There are a couple of ways that you can use the report once it is generated and depending on what you are looking for or how much time you have may dictate the way you use the Tool. There are two approaches you can take.

Using The Tool’s UI

Making use of the tool’s interface to navigate through the errors or violations is pretty easy and you are able to segment the data easily into relevant categories.

Each of the categories are easy to read and you can drill down further into individual issues to see where the actual problem is and in some cases it gives advice as to the best practice solution. Personally I find that this way of viewing the data is very good for a cursory look at a site for top level issues but if you want to drill down properly I prefer the other method.

Using The Tool’s CSV Export

This is my favourite way to read the data as you can filter it in Excel however you like and move each category into different tabs to structure the information how you like it. I have found lots of site issues that I wouldn’t have noticed with some other tools like links that redirect to redirected links on a site or pages with large in-line scripts.

Using this view you are able to segment based on priority or the site’s needs. For example on a site where it wouldn’t provide a good ROI to make the changes I have moved the ‘Invalid mark-up  errors into a low priority tab to be tackled when the higher importance elements are fixed.

Ok so HOW do I use it?

Below I will show you how to run your first report and how to drill into the information or export it to a CSV so that you can jump right in and make good use of this fantastic tool. Just follow the screenshots and you can’t fail! :)

The Initial Interface – This is what you will see prior to setting up a crawl and every time you start the tool.  There will be a short list of the most recent crawls in there too.

Initial Interface

Setting Up A Crawl – Complete the main information and if you need to you can adjust the more detailed settings such as the number of URL’s to crawl.

Crawl Setup

Crawl Overview – This displays an overall summary of the errors and violations that a crawl has found.

Report Overview

Crawl Violations Drill-down (Category Level) – By clicking the Violations tab on the left on the overview display you will see a full run-down of all the violations individually.

Interface Drilldown

Crawl Violations Drill-down (Violation Level) – By double clicking any one of the violations you can see more detail about the exact issue and in most cases how to resolve it. In this view you can also see the other violations grouped within the same or similar issue.

Interface Violation Drilldown

Export To CSV – This menu will allow you to export all the data or a portion of the data. I always export everything so I can use the filters and formatting in Excel to get the information needed.

Export To CSV

You can download the tool directly from the Microsoft Site and if you want more information on getting started with the tool you can find the information here.

I hope that after reading this you all go straight to the Microsoft site, get the tool and start using it on your sites. As you can tell, I’m a fan.

If you have any questions about elements of the tool or have a different opinion please feel free to add it as a comment below.

Image source:
SEO From Bigstock

Chris Simmance

Chris Simmance

Chris has worked in the travel industry for the last 8 years, much of that working overseas in ski resorts, so he has a fantastic understanding of thriving in competitive sectors. His last project was social media management and website development for a leading travel company.

down arrow

Your Free Whitepaper

How To Perform A Technical SEO Audit (2nd Edition)

How To Perform A Technical SEO Audit (2nd Edition)

Download this whitepaper now and get a new one every month!

9 Comments

  • Joydeep Deb 16th April 2013

    Hi Chris, I will download and try the Microsoft SEO Toolkit and see what extra data/report I get compared with Google Webmasters Tool or SEO Scream Frog etc.

    But its definitely worth a try :)

    Thanks,
    Joydeep

    Reply to this comment

  • Scott Pittman 16th April 2013

    Just downloaded this on the back of your review Chris. Great little tool! Just plug in your url and it goes off and generates a huuuge report of all sorts of useful stuff. Good find!

    Reply to this comment

  • Donna 17th April 2013

    Hi Chris

    I will download this great little tool. It is a must for all webmasters and marketers. Keep up the good work.

    Reply to this comment

  • Chris Simmance

    Chris Simmance 17th April 2013

    Thanks for the comments guys. It’s such a useful tool and I use it daily. It would be interesting to know whether you use the export or the tool’s UI to sort through all the violations.

    Let me know! :-)

    Reply to this comment

    • Martin Hall 17th September 2013

      The UI doesn’t really offer much. I’d recommend always running an export, and playing around with it in Excel for a bit. Plus, then you can easily set up a workbook and include other crawls, analytics, etc. to cross-check data.

      Reply to this comment

  • Adam Martin 17th April 2013

    Thanks for this Chris! Really interesting to read!
    I am going to download this tool now thanks to your recommendation!

    I will also recommend to friends!

    Thank you!

    Reply to this comment

  • Taylor Selway 17th April 2013

    I’ve been meaning to give this ago. I definitely will now. Like you say, another tool to add to the collection.

    Reply to this comment

  • Charlie Arehart 5th November 2013

    Chris, thanks for the review. Here’s a question about the tool that I can’t find answered anywhere: when running the tool against a site with a few hundred pages/links, it can overwhelm the visited server–more of a problem if the pages being served are not static HTML but dynamic pages, such as php, .net, coldfusion, etc.

    Have you noticed this? I realize you point out that we can limit the number of pages to visit, but what would be more helpful would be if there was simply a way to throttle its page request frequency, so that it might not visit more than 1 page per second. I’d trade a slower analysis for not overwhelming the site I’m analyzing.

    In fact, this is important to the analysis, as I noticed my analysis of my own site found thousands of “violations”, but it turned out nearly every one was the result of a failure of that particular page to be rendered, because the site was overwhelmed from the tool making so many visits in a short time. (In some cases, where perhaps it might have said the page was missing a description or something, it was that I was seeing an error page from the server indicating that too many pages were running at once from this client.)

    Even if you (or other readers) may not know of a way to throttle the tool, I want to leave that as a point of consideration should anyone else run it.

    Thanks for any thoughts.

    Reply to this comment

  • Graham 6th August 2014

    Just downloaded it today after finding it.
    Not sure why I never used it before. Now I have a HUGE! list of stuff to work on.

    Once this tool is 100% happy with the site, I will see how its helps rankings ( if at all )

    Mental note of today’s date. Lets see what happens in 30 days from now!

    Reply to this comment

Subscribe To The Koozai Blog