Call 0845 485 1219
2012 was a year of change for the SEO industry and it already feels as though 2013 will be every bit as turbulent. It has never been more important to ensure your site’s SEO is healthy.
This guide is intended as a comprehensive resource for carrying out a review of your site’s health and SEO performance. It is also designed to ensure that your site is not just healthy now, but will hold up against the promise of more Google algorithm updates and be protected against manual penalties. You can also download my free PDF checklist to use alongside your own SEO health review.
So many sites are suffering the wrath of Google updates and manual penalties and Search Engine Roundtable even went as far as claiming that you may never recover from Google’s Penguin. Many site owners are battling against the tide to climb back up the SERPs and reclaim the top ranking positions of yesteryears, while some have indeed reached the difficult conclusion that they cannot recover and are starting again with fresh sites and domains.
Don’t wait until it’s too late. Be proactive, not reactive, with your SEO strategy.
The Google Penguin algorithm update has undeniably transformed the way we “build” links, completely invalidating link building techniques that SEO’s relied upon to build website authority for many years. Tactics including submitting to link directories and negotiating placement of site-wide blogroll links went from being quick wins to cause for ranking punishments almost overnight. Many sites lost favour with Google over the last 18 months and are struggling relentlessly to restore what has already been lost.
In addition to the damage caused by this major algorithmic update, the introduction of manual penalties early last year has added further concern for webmasters and SEO’s. Google’s dedicated anti-web-spam teams are manually reviewing sites across the web to identify anyone in breach of its Webmaster Guidelines. Depending on the severity of the infringement, sites have received anything from an Unnatural Links warning in Webmaster Tools to complete Google de-indexing.
Despite all this, Google is far from finished in its fight against web-spam. For every site punished for its unnatural link profile, there are still several more out there getting away with the same crimes. It is a common frustration of webmasters whose sites have been “hit” by Google for their backlinks that their competitor’s sites are still ranking well with similar or even worse links.
However, the risk of ranking drops and de-indexing is far from over. Many sites with toxic link profiles which have so far gone unnoticed will not be so lucky later this year with the upcoming release of Penguin 4 (also dubbed 2.0 by Matt Cutts); a new and powerful Penguin algorithm update promised for later this year.
It has never been more critical that website’s clean up their backlink profiles proactively, ahead of the certain storm. It is no longer enough to sit and hope that you will be safe or to plead ignorance against poor quality backlinks. Google holds websites responsible for their own backlinks and with the introduction of the Disavow tool there is no longer any excuse.
Before you can begin to cleanse your links, you must have comprehensive data and an in-depth understanding of your backlink profile. In order to obtain this information and insight, you must use as many link analysis tools as possible to get a broad picture of your backlink profile. You should use as many of the following tools as possible to get this information including:
Assemble the data from as many tools as possible by exporting data into CSV spreadsheets and compiling in Microsoft Excel. No single tool will locate all of your site’s backlinks, so in order to get as big a picture as possible, bring all the information together into a single view and remove the duplicates.
With this broad spectrum of link data, start your manual analysis. This can take hours, days or even weeks depending on how many backlinks you have. The important thing is that you know what you are looking for.
In order to decide whether a link is good or bad, view it on page and first consider how it might have got there. Did you or someone at your business pay the hosting site for the link to be there? Was it submitted in order to generate “link juice”? If so, it already doesn’t seem like a “natural” link. Mark it for removal or disavow.
Next, look at the quality of the hosting sites. Look at the site’s PageRank and any other outbound links on the same page. Think about whether the referring site looks and feels credible and relevant to your industry. If it looks like a bad neighbourhood to be in, you don’t want your link there. Add it to the pile.
Link Detox and Link Risk help to analyse the quality of your links for you, giving ratings on the quality or risk of each link. This information should help speed up the analysis process for site’s with large backlink profiles, but be sure to manually review the results as no link analysis tool can be 100% accurate.
As a rule of thumb when it comes to backlink analysis; if in doubt, throw it out. If you have to put too much thought into whether or not the link is providing genuine value to your link profile, it probably isn’t. The more links you analyse, the better your understanding will be of good and bad links. This will help you clean your link profile over time.
Once you have a list of dodgy links you want removed, it’s time to get in touch with the referring sites. This can be a huge and daunting task and definitely will take time if you have a lot of backlinks. What’s more, the success rate (ie; how many links are actually removed by the referring site vs. how many link removal requests you send) is genuinely rubbish. Nonetheless, it’s a necessary step in link profile strengthening just as it is with recovery.
Collect the contact details for the referring sites by looking for “contact” or “about us” pages on the site. If they do not have these sorts of pages, try looking in the page footer or even in the source code for an email address. As a last resort, try to obtain an email address by running a WhoIs lookup against the domain. Nine times out of ten you will find an email address or contact form in one of these ways. Record these details in your spreadsheet of link removals and construct a polite message to send to your contact requesting removal of the link to your site.
Log every effort you make to get your links removed. Google expects you to work hard to remove harmful backlinks and will want to see proof of your efforts should you ever need to submit a reconsideration request.
For those links that could not be removed, due to a lack of contact details or no action from the site when contacted, prepare a disavow document to submit to Google. This tells them which links you do not want counted when they crawl your site and evaluate your backlinks. It tells Google you have tried to remove the links, knowing that they aren’t natural, and have been unsuccessful.
By going through this process and keeping your link profile clean from unnatural, spammy links, your site is much more likely to survive another Penguin update or manual review.
For more information on carrying out a comprehensive backlink review, please see my post on how to recover from a Google ranking penalty from earlier this year.
Unfortunately it is not just the Penguin causing Google ranking drops and making hard work for websites. The Google Panda algorithm update was first announced in 2011 and has since seen numerous tweaks and developments. Panda is such an important part of Google’s policy that it has recently been worked into their algorithm, meaning Panda checks are completed as part of the standard Google crawling and indexing process, not only when Panda is updated.
Where Penguin targets backlinks, Panda targets content. Where Penguin attempts to eliminate spam from search results, Panda ranks sites based on a complex evaluation of sites’ content quality and value.
In order to rank above your competitors, content is of increasing importance thanks to Panda and there are a number of fundamental health checks you should be regularly carrying out.
When considering the quality of the content on your site, Google reviews a number of key factors. In order to perform well in search it is essential that your site complies with certain content-based guidelines and provides real value to the user.
The first thing you should check is the length of content on each of your pages. If you have pages thin on quality content, you should develop them or get rid of them completely. As a general rule of thumb, each page should have at least 250 words with slighty more on your homepage or other key pages that you want to rank well (300+).
Content length is important but equally so is quality and part of that metric is in its uniqueness. If you have two pages with similar content you should consider getting rid of one and developing the other. As always, if you do remove any pages, don’t forget to apply a 301 redirect to point them to a relevant live page.
One way to check for duplicate pages on your own site is using the hash value in Screaming Frog. This code is given to each of your pages by the tool and is calculated based on the page content, meaning duplicate pages will have the same hash code. After running your site through the tool and exporting the results you can use the hash value to see if any pages are direct duplicates:
In 2012 Google released numerous other updates to its algorithm, one of which targeted ads “above the fold”. This simply means the content on your site’s page which appears at the top of the page or is visible without scrolling down. Google does not want to see that main space of your pages filled with adverts and instead expects your main page content to be given pride of place there to serve the needs of your users. Check your pages and ensure that you do not need to scroll to find your textual content and make sure you don’t use adverts above the fold wherever possible.
Check your site thoroughly to ensure you do not stuff your pages with keywords or excessively use keyword bolding as an optimisation tactic. Google is intelligent enough now to work out what your page is about without the need for these sorts of techniques and using them excessively will raise alarms bells at Google Spam Patrol. Try to use your target keyword 2 or 3 times on the page. Most importantly, only use keywords where relevant and appropriate; if it doesn’t read well and looks as though it’s been thrown in for SEO, Google are getting smarter at knowing.
Google’s Panda updates don’t only consider the content within your own website when evaluating the quality of your site’s content. In addition, Google will review your content in comparison to pages all around the web specifically to identify instances of duplicate or scraped content.
Google’s aim is to provide users with quality search results that meet their needs and the intention of their search. If two sites each have a page with identical or similar textual content, Google will not want to return both pages in relevant search results. The Panda algorithm update will try to establish which site provides the best content or had the duplicate content first and return their pages in search. However, it cannot always establish this accurately and may choose to display the plagiarised content instead of the original instance, or even devalue both sites.
To ensure that the content on your pages is not devalued due to scraped content, it is essential that you carry out regular checks for duplicate content across the web.
One great tool for this is Copyscape, a cheap and effective site which analyses your page and looks for content matching it across the web. For around $0.05 per page you can check the pages of your site for duplicate, plagiarised content across the web.
If Copyscape identifies duplicate content on any of your pages you should manually review each instance and consider why it was flagged. Is it just a case of your business using the same “about us” blurb on your site as on your LinkedIn page? Or is it that another company in your industry has stolen content that you wrote for your site and adapted it to use on their own?
Consider how each instance can be resolved. If the duplicate content is on profile pages or sister sites that you can control, it should be easy enough to revise the text on those profiles or sites and make it unique from your main website’s copy. However if another company has used content from your site, or perhaps you used content from another site when you first built your site, it is probable that you will not be able to get the external source to change their content. In these instances, you should revise the content on your own pages to create completely unique content for your users and for search engines.
One often overlooked area of SEO is in competition analysis and in particular the analysis of your competitors’ backlink profile and link building activity. Understanding this provides you with two main benefits:
When it comes to your SEO health checks, it is point 1 that is of interest. If you understand how strong your competitors’ link profiles are, you are in a better position to predict and forecast how much work you may need to carry out on your own link profile in order to compete or stay on top. In addition, by carrying out checks periodically you can establish your competitors’ “link velocity”; the speed in which they are developing new backlinks. For example, if a competitor has 2,000 backlinks from 200 domain in January and 2,500 links from 250 domains in March, you can establish the trends and patterns of their link behaviour to establish what work you might need to do in order to compete.
To establish these sorts of statistics, consider using the same sorts of tools that you would use for your own backlink analysis. In addition, try the Site Strength Indicator Tool from Search Engine News which gives you a snapshot of information on any site you check.
For a more in-depth competitor analysis, please see an older (but still relevant) post from the Koozai blog on SEO Competitor Analysis.
Lastly, you need to review the technical aspects of your site to ensure your website’s performance is not restricted by technicalities and that you operate fully within Google Webmaster Guidelines.
Please feel free to download our free whitepaper on completing a Technical SEO Audit to use as a guide for reviewing the technical aspects of your site.
To make sure you cover all the bases I’ve put together a free PDF that can help you keep track of everything. Download it here.
I hope you’ve found this guide useful, but please do let me know if you can think of any other health checks I might have missed. Share your experience with Google’s manual penalties or algorithm updates and let us know if you’ve managed to recover! Please feel free to comment below if you have any questions or comments or contact Koozai for more information on our Site Audit and SEO management services.
Copyright © 2006 - 2015, Koozai Ltd