2012 was a year of change for the SEO industry and it already feels as though 2013 will be every bit as turbulent. It has never been more important to ensure your site’s SEO is healthy.
This guide is intended as a comprehensive resource for carrying out a review of your site’s health and SEO performance. It is also designed to ensure that your site is not just healthy now, but will hold up against the promise of more Google algorithm updates and be protected against manual penalties.
So many sites are suffering the wrath of Google updates and manual penalties and Search Engine Roundtable even went as far as claiming that you may never recover from Google’s Penguin. Many site owners are battling against the tide to climb back up the SERPs and reclaim the top ranking positions of yesteryears, while some have indeed reached the difficult conclusion that they cannot recover and are starting again with fresh sites and domains.
Don’t wait until it’s too late. Be proactive, not reactive, with your SEO strategy.
The Google Penguin algorithm update has undeniably transformed the way we “build” links, completely invalidating link building techniques that SEO’s relied upon to build website authority for many years. Tactics including submitting to link directories and negotiating placement of site-wide blogroll links went from being quick wins to cause for ranking punishments almost overnight. Many sites lost favour with Google over the last 18 months and are struggling relentlessly to restore what has already been lost.
In addition to the damage caused by this major algorithmic update, the introduction of manual penalties early last year has added further concern for webmasters and SEO’s. Google’s dedicated anti-web-spam teams are manually reviewing sites across the web to identify anyone in breach of its Webmaster Guidelines. Depending on the severity of the infringement, sites have received anything from an Unnatural Links warning in Webmaster Tools to complete Google de-indexing.
Despite all this, Google is far from finished in its fight against web-spam. For every site punished for its unnatural link profile, there are still several more out there getting away with the same crimes. It is a common frustration of webmasters whose sites have been “hit” by Google for their backlinks that their competitor’s sites are still ranking well with similar or even worse links.
However, the risk of ranking drops and de-indexing is far from over. Many sites with toxic link profiles which have so far gone unnoticed will not be so lucky later this year with the upcoming release of Penguin 4 (also dubbed 2.0 by Matt Cutts); a new and powerful Penguin algorithm update promised for later this year.
It has never been more critical that website’s clean up their backlink profiles proactively, ahead of the certain storm. It is no longer enough to sit and hope that you will be safe or to plead ignorance against poor quality backlinks. Google holds websites responsible for their own backlinks and with the introduction of the Disavow tool there is no longer any excuse.
Before you can begin to cleanse your links, you must have comprehensive data and an in-depth understanding of your backlink profile. In order to obtain this information and insight, you must use as many link analysis tools as possible to get a broad picture of your backlink profile. You should use as many of the following tools as possible to get this information including:
Assemble the data from as many tools as possible by exporting data into CSV spreadsheets and compiling in Microsoft Excel. No single tool will locate all of your site’s backlinks, so in order to get as big a picture as possible, bring all the information together into a single view and remove the duplicates.
With this broad spectrum of link data, start your manual analysis. This can take hours, days or even weeks depending on how many backlinks you have. The important thing is that you know what you are looking for.
In order to decide whether a link is good or bad, view it on page and first consider how it might have got there. Did you or someone at your business pay the hosting site for the link to be there? Was it submitted in order to generate “link juice”? If so, it already doesn’t seem like a “natural” link. Mark it for removal or disavow.
Next, look at the quality of the hosting sites. Look at the site’s PageRank and any other outbound links on the same page. Think about whether the referring site looks and feels credible and relevant to your industry. If it looks like a bad neighbourhood to be in, you don’t want your link there. Add it to the pile.
Link Detox and Link Risk help to analyse the quality of your links for you, giving ratings on the quality or risk of each link. This information should help speed up the analysis process for site’s with large backlink profiles, but be sure to manually review the results as no link analysis tool can be 100% accurate.
As a rule of thumb when it comes to backlink analysis; if in doubt, throw it out. If you have to put too much thought into whether or not the link is providing genuine value to your link profile, it probably isn’t. The more links you analyse, the better your understanding will be of good and bad links. This will help you clean your link profile over time.
Once you have a list of dodgy links you want removed, it’s time to get in touch with the referring sites. This can be a huge and daunting task and definitely will take time if you have a lot of backlinks. What’s more, the success rate (ie; how many links are actually removed by the referring site vs. how many link removal requests you send) is genuinely rubbish. Nonetheless, it’s a necessary step in link profile strengthening just as it is with recovery.
Collect the contact details for the referring sites by looking for “contact” or “about us” pages on the site. If they do not have these sorts of pages, try looking in the page footer or even in the source code for an email address. As a last resort, try to obtain an email address by running a WhoIs lookup against the domain. Nine times out of ten you will find an email address or contact form in one of these ways. Record these details in your spreadsheet of link removals and construct a polite message to send to your contact requesting removal of the link to your site.
Log every effort you make to get your links removed. Google expects you to work hard to remove harmful backlinks and will want to see proof of your efforts should you ever need to submit a reconsideration request.
For those links that could not be removed, due to a lack of contact details or no action from the site when contacted, prepare a disavow document to submit to Google. This tells them which links you do not want counted when they crawl your site and evaluate your backlinks. It tells Google you have tried to remove the links, knowing that they aren’t natural, and have been unsuccessful.
By going through this process and keeping your link profile clean from unnatural, spammy links, your site is much more likely to survive another Penguin update or manual review.
For more information on carrying out a comprehensive backlink review, please see my post on how to recover from a Google ranking penalty from earlier this year.
Unfortunately it is not just the Penguin causing Google ranking drops and making hard work for websites. The Google Panda algorithm update was first announced in 2011 and has since seen numerous tweaks and developments. Panda is such an important part of Google’s policy that it has recently been worked into their algorithm, meaning Panda checks are completed as part of the standard Google crawling and indexing process, not only when Panda is updated.
Where Penguin targets backlinks, Panda targets content. Where Penguin attempts to eliminate spam from search results, Panda ranks sites based on a complex evaluation of sites’ content quality and value.
In order to rank above your competitors, content is of increasing importance thanks to Panda and there are a number of fundamental health checks you should be regularly carrying out.
When considering the quality of the content on your site, Google reviews a number of key factors. In order to perform well in search it is essential that your site complies with certain content-based guidelines and provides real value to the user.
The first thing you should check is the length of content on each of your pages. If you have pages thin on quality content, you should develop them or get rid of them completely. As a general rule of thumb, each page should have at least 250 words with slighty more on your homepage or other key pages that you want to rank well (300+).
Content length is important but equally so is quality and part of that metric is in its uniqueness. If you have two pages with similar content you should consider getting rid of one and developing the other. As always, if you do remove any pages, don’t forget to apply a 301 redirect to point them to a relevant live page.
One way to check for duplicate pages on your own site is using the hash value in Screaming Frog. This code is given to each of your pages by the tool and is calculated based on the page content, meaning duplicate pages will have the same hash code. After running your site through the tool and exporting the results you can use the hash value to see if any pages are direct duplicates:
In 2012 Google released numerous other updates to its algorithm, one of which targeted ads “above the fold”. This simply means the content on your site’s page which appears at the top of the page or is visible without scrolling down. Google does not want to see that main space of your pages filled with adverts and instead expects your main page content to be given pride of place there to serve the needs of your users. Check your pages and ensure that you do not need to scroll to find your textual content and make sure you don’t use adverts above the fold wherever possible.
Check your site thoroughly to ensure you do not stuff your pages with keywords or excessively use keyword bolding as an optimisation tactic. Google is intelligent enough now to work out what your page is about without the need for these sorts of techniques and using them excessively will raise alarms bells at Google Spam Patrol. Try to use your target keyword 2 or 3 times on the page. Most importantly, only use keywords where relevant and appropriate; if it doesn’t read well and looks as though it’s been thrown in for SEO, Google are getting smarter at knowing.
Google’s Panda updates don’t only consider the content within your own website when evaluating the quality of your site’s content. In addition, Google will review your content in comparison to pages all around the web specifically to identify instances of duplicate or scraped content.
Google’s aim is to provide users with quality search results that meet their needs and the intention of their search. If two sites each have a page with identical or similar textual content, Google will not want to return both pages in relevant search results. The Panda algorithm update will try to establish which site provides the best content or had the duplicate content first and return their pages in search. However, it cannot always establish this accurately and may choose to display the plagiarised content instead of the original instance, or even devalue both sites.
To ensure that the content on your pages is not devalued due to scraped content, it is essential that you carry out regular checks for duplicate content across the web.
One great tool for this is Copyscape, a cheap and effective site which analyses your page and looks for content matching it across the web. For around $0.05 per page you can check the pages of your site for duplicate, plagiarised content across the web.
If Copyscape identifies duplicate content on any of your pages you should manually review each instance and consider why it was flagged. Is it just a case of your business using the same “about us” blurb on your site as on your LinkedIn page? Or is it that another company in your industry has stolen content that you wrote for your site and adapted it to use on their own?
Consider how each instance can be resolved. If the duplicate content is on profile pages or sister sites that you can control, it should be easy enough to revise the text on those profiles or sites and make it unique from your main website’s copy. However if another company has used content from your site, or perhaps you used content from another site when you first built your site, it is probable that you will not be able to get the external source to change their content. In these instances, you should revise the content on your own pages to create completely unique content for your users and for search engines.
One often overlooked area of SEO is in competition analysis and in particular the analysis of your competitors’ backlink profile and link building activity. Understanding this provides you with two main benefits:
When it comes to your SEO health checks, it is point 1 that is of interest. If you understand how strong your competitors’ link profiles are, you are in a better position to predict and forecast how much work you may need to carry out on your own link profile in order to compete or stay on top. In addition, by carrying out checks periodically you can establish your competitors’ “link velocity”; the speed in which they are developing new backlinks. For example, if a competitor has 2,000 backlinks from 200 domain in January and 2,500 links from 250 domains in March, you can establish the trends and patterns of their link behaviour to establish what work you might need to do in order to compete.
To establish these sorts of statistics, consider using the same sorts of tools that you would use for your own backlink analysis. In addition, try the Site Strength Indicator Tool from Search Engine News which gives you a snapshot of information on any site you check.
For a more in-depth competitor analysis, please see an older (but still relevant) post from the Koozai blog on SEO Competitor Analysis.
Lastly, you need to review the technical aspects of your site to ensure your website’s performance is not restricted by technicalities and that you operate fully within Google Webmaster Guidelines.
Please feel free to download our free whitepaper on completing a Technical SEO Audit to use as a guide for reviewing the technical aspects of your site.
I hope you’ve found this guide useful, but please do let me know if you can think of any other health checks I might have missed. Share your experience with Google’s manual penalties or algorithm updates and let us know if you’ve managed to recover! Please feel free to comment below if you have any questions or comments or contact Koozai for more information on our Site Audit and SEO management services.
Heart & Health Image from Bigstock
Panda Image from Bigstock
Emma,
Thank you for the great write up. You did an excellent job covering the basics of the penalty removal process, whether it be manual or algorithm.
I would suggest adding Ahrefs as a backlink checker as well. I’m not sure if you’ve had much time to use their backlink checker but it’s absolutely amazing. It’s easily the best backlink checker on the market currently and their pricing is very competitive.
I would also suggest adding a section for link removal services, there are many services these days and a few good ones.
Thanks for the feedback James. I hear good things about Ahrefs but it’s not one I’ve used a great deal personally.
I’m finding Link Risk is good tool for managing a link analysis project.
Hi Emma,
Thanks for the tips and info I’ve made some notes :) I would also add site speed/bounce rate and responsive (Mobile ready) as they both impact the performance and results.
In terms of Panda and Penguin a great tool I like to use is http://www.panguintool.com it allows you to see if you have been effected by any algorithm changes.
Cheers
Danny
Thanks Danny. Yes, the Panguin tool is really good for algorithm checks, and Fruition is another good one (fruition.net) although there are limits on the number of sites you can track for free. Definitely worth a look though as it tells you the likelihood that each update had an effect, whether it was positive or negative and to what degree.
Site speed is definitely important and would be covered in the technical side of things which I haven’t delved into too much as they whitepaper says it all! And having a mobile-ready site is increasingly important too with Google’s focus starting to shift more into the mobile market.
Thanks for your feedback!
Hi Emma,
Thanks for that I will sign up and check it out :) Our old website was mobile friendly but had to many bugs so we had to redesign our site due to mobile being more of importance now.
Looking forward to more of your posts Emma.
No problem at all, thanks for your feedback Danny.
Thank you so much for this article, and the checklist provided, I found you via Google + and will be sharing.
Thanks Diana, glad to hear it :-) nice to know you found it useful.
Great write up.
What do you think about a blog that had been hit really hard by penguin 1.0 or 2.0:
1) Try to solve it and recover
2) Move on (minimal chance to recover); start on a new domain and point links that are under your control to your new domain instead of the “old” one
Looking forward to hear your opinion on this one.
Thanks,
Paul
Hi Paul, nice question!
My personal opinion would be to invest some time into trying to recover. Although some sources suggest there is no recovery from a serious Penguin attack, it isn’t known how long it might take to really regain previous successes or how hard those sites really worked. I have definitely seen some examples where things have started to pick up after a couple of months, albeit not to previous levels, but still continue to climb.
On the other hand, it does depend on your industry somewhat as to whether this is a viable option. For example, if a payday loan site is hit hard, it would almost certainly be better for them to start again as the lost revenue adds up quickly. However, if an authoritative information-based site with an existing following gets hit, it would make more sense for them to try and recover before starting from scratch.
Nice post Emma thank you. I had heard big things about the new penguin update. Quite surprised that it wasn’t that big a deal! Maybe we were just lucky!
Hi Sharon. I agree with you on this; the fear surrounding Penguin 2.0 was strong after the effects of the first version but wasn’t nearly as hard-hitting as many of us thought. However, it seems to be far from over. Matt Cutts has recently suggested there is more to come: https://www.linkresearchtools.com/case-studies/penguin-2-0-analysis-cheapoair/
I found this post from your comment over at SERoundTable and there’s some really useful info here. The question I would ask though is, is this really the future of SEO or just a bump in the road?
So a webmaster spends weeks/days/months removing bad links. Great. So in the time they’ve spent removing all those bad links who’s to say that in the meantime there isn’t a new bunch of links that Google doesn’t like? In other words cleaning out bad links isn’t a one off process but an ongoing one.
What’s to stop me or anyone else buying a bunch of Fiverr gigs putting up tens of thousands of scrapebox backlinks to competitors’ sites? Heck it’s probably a lot less work than getting legitimate backlinks to my own sites. It just seems to me Google are rewarding the less legitimate marketers.
My guess is that a lot of webmasters will be looking more and more at non-Google sources of traffic.
Hi Robert, thanks for your feedback.
I absolutely think that backlink analysis and potentially removing unnatural backlinks will be an ongoing process to some degree. Having said that, Google’s best practice guidelines have never really changed much, they’ve just got better (or at least supposedly) at spotting the good from the bad.
Regarding negative SEO, the only way to try and combat it is to stay on top of your backlinks. The minute you see a spike that could be the result of harmful activity, get onto disavowing them. This is currently all we can do!
Agree that webmasters will be looking for other sources of traffic but many will be increasing the AdWords spend to compensate: another Google win.
Thanks for your comment!
Thanks for putting this post together Emma, it is a brilliant resource for site owners to refer back to over the coming months. This will be especially helpful given the fact that Google have highlighted some of the things we can expect with their forthcoming updates.
Great summary over on Search Engine Land – https://searchengineland.com/googles-matt-cutts-black-hat-link-spammers-less-likely-to-show-up-in-search-results-after-summer-159185
Great post Emma, loads of handy hints!
We wrote our top tips for a 2013 SEO strategy back in December; things have changed loads since then and probably will continue to in the next few months.
These were our recommendations: https://www.superdream.co.uk/2013-seo-strategy/
Thanks Clare. I agree things will almost certainly keep changing as Google continues to stir the algorithm pot! Thanks for sharing this, some good tips there.
Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget - it’s free, so why haven’t you signed up already?
Call us on 0330 353 0300, email info@koozai.com or fill out our Contact Form.
18 Comments