Today I am speaking at the On the Edge conference in Birmingham on the subject of Future Proofing SEO. This post contains my slides and a summary of the presentation.
2013 has seen Google release fresh Panda and Penguin algorithm updates. These updates have shaken up the world of search and forced digital marketers to refine their approach to SEO.
With more algorithm updates imminent how can you protect your rankings? This post and the below slides cover future proofing tips for on-page, link analysis, content marketing, and other elements of your SEO strategy.
Panda updates largely devalue mediocre content and are now a continuous part of Google’s algorithm. This year alone has seen a number of significant updates including:
Panda #24 January 22, 2013: Google announced the first Panda update of 2013, claiming an impact on 1.2% of queries.
Panda #25 March 14, 2013: this was the last update before Panda was integrated into the core algorithm.
Panda Dance June 11, 2013: suggestions that Panda was still updating monthly.
Panda Recovery July 18, 2013: Google confirmed an algorithmic Panda update.
2013 saw another significant Penguin update. If you have an unnatural link profile the writings on the wall for the sustainability of your rankings.
Penguin 2.0 May 22, 2013: after months of panic Penguin 2.0 arrived.
With more updates on the way prevention is certainly better than cure. You can prevent ranking drops by practising sustainable SEO.
Sustainable SEO can be broken down into five main areas:
Let’s take a look at some of these key points below:
With the explosion of content marketing to gain social signals and backlinks people often forget about the all-important foundation of SEO – your own website. You can make huge gains on competitors by making sure your on page SEO is tip top.
Meta
Obviously title tags are a hugely important ranking factor. To make your Meta future proof it’s important you stick to the character guidelines (65 for a Meta title and 155 for a Meta description). Avoid short or uninformative Meta data and make sure you don’t duplicate any titles or descriptions. Duplicate Meta can devalue your page.
To make sure you are on top of Meta issues regularly check the ‘HTML Improvements’ section in Google Webmaster Tools.
Headers
Appropriate H1 headers are obviously a ranking factor. Only include the keyword if it fits the page and works for the user. Shoehorning keywords into a header will only work against you.
A technically sound website will give Google the confidence to deliver traffic to your website. A sustainable SEO campaign must include the following points:
Other Technical SEO factors include:
Having a responsive design will automatically reshape your website to fit a mobile or tablet device. With one in three searches taking place on mobile, responsive designs are a must for anyone that wants to compete on premium terms. This will have a positive impact on your bounce rate and conversions.
If you want to test how SEO friendly your site is you can test it on the free Google tool Howtogomo.com
Rich Snippets are a great way of giving search engines additional information on your product or service. If you visit Schema.org/docs/schemas.html you will find a range of ways to markup your content including:
Using the review rating can increase CTR which in turn will benefit your overall SEO strategy.
If you don’t have a blog you should make this a priority. Sites without a blog can look stale whereas a freshly updated blog can help you:
A general guideline is to split your blog content into three areas:
Other blogging factors to consider include:
You can link your blog posts to a Google + account to build up credible authorship. Instructions can be found here. This will display your picture to increase CTR. It is believed the more authority your authorship has the more credibility Google will give your content.
Once you have optimised the on page and technical elements of your site it’s time to make sure your domain has quality copy. We have all heard the term ‘Content is King’ but there are still millions of sites at risk with thin content.
Always ask yourself:
It’s essential that your copy is good quality. If the content is created for search engines and not users then you are at risk.
Google have started sending out ‘thin content’ messages to sites with little or no added value.
To avoid this message make sure you:
Other examples of shallow pages include:
A video explaining when to remove dead content can be found here.
If you get hit by a Panda update, nine times out of ten there will be some form of duplication involved. Avoid a duplicate content penalty at all costs. The following MUST be 100% unique:
There are a range of solutions to get around duplicate content. The four most common include:
For more information on avoiding a duplicate content penalty please read this blog post
There are a number of free tools to help you stay on top of duplicate issues and I suggest doing this regularly for peace of mind (particularly if you are in a competitive sector).
Screaming Frog is an excellent tool for exporting your website’s content. It takes two minutes to run and you can export anything from Meta data to actual content. By adding filters to your exported spreadsheet you can quickly organise pages with thin or duplicated content. You can also double check character counts for your Meta titles and descriptions.
You can check for duplicate pages on your own site by using the hash value. This code is given to each of your pages by the tool and is calculated based on the page content, meaning duplicate pages will have the same hash code.
A simple but effective free tool. Link Panguin to your Google Analytics account and it will show algorithm updates in relation to your organic search traffic. This will help you identify duplication issues in relation to the type of algorithm update.
Copyscape is a free tool but I would recommend upgrading to the premium version. The premium tool allows you to paste in a group of URL’s and do a batch search for internal and external duplicate content issues. The tool will highlight the exact duplicate text in screenshots.
Backlinks are still very much a ranking factor but it’s important you take responsibility for all inbound sources linking to your domain. If the link is there for no other reason than SEO, then it’s unnatural.
Backlink analysis is now an essential part of modern day SEO. Links from the following sources are deemed unnatural and can have a negative impact on your rankings:
To future proof your site it’s important you carry out a regular detox. I like to call this process ‘link weeding’. Link weeding can be done by gathering all your link data using the following tools:
Google Webmaster Tools is the best place to start as you can download your ‘latest links’. The tool now provides broader & more diverse link data to help you review your backlinks.
If you suspect that manual action may have been taken against your website you can use the Manual Action Viewer in Google Webmaster Tools which shows information about actions taken by the manual webspam team. This will provide information on what is directly affecting your site’s ranking in search results.
If you see a message saying “no manual webspam actions found” then there is no cause for concern. If you do have a message you could be in the 2% that have had a manual penalty. The manual action could be related to spammy content or unnatural backlinks (see example below):
If you have unnatural links or spammy content to address you can do this on the Manual Actions page by clicking “Request a Review”.
To spot unnatural link patterns review the following factors:
If you spot unnatural links you can detox your profile by doing the following in this priority order:
If you receive an unnatural link warning you will need to carry out a reconsideration request. It’s important you gather as much data as possible in your request to show you have worked to remove your unnatural links. Outreach emails and lists of removed or disavowed link sources in a Google doc are recommended. The tone of the letter is also incredibly important. For a cover letter template and more information on submitting a reconsideration request check out our free whitepaper.
Once you have fixed duplicate issues and detoxed your link profile the best way to future proof your site is to create superb content for your target audience. Your content can be promoted externally, on your blog or through social media.
Content Marketing can take on many forms including:
Market content to web properties that can deliver referral traffic and conversions. This will also deliver:
Technology also plays a key role:
Content marketing is becoming more technology based to engage the user, for example competitions, polls and interactive features are an excellent way to increase social shares. By adding an interactive element you are able to make the experience more personal and if you want to stay ahead of the curve then interactive content is certainly something to consider. More results and examples can be found here.
So there you have it. There are no short cuts to future proof your SEO but if you adopt the tactics in this post you will be able to achieve sustainable results.
To survive future algorithm updates you need to stay on top of duplicate issues and regularly detox your backlink profile. Compliment this activity with quality content marketing and you will reap the rewards of sustainable rankings.
If you have any questions please leave them below.
Its a very great article. Its really fantastic and superb to read. By reading this, I get to know more about the knowledge of blog commenting,SEO and further more about ranking, also the people experience about SEO and future of SEO and many more. This is very informative for me. Thanks for sharing.
This is one of the most comprehensive posts on sustainable SEO practices. One of the most important aspects to concentrate on is user experience. If your have a responsive website with good content, people will spend time on it, share it, like it, and the SEO part will be just a side effect. It will come up high in Google as a result of user participation.
Thanks for your comment Christopher.
I agree short articles can certainly be very useful & compelling to users and the content will still get indexed.
However, I still think longer content will be more relevant and work better for you if you are trying to rank for a competitive term.
Google will crawl short pieces of text or Tweets but in my experience I have rarely seen them out-rank 250+words of unique content (on a competitive term).
They are obviously spot on when they say focus on high quality copy.
Excellent overview article except for one piece of well intended advice:
“Make sure your pages have at least 250+ words”
In this thread:
https://productforums.google.com/forum/#!topic/webmasters/pBwxvKPf2gM/discussion,
John Mueller, a Google representative, talked about how long Google thinks a page should be. I’ll let John speak for himself:
“Rest assured, Googlebot doesn’t just count words on a page or in an article, even short articles can be very useful & compelling to users. For example, we also crawl and index tweets, which are at most 140 characters long. That said, if you have users who love your site and engage with it regularly, allowing them to share comments on your articles is also a great way to bring additional information onto the page. Sometimes a short article can trigger a longer discussion — and sometimes users are looking for discussions like that in search. That said, one recommendation that I’d like to add is to make sure that your content is really unique (not just rewritten, autogenerated, etc) and of high-quality.”
Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget - it’s free, so why haven’t you signed up already?
Call us on 0330 353 0300, email info@koozai.com or fill out our Contact Form.
4 Comments