Download this whitepaper now and get a new one every month!Download »
We love digital
Call 0845 485 1219
We love digital - Call and say hello - Mon - Fri, 9am - 5pm
by Oliver Ewbank on 27th November 2013
Hi there. Today I’m going to talk to you about future-proofing your SEO, which in essence means protecting your rankings from future algorithm updates.
So first things first, I’m going to start with metadata. So this is hugely important to get right and to stay on top of if you are going to future-proof your SEO. So you want to avoid duplicate meta titles, duplicate meta descriptions, short meta descriptions and meta descriptions that are too long.
So a really good place to start is to log into Google Webmaster Tools and go to the HTML Improvement section. Here it will give you lots of hints and tips on how to improve your metadata. So it will tell you if they’re too long or too short or non-informative.
Some good guidelines to stick to for a meta title [are] ideally, you don’t want to be longer than 65 characters, and for the meta description, ideally not longer than 155 characters. But if you log into Google Webmaster Tools regularly, go to that section of the tool and you’ll be able to create metadata that’s really valuable to users and search engines.
Next I’m going to talk about the technical side of SEO, which is hugely important for future-proofing your website, because Google are not going to want to send users to a website that’s not technically sound. So make sure that there are no broken links. Make sure you minimalise redirects when you can. Think about the user. So make sure you have a dedicated 404 page. Make sure that pages are loading quickly. And just try and stay on top of all these technical issues.
The more you can think about the user, the better. So ideally, if you can have a responsively designed website, this would be good from a mobile point of view, because this will increase the level of bounce rates and help users from that point of view as well.
It’s really important to iron out these technical issues. Also think about having an on-page sitemap, making sure you have an up-to-date robots.txt, making sure all the relevant pages are in the XML sitemap too. So it’s really important to stay on side of the technical side of things.
Next point, number 3, I’ve put down 250, which stands for 250 words. So ideally, each page on your website will have at least 250 words of good, credible, unique content. Google do sometimes put out thin content warnings. So to avoid this, have 250 words, whether it’s a service page or product page or about us page, if you have 250 words, it will make sure the page deserves to rank. So I can’t stress how important that is to make sure you don’t get caught up in any thin content warnings or any kind of Panda updates. Also, very important to try and keep your content above the fold when you can.
So the next thing I put down is blog. It’s hugely important, if you can, to have a blog on your website, because you can update it regularly. It stops your site from looking stale. You can push out really good content on social media platforms and really just show users and search engines that it’s a regularly updated website delivering good, valuable content to users.
Next to that, I’ve put author. If you can link blog content to a credible author, that will give the content more credibility. So Google Authorship is where you can link content to a Google+ profile. So if someone related to your website or industry has a really credible Google+ profile with lots of circles and lots of followers, if this is then attributed to the content on the blog, it will show users and search engines that it’s really valuable. You’re unlikely to get penalised from a content point of view if you have a credible author attached to the content. So hugely important there. Authorship can also help you from a click rate point of view, because it will have the person’s picture next to the content when it’s listed in a search engine like Google.
Next point I put down is to detox links. So it’s hugely important from a link point of view to detox them as regularly as possible, because Penguin updates and link spam updates can really harm you from a future-proofing point of view. So another good place to start is Google Webmaster Tools. Go in and you can export your latest links that have been built to your website. You might find links that you didn’t know about or links you didn’t know were there, and you can disavow them. So you can discount the links from being attributed to your site and show that they’re not there from an intentional SEO point of view.
So I would detox your links as regularly as possible. It could be weekly or monthly. You can do this by using a number of tools, like Link Risk or Majestic SEO or Link Detox or Open Site Explorer. So gather all your data together. The way to spot unnatural links is to sort by anchor text, so you can see which ones have too much exact match anchor text links. You could sort by page rank. Is it too good to be true? But basically, if the link’s only there for an SEO reason and not for referral traffic or any other reason, it could be deemed unnatural. So it’s hugely important to stay on top of unnatural links. Do this regularly, check Google Webmaster Tools, and you can really start to show Google your intentions.
When you do spot unnatural links, you can disavow them. You can try and remove them completely. If you still want the traffic coming through, you can no follow the link. Or if it’s more for an advertising campaign, you can no index the page it’s pointing to, just to show Google your intentions. So if you’re disavowing links that you don’t want, it future-proofs your link profile for any future sort of link spam updates. So something I’d definitely suggest doing.
The mext thing I put is content. This is the icing on the cake from a future-proofing point of view, because if you stay on top of all the other features, if you’re creating good content for your users, it shows you’re a credible source from within your industry. If you’re creating good content on your blog, content marketing can take many features, so it could be an infographic, it could be a video like this one, it could be a press release, it could be an FAQ page, you could write a glossary, it could be a guest post on a blog. So there’s loads of different ways you can do content marketing. But if you’re creating content for your users, getting good referral traffic, promoting it through social media, you’re showing Google and users that you’re a credible source, and you’re not going to get penalised if you show this authority with content marketing.
Last but not least, I’ve put down tools. Tools are an excellent way to stay on top of future-proofing your SEO campaign. Google Webmaster Tools will help you spot technical issues, the latest links to your website, and meta issues. So [it’s] a really good tool.
Another good tool that’s worth using is one called Panguin. You can link this up to your Google Analytics account, which will show you where traffic drops in relation to an algorithm update. So another really useful tool there. Obviously, the link tools I talked about before, Majestic SEO, Link Detox, Link Risk, Open Site Explorer are really good for checking your links. So if you can use tools regularly, you can stay on top of future-proofing your website.
From a content point of view, Screaming Frog will let you know how many characters and how many words are on the page. So Screaming Frog is a free tool you can use there. To check duplicate content, you could use CopyScape, which checks whether your content’s been duplicated anywhere else on the web. So another good tool there.
But just to summarise, if you do this altogether, you will be able to future-proof your campaign as much as possible. So metadata, make sure it’s not duplicated. Make sure it’s not too long or too short. Iron out any technical issues, like broken links. Make sure you have the relevant 404 page. Any technical issues like that, stay on top of.
Make sure you have at least 250 words on the page. That will stop any thin content warnings or stop you getting caught up in Panda updates. If you have a blog, link to a credible author. This will show that the content is credible, and you’re less likely to get held back from a penalty point of view.
I can’t stress this one enough. Detox your links regularly. If you’re showing Google your intentions, you’ve got to take responsibility for your backlinks. So show Google whether it’s one you built yourself, whether you’ve attained some links and you want to discount them. So disavow links. Remove them where possible. But basically, show Google the ones that you’re proud of and the ones that definitely aren’t there from an SEO point of view. So if you’re giving out Google these indicators on your link profile, it’s showing your intentions, and you’re less likely to get caught up in an update like Penguin or a link spam update.
Last but not least, obviously content marketing. If once you’ve done all these ones first, if you’re creating good content, it’s making you credible. You’re getting social signals for that content, referral traffic. So if you’re creating content for your industry, it’s showing you’re an authority, and, overall, you’re just less likely to get hit by any updates. By using all these tools, you can check these things regularly.
So metadata, stay on top, iron out technical issues, good content that’s a decent length, and detox your links as much as possible. If you do manage to do these regularly and check them weekly or monthly, you’ll be able to future-proof your website and kind of protect your rankings, and you’re less likely to kind of get penalised in a month or a maybe year’s time.
So thank you for watching. If you do want any more information on how to future-proof your SEO, you can visit Koozai.com.