We love digital - Call
03332 207 677 and say hello - Mon - Fri, 9am - 5pm
Call 03332 207 677
Unlike 08 numbers, 03 numbers cost the same to call as geographic landline numbers (starting 01 and 02), even from a mobile phone. They are also normally included in your inclusive call minutes. Please note we may record some calls.
It’s that time of year when the world of search is basted with a heavy brush of predictions ready for the next 12 months. Curious marketers gather round to share these stories, and recount them to others. If there’s one thing I have learned during 2014 it’s that no one can be certain what 2015 holds for anyone involved in SEO.
However, in my humble opinion, there are a few areas that are standing out as requiring extra special attention over the coming months. One of these is technical SEO. Sounds scary, huh? It’s not that bad. Here’s a few bits to look at to get you started…
I prattle on about site speed all the time with my clients. It’s a funny one as there are various opinions on the importance of site speed as a ranking factor. I still see it as not only an important ranking factor, but even more importantly, a fundamental element of a good user experience.
We have such short attention spans these days, not to mention barely a nano-second of patience when searching online. If your site is too slow people will leave and go to your competitors, within seconds, regardless of where you rank.
Here’s a few speed related facts as relayed by SEJ:
If you’re building a new site and factoring in SEO as you should be, try not to use too much Flash. Flash is poorly indexed, so try not to use it for content or especially any form of navigation. You may find some SEO/designer clashes here, but there is always a solution for two such important elements.
Google have recently started a global roll out of Flash warnings on mobile devices.
This is potentially a traffic and conversion killer for your site. It’s time to get on the HTML5 train if you haven’t already. Bits of Flash is ok, but avoid SEO based areas as spiders cannot read Flash, the content may as well not be there as far as the search engines are concerned. HTML5 is SEO-friendly, and Google et al can read and index content written in HTML5.
The below link favours neither, but it’s a great little tool to compare them by building a Flash and HTML5 version of the same game. You play the game in both, and decide which version you prefer.
It is a similar story for iFrames so use the same set of rules.
Always ‘on the list’ and waiting to be started, mobile optimisation or a responsive site is fast becoming a must.
Recently, for the first time, mobile traffic overtook desktop traffic. We all knew it would happen, but now it is here. The majority of users will use more than one device to search, consider and purchase. The majority of these users will use more than one device within a 24 hour period.
To use my most hated cliché; you need to ‘be where your users are’. This doesn’t just mean social platforms, via remarketing etc. You need to be usable on mobile. Even if your site is specifically not designed to convert on mobile, users need to be able to use your site and find information to keep them on the path to conversion instead of getting frustrated and seeking out an alternative.
Equally, Google see having a non-mobile optimised site as providing a lower quality experience to the user, and they don’t like that. If you want to see your rankings improve, you need a mobile optimised site. Google have a great little guide in their Developer section about creating mobile friendly websites which you can visit here.
Something many seem to find hard to let go of, Meta keywords are like an SEO security blanket, once a great way to spam a site to the edge of reason, now a cheap horrible call out to Google, shouting your horribly spammy intentions like the teacher’s pet in the playground. Get rid of them, all of them, before they start telling tales on you to Big G.
Structured data is essentially data about data, or information about information. Google and the other major search engines got together a while back to create a common ‘language’ they could all recognise for creating this kind of data. This ‘language’ includes Schema, microformats, microdata and RDFa.
Initially no one really took notice of these tags, possibly not seeing the point at the time. However, with the advent of Penguin and Panda, another round of ‘content is king’ shrieks, Google gave us a gentle prod and reminded us that these tags were available for us to use, to clarify certain information.
They even provided a mark up helper and testing tool, both of which you can find in Webmaster Tools. Just log in then click this link.
Have a look at Schema.org too, it provides a great library of schemas to cover a wide range of information you may want to add to your site.
<meta itemprop='productID' content='isbn:123-456-789'/>.
You might be surprised how many canonical sites and how much duplicated content is out there!
Websites which can be accessed using both www and non-www URL addresses, or on other duplicated formats, can damage their ranking abilities, plus it’s not great housekeeping.
When a single page can be accessed by using any one of multiple URLs, a search engine assumes that there are multiple unique pages. What they find when they access these pages is quite the opposite; page upon page of duplicated content, a whole sites worth in most cases. This rings alarm bells for spiders, who see not only huge amounts of duplicated content, but have no way of telling which they should rank.
Worst of all, you are splitting your site’s authority, creating two or more weak sites instead of one strong, authoritative site.
There are a couple of ways to sort out a canonical site out.
If you want http://www.thisisacanonicalsite.com/example/ to be the preferred URL out of all those being indexed, you can tell search engines this by marking up the page with the rel=”canonical” tag as follows:
Add a <link> element with the attribute rel=”canonical” to the <head> section of these pages:
<link rel=”canonical” href=” http://www.thisisacanonicalsite.com/example/” />
This indicates the preferred URL to use to access the example post, so that the search results will be more likely to show users that URL structure.
By picking your preferred URL and adding it to your XML sitemap search engines are more likely to use that one, but it is not guaranteed.
Of all the different ways to access your page, you could pick the strongest and redirect the others to it.
For example, if your page is accessible via:
…you could use a 301 redirect to channel the weaker pages to the new, passing on a little authority at the same time.
“While we encourage you to use any of these methods, none of them are required. If you don’t indicate a canonical URL, we’ll identify what we think is the best version or URL.”
Personally, I recommend using redirects, never use a 302 for this sort if thing. A 302 is designed to be temporary, whereas a 301 is a permanent redirect.
The best tool to use when structuring your site is common sense. No, really. What is good for the user is good for the search engine. Keep site architecture simple, follow logical paths in as few steps as possible, with clean URLs, and nothing to stop spiders in their tracks.
Don’t cram a number of objects onto a page until the page and/or URL doesn’t make sense, make more pages instead.
Making a site quick and easy to use helps create a good user experience, and a good search engine spider experience.
Bit of a basic one, but still often overlooked, make sure your site has an up to date XML Sitemap.
The sitemap.xml is used to guide search engine robots through the site and tell them what the site is about and how it is organised. This helps search engines better place you in user searches. By maintaining a good site structure and minimising broken links, web crawlers should be able to discover most or all of your site. A poorly organised or maintained site will not be crawled as intelligently, and it is likely that less pages will be indexed. You can find out more about XML sitemaps here, and create your own here.
This file tells search engine robots which parts of the site to crawl and which parts to exclude, differing between what is visible to the public and what is visible to the managers of the site. A search engine robot will crawl the entire website, except, supposedly, for the pages it is told not to via the robots.txt file. If, somehow, you don’t have one, you can find out more here.
You probably do have one, but when was the last time you checked or adjusted it to accommodate changes to your site? The robots.txt file is another bit of housekeeping to remember to stay on top of as your site grows and changes.
Help Google et al to work out more about your site by correctly applying alt attributes and good image descriptions. Remember, search engines can’t ‘see’ your images, you need to give them text to work with, such as the below example from Boostability:
Means search engines see this instead of nothing at all:
Do this throughout your site and keep attributes and descriptions updated.
There is so much more to the technical site of SEO, but this checklist should be a good starting point for getting your house in order. What other technical aspects do you think will be important in 2015? Let me know in the comments section below.
Images by BigStock Images.
We’re excited to announce that we’re launching a series of free Breakfast & Learn events for brand-side marketers. Our digital marketing experts will help you to boost your SEO, paid media, paid social and content marketing knowledge over breakfast.