Call 03332 207 677

Laura Phillips

The Importance of Technical SEO in 2015

16th Dec 2014 SEO 9 minutes to read


The Importance of Technical SEO in 2015

It’s that time of year when the world of search is basted with a heavy brush of predictions ready for the next 12 months. Curious marketers gather round to share these stories, and recount them to others. If there’s one thing I have learned during 2014 it’s that no one can be certain what 2015 holds for anyone involved in SEO.

However, in my humble opinion, there are a few areas that are standing out as requiring extra special attention over the coming months. One of these is technical SEO. Sounds scary, huh? It’s not that bad. Here’s a few bits to look at to get you started…

Site Speed

I prattle on about site speed all the time with my clients. It’s a funny one as there are various opinions on the importance of site speed as a ranking factor. I still see it as not only an important ranking factor, but even more importantly, a fundamental element of a good user experience.

We have such short attention spans these days, not to mention barely a nano-second of patience when searching online. If your site is too slow people will leave and go to your competitors, within seconds, regardless of where you rank.

Here’s a few speed related facts as relayed by SEJ:

  • According to a case study from Radware, 51 percent of online shoppers in the U.S claimed if a site is too slow they will not complete a purchase.
  • Radware also discovered in another study that the demand for loading speed has increased over time. For example, in 2010 a page that took six seconds to load witnessed a -40 percent conversion hit. In 2014? That same loading time suffered -50 percent conversion hit.
  • Research has found that 47 percent of web users expect a website to load in under two seconds.
    During peak traffic times, 75 percent consumers are willing to visit competitor sites instead of dealing with a slow loading page.
  • Besides making visitors happy, having a website that loads quickly is good for business. In fact, Strange Loop has stated that just “a one second delay can cost you 7 percent of sales.

Flash

Flash & SEOIf you’re building a new site and factoring in SEO as you should be, try not to use too much Flash. Flash is poorly indexed, so try not to use it for content or especially any form of navigation. You may find some SEO/designer clashes here, but there is always a solution for two such important elements.

Google have recently started a global roll out of Flash warnings on mobile devices.

This is potentially a traffic and conversion killer for your site. It’s time to get on the HTML5 train if you haven’t already. Bits of Flash is ok, but avoid SEO based areas as spiders cannot read Flash, the content may as well not be there as far as the search engines are concerned. HTML5 is SEO-friendly, and Google et al can read and index content written in HTML5.

The below link favours neither, but it’s a great little tool to compare them by building a Flash and HTML5 version of the same game. You play the game in both, and decide which version you prefer.

Flash vs HTML5

It is a similar story for iFrames so use the same set of rules.

Mobile Optimisation

Always ‘on the list’ and waiting to be started, mobile optimisation or a responsive site is fast becoming a must.

Recently, for the first time, mobile traffic overtook desktop traffic. We all knew it would happen, but now it is here. The majority of users will use more than one device to search, consider and purchase. The majority of these users will use more than one device within a 24 hour period.

To use my most hated cliché; you need to ‘be where your users are’. This doesn’t just mean social platforms, via remarketing etc. You need to be usable on mobile. Even if your site is specifically not designed to convert on mobile, users need to be able to use your site and find information to keep them on the path to conversion instead of getting frustrated and seeking out an alternative.

Equally, Google see having a non-mobile optimised site as providing a lower quality experience to the user, and they don’t like that. If you want to see your rankings improve, you need a mobile optimised site. Google have a great little guide in their Developer section about creating mobile friendly websites which you can visit here.

Meta Keywords

No Meta Keywords

Something many seem to find hard to let go of, Meta keywords are like an SEO security blanket, once a great way to spam a site to the edge of reason, now a cheap horrible call out to Google, shouting your horribly spammy intentions like the teacher’s pet in the playground. Get rid of them, all of them, before they start telling tales on you to Big G.

Structured Data

Structured data is essentially data about data, or information about information. Google and the other major search engines got together a while back to create a common ‘language’ they could all recognise for creating this kind of data. This ‘language’ includes Schema, microformats, microdata and RDFa.

Initially no one really took notice of these tags, possibly not seeing the point at the time. However, with the advent of Penguin and Panda, another round of ‘content is king’ shrieks, Google gave us a gentle prod and reminded us that these tags were available for us to use, to clarify certain information.

They even provided a mark up helper and testing tool, both of which you can find in Webmaster Tools. Just log in then click this link.

Have a look at Schema.org too, it provides a great library of schemas to cover a wide range of information you may want to add to your site.

For example: <meta itemprop='productID' content='isbn:123-456-789'/>.

Canonical Sites & Duplicate Pages

You might be surprised how many canonical sites and how much duplicated content is out there!

Websites which can be accessed using both www and non-www URL addresses, or on other duplicated formats, can damage their ranking abilities, plus it’s not great housekeeping.

When a single page can be accessed by using any one of multiple URLs, a search engine assumes that there are multiple unique pages. What they find when they access these pages is quite the opposite; page upon page of duplicated content, a whole sites worth in most cases. This rings alarm bells for spiders, who see not only huge amounts of duplicated content, but have no way of telling which they should rank.

Worst of all, you are splitting your site’s authority, creating two or more weak sites instead of one strong, authoritative site.

There are a couple of ways to sort out a canonical site out.

Canonical Tag

If you want http://www.thisisacanonicalsite.com/example/ to be the preferred URL out of all those being indexed, you can tell search engines this by marking up the page with the rel=”canonical” tag as follows:

Add a <link> element with the attribute rel=”canonical” to the <head> section of these pages:

<link rel=”canonical” href=” http://www.thisisacanonicalsite.com/example/” />

This indicates the preferred URL to use to access the example post, so that the search results will be more likely to show users that URL structure.

XML Sitemap

By picking your preferred URL and adding it to your XML sitemap search engines are more likely to use that one, but it is not guaranteed.

301 Redirects

Of all the different ways to access your page, you could pick the strongest and redirect the others to it.

For example, if your page is accessible via:

http://www.thisisacanonicalsite.com/example/

http://thisisacanonicalsite.com/example/

http://www.thisisacanonicalsite.com/example

…you could use a 301 redirect to channel the weaker pages to the new, passing on a little authority at the same time.

A Note from Google on Canonical URLs

“While we encourage you to use any of these methods, none of them are required. If you don’t indicate a canonical URL, we’ll identify what we think is the best version or URL.”

  • Don’t use the txt file for canonicalization purposes.
  • Don’t use the URL removal tool for canonicalization: it removes all versions of a URL from search.
  • Don’t specify different URLs as canonical for the same page (e.g. one URL in a sitemap and a different URL for that same page using rel=”canonical”)

Personally, I recommend using redirects, never use a 302 for this sort if thing. A 302 is designed to be temporary, whereas a 301 is a permanent redirect.

Site Structure, Dynamic URLs/Session IDs

The best tool to use when structuring your site is common sense. No, really. What is good for the user is good for the search engine. Keep site architecture simple, follow logical paths in as few steps as possible, with clean URLs, and nothing to stop spiders in their tracks.

Don’t cram a number of objects onto a page until the page and/or URL doesn’t make sense, make more pages instead.

Making a site quick and easy to use helps create a good user experience, and a good search engine spider experience.

XML Sitemap

Bit of a basic one, but still often overlooked, make sure your site has an up to date XML Sitemap.

The sitemap.xml is used to guide search engine robots through the site and tell them what the site is about and how it is organised. This helps search engines better place you in user searches. By maintaining a good site structure and minimising broken links, web crawlers should be able to discover most or all of your site. A poorly organised or maintained site will not be crawled as intelligently, and it is likely that less pages will be indexed. You can find out more about XML sitemaps here, and create your own here.

Robots.txt File

This file tells search engine robots which parts of the site to crawl and which parts to exclude, differing between what is visible to the public and what is visible to the managers of the site. A search engine robot will crawl the entire website, except, supposedly, for the pages it is told not to via the robots.txt file. If, somehow, you don’t have one, you can find out more here.

You probably do have one, but when was the last time you checked or adjusted it to accommodate changes to your site? The robots.txt file is another bit of housekeeping to remember to stay on top of as your site grows and changes.

Images

Help Google et al to work out more about your site by correctly applying alt attributes and good image descriptions. Remember, search engines can’t ‘see’ your images, you need to give them text to work with, such as the below example from Boostability:

This code:

<img src=“userphoto/mono-cat.jpg” alt=“Funny Animal Picture-“Monorail Cat” width=“68″ height=“76” class=“photo” />

Means search engines see this instead of nothing at all:

Alt Attributes

Do this throughout your site and keep attributes and descriptions updated.

 

There is so much more to the technical site of SEO, but this checklist should be a good starting point for getting your house in order. What other technical aspects do you think will be important in 2015? Let me know in the comments section below.

For more information on technical SEO, speak to one of our experts today.

Image credits:

Images by BigStock Images.

Share this post

Laura Phillips
About the author

Laura Phillips

Laura has experience of SEO, PPC and Social Media both in-house and within an agency environment. Having worked across a variety of industries from travel to law, and retail to education she is always looking for new and innovative ways to improve the search and social visibility of her clients across various platforms.

How To Perform A Technical SEO Audit

Your Free Whitepaper

30 pages of goodness

Technical SEO Whitepaper Cover

8 Comments

What do you think?

Digital Ideas Monthly

Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget - it’s free, so why haven’t you signed up already?