Koozai > Blog > 10 Easy SEO Mistakes

10 Easy SEO Mistakes

| 4 minutes to read

We all learn from our mistakes, and here are some common SEO mistakes you may have come across that we can all learn from.

1. Building a site in Flash

It looks nice, but search engines can’t read it – so make sure your site doesn’t completely rely on flash. If you do use flash, try to have a HTML alternative of your site too.

2. Relying on JavaScript

Don’t use too much JavaScript as search engines can’t read this either. If you have a menu that uses JavaScript, it is important to include an alternative navigation, such as a sitemap or a row of sitelinks, which include all the key pages and is accessible from every page. This will help both users navigate your site if they don’t have flash as well as link all your pages internally so that search engine scan find them.

3. Canonical issues

When the page can be accessed by multiple URLs, it appears as duplicate content in the eyes on the search engines. The most common occurrences of this are:

www.example.com

example.com/

www.example.com/index.html

example.com/home.asp

It often happens because of a CMS setting or perhaps a developer not being aware of the issue. Depending on the set up of your site, there are several ways to resolve canonical issues depending on your website set up.

4. Miss-using robots.txt

It’s amazing what one tiny / can do. Many people search for a standard robots.txt file which looks like this:

User-agent: *

Disallow:

But it often gets confused with:

User-agent: *

Disallow: /

which will block your entire site from being visited by any robots. Robots.txt should only be used if there is a need, for example to direct a search engine to your sitemap, or to stop it crawling admin pages or forms. If there is no real reason to use a robots.txt file for your site, don’t just put one on there anyway.

5.  Mis-using No Index tags

There might be a time you need to use a no index tag, for example if you don’t have access to put a robots.txt file on the root of your domain. Like using a robots.txt file, the danger here is that you can accidentally block your entire site from search engines. If you put the tag on the wrong page, or on a page template which is then used for other pages, it too can block important parts of your site from being indexed.

6. Choosing a CMS with limited SEO functionality

They are great for clients but can make things difficult for SEOs. Some CMS’s have awkward functionalities which mean that they don’t allow heading tags, alt tags or other core SEO optimisation elements. As mentioned in the canonical errors section, some even create duplicate content pages. If a client is doing a website re-design, try to research the CMS before they commit to using it, or even recommend one you already know works well.

7. Letting your development site get indexed

Web developers often build new sites in a live environment. This effectively means they are live web pages which can be indexed by Google. If the development site has the same content which will eventually feature on the main site, this could cause serious duplicate content issues. To try to avoid this happening, ask if the developer can add password protection to access the development site, which should ensure that it doesn’t get indexed accidentally. If this isn’t possible, use a robots.txt file to block all robots from accessing any part of the development site. Just make sure the developer doesn’t then move the robots.txt file to the main site when he makes the changes live!

8. Putting keywords and headings in images

It might look nice to have your headings as images, but search engines can’t read them, so try not to do this. If you have to, include the same heading in standard text somewhere else on the top of the page so it can still be given h1 tags. If images have to be used for titles, ensure you optimise images in all the usual ways.

9. Spelling mistakes!

Yes it sounds silly, but if you are trying to rank for a term, you need to make sure you have spelt it correctly in the first place. Having alternative spellings on a page can help if it fits into the scenario naturally, such as if a word can sometimes be hyphenated or be split into two words.

10. Not thinking far enough ahead

Many businesses rely on seasonality of events or occasions. It can take time for pages to rank and it will really help if your content is live before other sites publish content on the subject. An example is if you are writing about your local Christmas fare, contact the community and find out information on it and then write about it a couple of months in advance. This may seem a bit keen, but it will help you rank for search terms related to the event, before other people write about it. People are also more likely to link to you if you are the only source of information on the subject at the time.

Responses

  1. Danzo avatar
    Danzo

    In #2, I think you mean “javascript” not Java.

    “Java is to JavaScript as ham is to hamster.”- Jeremy Keith

    Frankly, I stopped reading at that point. Anyone who confuses Javascript with Java simply cannot be an authority on anything web related.

    1. Mike Essex avatar

      Thanks for the feedback Danzo. We’ve updated the text now.

  2. LinkMasters4All avatar

    For a site to be successful, it should be simple and easy to navigate. I too personally don’t like Flash Sites.

Leave a Reply

Your email address will not be published. Required fields are marked *

Digital Ideas Monthly

Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget – it’s free, so why haven’t you signed up already?