We love digital

Call 0845 485 1219

We love digital - Call and say hello - Mon - Fri, 9am - 5pm

SMX London 2011 – Day 2 Recap

Samantha Noble

by Samantha Noble on 19th May 2011

After the closing sessions on day 1, I was looking forward to getting stuck back in to day 2 of the SMX London conference. The first session of the day was something that I really enjoy and find interesting which is Search Analytics and Competitive Intelligence. The take here was that there are so many tools and so much data available to us but this can very often be overwhelming and unusable.

David Sottimano from Distilled and John Straw, the founder of Linkdex took two different approaches at getting the audience engaged during this session.
David was up first and this was the first time I had ever seen him present and was very impressed. He had some really good tips and came up with unique ideas that had not been spoken of before.

One thing that I hear and come across very frequently in the office is that clients give you a list of who they believe are their competitors but there is such a big difference between competition online and offline. This was the key starting point to David’s presentation, if client tells you competitors who are actually offline, you must make sure you tell them and change them to their actual online competitors. This can take some explanation but it is worth it, trust me :)

The key takeaway from this entire session was the Competitor Discovery Tool which David has compiled using Google Docs – It is available for you to download here – http://dis.tl/smx-london

Basically you enter the keywords you or your client want to rank for and it will bring out all competitors in that space and how many pages they have indexed in the SERPs for all queries. You can then scan all the ranking URLs for a competitor and see which content drives the majority of their rankings (blog, articles, FAQ etc) and replicate the method.

A calculation for understanding the importance of the competitors URLs is below:
Number of number 1 rankings x 40% weight
+
Number of number 5 rankings x 8% weight
=
Your competitor domination score

You then go back to client or boss and explain why they not ranking and what they need to do to compete. Genius tool, try it out!

John Straw the founder of one of the latest link analysis tools, Linkdex was up next. We are using the tool at Koozai and have been for a few months now and can definitely vouch for it as being one of the neatest tools in the industry at the moment.

One of the questions John asked at the start of his presentation was ‘How many of you are analysing competitors but do not have the time to action the data?’ The answer was that approximately 80% of the attendees put their hand up.

If you do fall into that category, I can strongly recommend Linkdex as a tool.

 

The second session of the day was ‘Keyword Research Ninja Tactics’ and the panel consisted of Richard Baxter from SEOgadget, Christine Churchill from KeyRelevance, Lasse Clarke Storgaard from MediaCom and Kevin Gibbons from SEOptimise.

Richard was up first and the key issue that he was looking to address when it comes to keyword research is that you have all this data and it is so jumbled that it is hard to manage and decipher.

He recommended that by grouping the keywords into categories, you will easily be able to add filters in Excel to uncover core phrases to target. The below example is how Richard categorised keyword research for a car dealer:

  • Action – Buy, Find,
  • Condition – Used, New
  • Brand – Audi, BMW
  • Model – 1 series, TT
  • Colour – Blue, Red
  • Location – London, Southampton

This data can then be graphed to identify what search groups to go after. There were a number of Excel formulas given away during the session and a link to the spreadsheet helping you to pull together all your keyword research data will be added to this post in 7 days time.

SEOgadget have made a great keyword tool which you can trial for free for 14 days here and is well worth a look – https://tools.seogadget.co.uk/

Christine Churchill, the President of KeyRelevance spoke about all the different keyword research tools that are out there and I have selected some of her favourites to share here.

AdWords Keyword Tool – Good source of data but it changed last September and now only pulls in data from Google rather than other channels as it did before. Christine made a good point about using the tool for keyword research, make sure you always segment from mobile traffic to get actual traffic estimates for the web.

Google Trends – Use to compare top terms to see which is most active to prioritise

Google Insights – Should always be used to understand seasonality of a product or service before you begin to optimise

Keyword Discovery - Includes misspells and seasonal data. In a way, it pulls on a lot of data from other sources and displays in one screen, which can be very useful.

Google Suggest/Instant – Shows the top 10 keywords that are being searched for. Use on the standard web results BUT also search on Google Images and Shopping for different results.

Ubersuggest - Basically does the above work for searching on Google Suggest for you. Type a phrase and expand and it will pull in all the suggestions from Google Suggest for your phrase plus everything after that going through the alphabet from A-Z.

YouTube Suggest - Data is pulled from the YouTube database, which is huge and should not be forgotten. Search in the same way as recommended for Google Suggest. They also have a good keyword tool.

Google Wonder Wheel – Provides you with a visual representation of your keywords

SEOmoz Keyword Difficulty – See how difficult it would be to go after a keyword before investing too much time

Lasse Clarke compared keyword research to shopping at a local supermarket, which is a very interesting theory but it makes perfect sense:

Shoppers looking for products on the top shelf – RESEARCHERS
Shoppers looking for products on the middle shelf – CONSIDERER’S
Shoppers looking for products on the bottom shelf – PURCHASERS

These groups of people can then be broken down into percentages

  • Researcher – 60% – 80%
  • Considerer’s- 10% – 15%
  • Purchasers – 6% – 8%

Use this theory when you are doing keyword research to segment the keywords based on the searchers intent.

Kevin Gibbons from SEOptimise gave a fantastic presentation on  targeting the long-tail phrases. We have all seen so much about targeting the long-tail but this presentation gave away some great tips and advice that could be actioned by anyone.

The first statistic Kevin started with was that 20% – 25% of Google queries have never been searched for before. That is a huge amount when you consider how many searches are done each day and highlights the potential outreach if you do target the long-tail. Here are the ten tips that he gave away:

  1. Find common search trends using Google Trends and ask the common questions: Who, What, Where, When and How.
  2. Answer FAQs within your niche. Use Google Suggest to see what questions are being asked and answer each one by writing a blog post on your site.
  3. Pick out popular themed keywords and using Richard Baxters recommendation will help with this.
  4. Use the Custom Segments in Google Analytics to analyse the long-tail phrases your site is already attracting. He suggests that you should only look at keywords of four words or more.
  5. Use PPC and look at your Impression Share data to see the potential of the phrases. Remember to use broad and phrase match rather than exact.
  6. Use a variety of tools to verify results. Kevin recommends Wordtracker Strategiser and Experian Hitwise.
  7. Estimate average Click through Rates using Google Webmaster Tools for each ranking. Sum up the impressions and clicks for each position or search for reports on Econsultancy or SEOmoz for research on this organic CTR based on position.
  8. Use AdWords Keyword Tool and export to Excel to predict traffic values.
  9. Filter your keywords into themed groups, also mentioned in Richards presentation.
  10. And the main tip from Kevin was ‘Don’t over think it and go too long-tail as you will miss out significantly’

 

After lunch, we were greeted with a panel of four from various areas of digital marketing to tackle the subject of dealing with Google penalties and suspension. Panel speakers were Craig Danuloff, Mikkel deMib Svendsen, Craig Macdonald and Michael Wyszomierski.

Michael is a Project Manager at Google and it was very interesting to hear this subject discussed from Google themselves. The key takeaways for me were about understanding if you have been penalised or suspended and Google are putting more and more effort into informing webmasters what has happened to their site. The way they are tackling this is to send messages out to the websites Google Webmaster Tools account to let you know what you have done wrong.

If you have not been removed from the index altogether then it is likely that you have just been filtered from some of the results. Things to look into to fix this:

  1. Check Google Webmaster Tools to see when site was last crawled
  2. Visit and ask questions on the Google Webmaster Tools help forums
  3. Check your cache date
  4. Check for duplicate content and fix (NB – No penalty is given for duplicate content at present but should be corrected)

If you have been penalised by Google you will need to apply for a reconsideration request. You will know if you have been penalised by the message that you would see in Webmaster Tools. If the message mentions ‘violation’ make sure that you:

  1. Read it
  2. Go to the Webmaster Tools forum if you don’t understand it
  3. Only file a Reconsideration Request AFTER you have cleaned everything up

Some tips for submitting a Reconsideration Request:

  • Check that it is not a previous violation issue
  • Do not try to hide any information, be totally honest
  • Do not submit multiple requests at the same time
  • Tell Google about the violation and what you have done to fix it
  • If you have been unable to remove all the bad links, send a list of the sites that you have been unable to remove from the index and explain that you have tried
  • Be patient submitting a request, they are checked manually

Mikkel was up next giving examples of some of the reasons why you may feel like you have been penalised.

Penalisation is not the same as filtering and you need to be clear on which is effecting your site. Only around 1% of sites suffer manual or automatic or manual penalisation, the rest are effected by filtering.

Keep up to speed on algorithm changes or updates and ensure that you are continuing to optimise for people rather than search engines! Changes to your website can cause drops in rankings whilst Google re-indexes the site, so be patient and don’t panic.

The audience were also provided with some other questions to ask if rankings and traffic drop:

  • Has your web server slowed down?
  • Did you make changes to your site?
  • Did you add, amend or remove any code?
  • Have you changed your URL structure?
  • Have your page templates been updated?
  • Any change in your publishing system?
  • Has your bounce rate or other quality factors changed?
  • Did your site get hacked? Scarily, this is becoming more and more common

Craig Macdonald from Covario followed Mikkel and reminded us of what we should be doing to ensure we stay ‘friends’ with Google.

  • Clean up your content
  • Check site for content duplication regularily
  • Clean up your link profile
  • Try to get your keyword in the URL
  • H1′s still matter and should not be forgotten
  • When link building, continue to think quality not quantity
  • Page load time remains important and should be reviewed constantly

The final panellist for this session was Craig Danuloff from ClickEquations Inc and he had a different take on the subject and covered Paid Search penalisation, mainly Quality Score. The presentation was focused on how Quality Score can damage an account and the different ways to improve it. Some pointers are:

  • Split keywords into more targeted ad groups
  • Continuously optimise ad copy and creative
  • Ensure ad text matches the keywords
  • Send traffic to a relevant landing page
  • Add a privacy policy to your site
  • Populate landing pages with targeted Meta tags

Social Signals & Search was up next and this was by far one of my top sessions of the two days. Loads of insightful information and theory behind the data. Four people on the panel again; Bas van den Beld from State of Search, Cedric Chambaz from Microsoft, Marcus Taylor from SEOptimise and Jim Yu from BrightEdge.

Bas was first and his presentation skills and content was fantastic. The whole audience were engaged in what he was saying and I really think everyone would have taken good pointers from his presentation.

He was talking mainly about how we should be constantly looking at how we connect search and social together. Taking Gooogle as an example, they are now indexing all different social elements in their results~:

  • Blogs
  • Flickr
  • Twitter
  • Facebook
  • Etc

Search engines are building on the whole Social Search experience by using data that is being pulled mobile search, personal search and local search. Bas had a theory that Google introduced the +1 feature, not to compete with Facebook and try to figure out which sites are popular and which aren’t but to gathering more data on searchers. As you hit the +1 button then they know what sort of sites and web pages you like and what you do not.

A few other things covered by Bas that not everyone would be aware of were:
Google News – you can now tell it what sort of news you want. Again, the search engine will be using this data to build up your social profile
Twitter Profiles - no longer being indexed. If search for name, it is the tweets that appear rather than the whole profile
Google Profiles - private profiles will become redundant by 31st July 2011 and all profiles will then become public

Marcus Taylor from SEOptimise followed Bas and this was his first time presenting at a conference. The feedback he got on Twitter throughout the session was awesome and should have boosted his confidence more for next time. Natural presenter :)

He shared the results of a test that he trialled to find our how important Facebook likes are for Search. He took two unindexed domains, one two week old and the other two years old. Both had zero backlinks and no ability to ping Google (a setting common in WP). He progressively added likes to each domain.

The Result
0 visits from Googlebot
0 pages indexed in Google

This concludes that you can not ‘directly’ influence rankings with Facebook likes BUT Facebook can drive a lot of quality traffic. That traffic can then produce links which impacts rankings. So Facebook likes are important in an indirect way.

Jim Vu from BrightEdge covered the importance of integrating social media on websites. He three must haves are:

  1. Social Bookmarking option
  2. Links to main social profiles
  3. Add the ‘share like’ button on content

One tip that I thought was very good and actionable is to build a parallel social media structure and interlink each profile to one and other. Take Cadbury for example, they have their main Facebook page plus lots of other pages for the different types of chocolate goods.

Cedric Chambaz from Microsoft showed some of the things they are doing to integrate Search and Social together.

www.bing.com/social will return you with the live search results of what is being talked about on Facebook and Twitter at the present time

www.bing.com/maps can be used to search for a brand and see what people are saying about you and where they are located

His closing point was that everything you do in Social will eventually appear in the Search results AND what happens in Search can influence your Social Media performance.

Well, that is my recap of the presentations I attended on Day 2 at the SMX Advanced Conference in London. Looking forward to going back again next year and catching up with everyone that I met this year and seeing how far the digital marketing space has moved on in a year.

Samantha Noble

Samantha Noble

Samantha Noble is the Marketing Director at Koozai; having worked within the marketing industry for over nine years, Sam has a plethora of marketing knowledge. With a strong understanding of digital marketing techniques, Sam will be covering all aspects of search and the industry in general.

Subscribe To The Koozai Blog