Call 0845 485 1219

Anna Lewis

Review of SMX London Presentations – Part 1 #smxlondon 2012

16th May 2012 News, Events | No Comments


I’ve just spent the last two days at SMX London where I’ve learnt a lot, given a presentation and met lots of new faces. Here I’m going to summarise key points I’ve learnt from the sessions I attended.

The event started with a keynote with Amit Singhal, a Google fellow who discussed how he came to be at google, how he rewrote the whole ranking algorithm when he joined and what google are moving towards now.

I think his biggest take away was his explanation of  Google working to understand ‘things’ not just keywords. A thing was described as being an entity, whether it’s a person, place, event or popular topic. This understanding moves the googlebot away from being a robot and closer to being intelligent.

He also explained how search plus your world (SPYW) aims to show personal and private content without anyone else seeing it. This is the First step towards indexing everything you have in your universe and gives you control to remove personalisation. As search marketers, we are worried about personalisation but Amit explained that personalisation will not take over the results, user has control and diversity is still always there.

Other good things Amit said include:

Results should be ordered by relevance, personalisation is harder to order and test

As screen sizes are changing the design will also have to change

Google have a large pool of human raters to test their updates to design

The search engine that can measure best what users like can provide the best results

Moving from link graph to knowledge graph

Direct answers becoming more common, over pages of results

Hampered if you don’t know what things are, have to understand what words are not just see them as keywords

The New Periodic table of SEO

This session saw talks from Marcus Tober and Marcus Tandler, with extra insights from Dave Naylor.

Marcus Tober from Search Metrics provided some very clear graphs of data showing correlation of ranking factors. Highlights included:

Social signals have a high correlation to rankings, so does the number of back links

Ads could be a problem for rankings, including Adsense, affiliate ads, etc.

Everyone is connected, social media knows about an earthquake or plane crash before anything else knows.

Google+ has a very high correlation, because of personal plus ones.

Social is very important, it correlates well with high rankings

This doesn’t mean a tweet is a strong ranking factor, but it helps

Search engines do not look at the volume of tweets etc, they look for quality and user signals

Social signals are user signals

Google know when you bounce

Why social is a signal? They are recommendations, the historic prominent ranking factor, was manipulated so is not such a strong ranking factor and social is now being taken in to account more.

Back links are still a strong factor but quality matters

Volume counts for something, position 1 needs an average of 2000+ links

Position two and three have more links with target keyword than other positions

Proportion of no follow links is higher for sites ranking in top three positions, probably because it is more natural to have those

Same applies for stop words, which are not keywords, include brand name, read more, sentences, random things, the more of these you have the more likely you are to be ranking in top three positions.

Become a brand, on page relevance is decreasing, you don’t need to put your keyword in the title anymore. The rankings sites now fit the query but are not the most optimised site

Marcus Tober’s Conclusion 

Social factors are important

Analyse what your competitors are sharing

Back links are still a key factor

SEO is not dead, it’s just that the parameters change

Become a brand – says it’s the most important factor for him over the last year

Make sure the user can find your useful content

Have a usp, why should you rank above your competitor

Ads could be a handicap

Email marcus@nullsearchmetrics.com for the white paper

Marcus Tandler spoke about anchor text,

Google doesn’t want to rank the site with the best SEO instead they want to rank the site that is the best result for the user

Over 65 percent of keyword exact anchor is bad

Said that exact match anchor is dead, however, the example was quite extreme, 90k links and brand term wasn’t one of the top ten anchors, this is a very unnatural link profile and the site suffered.

Being a brand isn’t enough, you need the brand links to reflect it! A brand should have more anchors than commercial keywords.

Links are still king.

Google knows traffic, dwell time, bounce rates, due to the volume of sites using chrome, Adsense, Analytics, toolbar. Over 75% of the top million sites have something of Google’s on their site.

Google wants to know which sites are getting a lot of direct or similar traffic, which do you use without needing a search engine? Must mean its good enough for the user, high volume of this may be a strong signal.

Links that are hard to spot are likely to get clicked less, links in the middle of an article may get clicked more, so is a bigger ranking factor and will count for more.

Google knows which sites visitors of a website also visit and may use this data.

Dave Naylor’s most important point was that Negative SEO has always been around but is now a bigger problem than it was in the past.

Life in a (Not provided) World

Scott Krager and Duran Inci presented ideas about what to do now that google Analytics is hiding keyword data behind (not provided).

Scott asked why the data is being hidden:

Is it for privacy, not really because you can buy it

Is it for profit? Screws with Adsense dynamic advertising based on queries

Is it for PR? One day someone might complain

Does it matter why?

Always track goals and conversion rate, these can be more important than keywords.

Even for not provided traffic, be honest about the traffic and conversions about it.

Assume that keyword data is all going away

His prediction is that organic will all be encrypted one day.

Look at the URL report to see the percentage of organic traffic that each URL receives, this might actually be more insightful.

Make sure you look at the percentage change by date.

So, how do you get data back?

Find keywords in WMT, at page level

Compare to page level keywords in google Analytics

GWT may have more queries that are in the not provided bucket

In Adwords, look at impression share of each keyword in its own ad group

Only test keywords on exact match

Keyword research with a budget is best!

Try and get a at least a weeks worth of data

Think of rankings as weather stations, you see the change there before you see the impact in google Analytics.

Build better reports, including conversions and URL level data to take the issue away.

Duran Inci showed examples of how much not provided is counting for, it seems to be up to 50% but each vertical is different.

Duran explained several methods for using scripts and spreadsheets by seo.com and Lunametrics.

A spreadsheet can be used to estimate the percentage of brand and non brand keywords based on the provided percentage.

There is also a Python script for pc users that downloads search data which can be useful, if a little techy.

Find performing and not performing terms at landing page level. What kind of content is working?

Download search query report from google, review trends, cross ref with WMT

Duran also made SEO recommendations such as writing power articles for content marketing.

Pierre Far chipped in during the questions, apparently, many people are analysing WMT data wrong, thinks people are not using the filters when they should be, i.e. web, images, desktop etc. you need to make sure you’re comparing the right data as the data is WMT is right and should match to google Analytics, bar the standard reporting differences.

98 percent of data is accurate in WMT.

Advertisers apparently have more right to the data, particularly in real time, however, SEOs apparently don’t need the data. This makes me think that google do not like SEOs and want to discourage them, as an SEO I like to describe myself as an online marketer for websites, why should I be denied this data when people serving ads are allowed it?

Complying and coping in the new world of regulated global marketing environment

Andy Atkins-Krueger explained some great points:

The ASA are going to have to be the ones enforcing the laws for website advertising etc

Not all eu countries are implementing it the same

Germany are in a grey zone, unlikely to be strict

Italy has implemented but not enforced opt in

France has got strict laws, toughest in Europe, Netherlands also tough

Spain is less moderate

UK has had messy implementation!

ICO say that Analytics should not be a problem

People doing nothing will be punished more than those trying to implement something

Unlikely to be penalised if your cookies are not intrusive, said ICO, which is lucky as 90% of Analytics data disappeared due to users not opting in on the ICOs own website when they implemented their solution.

Who will take action on enforcement for international sites?!

If someone complains you’re likely to be in a spot of bother!

Ignorance of the law is no excuse

Three strikes and your out, under uk digital economy act relating to copyright infringements

Demonstrate that you have taken action for the cookie law and you should be fine.

Anthony Haney said that the ICO recommends using splash pages and pop ups, the things users hate!

My thought is That you should try and work out a nice looking solution that won’t screw your data and that has a good opt in rate, split test etc.

Check your cookies on a regular basis anyway, regardless of this law. It’s good to know what’s on your site.

Get people in your company involved ASAP as you reeeally need to get the ball rolling soon.

Watch what others are doing

Privacy policy in the footer, is on the way to compliance.

You have to use cookies to remember your decision not to use cookies, IRONY!

No clear policy on how the law will be policed but it is likely to be four stages:

Information notice

Undertaking

Enforcement notice

Monetary penalty

So you should get warnings before penalty.

How do you get around implement a limiting solution?

Consider your user perspective. How much are they going to want to opt in?

Craig McDonald, Product marketing at Microsoft explained that Intention tracking has been going on in direct marketing for decades, through credit card tracking. This info is then sold to companies related to their activity so companies can advertise to relevant users.

The Intention is not stalking, instead it’s to create relative advertising.

So why is the standard different in online activity?

82 percent of digital marketers think its a bad idea

80p of consumers think its a bad idea

Most consumer did not know what it was

Consumers assume its malware

Politicians are reacting to a non issue

There is a Fine line between tracking and surveillance!

 

In marketing you’re Always under pressure to understand intent as much as possible, You wouldn’t want to show a steak restaurant to a vegetarian! Cookies are they to give consumers what they’re looking for, it’s such a shame they’re being given such a bad name!

Share this post

About the author

Anna Lewis

Our resident analytics specialist is Anna Lewis. Anna is unbelievably attuned to anything analytical and can fill you in on all the latest news, tips and advice to get ahead in this evolving market.

What do you think?

Digital Marketing Ideas Every Month

Sign up to receive our free monthly email. Including our favourite pieces of news from the digital marketing industry.

From SEO to PPC, Social Media to Brand Management and Analytics, we'll keep you informed.