Recapping live from BrightonSEO 2018 with round ups of the talks taken place so far. We’ve got some key takeaways from the search marketing conference, as well as their slide decks.
If you’re a BrightonSEO fan (and who isn’t), here’s the rundown of today’s talks:
If you’re looking for how to get buy in for your SEO strategies, then this talk is for you. Steph explores how to stay abreast of big websites with multiple stakeholders.
If you have inherited a toolset you are not happy with, don’t be afraid to request a new one. You will often have to create a business case but it will be worth it.
If it can be automated, automate it. Work with developers and data analysts to see if any of your “grunt work” can be sped up.
Educate others in the business on SEO by tying the SEO goals back to the overall business objectives, or your colleagues’ specific departmental objectives. “Use the universal language of £££, not Schema”.
Ex-Googler Kaspar Szymanski explained how Google penalties for all to understand. The talk included how the penalties are issued, how to recover and perhaps most importantly: how to prevent Google penalties from happening.
Every Google penalty can be fixed and lifted.
Use as many tools as possible to evaluate backlink profile and begin link removal for high amounts of negative backlinks, chances are there aren’t humans at the ned of these links to communicate with about removal.
It doesn’t matter who built the links that get you penalised; whether it’s your predecessor or someone who didn’t know what they were doing, take responsibility as Google won’t take this into account, leave emotions out of reconsideration requests.
Former Google engineer Fili Wiese gives us a talk on how to optimise your website to allow search bots to crawl it more efficiently.
The usual key term with optimising for search bots is “crawl budget” but we should really change this to crawl prioritisation. When Google first discovers your website, it crawls it from the top-down (homepage first, following the normal structure). At some point it will stop doing that and become a random crawl. This means that the URL scheduler is kicking in and determines which URL gets crawled and when. Not every URL gets crawled equally or as often – some URLs get crawled a lot, some don’t. Links (internal and external) can be really useful for helping Googlebot to prioritise.
Things that can make Googlebot deprioritise your URLs include – duplicate content, low quality content, incorrect server signals (i.e. a soft 404 will make Google trust your server signals less)
We are currently in the interim period between crawls from desktop user agent to mobile user agent. Keep an eye on your server logs to identify your individual changeover date
Mobile and desktop crawlers are very different. Don’t disregard this changeover if you have a responsive site.
Migrate. Plan. And mitigate any risks. See it as an opportunity to free up budget to increase and improve revenue.
Alexis K Sanders delivers a rapidfire introduction to structured data using Schema.org for non-techy Marketers attempts to give anyone the confidence to take on any structured data markup challenge in 20 minutes!
Benefits: Enhanced SERP results – i.e. review stars, pricing information, event listings. All of these either give you slightly more SERP real estate or help to make your listing stand out more from the others. A study of different eCommerce sites shows that URLs that have rich results have a higher click-through rate. There is also an opportunity to improve relevance – Schema helps Google to understand content. Schema.org has also introduced a specification called “speakable” which may help with voice search in the future – something that many people are wondering how to optimise for.
For the search engines there isn’t a big difference. Microdata is embedded in HTML whereas JSON-LD is easier to implement through Javascript. However Microdata is better supported at this time. Even though Bing claims to support JSON-LD, analysis shows that they don’t support it too well yet – although it’s something that they are working on.
For the advanced users: Use Schema.org itself. It works like a dictionary of every single itemtype that you can use. They provide what the itemtype does, how much it is used online, and the different properties that you can assign. Also how to nest the different data types within your markup.
Using curly quotes instead of straight quotes! Mind your commas. Use Google’s structured data testing tool to check your work. Also, make sure that you aren’t violating Google’s policies, or you could end up with a manual penalty and all your hard work will be for nought.
Alexis K Sanders delivers a rapidfire introduction to structured data using Schema.org for non-techy Marketers attempts to give anyone the confidence to take on any structured data markup challenge in 20 minutes!
Take the time to look through the mobile site in context. We sit behind desktops all day. Sit down and navigate through websites on mobile.
Mobile crawlers are more tolerant of hidden content. HTML preferred – JavaScript is rendered, but we want to make it easy.
Generally, make sure the technical SEO provisions made on desktop are all visible on mobile. Don’t take anything for granted.
Bill recommends https://sitebulb.com for detailed crawl analysis, and for understanding where XML sitemaps may be causing index bloat
Bill has built an indexation tester (this is a big deal as to automate this you usually need to use proxy servers): https://bit.ly/index-tester
Find sites that used to link to legacy versions of your site using the Wayback Machine: https://bit.ly/wayback-linking
As marketers, we need to establish trust between consumers/businesses and brands in an environment where people don’t trust anybody. Here’s how:
“the web of apps”: It’s significant that we optimise our apps to appear high on search rankings. Play around with your categories for apps, make sure you get your keyword in to the title eg “{name of fitness app} – fitness” and include long tail keywords.
The ASO industry is reminiscent of SEO 10-12 years ago, there’s a lot of questions as we’re unsure about what the future is for apps
This includes App indexation
Sam Marsden from Deepcrawl gives us an overview of his new process for conducting content audits combining crawl data with other data sources. This new process is designed to help us to optimise crawl budget, avoid penalties, and make sure that our most important pages are not getting lost in the noise as well as assessing on-site engagement and search performance.
Sam starts off by talking about when Deepcrawl recently had to do a site redesign – this included moving over to a new CMS. As part of this, Sam decided that a content audit was in order – to discover the full inventory of content and then attach performance data to this content to decide what to do with it. Sam came up with a much more crawl centred approach than what most online guides suggest. He mentioned that cloud crawlers like deepcrawl can help because it means you aren’t limited by scale – you can crawl millions of URLs in one crawl and then bring other data sources, such as majesticSEO link data or search console data into this crawl. Sam believes that crawl data needs to be at the centre of your audit rather than being a source.
What are we going to do now that we have the data? 4 key questions when looking at relationships between different pieces of data:
1. What is and isn’t performing well on my website?
Now that Sam has given us some of the details of the process, he explains how to automate it – Schedule crawls. Use Deepcrawl’s Zapier integration to trigger crawls through different means i.e. through a calendar invite. Create automated rules. Pull your data into dashboards for continuous monitoring.
In this session we were taught how to pick – and more importantly, win – your SEO battles, as well as providing tangible results for your projects if you ever need to build a case for SEO in house.
Rachel suggests that perhaps the acronym should now stand for Search Experience Optimisation, as SEO is about so much more than optimising for search engines, User Experience is fundamental too.
Use the Barracuda Digital ‘Panguin Tool’ to map Google algorithm updates to your organic performance statistics, and easily see whether your site has been affected by an algorithm change
Branded3 have created a tool that calculates the estimated reduction in revenue that poor site speed can lead to: https://www.branded3.com/site-speed-conversion-rate/
View Danny’s own write up here
Lots of us marketers focus on words, copy and technicalities to sell our products to our prospects. However, addressing their emotional needs has proven much more effective in achieving CTR and other success metrics.
How to optimise your online presence to meet your audience’s emotional needs:
Identify and analyse your audience’s emotional goals using the limbic map of the brain
For example, those looking for an apple tart recipe are actually looking for love and praise from their friends and family. Those looking for cheap perfume are looking to identify with celebrities and smell irresistible. Those looking for free PowerPoint templates are actually looking for templates to inspire and impress their audience.
Use lots of implicit signals including as large images and colours, aspirational headings and layouts. “This is the person you will become by buying this product
Sadly our team live bloggers weren’t able to personally view all of the talks at BrightonSEO (there were five of us and haven’t quite mastered the art of cloning / astral projection), however all the remaining slide decks from the talks we didn’t see are available to view here:
Check our other blogs from this morning’s talks to stay up-to-date with all things BrightonSEO 2018.
Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget - it’s free, so why haven’t you signed up already?
Call us on 0330 353 0300, email info@koozai.com or fill out our Contact Form.
What do you think?