As the dust begins to settle from Google’s latest algorithm bombshell – we’re beginning to see a clearer picture of the damage caused and why. Whilst it’s too early to conclude the absolute affects, the inevitable consequence of sites losing rankings due to poor content, is that those with decent on-site copy should prosper.
Sites that have been reportedly affected are those that contain low quality content including article sites, hubs and scraped blogs [See: Google Algorithm Update: Quality Content is King]. Whilst on the face of it this sounds like bad news, there’s no reason why you can’t turn this update in your favour; particularly if you’re prepared to invest in your on-page content.
Whilst there are a variety of SEO tactics at your disposal that will still help improve search engine rankings, in light of the Farmer/Panda algorithm update, a lot of attention should be given to content. With many sites seeing a 90% drop in rankings there is a massive opportunity for you to capitalise and push your site forward in their stead.
Okay, so what should you do? Well, now is the time to get writing. Identify keywords related to your services and look at ways in which you can target them within your site, by generating valuable content that compliments your services.
1. Improve Individual Pages
1.1. Spelling and Grammar – Start at the beginning and get all of your content grammatically correct. Whilst Google may not be a grammar pedant, you need to make sure that the copy you’re producing is of a decent standard so as to benefit rankings and visitor satisfaction.
1.2. Sentence and Paragraph Structure – Make sure these are short and sharp. Sentences should be brief – around ten words. Paragraphs too should be brief – between one and four sentences long. You should also add lists and bullet points where necessary – this will break up the information making it aesthetically pleasing and search engine friendly.
1.3. Originality and Relevance – Every page your write must contain original and relevant content. Search engines frown upon duplicate content. Relevance is essential too – make sure the pages you write are pertinent to the products and services you are offering. If something looks out of context – then it probably is. This will confuse search engines and impact on your SERPs.
1.4. Keywords – When you write web pages, you should use relevant keywords to fully optimise your site. Do not ram your site with keywords here, there and everywhere – keyword density is taken into consideration and search engines will become suspicious if a page is overrun with similar words.
Using the tips previously mentioned; these should be applied to a variety of different pages on your website. Having a variety of relevant and unique content will work wonders.
2. Blogs
Writing a blog that is updated every day (or a couple of times a day) is a great idea. Having frequently updated content keeps your website fresh and active. Blogs are crucial to search marketing as they provide another way to be found by search engines and can allow you to build a strong internal network of links. Provided that it is unique and relevant, it will go a long way to helping you get noticed.
3. FAQs
Okay so this is to answer visitor’s queries, which is crucial to the user experience. However there is no harm in optimising the FAQs with keywords and anchor text linking to various pages within your site. Linking to and from the FAQs page will boost that page’s strength. You can also look for questions that are being asked (and not answered) online through a little cunning keyword research.
4. User Guides
Anything from ‘how to guides’ to ‘buyer guides’ are a great way to bolster your site, both in terms of the number of optimised pages as well as the value for visitors – in the same way that FAQs are. First and foremost this is to improve the user experience to make sure it’s as user friendly as can be. This will also help with adding keywords and linking anchor text to various pages on your website – ultimately strengthening each page in the process.
5. Site Map
If your website is large then it is essential to include a sitemap – even on small sites. This makes it easier for websites to ‘crawl’ and index the pages easily. A site that is well indexed has a greater potential of appearing high in SERPs.
In light of significant changes to Google’s algorithm, it’s crucial that web developers and webmasters take note of the message they are getting across. Even if you may have lost some link value during the update, there’s no need to throw in the towel – in fact you should be doing quite the reverse. By focussing your attentions to on-site content, you may be able to position yourself to benefit from others’ misfortune and give your SEO a shot in the arm.
One thing I’m not hearing a lot about is the effect on link juice. Many sites may have links on “Farm” sites and as their authority and rank decreases so will their link juice. In reality the amount websites affected by this change is much higher than 11.8%. IMO.
My blog has been scraped heavily throughout the years. People have been syndicating my work without MY PERMISSION. What then? I am not going to rewrite all my work — I thought all along Google would know what to do with these thieves. What is your recommendation for a heavily scraped blog and all those sites whose content have been stolen? It is downright impossible to rewrite my content — they are unique to me. I share my stories and thoughts in my blog and it gets republished elsewhere by idiots who cheat and steal. I want to sue for lost income on this one. I can DMCA and hire a lawyer if need be, but let me know what my options are.
Thanks for your comment Joe. Scrapers are the bane of any blogger, in fact a lot of our content ends up on plenty of other sites. The first thing that you need to do is find the culprits. Take note of their URL and the specific pages that have been scraped and then report them to Google through Webmaster Tools – https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1 (choosing ‘Duplicate Site or Pages’ option). Unfortunately that won’t always spawn immediate results, but it will at least alert them to the problem.
Google should be able to identify the original source (based on the time it is indexed) and as scrapers tend to be low PR, low quality sites, they often won’t be able to compete on rankings – so it’s still worthwhile continuing producing content. It is frustrating and certainly puts a lot of people off, but hopefully these latest algorithm changes will help to reduce the amount of scraping going on, simply because it won’t provide any value to sites stealing content.
Isn’t it about the time to start thinking outside the box?
1. not using Google for a full day
2. uninstalling Google Analytics (this alone achieves two wonderful things: site becomes faster, Google stops receiving any hits from your website’s users)
3. making a website run faster (remove the need for dynamic php/asp where not needed), hire someone to improve slow queries.
4. submit your newest pages to gigablast, yandex, taobao, etc
Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget - it’s free, so why haven’t you signed up already?
Call us on 0330 353 0300, email info@koozai.com or fill out our Contact Form.
4 Comments