We love digital - Call
03332 207 677 and say hello - Mon - Fri, 9am - 5pm
Call 03332 207 677
Unlike 08 numbers, 03 numbers cost the same to call as geographic landline numbers (starting 01 and 02), even from a mobile phone. They are also normally included in your inclusive call minutes. Please note we may record some calls.
As Search Engine Land announced in early June, the Google Panda refresh is imminent. In case you aren’t sure what I’m referring to, Google carries out a variety of algorithm updates that combat certain activities on the Internet. There are four main Google algorithm updates and these are referred to as, Hummingbird, Penguin, Pigeon and of course, Panda.
So with the Google Panda refresh on the way, what can you do to safeguard your site from being hit by the latest updates?
Before we go any further, it’s important to establish what Panda actually is.
Panda essentially looks for thin, low-quality content and spammy on-page keyword techniques, such as keyword stuffing and hidden content.
In contrast, Penguin is after spammy, unnatural or unethical link-building techniques. We often refer to this as ‘black hat’ link building.
When any of these Google algorithm updates aren’t respected and the rules aren’t followed, Google can hit a website with either a manual penalty or a warning. You can read more about Google’s manual penalties in a previous Koozai blog post.
The main steps we recommend taking to ensure that your site is as safe as it possibly can be prior to the Panda refresh are as follows:
Let’s look at each of these areas in more detail.
One of the best tools to check for external duplicate content is Copyscape. This tool is easy to use and comes in both free and paid versions. It’s an extremely quick way to help you detect any scraped content that features on other websites.
The most effective way to use Copyscape is by combining it with Google Analytics to find the pages on your site with the most page views. I usually work on 200 URLs at a time, and if it’s a larger site I’ll aim to check another 200 URLs each month.
The tool works via a colour-coded system, which uses a scale ranging from yellow to red to indicate the level of risk in accordance with any external duplicate content located.
You can then click into each of the colours and see how many occurrences of duplicate content there are by viewing the amount of URLs in each section. Copyscape then shows you the percentage of words on the page that have been scraped by another site (or, in some cases, inadvertently by your site).
When external duplicate content is found, you can either re-write it on your site, or if it’s been scraped you can contact the site responsible and request that the content is removed.
Internal duplicate content is content within your site that isn’t unique.
Unique content is one of the most important aspects of a website as it improves user experience and site visibility in search engines too.
A good tool to use for detecting internal duplicate content issues is Siteliner. This tool identifies exact matches throughout your website and highlights the percentage of duplicate content found, even down to the individual page this appears on.
Similarly to Copyscape, it colour-codes content by how much unique and duplicate content is on your site. Siteliner can also tell you how the amount of duplicate content compares to other similar sites too.
Upon finding internal duplicate content issues, your options to fix this include:
It’s always important to ensure that every page on your website contains at least 250 words minimum.
The more text there is on a page, the more likely it is that Google will understand what the page is offering.
Screaming Frog can be used to check for thin content by detailing the word count on each page. You can check each of your site pages and if there are below 250 words, the page can then be marked to indicate that the word count should be increased.
If pages do not meet the recommendations of these content guidelines, consider rewriting them or adding more relevant content to them.
‘Above the fold’ refers to the top half of a page which is visible when users click on your site. I’ll use Koozai’s services page as an example:
The image above displays what users will see when clicking on this page without having to scroll down on the right-hand sidebar.
Ideally you want all of your content to appear ‘above the fold’, but you have to consider the user here too. If having all of the content above the fold compromises the design of the page too much, the main thing to keep in this section is the H1 header tag.
In the example above, you can see that Koozai’s H1 tag, ‘Digital Marketing Services’, is there for users to see straight away. Making the H1 tag visible is important: this is relevant to both the user and the search engine, as it helps them understand what the page is about.
If there is lots of unnecessary content below the fold or if the page contains a large number of adverts, this sends negative trust signals to users and search engines so bear this in mind.
Spammy content is any unnatural content on a website. If content is there for any reason other than to enhance user experience, it may be considered as spammy.
The main spammy content that you can check for is keyword stuffing. While keywords are used for relevance, we recommend using two to three keywords spread out across a page of content – but this has to be natural.
The term ‘keyword stuffing’ refers to the unnatural and endless packing of keywords in any given page. This technique is unethical. Search engines recognise this use of keywords as a direct attempt at misconduct for the sole purpose of improving a site’s rankings for particular search terms.
You can check for keyword stuffing with a number of tools, or simply by searching for them in the text and seeing how many times they appear. If this is more than three times, including within the title or H1 header tag, further work needs to be done.
If you’ve found that your site is using these techniques in order to attempt to rank better, then you can rewrite your content in a more natural and ethical way with the user’s experience in mind. When your website is recrawled by search engines, they will pick up on this and notice that the content now meets their guidelines.
Image alt text is used to describe the images used throughout a website. It is important to include this on your site, primarily for search engines. When a search engine crawls your site, it can often struggle to understand images. This is where the alt text comes into play, making the image easier to interpret and providing context for each one used.
By describing your images, you make it easier for search engines to provide users with the relevant information during an image search. Furthermore, if, for whatever reason, an image fails to load on your page, the alt text will be shown in place of it, highlighting to the user what exactly should be visible.
Keywords are important here, especially when they relate to the content of a page, where image alt text could help build keyword relevance for search engines. Missing image alt text can be identified in Screaming Frog and can be easily added to your website in the CMS or WordPress.
It’s important to add image alt text because if there are pages on your site that aren’t described by these tags, your HTML becomes invalid and you aren’t following Google Search Console guidelines.
If you ensure these checks are regularly performed, especially before the refresh of the Panda update, you will help to increase your search rankings and allow more traffic to enter your site and stay on your pages for longer.
Please leave me a comment if there are any other tools you use or checks that you’ve undertaken in preparation for the Panda refresh. If you’d prefer to contact me via Twitter, I can be found at @Sally_Newm.
We’re excited to announce that we’re launching a series of free Breakfast & Learn events for brand-side marketers. Our digital marketing experts will help you to boost your SEO, paid media, paid social and content marketing knowledge over breakfast.