Call 0845 485 1219
With sites always evolving there are always changes you are going to have to make to the background elements in order to keep it working for search engines and users properly. Most of these changes are pretty simple and rarely cause problems. Others, if not properly implemented, can cause errors or even kill traffic and rankings until they are resolved.
In this post I will highlight some of the changes that require the most care. Like I said, they can be simple changes but if they aren’t made properly you can cause yourself a few issues and a bit of a panic when you look at the traffic data.
1. Incorrectly Implemented .htaccess file
Adding an .htaccess file to the root of your server has many benefits. It can be used to Redirect URLs, Block the IP addresses or domains of unfavourable traffic (like bots), change the 404 error page that is displayed to the user and even control Caching for better page speed.
“What could go wrong?”
How long do you have?… There are too many potential problems to list in the one blog, so I’ll mention two of the worst case scenarios that come to mind.
The first one would be that through incorrectly writing the code you bring down your entire site, leaving it inaccessible to the outside world. Not even the cool 404 page you made will show up! All your users will see is a server 500 error and your traffic (and perhaps business) will suffer as a result.
Another severe issue that poor implementation of the .htaccess file can cause is a really simple one. There have been occasions where people have copied a .htaccess template and forgotten to amend the URLs within the template. This has caused them to inadvertently redirect their entire site to the random guy who wrote the templates’ website or even in some cases ‘www.inserturlhere.com’. Simple to fix, but again could cause some measurable drops in traffic!
2.Robots.txt Inadvertent Blocking
When a search engine visits your site, the first thing they should do is take a look at your robots.txt file as this way they know what parts of your site they need to ignore and which parts they should head to first… Like the sitemap!
When the robots file is written it should look similar to the one shown below.
If however the file is incorrectly written you could accidentally block important pages from being indexed by search engines or sending the crawler to the wrong sitemap making the crawl it completes provide erroneous results. I don’t think it needs explaining too much what kind of damage blocking a page or pages from the index can do for their rankings in search engines. If it doesn’t index them, it won’t rank them.
3.Incorrect or Poor Redirects
Back in the realms of the .htaccess file is the redirect. Permanent or temporary, they should be done properly to avoid causing usability problems, traffic drops and even ranking issues. If they are done correctly you can fix those pesky 404 errors and redirect old URLs to the shiny new one you created. The problem comes when they aren’t properly written and an infinite redirect loop is created. This is bad! Not only does it generate network traffic it is really annoying for the user as they don’t get the page that they clicked through to not to mention the ranking drop worries…
4. Mis-implementation of the Canonical tag
Canonicalisation has been around for a while now and it is still tripping some people up when it comes to implementation. There have been cases where incorrect implementation has resulted in a totally different URL being ranked in search engines to the one that is intended to be protected. Ideally the canonical tag should be added to any page where you don’t want search engines thinking that your duplicated content on another page is intended at ‘gaming’ the rankings, like in the old days of wild west SEO.
For example I have the following sites with duplicated content (for some reason); www.myawesomesite.com/ and cool.myawesomesite.com/
The second URL is not my main focus but rather than take it offline I decide to tell search engines that the first URL is the ‘Canonical’ one by attaching the correct tag on the duplicated site not the canonical one! By doing this wrong you may find that the wrong page is ranked over the one you want or a totally different page is ranked over your site! Bad for traffic and bad for the SEO.
5. Sitemap Changes
When you make changes to the site by creating new pages or removing old pages it is important to update the corresponding URLs in the sitemap.xml file. Issues arise when you keep old, redirected links in the file or even leave disallowed pages from the robots.txt file. These can end up providing 404 errors or confusion in the search engine indexing, as it may look for a page in the sitemap and be redirected or find a disallowed one from the robots in the sitemap.
Its always best practice to resubmit the sitemap in Google Webmaster Tools whenever there is a change made to tell Google that there is something new to crawl.
So ultimately there will be changes to make to your site as it develops and evolves into what you want it to be; but as you make changes, think about the potential harm that may be caused by making an error in their implementation. Take a moment to double check before confirming them!
Do you have any horror stories to tell or any funny examples of these sorts of changes going wrong? Let us know!…Go on!
Bad Chain via BigStock
Copyright © 2006 - 2015, Koozai Ltd