Website Migration and New Site Whitepaper

Another Kick-Ass Digital Marketing Guide



    About This Whitepaper

    Redesigning and restructuring a website can be a great way to freshen up your brand and help increase user conversions. This isn’t without risk though. From domain names to redirects, title tags to content, there is a multitude of things that can go wrong and damage a site, ending in lost traffic and decreased conversions. Following the instructions below will help sustain your rankings and maintain your current traffic levels.


    Initial Considerations

    Domain Name

    We would suggest sticking to the same domain name on the new site, as well as keeping as much under one domain as possible. This is so that any historic links that have been built will still be pointing to the relevant domain.
    You might be moving to a more preferred domain or include subdomains for logins and other areas, so this is sometimes unavoidable.
    Keeping everything on one domain helps focus your efforts in terms of marketing, hosting, upkeep, etc.

    Domain names

    You can check your preferred domain availability on websites such as




    Content Management System

    This might be dictated by developers and/or sales methods, but there are many more options than in previous years. Even if different departments want different CMSs, there will be options for integrations and communications between the two.
    If you have tech support in-house this is less important, but ideally, you should be able to do the following through your CMS:

    • Implement and change Meta descriptions and Meta titles independently so that it does not affect other features on the page such as headings
    • Implement text changes easily
    • Implement and change internal links easily
    • Implement Alt tags on images easily
    • Implement and change H1 and H2 heading tags around text easily and independently (so that it does not affect other features on the page such as navigation or layout)
    • Place permanent 301 redirects on pages of the website
    • Enable an XML Sitemap to be uploaded and replaced
    • Enable a robots.txt file to be uploaded and replaced

    Staging Websites

    If a staging website is being used for the build of the new website, you need to make sure that this staging instance isn’t indexed.

    This is essential, as it would otherwise run the risk of penalty for duplicate content.

    This can be done through various means:

    Password Protection

    Blocking crawlers and users from getting to the site through a login is the most effective way to stop bots and unwanted users from accessing your site.





    Putting a directive in your robots.txt file (on the root of the staging domain) can block crawlers from accessing the site.

    Note that this technique means users will still be able to see the pages, and it isn’t a 100% guarantee of blocking crawlers.

    NoIndex Tags

    These are tags on the page level and aren’t always abided by. This is more effort than the two above so should be a last resort if needed.

    Site Build

    Your site will need to be built in a language that is easily read and crawled, and there should ideally be a HTML alternative. Some new sites rely on JavaScript, iFrames and other languages. These can mostly be read by crawlers, but not as effectively as HTML.


    Redirects & Structure

    301 Redirects

    301 redirects need to be put in place to ensure that users and crawlers can find the new equivalent from the old pages. The redirects should be mapped out so that each old URL goes to the most appropriate new URL. For many pages this will be straightforward, as you’ll have direct equivalents for top pages such as services and categories.

    For other pages that might not have a direct equivalent, you can either redirect to the level above (such as a product to its parent category), or to the homepage.

    URL Structure

    You should also make sure that the new pages and new site structure is sensible, whilst being concise and informative. For example, “/services/” rather than “/our-service-offering/”. This helps users when looking at the URLs, as well as including clear keywords for crawlers to understand the structure.


    Canonicals are put in place to ensure that duplicate content isn’t an issue.
    The most common version of this is on category pages when you apply filters and sorting. This is vital functionality for users, but for crawlers this represents a duplication of pages and can confuse what should be indexed. In this scenario, the URL canonical should be the version without the query string and appended code.

    Canonicals diagram


    On-Page Optimisations

    Page Titles

    The page titles are one of the main parts of a page’s optimisation. If you’re building like-for-like pages then you can move these across with minimal adjustments. However, if you’re changing site structures or simply adding and removing pages, then you’ll need to see what traffic is going where and then optimise accordingly.

    This can become a complex task, and this is often one of those more suited to SEO experts, but you can get good information by looking through the search terms and landing pages to see where users land and for what terms.

    Meta Descriptions

    This is a similar situation to the page titles, in that they should be moved across where relevant, or updated as needed. It’s also important to update branding, telephone numbers, etc. as these things often go hand in hand with migrations.
    Meta Description Example

    Header Tags

    The header tag structure will invariably change with migrations and redesigns. Despite this, it’s important to keep the header structures intact, or improve on them in terms of their hierarchy.

    You probably have H1s, which should stay relevant and descriptive rather than being cut down to the bare bones. For H2s and beyond, a migration is a good time to assess these and separate content out on pages as needed.


    Content & Design


    Keeping internal linking to your main pages high and ensuring users can navigate easily is best practice in terms of SEO and for user journeys. Many redesigns result in reduced header menus or those focusing on style over substance. This can have a detrimental effect on your user experience and overall optimisation.

    When redesigning and creating a new header menu, make sure that it offers easy access to your top pages and at least matches the linking – or preferably improves upon it.

    Footer Links

    This is similar to the top menu in that it helps increase internal linking to pages, but it should also give useful links to users as this is the go-to place for contact, delivery info, terms and conditions, etc.

    Sharing & Social

    It’s still highly recommended to include some form of social sharing icons on your site.

    This doesn’t have to be a pervasive thing across the whole site, but instead it can be on your ‘end’ pages, such as products, courses, informational pages, etc. Social icons also work well on your website’s footer.




    HTML sitemaps are often seen as old school, but they’re still of use for real users and crawlers. They help with accessibility issues as well as giving a backwards compatible alternative to JavaScript menus and such. Robots and crawlers also like them as they give easy access around the site. Automated HTML sitemaps are a simple solution to orphaned pages and deeper, harder to access content.


    File/Code Upload

    XML Sitemaps

    Carrying across the sitemap.xml document is an important step to ensuring your migrated site can be crawled quickly and effectively once live. This would ideally be automatically updated when new pages are added/removed, but creating a new one periodically and updating it is the next best thing. Be careful if this is a manual task, as you’ll need to ensure the URLs are up-to-date and correct. Once live this can be submitted into Google Search Console to get crawled ASAP.

    Search Console Verification

    As mentioned above, getting crawled quickly after the migration is key to retaining traffic. You should already have a Google Search Console property, or you may have to create a new one. If you haven’t used this before you can find out how to set it up here.
    Depending on your hosting and DNS settings you may need to re-validate your domain property in Search Console. This is a fairly simple task, but if you don’t know where to access the information, it’s best to pass over to a developer.

    Google Analytics Codes

    Whether you’re using Universal Analytics, GA4, Google Tag Manager or another analytics software, you should ensure this is tracking and ready for launch.
    The best way to do this is to set up a different property for the staging site and ensure everything you want to track is working on there. Then at launch you can switch the tracking codes over to give the least possible interruption in your data collection.


    This file has been previously mentioned as a way to block crawlers and traffic on staging sites. Once the site is ready to launch and switches to live, you should ensure that the robots.txt is changed to the live version too.
    Make sure that anything which was blocked purely for staging is removed so that crawlers can access the site. This is a simple thing to change, but it can have drastic effects on your site. We have also seen it far too many times where a live site is blocked through the robots.txt file, so stay vigilant!


    Technical Considerations

    Image Tags

    When using the same images, make sure you carry across the Alt Tag descriptions. If you’re uploading new imagery, either copy over relevant data if replaced, or create new Alt Tags for images across the site.

    Site Speeds

    There are several aspects to site speeds, and it’s a known ranking factor, so you should aim for as high a score as possible.

    • Use Gzip compression on your site, helping compress the files and images
    • Use up to date web formats for images and videos
    • Compress images and other files before upload to ensure they can be delivered to users quickly
    • Minify your code where possible. This can be done through plugins to automate the task
    • Update your server settings to increase cache times in general.

    Custom 404 Pages

    A custom 404 page is recommended on all sites, helping you handle all errors as effectively as possible.
    This ensures that any non-existent URLs will resolve to a 404 page that you control, and it can help you direct users back to where they need to be. Having a good menu navigation can help users back after hitting a 404 page, but you can also utilise other pieces of code to send users back to recent pages, products, etc.



    Thanks for reading this whitepaper, hopefully, it will help you plan a migration and cover a wide range of eventualities. Through the implementation of the above, you should be in a good position to maintain current rankings and have a steady continuation post-migration.
    If you need further assistance, you can get in touch with us and discuss the specifics of your migration further.

    Koozai Logo

    Legal Notice

    This guide is not affiliated with Google. By reading this e-book, you agree to the following terms and conditions:
    Under no circumstances should this e-book be sold, copied or reproduced in any way except when you have received written permission. As with any business, your results may vary and will be based on your background, dedication, desire and motivation. Any testimonials and examples used are exceptional results, which do not apply to the average purchaser and are not intended to represent or guarantee that anyone will achieve the same or similar results. You may also experience unknown or unforeseeable risks which can reduce results. We are not responsible for your actions. The material contained in this report is strictly confidential.

    © The Koozai Group Ltd 2023