Call 0845 485 1219
We love digital - Call
0845 485 1219 and say hello - Mon - Fri, 9am - 5pm
With this in mind, here is a list of four pages that every website should have for SEO and PPC.
As always, we’d love to hear your thoughts, so if you have anything you’d like to add, we’d like to hear it in the comments section.
This simple .txt document allows you to give instructions to web robots about your site. Using this file, you can prevent certain pages on the site from being indexed as well as direct the search engines to other pages on the site, such as your sitemap.xml file. However, it is important to remember that robots do not have to comply with the instructions set out in the file, so malware robots and email harvesters will most likely ignore your robots.txt file.
It is fairly straightforward to create one of these files, simply open notepad or an equivalent programme such as NoteTab Pro, and structure the file like so:
The Allow: / reference is simply telling search engines to index every page on the site, whilst the reference to sitemap.xml will direct search engines to this file as quickly as possible, allowing them to index my site quickly. When saving the file, ensure it is named as robots.txt and arrange for it to be uploaded to the root of the site so that it can be found at http://wwww.example.com/robots.txt.
If you want to test that the file you have uploaded is working correctly, I would recommend utilising the Crawler Access section within Google Webmaster Tools, located in Site Configuration -> Crawler Access -> Blocked URLs (robots.txt). If you scroll the bottom of the page, there is a section for testing. If after testing Google returns a 200 (Success) for the file, you know it has been uploaded correctly.
The above are merely the basics of what you can do with the file. You can also exclude pages from being indexed, but it’s recommended that you visit: http://www.robotstxt.org/ to find out more about this topic.
Simply put, this is a list of URLs that can be found quickly and easily by the search engines. Google, Bing or whatever search engine you regularly use don’t actually spend an indefinite period of time on every site. So to ensure all of your content has been indexed, it is more efficient to direct them to this file from which they can visit each page on your site.
When uploading your sitemap.xml file, ensure it is named as sitemap.xml and upload it to the root of the site so that it can be found at http://www.example.com/sitemap.xml. To ensure Google is aware of the file, it is recommended that you utilise the Sitemaps section located on the dashboard of your Google Webmaster Tools account. If you go to the Add/Test Sitemap section, you can submit the file to Google to ensure it is used the next time your site gets indexed.
4. Thank You
Finally, a Thank You page is key to tracking conversions on the site and help you calculate any return on your investment. Simply, this page is what is returned to the user once they fill out a form of the site. By tracking how often this page is displayed, you can begin to understand how many conversions can be attributed to SEO and PPC activity.
To set this page up, I would recommend building it at http://www.example.com/thank-you and set your forms up to direct users to this page once returned. I would also exclude the page from search engines via your robots.txt file and ensure it is not included in the sitemap.xml file. Once the page is live, you can set up conversion tracking in AdWords and Analytics.
Web design concept via BigStock
Copyright © 2006 - 2014, Koozai Ltd