Technical SEO is an incredibly important but often neglected step in the SEO process. In most cases, if there are problems with your technical SEO, then it’s very likely that the effects of your other SEO methods will have much less of an impact.

As a result, it’s crucial that you at least have a basic understanding of technical SEO when delving into any other forms of SEO.

To the average marketer or website owner, Technical SEO may sound quite scary or rather boring, but in reality, most technical SEO improvements can be made in an afternoon and could solve months’ worth of traffic problems.

In this post, I’ll teach you the basics of technical SEO, alongside best practices and common problems while hopefully managing to keep you awake at the same time! Hopefully you will come away from reading this able to do your own technical audit.

What is Technical SEO?

When looking at SEO as a whole I like to split it up into three main pillars: On-Page SEO, Off-page SEO and Technical SEO.

The first pillar is On-Page SEO. This is related to content on your website and how it can be made more relevant to what a user may be trying to search for. Think of this as SEO that can be affected by you and the changes you make to content on your website.

The second pillar is Off-page SEO. This is the process on gaining links from other websites (often known as ‘link building’) in order to improve the trust of your website. Think of this as SEO that can’t always be affected by you and will improve over time as and when you gain backlinks to your website.

Lastly we get to the final pillar, the holy grail: Technical SEO. As mentioned earlier, this pillar is often neglected because the average marketer either doesn’t understand what technical SEO is, or has a basic understanding and thinks it’s too complicated to do anything about. Simply put, I like to think of technical SEO as the aspects of a website comprising of more technical problems that the average marketer wouldn’t be able to identify or fix. These are technical issues because they have nothing to do with the actual content on a website.

Technical SEO best practices and common issues

Now that you have a slightly better understanding of what technical SEO actually is, I’ll take you through a number of best practices and common issues that you can cross-check with your own website in order to improve its performance and ultimately, how it ranks on Google.

Below is a list of each section in this guide. Either work your way through each of the sections one-by-one, or use this menu to skip straight to a particular section.

Add an SSL certificate to your website to make it HTTPS enabled

One of the most important best practices over the last few years is to make your website more secure by enabling HTTPS with an SSL certificate. The easiest way to spot if a website has an SSL certificate is to check to see if there’s a padlock icon to the left of a website’s URL in Google Chrome. Check your browser now or take a look at the examples below:

HTTP Example

An example of a URL without an SSL certificate

HTTPS Example

An example of a URL with an SSL certificate

When an SSL certificate is installed onto your website’s server, your website will become accessible via as opposed to Put simply, this indicates that any information transferred between your website and server (form completions, usernames, passwords etc) is encrypted and therefore more secure. The more secure your website is for your users, the more trusted your website will be by Google and other search engines.

If you are one of the lucky ones and your website is already HTTPS enabled, great! If not, determine which CMS (content management system) your website has been created in. Nowadays in paid for CMS’s like Wix and Squarespace, HTTPS is built-in and can be toggled on and off. With Wordpress or other CMS’s you should contact your hosting provider and ask them to enable HTTPS for you.

If you are one of the lucky ones and your website is already HTTPS enabled, great! If not, determine which CMS (content management system) your website has been created in. Nowadays in paid for CMS’s like Wix and Squarespace, HTTPS is built-in and can be toggled on and off. With Wordpress or other CMS’s you should contact your hosting provider and ask them to enable HTTPS for you.

Before moving onto the next check, it’s worth pointing out some common issues that can occur when HTTPS is not set up correctly:

  • Ensure that your website is set up to redirect to the HTTPS version of your website. I have seen cases with some websites when no redirect has been put in place and two versions of a website have existed, a HTTP version and a HTTPS version in which Google would index the website as an exact duplicate which is really bad for SEO!


  • In some cases, HTTPS is enabled on a website but the website is still not showing up as secure. This often happens when HTTPS is enabled on a website but there’s links to HTTP versions of an image in the code. Luckily this is a fairly easy fix – simply use Chrome developer tools to view the source code of your website and search for any media files that reference HTTP and change these within your CMS to HTTPS.

404 Pages

When we talk about 404 pages as a technical SEO issue, I’m talking about website pages that Google has indexed or users are still visiting, but no longer exist and are therefore are now 404 pages. This commonly occurs either when a page has been deleted but Google is still referencing that page in it’s search results or when a URL has been changed and people are still being linked to the old URL.

The number of 404 pages that your website has will depend on the size of the website. Think of it this way; the more 404 pages your website has indexed, the more likely it is that a user lands on a 404 page rather than a actual page on your website. If Google sees traffic on your website landing on 404 pages, it’s going to rank your website lower than another website with fewer 404 pages as a user is more likely to find what they are looking for on the other website.

In order to check if your website has any 404 pages, sign in to Google Search Console and navigate to Crawl → Crawl Errors (see image below).

Google Search Console crawl errors tab

This view will show you a list of all of the 404 pages that Google has found, as well as the dates on which they were found. If the list is large, consider downloading the list as a CSV file and add redirect URLs in a column to the left of 404 pages and get the developer of your website to set up the redirects. If the list is relatively small, consider setting the redirects up one by one within your CMS.

Once your redirects are set up, head back into Google Search Console and mark the 404 pages as fixed. Over the next few days Google will try to index those pages again and if redirects are found to be in place, 404 pages will no longer be indexed. If the redirects aren’t in place properly those pesky 404 pages will appear back in the list again alongside a new detected date.

Robots.txt and Sitemap.xml files

I have probably used the term ‘average marketer’ far too often in this guide, but again, robots.txt and sitemap.xml files are another aspect of technical SEO that I would not expect the average marketer to know of, let alone understand how they can affect your website in Google search results.

In order to explain what these are, I like to give some context. To generate a web index and, in turn, search results; search engines will crawl each and every website using what is known as bots or spiders. When a bot first visits a website it will have to read the robots.txt file. A robots.txt file is a code file that can be used to set rules about pages or elements of your website that you do not wish to be crawled. After checking this file and adhering to the rules, the bots will then find the sitemap.xml file if your website has one. Think of this as a map for your website, while we navigate through websites using menus and links, bots use the sitemap.xml file as a map to visit major pages of your website and in turn, eventually crawl every page that hasn’t been disallowed in robots.txt file.

An example of the journey a Google Bot takes when crawling a website

To see if your website has a robots.txt file try going onto your website and typing /robots.txt at the end of the URL, e.g. and likewise to test for a sitemap.xml file, this can be done by typing /sitemap.xml at the end of your website’s URL, e.g.

But back to how these can affect the SEO of your website. While not having a robots.txt or sitemap.xml file won’t negatively impact the SEO of your website, having them will speed up the process via which your website is crawled or indexed, meaning that if you make on-page SEO changes, you will see much faster results.

If you don’t have either of these, they can be generated through your CMS or by contacting the developer of your website – or, in some cases, your hosting provider. In terms of best practise, I would always advise adding a link from your robots.txt file to your sitemap.xml file, through which both can then be submitted to Google via Google Search Console. That way when bots or web crawlers visit your robots.txt file (as they will always do first), they then have a link straight to your sitemap without having to search for it. It’s also worth looking out for pages that shouldn’t be disallowed as there’s always the chance that a key page of your website is not being crawled by search engines and wouldn’t appear in search results.

Image Hosting and Optimisation

I know what you’re thinking. How do images link with Technical SEO? Wouldn’t images be classified under on-page SEO? You’re right! The process of adding images to your website to improve your rankings would be considered on-page SEO; but, once added, there are still technical SEO checks that need to be made.

The first check is to have a look at the images on your website and ensure that they are hosted through your CMS. It may sound stupid, but you’d be surprised how often this occurs as lazy content creators and website developers decide to link to an image that is already being hosted elsewhere rather than uploading it to your own CMS. Believe it or not, we had an experience of this with a client last year – over the course of a couple of years and a number of new website builds, each developer after the next linked many of the images on the website to past dev sites thus creating a trail of image links spreading across 4 different websites. As a result, until this was discovered and fixed, the website performed poorly in search results as the website couldn’t be trusted. There’s also the chance that the images will suddenly become broken if they are removed from the website they are being hosted on.


An example of a broken image on a web page

The next check related to images is optimisation. It’s easy to upload an image to your CMS without worrying about the size on the assumption that in most cases it will just scale to the size you need. While this is correct, this can have a detrimental effect on the speed and performance of your website as users are having to load images a lot larger than necessary. To give you a rather extreme example, take your average high-quality stock image that could be around 5000 x 3000px in size. If you add that to a slider section on your website that is actually only 1920 x 400px then you are loading a much larger image than you need to, especially on mobile. In order to check if any images on your website could be optimised, type your website URL into Google’s pagespeed tool and make a list of any that it highlights in the image optimisation section. Either reduce the size of these images or see if your CMS offers any sort of image compression such as a plugin on Wordpress. A very easy fix is to convert any non-transparent .png images on your website to .jpg as they will be a smaller file size.

Google’s PageSpeed Insights tool

Trailing slashes

You may or may not realise but something as small as trailing slashes can wreak havoc with your website’s SEO. Some websites will have a trailing slash at the end of their URLs and some won’t – what matters is that you don’t have both! Much like setting up your SSL certificate and forgetting to redirect the HTTP version of your website to the HTTPS, having two versions of your website with and without trailing slashes will be viewed by Google as duplicate content. An exact duplicate of your website in fact.

Checking for trailing slash problems is simple. Go to an inner page on your website and check the URL:

  • If there is already a trailing slash, remove the trailing slash and hit enter. It should redirect back to the trailing slash. If it doesn’t – you have a trailing slash problem on your hands!


  • If there isn’t a trailing slash, add a trailing slash and hit enter. Again, this should redirect to the original URL without the trailing slash. If it doesn’t redirect then likewise, you have a problem!


It’s unlikely that you will have a trailing slash problem and your URLs will just redirect unless you have made any major structural changes to your website, but if you do, take a look at adding a 301 redirect or bring this to the attention of your web developer and get them to take a look for you.

www. Or no www.

Much like trailing slashes, choosing to have www. or no www. at the start of your website’s URL is all down to personal preference and won’t have an impact on SEO. What will have an impact is having both.

Follow the same checks as you would with the trailing slashes, testing to see if your website uses www. or not and whether a redirect is in place. If a redirect is not in place and both versions exist, then again, you have a massive duplicate content problem which will severely affect the performance of your website in Google search results.

Although I have never seen this issue on any of the websites I have built or worked on, I know that this can be fixed by accessing the domains options within the cpanel or equivalent associated with your hosting. Alternatively, contact your hosting provider and ask them to take a look.

Broken links throughout website

A broken link is a link on your website that goes to a 404 page or a page that no longer exists. Broken links can be split up into two categories: broken links to pages on your website, or broken links to pages on another website. These can be found by either testing all of the links on your website or using a broken link checker tool like

With broken links to pages on your website, these can be fixed by editing the content within your CMS and changing the link to the correct URL or a new URL. Usually as soon as these are fixed, you won’t have to worry about these again unless you change the URL structure of your website.

With broken links to pages on another website, these have to be checked more often. Broken links like these often occur in blog posts where a content creator links to a page on another website but that website changes the URL of that page or removes it entirely. As a result, it is worth checking the links within your blog posts every couple of months to ensure that all links are working.

Although broken links won’t have a huge impact on SEO, Google have been known to take broken links into account when ranking pages if there’s a large number of traffic being linked to 404 pages. There’s also nothing worse than a user stumbling across a broken link on your website. It may cause them to leave and never come back!

Mobile Friendly

After Google released its mobile-friendly ranking algorithm in 2015, ensuring that your website is mobile friendly is almost as important as any other SEO process. As much as 52% of global web traffic originates from mobile devices, so if your website isn’t mobile-friendly, chances are you’re missing out on a lot of potential traffic.

Nowadays, most website themes in all the popular CMS systems will be mobile friendly, but don’t just assume that your website is fine! The best way to check that your website is mobile friendly is to test it for yourself on multiple different mobile devices. Pretend to be a user and navigate around the pages on your website, testing functionality and links, making a list of issues as you go.

Although Google will class your website as mobile-friendly if your website scales down to mobile size, there may still be UX (user experience) issues that will require fixing. Almost every website I have ever built using a mobile-friendly CMS has needed slight mobile tweaks in order to make sure that all of the content is visible and functionality is working as intended. Mobile UX issues won’t be picked up by Google when it comes to ranking websites, but what will be noted is bounce rate and session duration on mobile and therefore this can indirectly affect your rankings.

Canonical URLs

If you aren’t familiar with canonical URLs, and more importantly the rel=”canonical” tag, then think of it as a simple way to tell Google which version of a page to take into account when indexing your website.

To give you some context an example I always use to explain the purpose of rel=”canonical” is e-commerce websites. Go onto any large e-commerce website, select a product category, scroll to the bottom and try to load more products. You’ll notice that the URL changes but the category remains the same. This is where rel=”canonical” comes into play. The rel=”canonical” tag can be used to tell Google that although there are 8 pages of products, the first page is the only one that should be ranking. This, in turn, reduces the risk of duplicate content as the last thing you want is for Google to be ranking 8 of the same product page.

Website Speed

Nobody likes a slow website and it’s commonly known that nearly half of all web users now expect a site to load in 2 seconds or less, often completely abandoning a website if it hasn’t loaded within 3 seconds. Furthermore, Google now takes website speed into account when ranking pages, so even small improvements to the speed of your website can result in improved rankings.

There are many ways to test your website’s speed but my favourite tool is Pingdom’s website speed test tool. Type in your website URL, start the test and see how long it takes for your website to load!


Pingdom’s Website Speed Test tool

This is the point in which I would usually go into more detail about each improvement that could be made, but as Pingdom already does that for you with performance insights, I’ll just list a couple of common suggestions:

  • Leverage browser caching – all modern browsers use some form of caching to save copies of web pages so that when you visit a website again, the browser can load elements of a page from the cache as opposed to the host server of the website.


  • Ensure that images are being optimised or compressed. As mentioned earlier, there’s nothing worse than trying to load a super large image on mobile. It’s just not necessary!


  • Reconsider your hosting provider. Although hosting providers like GoDaddy are cheap and reliable enough for a small business, the result of paying a little more and switching to a hosting provider that uses Amazon servers could be huge.


  • Minify your code files – this is something that can be done by the developer of your website or through the use of plugins/tools within your CMS. Minification essentially reduces the size of your code files, much like using a zip folder to reduce the size of multiple files at once.

It’s worth noting that Pingdom’s speed test tool isn’t 100% accurate as you’ll notice that if you do a couple of tests in a row the load time will be different every time; it is, however, a great estimation.

Schema Markup

Schema, sometimes referred to as or Schema markup, is a vocabulary of HTML tags that can be added to content on your website in order to improve the effectiveness in which search engines can read your webpages and represent that in search engine results pages.

An example of some Schema Markup code

In a similar way that image alt tags provide search engines with a description of an image, Schema can be used to highlight different content types to search engines in order to help them better understand the content. For example, if you type “cinema times” into Google, more often than not you will be shown the showtimes of films at your local cinema above the search results. That’s just one example of how Schema has been used to tell Google exactly what the content on a web page is.

In order to start tagging the content on your own website Google offers a nice Structured Data Markup Helper tool in which you can generate Schema markup for a number of different content categories that often appear on websites: Articles, Book Reviews, Events, Job Postings, Local Businesses, Movies, Products, Restaurants, Software Applications and TV Episodes.

The most common use of Schema would be for products and services. By using the Schema for products on your e-commerce website you can give your products a number of properties: e.g. Product Review Rating, Brand, Product Colour, Item Condition, Logo, Material, Model, Product Name, Product Description and many more.

Although there’s not conclusive evidence as to whether Schema Markup improves rankings, what we do know is that search results in which content has been marked up using Schema have better click-through rates.


An example of marked up content appearing in Google Search results

Boss Digital

When we first opened in 2010 we started as a pure SEO agency, specialising in technical SEO, on-page-optimisation and link building. SEO has changed enormously in recent years, but the fundamentals of solid technical optimisation are as important as ever. This is one of our first steps as it ensures the content we then publish on the website is able to achieve the maximum possible reach.

If you are interested in technical SEO as a service, please email, give us a call on 01628 601713 or use the form below.


Are you interested in our services? Contact us using one of the methods below.