Getting under the bonnet - technical SEO

SEO part 6 - getting under the bonnet with technical optimisation

Transcription

Now I'm not technical so this is going to be more about what to do as opposed to how to do it but I just want to run through some of the essential elements of your SEO plan from a technical perspective.

So the first is speed. Speed is absolutely everything - it's been important for a long time from an SEO point of view but it's something that Google has repeatedly spoken about more than ever in the last few years since mobile usage has just grown and grown, so if people can’t access that webpage quickly then Google is never going to prioritise it in the listings. So that’s the first thing.

The second is having SSL certification. Google said a couple of years ago that if a website doesn’t have SSL certification then they wouldn’t know if they could trust it and rankings might suffer, but I would add to that that without SSL certification, very often people will be entirely blocked out of websites so it’s a disaster from a user experience and brand point of view.

The third thing is structured data. So you may have heard of something called schema which is a form of structured data that enables Google to make sense of what a piece of content is - so is it a recipe or a cinema time or a review, etc - and can therefore give you the credit you deserve for your beautifully designed and richly packed landing pages. Whereas if you create all this great content but you don’t have the appropriate structured data in place then Google won’t necessarily be able to understand what each thing means. It will also have an impact on click through rate from the search engine results page because a lot of that structured data can get pulled through to the SERPs, so you might see cinema times or event information or review details often pulled through directly to the search engine results page, which can have a really significant impact on click through rate.

Then there is the general technical maintenance of a website. I think sometimes we can be guilty of approach SEO as a set and forget scenario, but there needs to be a company policy for how you’re going to monitor this on an ongoing basis. For some websites that might need to be daily while for others that could be monthly or quarterly, but there needs to be some kind of policy in place that says that every x period you will use whatever diagnostics tools you have at your disposal, which to begin with may just be search console, to assess the health and performance of the site and create a list of new tasks required to fix any of the inevitable issues that arise with any site that is constantly growing and evolving.

See you next time.


On, page optimisation essentials, SEO

SEO part 5 - On-page optimisation essentials in 2 minutes

Transcription

The days of obsessing over keyword density and trying to stuff keywords and phrases into every last sentence and header are thankfully long gone, however we do still need to help the search engines and ensure that they can actually understand what it is that we want each page to be targeting so things like title tags and image alt tags are still really really important.

So, starting with the with title tag which is the first thing that Google will see when it arrives on that page - typically you'll just you'll have one or two key phrases at the start followed by the brand and you want to be containing it to about 70 characters or so. Then with the image alt tags if we can include keywords within these fantastic. They don't immediately appear to the user so we don’t have to worry too much about that so it’s another really great opportunity to build that relevance to the search engines. And the final thing is the URL structure so as long as we're not making the URL too too long, including some keywords is really sensible because as much as anything else then if people link to that page using that URL then it's a really nice natural way of including some keywords within that anchor text.

Those are where we can be quite pushy with the keywords. We we need to rein it in somewhat is with the content that's actually visible to the user, so with headers and with the body copy I'm definitely not saying don't include keywords. In fact more often than not some level of keyword inclusion is quite important to the user because it reassures the user that they're on the right page and that this page is going to cater to that search query that they just made the brought them there. However, we must ensure we don't step over that line and start to provide a spammy user experience. It’s very similar with the Meta Description. The Meta Description doesn't directly affect rankings but it does of course hugely impact likely click through rates which arguably in turn affect rankings so it is very important from a search engine perspective and once again it's really important that we are including keywords because that's going to reassure the searcher that this page is relevant and the search engines will often bold that text so it really draws the users attention to that Meta Description so really worth doing but just so important that we're always prioritising the key messages of the brand.


SEO Landing Page

SEO Part 4 - Creating Killer Landing Pages

Transcription

There was a time with SEO when we wouldn't worry too much about the content of landing pages from a user's point of view. As long as the page ticked certain technical boxes and provided all the right keywords then that was fine, but those days are long gone. Now one of the most important things we can do is put ourselves in the shoes of the user and consider the almost infinite array of expectations and intents they might have whenever they make a given search query.

So, for example, let's imagine that we run an accountancy firm and we want to construct the perfect landing page targeting tax advisory services in London. Now what we what we have to do, as I say, is put ourselves in the in the place of that audience and think about all those different things that might be going through their minds when they make that search query. Sp for example are they hoping to find technical information, are they hoping to find case studies and testimonials from people or businesses just like them who have purchased those services and to see what sort of experience they had. Are they hoping to find local address information and a London telephone number, and perhaps they are hoping to see links to relevant resources so that they can research the subject matter first before before making that enquiry.

We have to try and tick as many of these boxes as we possibly can and by doing so not only are we going to achieve far better rankings but we're also going to provide a much better user experience that inevitably leads to a higher conversion rate, so this could not be more important.

The only other additional suggestion I'd make is to avoid any excessive duplication of content. Now some duplication is inevitable because from a brand perspective consistency is key, so if you imagine that with each of these pages they could be the first encounter that your brand has with this individual so there are certain messages that you're always going to want to communicate which means that some duplication is fine, but if you find that this page has in excess of 20-30 % duplicate content then you've got a problem, and if you have enough of those pages that have
that level of duplication then at some point the entire website is going to get penalized so that's the other really important consideration to make.

So I'd really encourage you to really invest significant time researching the competition to see the different types of content that they're providing within those key landing pages and ensure that you are ticking as many of those boxes in the richest and most unique way that you possibly can.


technical SEO

The Ultimate Guide to Technical SEO

Technical SEO is an incredibly important but often neglected step in the SEO process. In most cases, if there are problems with your technical SEO, then it’s very likely that the effects of your other SEO methods will have much less of an impact.

As a result, it’s crucial that you at least have a basic understanding of technical SEO when delving into any other forms of SEO.

To the average marketer or website owner, Technical SEO may sound quite scary or rather boring, but in reality, most technical SEO improvements can be made in an afternoon and could solve months’ worth of traffic problems.

In this post, I’ll teach you the basics of technical SEO, alongside best practices and common problems while hopefully managing to keep you awake at the same time! Hopefully you will come away from reading this able to do your own technical audit.

What is Technical SEO?

When looking at SEO as a whole I like to split it up into three main pillars: On-Page SEO, Off-page SEO and Technical SEO.

The first pillar is On-Page SEO. This is related to content on your website and how it can be made more relevant to what a user may be trying to search for. Think of this as SEO that can be affected by you and the changes you make to content on your website.

The second pillar is Off-page SEO. This is the process on gaining links from other websites (often known as ‘link building’) in order to improve the trust of your website. Think of this as SEO that can’t always be affected by you and will improve over time as and when you gain backlinks to your website.

Lastly we get to the final pillar, the holy grail: Technical SEO. As mentioned earlier, this pillar is often neglected because the average marketer either doesn’t understand what technical SEO is, or has a basic understanding and thinks it’s too complicated to do anything about. Simply put, I like to think of technical SEO as the aspects of a website comprising of more technical problems that the average marketer wouldn’t be able to identify or fix. These are technical issues because they have nothing to do with the actual content on a website.

Technical SEO best practices and common issues

Now that you have a slightly better understanding of what technical SEO actually is, I’ll take you through a number of best practices and common issues that you can cross-check with your own website in order to improve its performance and ultimately, how it ranks on Google.

Below is a list of each section in this guide. Either work your way through each of the sections one-by-one, or use this menu to skip straight to a particular section.

Add an SSL certificate to your website to make it HTTPS enabled

One of the most important best practices over the last few years is to make your website more secure by enabling HTTPS with an SSL certificate. The easiest way to spot if a website has an SSL certificate is to check to see if there’s a padlock icon to the left of a website’s URL in Google Chrome. Check your browser now or take a look at the examples below:

HTTP Example

An example of a URL without an SSL certificate

HTTPS Example

An example of a URL with an SSL certificate

When an SSL certificate is installed onto your website’s server, your website will become accessible via https://www.yourdomain.co.uk as opposed to http://www.yourdomain.co.uk. Put simply, this indicates that any information transferred between your website and server (form completions, usernames, passwords etc) is encrypted and therefore more secure. The more secure your website is for your users, the more trusted your website will be by Google and other search engines.

If you are one of the lucky ones and your website is already HTTPS enabled, great! If not, determine which CMS (content management system) your website has been created in. Nowadays in paid for CMS’s like Wix and Squarespace, HTTPS is built-in and can be toggled on and off. With Wordpress or other CMS’s you should contact your hosting provider and ask them to enable HTTPS for you.

If you are one of the lucky ones and your website is already HTTPS enabled, great! If not, determine which CMS (content management system) your website has been created in. Nowadays in paid for CMS’s like Wix and Squarespace, HTTPS is built-in and can be toggled on and off. With Wordpress or other CMS’s you should contact your hosting provider and ask them to enable HTTPS for you.

Before moving onto the next check, it’s worth pointing out some common issues that can occur when HTTPS is not set up correctly:

  • Ensure that your website is set up to redirect to the HTTPS version of your website. I have seen cases with some websites when no redirect has been put in place and two versions of a website have existed, a HTTP version and a HTTPS version in which Google would index the website as an exact duplicate which is really bad for SEO!

 

  • In some cases, HTTPS is enabled on a website but the website is still not showing up as secure. This often happens when HTTPS is enabled on a website but there’s links to HTTP versions of an image in the code. Luckily this is a fairly easy fix – simply use Chrome developer tools to view the source code of your website and search for any media files that reference HTTP and change these within your CMS to HTTPS.

404 Pages

When we talk about 404 pages as a technical SEO issue, I’m talking about website pages that Google has indexed or users are still visiting, but no longer exist and are therefore are now 404 pages. This commonly occurs either when a page has been deleted but Google is still referencing that page in it’s search results or when a URL has been changed and people are still being linked to the old URL.

The number of 404 pages that your website has will depend on the size of the website. Think of it this way; the more 404 pages your website has indexed, the more likely it is that a user lands on a 404 page rather than a actual page on your website. If Google sees traffic on your website landing on 404 pages, it’s going to rank your website lower than another website with fewer 404 pages as a user is more likely to find what they are looking for on the other website.

In order to check if your website has any 404 pages, sign in to Google Search Console and navigate to Crawl → Crawl Errors (see image below).

Google Search Console crawl errors tab

This view will show you a list of all of the 404 pages that Google has found, as well as the dates on which they were found. If the list is large, consider downloading the list as a CSV file and add redirect URLs in a column to the left of 404 pages and get the developer of your website to set up the redirects. If the list is relatively small, consider setting the redirects up one by one within your CMS.

Once your redirects are set up, head back into Google Search Console and mark the 404 pages as fixed. Over the next few days Google will try to index those pages again and if redirects are found to be in place, 404 pages will no longer be indexed. If the redirects aren’t in place properly those pesky 404 pages will appear back in the list again alongside a new detected date.

Robots.txt and Sitemap.xml files

I have probably used the term ‘average marketer’ far too often in this guide, but again, robots.txt and sitemap.xml files are another aspect of technical SEO that I would not expect the average marketer to know of, let alone understand how they can affect your website in Google search results.

In order to explain what these are, I like to give some context. To generate a web index and, in turn, search results; search engines will crawl each and every website using what is known as bots or spiders. When a bot first visits a website it will have to read the robots.txt file. A robots.txt file is a code file that can be used to set rules about pages or elements of your website that you do not wish to be crawled. After checking this file and adhering to the rules, the bots will then find the sitemap.xml file if your website has one. Think of this as a map for your website, while we navigate through websites using menus and links, bots use the sitemap.xml file as a map to visit major pages of your website and in turn, eventually crawl every page that hasn’t been disallowed in robots.txt file.

An example of the journey a Google Bot takes when crawling a website

To see if your website has a robots.txt file try going onto your website and typing /robots.txt at the end of the URL, e.g. www.yourdomain.co.uk/robots.txt and likewise to test for a sitemap.xml file, this can be done by typing /sitemap.xml at the end of your website’s URL, e.g. www.yourdomain.co.uk/sitemap.xml.

But back to how these can affect the SEO of your website. While not having a robots.txt or sitemap.xml file won’t negatively impact the SEO of your website, having them will speed up the process via which your website is crawled or indexed, meaning that if you make on-page SEO changes, you will see much faster results.

If you don’t have either of these, they can be generated through your CMS or by contacting the developer of your website – or, in some cases, your hosting provider. In terms of best practise, I would always advise adding a link from your robots.txt file to your sitemap.xml file, through which both can then be submitted to Google via Google Search Console. That way when bots or web crawlers visit your robots.txt file (as they will always do first), they then have a link straight to your sitemap without having to search for it. It’s also worth looking out for pages that shouldn’t be disallowed as there’s always the chance that a key page of your website is not being crawled by search engines and wouldn’t appear in search results.

Image Hosting and Optimisation

I know what you’re thinking. How do images link with Technical SEO? Wouldn’t images be classified under on-page SEO? You’re right! The process of adding images to your website to improve your rankings would be considered on-page SEO; but, once added, there are still technical SEO checks that need to be made.

The first check is to have a look at the images on your website and ensure that they are hosted through your CMS. It may sound stupid, but you’d be surprised how often this occurs as lazy content creators and website developers decide to link to an image that is already being hosted elsewhere rather than uploading it to your own CMS. Believe it or not, we had an experience of this with a client last year – over the course of a couple of years and a number of new website builds, each developer after the next linked many of the images on the website to past dev sites thus creating a trail of image links spreading across 4 different websites. As a result, until this was discovered and fixed, the website performed poorly in search results as the website couldn’t be trusted. There’s also the chance that the images will suddenly become broken if they are removed from the website they are being hosted on.

broken-image-example

An example of a broken image on a web page

The next check related to images is optimisation. It’s easy to upload an image to your CMS without worrying about the size on the assumption that in most cases it will just scale to the size you need. While this is correct, this can have a detrimental effect on the speed and performance of your website as users are having to load images a lot larger than necessary. To give you a rather extreme example, take your average high-quality stock image that could be around 5000 x 3000px in size. If you add that to a slider section on your website that is actually only 1920 x 400px then you are loading a much larger image than you need to, especially on mobile. In order to check if any images on your website could be optimised, type your website URL into Google’s pagespeed tool and make a list of any that it highlights in the image optimisation section. Either reduce the size of these images or see if your CMS offers any sort of image compression such as a plugin on Wordpress. A very easy fix is to convert any non-transparent .png images on your website to .jpg as they will be a smaller file size.

Google’s PageSpeed Insights tool

Trailing slashes

You may or may not realise but something as small as trailing slashes can wreak havoc with your website’s SEO. Some websites will have a trailing slash at the end of their URLs and some won’t – what matters is that you don’t have both! Much like setting up your SSL certificate and forgetting to redirect the HTTP version of your website to the HTTPS, having two versions of your website with and without trailing slashes will be viewed by Google as duplicate content. An exact duplicate of your website in fact.

Checking for trailing slash problems is simple. Go to an inner page on your website and check the URL:

  • If there is already a trailing slash, remove the trailing slash and hit enter. It should redirect back to the trailing slash. If it doesn’t – you have a trailing slash problem on your hands!

 

  • If there isn’t a trailing slash, add a trailing slash and hit enter. Again, this should redirect to the original URL without the trailing slash. If it doesn’t redirect then likewise, you have a problem!

 

It’s unlikely that you will have a trailing slash problem and your URLs will just redirect unless you have made any major structural changes to your website, but if you do, take a look at adding a 301 redirect or bring this to the attention of your web developer and get them to take a look for you.

www. Or no www.

Much like trailing slashes, choosing to have www. or no www. at the start of your website’s URL is all down to personal preference and won’t have an impact on SEO. What will have an impact is having both.

Follow the same checks as you would with the trailing slashes, testing to see if your website uses www. or not and whether a redirect is in place. If a redirect is not in place and both versions exist, then again, you have a massive duplicate content problem which will severely affect the performance of your website in Google search results.

Although I have never seen this issue on any of the websites I have built or worked on, I know that this can be fixed by accessing the domains options within the cpanel or equivalent associated with your hosting. Alternatively, contact your hosting provider and ask them to take a look.

Broken links throughout website

A broken link is a link on your website that goes to a 404 page or a page that no longer exists. Broken links can be split up into two categories: broken links to pages on your website, or broken links to pages on another website. These can be found by either testing all of the links on your website or using a broken link checker tool like http://www.brokenlinkcheck.com/.

With broken links to pages on your website, these can be fixed by editing the content within your CMS and changing the link to the correct URL or a new URL. Usually as soon as these are fixed, you won’t have to worry about these again unless you change the URL structure of your website.

With broken links to pages on another website, these have to be checked more often. Broken links like these often occur in blog posts where a content creator links to a page on another website but that website changes the URL of that page or removes it entirely. As a result, it is worth checking the links within your blog posts every couple of months to ensure that all links are working.

Although broken links won’t have a huge impact on SEO, Google have been known to take broken links into account when ranking pages if there’s a large number of traffic being linked to 404 pages. There’s also nothing worse than a user stumbling across a broken link on your website. It may cause them to leave and never come back!

Mobile Friendly

After Google released its mobile-friendly ranking algorithm in 2015, ensuring that your website is mobile friendly is almost as important as any other SEO process. As much as 52% of global web traffic originates from mobile devices, so if your website isn’t mobile-friendly, chances are you’re missing out on a lot of potential traffic.

Nowadays, most website themes in all the popular CMS systems will be mobile friendly, but don’t just assume that your website is fine! The best way to check that your website is mobile friendly is to test it for yourself on multiple different mobile devices. Pretend to be a user and navigate around the pages on your website, testing functionality and links, making a list of issues as you go.

Although Google will class your website as mobile-friendly if your website scales down to mobile size, there may still be UX (user experience) issues that will require fixing. Almost every website I have ever built using a mobile-friendly CMS has needed slight mobile tweaks in order to make sure that all of the content is visible and functionality is working as intended. Mobile UX issues won’t be picked up by Google when it comes to ranking websites, but what will be noted is bounce rate and session duration on mobile and therefore this can indirectly affect your rankings.

Canonical URLs

If you aren’t familiar with canonical URLs, and more importantly the rel=”canonical” tag, then think of it as a simple way to tell Google which version of a page to take into account when indexing your website.

To give you some context an example I always use to explain the purpose of rel=”canonical” is e-commerce websites. Go onto any large e-commerce website, select a product category, scroll to the bottom and try to load more products. You’ll notice that the URL changes but the category remains the same. This is where rel=”canonical” comes into play. The rel=”canonical” tag can be used to tell Google that although there are 8 pages of products, the first page is the only one that should be ranking. This, in turn, reduces the risk of duplicate content as the last thing you want is for Google to be ranking 8 of the same product page.

Website Speed

Nobody likes a slow website and it’s commonly known that nearly half of all web users now expect a site to load in 2 seconds or less, often completely abandoning a website if it hasn’t loaded within 3 seconds. Furthermore, Google now takes website speed into account when ranking pages, so even small improvements to the speed of your website can result in improved rankings.

There are many ways to test your website’s speed but my favourite tool is Pingdom’s website speed test tool. Type in your website URL, start the test and see how long it takes for your website to load!

pingdom-speed-test

Pingdom’s Website Speed Test tool

This is the point in which I would usually go into more detail about each improvement that could be made, but as Pingdom already does that for you with performance insights, I’ll just list a couple of common suggestions:

  • Leverage browser caching – all modern browsers use some form of caching to save copies of web pages so that when you visit a website again, the browser can load elements of a page from the cache as opposed to the host server of the website.

 

  • Ensure that images are being optimised or compressed. As mentioned earlier, there’s nothing worse than trying to load a super large image on mobile. It’s just not necessary!

 

  • Reconsider your hosting provider. Although hosting providers like GoDaddy are cheap and reliable enough for a small business, the result of paying a little more and switching to a hosting provider that uses Amazon servers could be huge.

 

  • Minify your code files – this is something that can be done by the developer of your website or through the use of plugins/tools within your CMS. Minification essentially reduces the size of your code files, much like using a zip folder to reduce the size of multiple files at once.

It’s worth noting that Pingdom’s speed test tool isn’t 100% accurate as you’ll notice that if you do a couple of tests in a row the load time will be different every time; it is, however, a great estimation.

Schema Markup

Schema, sometimes referred to as Schema.org or Schema markup, is a vocabulary of HTML tags that can be added to content on your website in order to improve the effectiveness in which search engines can read your webpages and represent that in search engine results pages.

An example of some Schema Markup code

In a similar way that image alt tags provide search engines with a description of an image, Schema can be used to highlight different content types to search engines in order to help them better understand the content. For example, if you type “cinema times” into Google, more often than not you will be shown the showtimes of films at your local cinema above the search results. That’s just one example of how Schema has been used to tell Google exactly what the content on a web page is.

In order to start tagging the content on your own website Google offers a nice Structured Data Markup Helper tool in which you can generate Schema markup for a number of different content categories that often appear on websites: Articles, Book Reviews, Events, Job Postings, Local Businesses, Movies, Products, Restaurants, Software Applications and TV Episodes.

The most common use of Schema would be for products and services. By using the Schema for products on your e-commerce website you can give your products a number of properties: e.g. Product Review Rating, Brand, Product Colour, Item Condition, Logo, Material, Model, Product Name, Product Description and many more.

Although there’s not conclusive evidence as to whether Schema Markup improves rankings, what we do know is that search results in which content has been marked up using Schema have better click-through rates.

cinema-times-schema

An example of marked up content appearing in Google Search results

Boss Digital

When we first opened in 2010 we started as a pure SEO agency, specialising in technical SEO, on-page-optimisation and link building. SEO has changed enormously in recent years, but the fundamentals of solid technical optimisation are as important as ever. This is one of our first steps as it ensures the content we then publish on the website is able to achieve the maximum possible reach.

If you are interested in technical SEO as a service, please email hello@boss-digital.co.uk, give us a call on 01628 601713 or use the form below.

CONTACT US

Are you interested in our services? Contact us using one of the methods below.

\n



Website architecture, SEO

SEO Part 3: Designing the perfect website architecture for both SEO and UX

Transcription

Before we write a single line of code for a new website, we need to be really clear on where we’re trying to take this in the long term. We need a really clear sitemap that might not all be created before the site goes live but certainly provides a clear sense of direction for how this is going to evolve over time.

There are two distinct elements to this - the first is from a user experience point of view - what do we want people to do on the website and how are we going to get them to do it? The second is from an SEO perspective. Now to some extent these two things overlap, and with every year that’s gone by the degree tow which th they overlap has become greater and greater - we used to have all sorts of battles between SEO experts and UX experts.

Thankfully those times are changing and there’s an ever increasing overlap, so I’ll just give a very quick example. Let’s imagine we run a gym chain. The home page will naturally target certain keywords, and if we run a number of different locations then presumably we’re going to have a number of pages that each target those different local terms. Perhaps we might have a page for Chisqick that targets terms like “gym in chiswick” or “ladies gym in chiswick” or “budget gym in chiswick”. However, there are almost certainly going to be terms that it doesn’t cater to in any great detail. For example, maybe someone in Chiswick is searching for “yoga classes” and perhaps the gym does offer yoga classes but it might be 1% of what the gym does and therefore cluttering up the user experience with details of the yoga classes doesn’t necessarily make sense. On the other hand there’s no harm in getting that traffic and maybe we offer really good yoga classes and it’s something we want to be doing of more in the future, so we still want to be getting the attention of those people making that search query but we just don’t want it interfering with the primary user experience. So in that case we might create a separate landing page targeting yoga classes in Chiswick with really rich content about the instructor and the different types of yoga taught, and other information that’s going to add to the UX, but this is a separate page and not something that forms part of the primary user experience.

Now as you can imagine, doing this for some businesses can result in vast sitemaps. Some businesses that have hundreds of different products and services or different locations served, and with all the different variables and combinations of those things, you can end up with sitemaps that are thousands of pages large, so it’s not always practical to think that they’re all going to be created before the site goes live, but it is important that you have a sense of where you’re going to take this in the long term.

You may also want to consider searches that represent people earlier on in the sales funnel, so rather than just focusing on those who are ready to buy, we might want to target people as well who are earlier on in their decision making process. For example, rather than someone searching for “gym chiswick”, it might be more a case of targeting people searching for educational material on nutrition or exercise generally. This is really important as we want to be capturing this data and building these relationships. Now with pages like this we may decide that they belong more appropriately within a blog or resources section and we don’t necessarily need to create them before the site goes live and rather they’re going to form part of a broader content strategy, Again, it’s just so important, as with all of this, that we have a really clear sense of where we’re trying to take this architecture in the long term so that we don’t create any obstacles for ourselves in the short term.


Choosing your domain, SEO

SEO Part 2: Choosing the right domain for both SEO and brand

Transcription

Now depending on what stage you’re at in the process of developing your website, you may or may not already have a domain, and if you don’t have a domain and it’s a decision you’re looking to take, you’re probably thinking that it could have quite significant implications from an SEO point of view, and historically you would have been right. Pick the right, keyword rich domain, was considered one of the most important steps to developing an SEO strategy.

Not least because when people link to the domain they’re going to be including keywords in a very natural, organic way, so it definitely has its advantages. However, those benefits are vastly outweighed by the potential downsides if it’s not brandable. So, to give you an example, let’s imagine you’re looking to launch a brand called tvsonline.com. You could imagine people talking about it as a real brand, like they could trust it and it even having offline shops. However, tvs-online-uk-cheap.info - forget it. Nobody’s trusting that, so even if it benefits from some very short term artificial ranking inflation, it’s not going to be sustainable. People are not going to trust it as a brand and Google are going to see that people don’t trust it as a brand.

That’s the first consideration. You have to prioritise the sense of brand. Can you imagine this becoming a brand?

Secondly, we have to consider the extension. Now, as a general rule, if you can get a .com fantastic, but that’s not always that easy and if you’re audience is local. If it’s UK based, for example, then a .co.uk can work really well. .Net’s can work well. .Org’s can be really nice if you’re looking to develop a sense of community. I’m usually reluctant to suggest others, but again it comes down to the brand. For example, .London could work really well for certain brands. You wouldn’t necessarily expect any short term SEO benefit. You’d need to build those brand signals, but certainly, it can work for particular businesses.

The final consideration I’ll add, is the inclusion or exclusion of hyphens. Within the domaining world, there tends to be a bit of snobbery towards hyphenated domains. and if you can get the non-hyphenated version then that’s probably best. But not necessarily. Sometimes it can help break up the words and improve the legibility of the domain, so again it comes down to how you feel about it and whether or not you can imagine it becoming a real brand. And to be frank hyphenated are easier to find so you may not have a lot of choice.

So, those are all the considerations. Ultimately, it comes down, as I keep saying, to this one thing - brand.


First rule of SEO

SEO Part 1: The First Rule of SEO...

Transcription

The first thing to get in your mind when starting an SEO strategy, is to forget all about SEO. That probably sounds like an odd thing to say but the reason is simple. The days of very tactical, very technical SEO campaigns that are geared towards identifying the shortcomings of Google’s algorithm and using them your advantage, those days are long gone. If that’s your objective, to sneak under the radar, employ some dodgy tactics and artificially inflate your rankings, then forget it. You may enjoy some short term wins, but it’s never going to be sustainable.

Google has one objective in mind and that’s to provide the best content from the most trusted brands for any given search query and it is bloody good at doing it. So you need to be thinking about all the different things that contribute to that - you need to be thinking about the business you’re representing, the audience you’re trying to engage with, the content that’s going to satisfy their search intent, you need to be thinking about social media, and mobile UX and a thousand other things, but first of all you need to begin by adopting that mentality - this is all about building the best content and the most trusted brand.


The Marketing Genius Archive 2

The Marketing Genius Archive #2 - Marshall McLuhan, The Medium Is The Message

Transcription

Over 50 years ago, Marshall McLuhan wrote that the medium is the message, by which he meant that the way in which we consume information is more important than the information itself. So the introduction of the telephone was more important than any individual message that passed through it, likewise for radio, television and of course social media.

I have mixed feelings about this - on the one hand I believe that in the world of social media businesses are often guilty of putting the platform first without any sense of the content they want to create. They jump onto the latest channel out of fear of falling behind the competition And I believe that great content is great content and if you get that right, then to some extent the channel will take care of itself.

However, this is of course a slight oversimplification and I, like many marketers, would probably benefit from occasionally reminding myself of McLuhan’s mantra. The channel we use not only determines who see’s the content, but also the format in which it needs to be presented, the duration for which they will consume that content, the weight, the gravitas that they attach to that content, so for example an ad in a magazine is likely to make a far greater impact than an ad at the bottom of a YouTube video, but perhaps most importantly it determines the objective of the content and the Key Performance Indicator that will be used to monitor that objective, so if the goal is to maximise brand reach then radio or instagram may be the most appropriate channels, where as if the goal is to capture email and nurture the relationship over time then the blog or resources is likely to be more effective.

So, I probably have to concede that McLuhan that was onto something, but I still maintain that it is those brands that master their message and their content that will be around for the next platform to emerge, and the next 50 after that.

See you next time.


Slowing down to speed up - the importance of laying the right foundations before launching a content marketing campaign

Transcription

When new projects launch there tends to be a rush to get content out and ads running. Key decision makers want to see stuff happening. They want to see signs of early progress.

The trouble is that this that activity that they are so keen to see, is there to feed into something else, so if the something else isn’t correctly set up then you’re not going to see the full benefit of the work you produce. Certain things need to exist first. It’s a bit like when you see someone stick spoilers on an old fiat punto. You sort of feel like they’ve missed the point.

The most common examples of this that we see online include:
- A lack of a coherent brand visual identity
- Poorly designed websites that aren’t going to convert
- Weak technical or on-page optimisation that means even as you grow the authority of the domain, the site still won’t bring in targeted traffic
- The lack of a strong email capture as this is often an important secondary objective of a website
- Goal and event tracking that enables you to make sense of what’s happening on the site

Until these foundations in place, forget the content calendars, press releases and media plans, as the return you will get from them will be just a fraction of what it could be.

This is a lot easier said than done as these decision makers by their very nature are highly impatient, they want to see things happening yesterday. All I would suggest to help alleviate this pressure is to try and get a direct response campaign up and running as quickly as possible. That could be Adwords PPC, Facebook direct response, or an offline lead generation campaign, but something that will drive results early so that you can then justify to that decision maker the need for serious investment in the brand, the user experience, and all of the optimisation surrounding the website, so that moving forwards they get the greatest possible return for every pound spent.

See you next time.

Dan


Red Light Marketing - Proven to get you traffic

Red Light Marketing - Proven to get you traffic

Are you struggling to bring people to your site? Is your engagement low? Are your google analytics results the stuff of nightmares?

Here at Boss Digital, we understand that successful marketing is all about visibility. So this April 1st, we’re launching our new ‘red light’ plan - proven to get you traffic. Our specially designed program, which has been in development for months, is available in three foolishly good packages:

Rush Hour

Designed to bring as many people to your site as possible in a short time period, this special marketing algorithm condenses all your daily traffic into a one-hour time period between 8-9am or 5-6pm for maximum exposure.

Gridlock

What use is getting people to your site unless you can keep them there? Improve your bounce rate with our exclusive line of code which traps visitors to your site on specific pages for unspecified lengths of time.

Tailback

Good websites have a streamlined navigation map to help visitors move easily between different pages. Great websites employ our new tailback feature, which redirects visitors back to the original landing page they arrived on for an extra click-through boost.

Want to find out more about what our red light marketing schemes can do for you? Contact us at afools@boss-digital.co.uk