Anyone can do an SEO health check and ring the changes, says Nick Capehorn, copywriter, Theme Group
There are many areas of SEO that warrant bringing in someone with expertise – quality link building, sound keyword research, good content – but a quick website health check can be performed by just about anyone. Here’s how you can find and address some key issues that could be holding your SEO back.
Due to the complex nature of SEO, you may well be hamstrung by issues invisible to the casual observer. So before you do anything, turn off your JavaScript. How you do this depends on your browser and computer type, however, a Google search for ‘turning off JavaScript’ will show you how.
JavaScript is used to pull images, Flash and other elements, on to the page that aren’t within the main code; while it’s on, you could be seeing content that’s stored elsewhere and may not be accessible to search engines. Anything that vanishes when JavaScript is turned off will probably be unseen by search engines – and you won’t be getting SEO benefit from them.
Avoid canonicalisation
Check how many domain name variations you appear for. Search for your website with:
- www.yoursite.co.uk/index.html (or /default.asp if you’re on IIS servers)
- yoursite.co.uk/index.html (or /default.asp)
- www.yoursite.co.uk
- yoursite.co.uk
Three of those four versions should redirect to your favoured way of writing your URL. Appearing for all four versions, with no redirecting taking place, means you have what’s known as ‘canonicalisation’ issues – leading to:
- Possible duplicate content issues (search engines think you have four identical pages).
- Watering down of links – if you have four links, and each uses a different variation, they count as one link to one of four pages, rather than four links to one homepage.
Fortunately, setting up a ‘301 redirect’ to your favoured URL will solve this. This will forward links from the other URLs on to the same address and pass on 80-90 per cent of the link juice.
Check your robots.txt file
This file, which can usually be seen by adding ‘/robots.txt’ to the end of your URL, shows any specific commands your site is giving search engines. The most common one that can cause SEO issues is ‘noindex’; this is usually used to avoid duplicate content problems – a site with two versions of a page, depending on text size for example, would only want one version indexed. So you’d set the large version to noindex meaning search engines wouldn’t crawl it.
This can cause trouble when key pages are accidentally listed as noindex, as search engines will ignore them. Your web team should be able to amend this.
Note: pages are set to ‘indexed’; if your robots.txt file doesn’t list pages as noindex.
Is your content on-point?
This assumes you have conducted thorough research to find your market’s popular terms. If you haven’t, you can get some ideas on search volumes by using Google’s keyword tool, Insights for search. Check the following:
- How many key phrases is each page targeting? Search engines like relevant content, so assign just one or two focused terms per page, with your highest volume/most competitive terms on the top level/most linked to pages.
- Have you optimised your title tags (the text that appears at the top of your web browser, and that usually forms the link text in the results pages)? This tag tells the search engines what your page is about – include relevant keywords only. Extra terms dilute the relevance, and the tag gets cropped at 70 characters anyway. Suggested style is keyword:company (‘Blue Widgets:Blue Widgets Inc’), unless you have a powerful brand worth mentioning first.
- Is your content relevant? Check it’s relevant to the term you’re targeting, and answers a user query – be honest.
- Are you writing for humans? Keyword stuffing is long gone. Using the keyword in the title tag, the header, and two or three times throughout the copy will tell search engines what that page is about. It’s more important to write compelling content that people will want to link to.
Who’s linking to you?
Thanks to Google updates punishing spammy links and over-optimised anchor text, a link audit is more vital than ever. Although it’s best to use a dedicated tool such as opensiteexplorer.org, a (very) rough idea can be gleaned by putting link:yoursitehere.co.uk (URL without the www.), or by looking at your Google Webmaster Tools account.
Things to look out for
- An unnatural level of exact match anchor text (the text that makes up the link). Sites should have a mix of keyword anchor text, company name, and even simply ‘click here’.
- Spammy-looking sites. If listing on directories, choose respected niche sites or popular local business sites – keep clear of those telling you a listing will boost your SEO, or poor quality article sites. Quality beats quantity.
- Lastly, although not strictly SEO, it’s worth checking search engines and social media platforms for negative mentions of your business.