Get SEO basics right

Understanding SEO is vital to executing it correctly. Neil Erlam, CEO of Netbiz, offers some basic technical advice

Search Engine Optimisation (SEO) assists your company in improving its search engine rankings. This is easily the best way of getting your website seen online, without forking out for adverts, but how is this done?

Google webmaster tools publishes a great list of best practices to follow when optimising for its search engine, and following these rules will stand you in good stead. Below are more tips and advice, while you may not be able to affect all these things yourself, it pays to have some understanding about what you should be asking your technical team to do.

Web design
Your website’s design and content are extremely important, so create a useful, information-rich site that is not overloaded with links and has pages that clearly and accurately describe your content. Try and think about the words users would type into search engines to find your page, and make sure your site actually includes these words within its content.

Your site should have clear content guidelines, so it is a good idea to create an obvious page hierarchy and text links, each page should be able to be reached by at least one static page. A site map can be a useful tool for users, pointing out the most important parts of the website, and if the map has an extremely large number of links then consider breaking up the map into multiple pages.

It is not simply a matter of getting the right design, layout and content for your website that will increase its SEO ratings, there are also a number of technical guidelines that can help improve your online visibility.

Spiders and bots
Before a search engine can provide you with a link to a file or document, it must find the requested information. In order to sift through the hundreds of millions of web pages that currently exist, the search engine uses software robots called spiders to crawl through the pages and compile lists of the words used in them.

Using a text browser such as Lynx will allow you to view your site without graphics and focus only on the text content while examining your site. Most search engine spiders see your site as Lynx would, so if fancy features such as JavaScript, cookies, session IDs, DHTML, or Flash keep you from seeing your site in a text browser, then search engine spiders may have trouble crawling your site.

Allow these search bots to crawl your sites without session ID’s or arguments that track their path through the site. While these techniques are useful for tracking individual user behaviour, the access pattern of bots is entirely different and using these techniques may result in incomplete indexing of your site.

Make sure your web server supports the If-Modified-Since HTTP header, as this will allow your server to inform Google whether your content has changed since it last crawled your site and, in turn, this should save you bandwidth and overheads. Make use of the robots.txt file on your web server, it will communicate with the bots regarding which directories can or cannot be crawled, useful for avoiding publicising some search results pages or other auto-generated pages that don’t add much value for users coming from search engines.

Try to use text instead of images to display important names, content, or links as the Google crawler doesn’t recognise text contained in images. If you must use images for textual content, consider using the ‘ALT’ attribute to include a few words of descriptive text – make sure your

Related content

Access full article

B2B strategies. B2B skills.
B2B growth.

Propolis helps B2B marketers confidently build the right strategies and skills to drive growth and prove their impact.