8 common misconceptions of conversion optimisation
The terms “conversion rate optimisation” (CRO), “testing” and “marketing optimisation” will often mean different things within different organisations. It’s no surprise then that many misconceptions exist around what optimisation actually involves. Here, we help marketers get to grips with CRO, in all its forms, by dispelling those common myths.
1. ‘‘Conversion rate optimisation is just about improving conversion rates’’
In many ways, conversion rate optimisation is a victim of its own name, with businesses often under the illusion that it’s solely about improving conversion rates. While this is an important aspect of your strategy, it’s vital that companies take a holistic view.
A retailer may, for example, focus on improving sales. This may subsequently lead to a big surge in conversions – but if you’ve doubled your returns rate you could actually be reducing business profit.
Similarly, while businesses may often find themselves focusing on one aspect of conversion rate optimisation, they will often attempt to embark on a three or six month project in the belief that it can give them a ‘quick boost’. Yet any optimisation programme should be a long-term strategy. Compare conversion optimisation to other marketing channels and activities such as SEO – do you consider those long-term projects? If you continue to drive traffic to your site, you should be optimising that traffic to ensure you generate the greatest ROI.
Adding to this are the constant changes in consumer behaviour. There will always be fluctuating factors – from changes to your target audience, your messaging, technology, as well as external factors such as shifts in the market. Organisations will find that what was once a winning test may need to be re-evaluated and tweaked on an ongoing basis.
A good optimisation methodology should be used to drive business growth, not just conversion rates. This can be achieved by testing larger scale business hypotheses before investments are made in order to reduce risk; including data driven website redesigns, marketing messaging or new products or services.
2. ‘‘It doesn’t work: we ran a few tests but our conversion rate went down!’’
The majority of tests should be delivering you positive results, providing the hypotheses are based on insight formed from a range of data sources such as user research and analytics, as well as using the persuasive layer and psychology principles to change behaviour. And yet sometimes, not every test will yield the expected results first time.
That’s not to say that it’s a wasted effort. Testing is an important part of the process that helps to manage would-be mistakes. They provide critical insight that businesses can harness moving forward.
Understand why the test didn’t succeed and then include these findings into the next hypothesis in order to take a negative result and turn it into a positive. Not every test is a winner, but if it teaches you something, then it has value.
3. ‘‘Why would testing yield better results than listening to our UX team or following ecommerce best practice?’’
The decisions you make about your website should be based on fact rather than opinion. To do this means collecting and analysing data, understanding what motivates your customers, as well as testing concepts. Ecommerce best practice doesn’t evolve with changing customer habits or external factors, but is instead a common denominator that may or may not be applicable to your situation. In short, the nature of best practice means it’s reactive rather than proactive. Businesses that invest in testing will lead the way, rather than trying to follow what may have worked for others.
4. ‘‘We’ve already done testing on our other website so don’t need to do it on this one’’
Unfortunately, it isn’t as straightforward as this. Every website will vary and have different users, different products, a different interface and a different proposition. This of course means that the results from one site may not be valid on another; instead, the hypotheses will need to be reassessed. However, you’re likely to have a head start on your second site with some partially validated hypotheses to research and test on other sites.
5. ‘‘We have a new site coming up – it’s better to wait until this is launched before doing CRO’’
A conversion optimisation programme will give you some insight into which areas of your website need improvement before redesigning them. Without testing, a redesign is a subjective project, relying on instinct rather than data. Through testing, you can try different variations in the early stages of the redesign to see what really works and which areas the customer is engaging in. This is also a cost-saving exercise. By avoiding the need to build unnecessary elements, you will be saving budgets and making your redesign a speedier, more agile process.
This is particularly important when considering that a website redesign can take a year or more to come to fruition. That’s twelve months in which you’ve lost the revenue uplifts you could have been generating on your existing site, which could more than pay for your impending redesign. The fact that a redesign takes this long also adds more risk, with consumer opinion and expectation often shifting from what you’d originally planned to deliver.
6. ‘‘All you need to carry out testing is the tool’’
There are now a multitude of tools on the market that organisations can use to start testing. But this is only part of the equation. The testing tool is the means by which you test but it does not improve the quality of your test hypotheses or generate results in itself. Testing is one cog in the optimisation process, yet we see a lot of businesses solely focusing on delivering X amount of tests. Instead, you should be focusing on quality in order to deliver the greatest number of successful tests generating your business profit.
Rather than thinking that optimisation is just about testing, following a process to get the best hypothesis to test is crucial. Below is the process we follow to deliver the best results for the businesses we work with at PRWD.
PRWD Growth Methodology™
7. ‘‘We’re planning on doing this in-house so don’t require external help’’
It’s fantastic that you understand the importance of optimisation for your business – you’re clearly already one step ahead of the curve. You might also be surprised that a lot of clients we work with also have an optimisation team in-house. Despite this, they work with us as they recognise that there are limitations to the work they can do in-house. It could be the lack of a particular expertise (e.g. little or no capabilities to carry out user research, copy expertise or understanding of psychology & persuasion techniques ) or it can simply be they want to validate the work being done in-house from an expert, and get an impartial view.
8. ‘‘There is no room in our development schedule for this’’
The majority of tests can be done via testing tools, meaning no in-house development is needed. For more innovative tests, where a developer is required, brands can look to the expertise of a conversion optimisation specialist.
Ultimately, optimisation will become your developers’ favourite thing. Up to 15% of IT projects are abandoned and at least half of a programmer’s time during projects is spent doing rework that is avoidable. This causes headaches and frustration for your development team. Through optimisation and testing, little work is needed upfront to test and only elements that are actually going to work will be put into your development schedule, cutting down on changes that need to be made. If your backlog is excessive, winning variants can also be maintained in the testing tool, so you can reap the rewards straight away until your developers have time to deploy the changes to the live site.