Picture 1968. The Tet Offensive was turning the tide in Vietnam. Martin Luther King Jr. was assassinated at the Lorraine Motel in Memphis, and Richard Nixon won the race to the White House.
International Business Machines Corporation (IBM) was at the bleeding edge of computing technology – employing 240,000 people and working with NASA to put man on the Moon.
The company’s latest mainframe – the System/360 Model 91 – could run 16.6 million instructions per second, 500 times more than the previous model launched only three years earlier.
The nature of computing was changing forever. Complex tasks were now delivered in minutes and seconds instead of days and hours.
But how long is too long?
In Poughkeepsie, New York, an IBM employee named Robert Miller was researching the impact response times had on human behaviour.
In a paper published for a conference of the American Federation of Information Processing Societies (AFIPS), he theorised that the “degree of complexity of problems that can be solved by a human is dependent on how much information (and in what form) he can hold in short-term memory”.
Miller noted that human memory is never passive – and that “spontaneous noise from within the thinking system, as well as distractions from outside, can interfere with short-term memory contents, and of course, these effects rapidly increase when the individual has an awareness of waiting.”
He concluded that “tasks which humans can and will perform with machine communications will seriously change their character if response delays are greater than two seconds”.
Forty years later, in the New York Times bestseller SuperFreakonomics, Steven Levitt and Stephen Dubner referenced Miller’s work and gave his observation a memorable name: cognitive drift.
Based on experiences they observed in healthcare, they shortened the timeframe, advancing that cognitive drift occurs “if more than one second elapses between clicking a computer mouse and seeing new data on the screen”.
The message: people quickly get distracted. If computing responses are delayed by even a second, we often move on to the next task.
This is important today given increased mobile usage and given the browsing experience is often poor – even in developed countries. In a recent study by OpenSignal, the UK ranked 41st out of 88 countries tested for 4G speed. The US ranked ever lower, at 62.
Waiting for pages to load is a common occurrence, and it has a major impact on the success of lead generation campaigns.
The cost of cognitive drift to B2B marketers
As any demand marketer will know, lead generation campaigns are only as strong as their weakest link – particularly when paid media is involved.
Some of these weak links are well-known to marketers. For example, if you ask for too much data on a lead capture form, you will limit conversion – no matter how strong your content, or how accessible your browsing experience.
The impact of other campaign variables is less understood, and our experiences with clients suggested that landing page load time was one such variable to investigate.
To research the issue, we analysed 159,429 user interactions from lead generation campaigns – captured through our in-house Performance Benchmark System (PBX). We tracked the number of people clicking an ad, and then measured how many waited for the destination page to load.
We presented this data at B2B Marketing Ignite in July, and the results were significant
Landing pages with load times under two seconds experienced an average drop-off rate of 39.24%. For every 10 clicks an advertiser paid for, an average of four failed to make it to the intended destination. When load time increased to more than four seconds, the drop-off rate leapt to 73.56%. In other words, for every $1 spent on advertising, 74 cents was wasted.
This difference translated to a 2.3x increase in the leads generated from every dollar invested – just by reducing the wait for users.
Test your landing pages to understand if you have a problem. There are a host of free tools through which to do this, with Pingdom being one of the simplest and easiest to use. If you are technically savvy or have a developer to hand, then Google’s Lighthouse tool provides deeper insight.
3 steps to improve landing page load times
If you spot an issue, work through your landing pages methodically to identify areas for improvement.
1. Landing page content should be concise and to-the-point
Aim to convey your message efficiently and delivering against a single objective. Resist the temptation to add content that isn’t strictly necessary. Doing so risks diluting the impact of your message and adds weight to your page – slowing it down.
2. Consider technical ways to make your page as light as possible
Make sure images are compressed and use streaming services for video instead of hosting on your own site. Check that your codebase has been ‘minified’. Yes minification’s a thing – removing all unnecessary characters without changing functionality.
3. Host pages locally to your audience
Removing distance between servers and your audience is a great way to speed up a web request. If you’re unsure of where your audience is located, look to your CRM and web analytics for insight.
You may also consider specialist platforms, which take care of these points for you. Most marketing automation platforms offer at least some of this functionality, and there a growing suite of platforms dedicated to hosting optimised landing pages.
Take note, though: this space moves quickly. Load times can provide a competitive edge, but you need to stay on top of the latest trends.
If you can’t keep up, find a partner that can help.
The bigger picture: data highlights weak links in your campaigns
While fixing load time represents low-hanging fruit for many marketers, don’t forget that landing pages are just one link in the chain for a lead generation campaign. Data is your ally in understanding performance across this user journey.
• Start by collecting every data point your campaigns generate
You don’t know what data will be valuable until analysed, so we strongly recommend collecting everything.
• Use dashboard reporting to surface key metrics, and to make sense of the millions of data points modern campaigns can generate
Use data visualisation to investigate hypotheses you may hold, and to highlight weak links in a clear and logical way.
• You should also benchmark performance, against previous campaigns and against your peers or the wider industry
It’s critical you understand what good performance looks like to help identify areas to improve.
• Ensure a clear and constant focus on the outcomes that matter to you
Misaligned incentives in the ad industry mean the odds are stacked against marketers. Ad platforms and media agencies are incentivised to spend your money, and rarely to generate the outcomes your business needs. Keeping a clear focus on outcomes will help identify the weak links that matter to you, and ensure you maximise ROI.
• Don’t rest on your laurels
This process should be ongoing; you must optimise relentlessly. Build a culture within your team to constantly test, adapt, and correct strategy. What works in one campaign may not work in the next. Technology moves quickly, and it is on you to keep up.
The moonshots are out there
We will live in a time where marginal gains are embraced, often rightly, but it can feel as though there are few big steps left to take – that our focus should always be on incremental improvement.
Back in 1968, moonshots were being embraced quite literally, and that is something marketers of today can learn from.
Our data on landing pages suggests there is still low-hanging fruit out there, and areas where small changes can have a huge, transformative effect on campaign ROI.
Data must be your ally on this journey – helping you identify, and quickly act, on weak links in your campaigns. Without it, you are operating blind.