The inconvenient truth of A/B testing

By: John McGarvey

Date: 7 October 2014

Should you A/B test your website?{{}}Why guess about how to improve your website, when you can actually measure exactly which changes have the greatest impact?

That’s the basic idea behind A/B testing. Because tracking what people do on your website is easy, you can make a change and then precisely measure what impact it has on your sales or conversion rate.

A/B testing is also called ‘split testing’. Learn the basics on Marketing Donut.

How does A/B testing work?

With A/B testing, you use an A/B testing tool to split your website traffic in two. Half of visitors see your original page design. The others see an edited version.

As visitors interact with your website, you track how visitors behave and measure what they buy.

Over time, you can see which of the two versions is generating more sales. In A/B testing terms, you can see which is the winner. Typical A/B tests might aim to determine:

  • What types of imagery work best on a landing page.
  • What key benefit you should use in your headline.
  • What button text generates more clicks.

Once you’ve proved which of your two versions is better, you can roll it out to all visitors — and move on to your next A/B test.

The awkward truth about A/B testing

In recent years, the popularity of A/B testing has grown enormously. Tools like Google Content Experiments, Optimizely and Unbounce help you run A/B tests even if you don’t have much technical knowledge.

When running A/B tests is so easy, it’s tempting to get carried away. Why bother with market research when you can just try out two options and let the results show — unarguably — which is right?

Well, if you’re a small company that wants to test everything, you’re going to run into a problem pretty quickly. Your website probably doesn’t have enough visitors.

In A/B testing, statistical significance is crucial

An A/B test is only truly useful if you have confidence in the result. You need to know, for sure, that the element you’ve changed is responsible for the improved conversion rate.

Here’s an example. Imagine you’re testing two variants of a landing page.

Half your visitors see the original version, which is converting at 1%. The other remaining visitors see a new version that’s converting at 2%.

At first glance, it looks like the new page has won. Doubling conversion rate is an impressive result — it could double your revenue, too.

But what if these were the full numbers behind that test?

  • The original version was seen by 100 visitors. One of those visitors clicked ‘Buy now’, giving the conversion rate of 1%.
  • The new version was also seen by 100 visitors. Two of those visitors clicked ‘Buy now’, giving the conversion rate of 2%.

Suddenly, things look different. The difference between pages is a single sale. If just one more person chooses to buy from the original page, the score will be even.

For many small business websites, these aren’t unrealistic traffic levels for a single page. However, you need many more visitors to be confident the improvement you’re seeing isn’t down to chance or seasonal factors.

A/B testing may take longer than you think

Most A/B testing tools will give you an indication of how much confidence you can have in the results of your test. Typically, you’ll want a confidence rating of more than 95% before using a test’s outcome to make a decision.

And that takes time. As a general benchmark, you’ll need at least 100 sales or conversions via each page variant before confidence in the result is that high.

If your conversion rate is 2%, that equates to 5,000 visitors to each page. But if those pages only receive a few hundred visitors a week, you’ll be waiting a while for results you can trust.

Have realistic expectations of A/B tests

Sites with massive traffic, like Google, have the ability to test 41 different shades of blue to see which performs better. But your average small business website simply doesn’t have enough traffic to run such detailed tests.

This doesn’t mean A/B testing is a waste of time. It can be a really powerful way to improve your website. But it’s important you go into any project with realistic expectations of how long it will take to get meaningful results.

Online marketing gurus often talk of ‘testing your way to success’. But more often, visitor numbers mean it takes time for a single A/B test to provide conclusive results.

And in practical terms, that means A/B testing is best used as one of many tools to improve your website experience.

What does the * mean?

If a link has a * this means it is an affiliate link. To find out more, see our FAQs.