What if my A/B test generates a negative result and decreases conversions? Understandably, this scenario is frightening to many marketers. But the truth is that negative results aren’t the end of the world. In fact, they can teach you just as much as positive ones. In some cases, the insights you get from a treatment that tanked are just what the LPO Doctor ordered. Here’s a case study where a negative result leads to a winning version of a B2C landing page that increased conversions by 48.69%.

Background information:

Client: OK A.M.B.A, one of the major oil and energy companies in Scandinavia.

Product: Heating oil. OK has a great offer, where you can save up to DKK 1.150 ($200) the first time you buy heating oil. However, the offer itself is quite complicated as you have to choose a number of add-ons in order to get the full discount.

Landing page: PPC/SEO landing page designed to push potential clients on to the first step of the checkout flow

Optimization goals: Increase click through to the checkout flow (primary), increase number of sales (secondary)

Restrictions: We could not tweak the checkout flow. The checkout flow is rather complicated and has a significant impact on the number of completed sales.

The original landing page (control):

The control version was very copied heavy and lacked the visual support of the offer. In fact, the only graphic element was an image of a little girl washing her hands.

As mentioned, the offer is quite complex, and it requires quite a bit of information to understand the setup. The control version had quite a few links to sub-pages that explain more about the different aspects of the offer.

Step 1 of the checkout flow:

When you click the main call-to-action you land on step 1 of the checkout flow where you can add and remove different add-ons and calculate your final discount. The complexity of the flow naturally has a significant influence on the number of completed orders. As mentioned, we could not tweak the checkout flow.

Our variant (Treatment A):

We mainly focused on making the page more visually appealing and easy to interact with. We used a more relevant image and chose to show the calculation that lies behind the discount price, so potential customers could gain a greater understanding of the offer right off the bat.

We also chose to add more details about the add-ons, rather than sending visitors off to a number of sub-pages.

The first test Control vs. Treatment A: 

We were super psyched about our treatment and expectations were sky high when we launched the test. However, the results spoke for themselves, and there was no doubt that our treatment tanked totally.

Our beautiful variant underperformed by -30.27% measured on CTR to the checkout flow. It also underperformed measured on sales, however, we did not reach statistical significance on this conversion goal – in this case, we deemed it too risky to let it run longer than absolutely necessary.

Follow-up experiments designed to isolate friction areas: 

We were very surprised by the results, and naturally, this wasn’t the result we’d hoped for.

But being experienced testers, we had already prepared the client for the fact that optimization is a scientific process – not magic – and that it sometimes takes several tests in order to get the necessary insights to achieve a significant lift.

In order to get new insights, we did something that many might view as, well… insane. We took all the PPC traffic and sent it to our treatment – the loser.

Doing so allowed us to run a number of smaller follow-up experiments to isolate friction areas and elements that directly influenced prospects’ decision-making processes. During these experiments, we learned a lot and isolated a number of friction points – marked by blue circles on the image.
The main learning was that the visualization of the calculation actually was a huge backfire. Also, the button copy and the image had a measurable negative impact.

Treatment B:

We took all our learnings from the previous tests and the combined result became Treatment B.

Control vs. Treatment B:

We held our breath, crossed our fingers, and launched a new test where we tested treatment B against the Control version.
When the test reached statistical significance, we were happy to be able to conclude that Treatment B outperformed the Control by 29.5% measured on CTR to the step 1 of the checkout flow, and by 48.69% measured on sales.

Main Takeaways – What you can learn from this case study

“The goal of a test is not to get a lift, but rather to get a learning…” Dr. Flint McGlaughlin, MECLABS

Landing page optimization is a scientific process and the primary goal is to get a learning. As long as you understand what happened in the test and the results give you new insights, it essentially isn’t important whether the initial test results are positive or negative.

Of course hitting a home run in the first swing is easier on the ego. But when you approach optimization as a scientific process – not a one-off opportunity to swing for the fences – you’ll see that stopping at a few bases along the way is often what it takes to win the game.

One might be inclined to view the first test – Control vs. Treatment A – as a bad test. But in fact, it wasn’t a bad test at all. It was an important first step towards the winning version that created a dramatic lift in conversions.

Moreover, this case study is a good example that the only way to be sure that your optimization efforts are in fact optimizing your website performance is to put them to the test.

Had we blindly trusted in our experience and not tested this landing page, we would actually have sold the client a page that underperformed significantly to what they already had. You might want to think about that next time someone offers to optimize your website without mentioning the word test.

Ben Gheliuc

Founder @BeemDigital

Ben is an Avid digital marketer that loves geeking out on Marketing Campaigns, PPC and SEO. With over 6 years of industry experience, he's been able to stay ahead of the curve on exactly what works to always deliver an ROI.

You'll find him battling Darth Vader dolls with his kids and the next hour diagnosing CRO for an Enterprise Company. You'll find him battling Darth Vader dolls with his kids and the next hour diagnosing CRO for an Enterprise Company.

Ben Gheliuc

Founder @BeemDigital

Ben is an Avid digital marketer that loves geeking out on Marketing Campaigns, PPC and SEO. With over 6 years of industry experience, he's been able to stay ahead of the curve on exactly what works to always deliver an ROI.

You'll find him battling Darth Vader dolls with his kids and the next hour diagnosing CRO for an Enterprise Company. You'll find him battling Darth Vader dolls with his kids and the next hour diagnosing CRO for an Enterprise Company.
Share This
0
Comments

Leave a Reply

Your email address will not be published. Required fields are marked *