How I Got 37% Conversion in My First Ever A/B Test

AB Testing for conversion ImgA/B testing is always a myth. You might have realised that there are a dozen methods, claiming you can magically sky-rocket your conversion rate. Truth be told, there isn’t a magical one-size-fits-all solution. If there was and it did work every time for every product, we wouldn’t really need A/B testing. So don’t think that you can nail an A/B test after reading the magical guides you found on the internet (Disclaimer: this is not one).

As an intern doing growth hacking, I began my A/B test for the first time — it was a struggle, but exciting. Little did I know, how hard it was going to be when I first started it, or how I’d manage to boost our landing page’s conversion rate to 37%. I knew nothing about the principles of A/B testing, such as what to do and what not to. But after messing things up twice, I finally nailed it.

I am here to share with you my three trials of A/B tests so you don’t have to make stupid mistakes like I did.

Trial 1 – Changing the tagline only

Shotbot landing page screenshot

Shotbot landing page started off like this.

A little background of the product I was working on: it was a product called ShotBot. ShotBot helps developers submit screenshots to iTunes Connect conveniently, since the process can be quite painful.

The tagline of the benchmark version was ‘Generate, submit screenshots to iTunes Connect fast’. At first, I thought the tagline would be too ambiguous to developers, which had led to the unsatisfactory conversion rate. So I introduced two challengers to the A/B test simultaneously:

  • ‘Generate, submit screenshots to iTunes Connect fast’ (Original)
  • ‘In 3 steps, Generate and submit screenshots to iTunes Connect’ (Challenger 1)
  • ‘Save you 15 minutes in preparing screenshots for iTunes Connect’. (Challenger 2)

The intention, for both of the trials, was to quantify the benefits. It turned out to be ineffective, both challengers achieved only around 26%-27% of conversion rate, which were the no better, if not worse, than the benchmark.

I then realised that I made a handful of common beginners’ mistakes over my first trial, which I will talk about in detail in a separate blog post.

Trial 2 – Changing Videos to GIFs

Shotbot GIF

Adding GIFs like this, despite the effort, were useless.

After all the painful learnings from my first trial, I became aware of the necessary procedure and the principles of A/B testing.

The next hypothesis I set was ‘developers need to know what they’re going to get to convert’.

In the baseline version, a video is placed to demonstrate the functionality to visitors. I thought the play button was too much of a hurdle, so why not just use GIFs instead?

I did exactly that. It turned out that the conversion rate for both versions were identical at around 28%. But what really intrigued me was that most users still clicked the download button located at the top, despite all the information I gave them in the lower part; needless to say, they did not look through the entire page i.e. the information was irrelevant to them. This was when I started to think that visitors might be quite impulsive, in a sense: they wouldn’t scroll down either way, they just wanted to download. All those information I showed them was meaningless.

So, I set up my next test focusing on the content above the ‘scroll’.

Trial 3 – Adding user’s testimonial

Shotbot Landing Page User Testimony screenshot

Adding a block of user testimonial at the top of the page.

The hypothesis I set for the last trial was that ‘developers need social proof from other developers to convert’. The testing area will focus on the top section of the site following the last test. Therefore, I reached out to our users to get some testimonials based on how ShotBot had helped them.

In the challenger version, I added a testimonial from one of our users, saying ‘I no longer spend hours just to upload localized screenshots one by one.’

The effect? The challenger’s conversion rate shot up to 37%. Our internal target was to reach 40% and hence we believed 37% was good enough to move on to other tasks.

Shotbot landing page statistic on Optimizely screenshot

Nailing the 37% target. (From Optimizely)

I got lucky to have gotten to 37% in such a short series of tests. Check out my next post here for preparation tips and watch-outs that will help start your first A/B test.


If you find this post interesting, subscribe to our newsletter to get notified about our future posts!

Written by:

Dennis the intern. Doing all sorts of growth hacking, content marketing and data driven goodies at Oursky. He loves cat too.

Find him at dennistam@oursky.com|Linkedin|Twitter

1 Comment

  1. Very true! Makes a change to see soemnoe spell it out like that. 🙂

Leave a Reply

Your email address will not be published.

*

© 2017 Oursky Blog

Theme by Anders NorenUp ↑