Like any other revolutionary discovery in online marketing, A/B testing has become something that any company with an online presence has to try in order to remain competitive. The A/B testing technique is nowadays the handiest option for marketers who are working with conversion rate optimization.
Image via Marcin Wichary
But how is A/B testing working for you?
Have you ever considered the fact that the A/B testing methodology is precise and that it has strict rules?
If you haven’t tried A/B testing by now, read this post until the end to know what traps to avoid. On the other hand, if you have already run some tests, I encourage you to stay with me until the end to see if any of the “unrevealed” aspects that I mention in this post has slipped between your fingers.
Before jumping into the things you need to avoid, let’s take a brief look at the advantages of A/B testing:
- it reveals the impact of a change in the website’s design or copywriting on key performance indicators such as conversion rate or revenue
- it gives you insights on your website’s visitors behavior
- the test involves real people without interrupting their journey
- its results are scientific(more on the confidence rate later)
- it’s cheaper to use an A/B testing software than a team of developers and designers
- it’s risk-free if you respect the rules
From my point of view and personal experience, these are the traps one could fall for.
- Believing everything you hear
Although other’s positive results have always inspired and encouraged me to start applying their “success recipe”, I’ve noticed that it takes time to understand the full potential of a marketing technique. You need to start trying it and see if it works for you. Even if you’re still searching for advice from experts on blogs, you have to start trying an A/B testing platform right away. Sorry to disappoint you, but there is no other way.
None of the information providers will run the test for you, nor will they assist you in setting it up. Instead of getting informed from a single source, take time to read both the pros and cons of using an A/B testing software. Do not forget that the A/B testing methodology involves strict rules and a rigorous approach to interpretation and implementation of the winner version.
- Getting too excited
If you’ve taken the decision to use A/B testing to increase the performance of your website, congrats! You’re doing the right thing. It means that your decisions are based on data-driven results and you’re always looking to improve your outcomes. The excitement of having a tool that allows you to test your ideas without wondering how it may affect the website’s KPIs is overwhelming. I totally understand it, I’ve experienced it too.
If you want to dive into setting up a test immediately on your website, it’s ok. But you will lose money. Without a list of hypotheses – built with the customer in mind, you won’t know where to start from and which pages to include in the test. A/B testing is about knowing, not guessing. You have probably heard that before, but the trick is to understand its meaning.
You need to find out which are the pages that need optimization by getting information from people who visit your site and buy from it. Your customer’s feedback is the key to having a healthy list of hypotheses to test.
My recommendation is to create a survey on your website and to ask visitors about the barriers they have encountered, what would they like to see on the page, which is the element that stopped them from becoming customers on certain pages. You will know which are the pages that need optimization and the elements that you need to change to create and test the variation.
- Testing more than one variable
The alternative to A/B testing is Multivariate testing. It implies changing a combination of elements on a page. But be warned: modifying too many variables at once in a single test won’t tell you which of the variables has a positive impact on KPIs. Therefore, you assume the risk of seeing a drop in conversions after you implement the winning version.
There is no rush. Instead of ruining a test by including too many variables at once, test each variable independently. If you’re still not convinced, check again number 2 and remember that it’s not about your desire to see your website changed, but about the customer’s demands from your website.
- Expecting huge outcomes from small changes
In case you run optimization efforts on a subscription-based site, I recommend you to try one or both of the following alternatives:
- transform your homepage into a sales page having in mind the mandatory aspects of a high converting page: emphasizing on the benefits of using the product, including testimonials, having repeatable CTA buttons, including trust certificates, using videos, etc. Then, you can start split testing one element at once and see the impact on conversion rate
- build landing pages for each persona. Supposing your product can be used by HR specialists, marketers and consultants, make sure to build separate landing pages for each of them. Then, you can test variations of the landing pages and find out what is convincing them to convert.
Image via Sean MacEntee
As you see, conversion rate optimization goes hand in hand with A/B testing and landing pages. It’s a pity to consider that A/B testing is the “one size fits all” solution for your online business.
- Stopping the test too early
If the A/B testing software is showing you a statistical significance rate of 95% doesn’t mean that you have to stop the test right away. Learn from other’s experience with the A/B testing methodology and be grateful that you know how to avoid losing money.
Even if you see a confidence rate of 95%, it doesn’t mean that the variation will continue to have a positive impact on the website’s KPIs. Therefore, don’t stop the test unless it reached the right amount of traffic. In order to help you with this sensitive aspect of this testing method, try this calculator. It helps you to predict how many visitors you need in order to have accurate results with your tests.
- Testing the impact on the irrelevant KPIs
Measuring the impact of the change on page views and click-through rates just doesn’t make sense. The A/B test is meant to show you the impact of a change on metrics that matter such as conversion rate and revenue. Focusing on the wrong KPIs will only trick you to think that you’re on the right way because you see people clicking on a button.
- Running several tests in the same time
This trap has to deal with excitement again. If you think that you would have greater results by running a lot of tests in the same period of time, you may be wrong about it. A/B testing is just like any other thing in life. You take action and then you learn how to improve. If you’re running multiple tests, you’re making the same “mistakes”, ruining all the tests. You end up with nothing. You have no idea why you don’t see increase in conversion rates and you probably think that A/B testing isn’t for you.
One thing at a time.
Make sure to go through this checklist next time you run an A/B test:
1. ask for customer’s feedback: create surveys
2. make the list of hypotheses based on the customer’s answers
3. take the first hypothesis from the list and plan your test- find out how to select the hypotheses for A/B testing
4. set up the test with your A/B testing software – consider time(at least 14 days) and predict the number of visitors
5. monitor your test daily, checking for anomalies
6. stop the test if you’ve reached the statistical confidence and the required number of visitors
7. implement the winner version
8. enjoy results