Why A/B testing is almost never the solution to your problem
When developing products or improving conversions it’s quite commong to go for A/B testing. Which sometimes makes sense but in most cases won’t be the solution to the problem you have.
Let me explain why.
When is A/B testing useful?
A/B testing is good for just one case.
You have a product or website with lot’s of traffic that is working really well and you want a small part of it to work even better.
But you don’t wan’t to ruin the good thing you already have. So you come up with an improvement you really believe in and then you A/B test it to see that you don’t make stuff worse.
With a lot of users/visitors/viewers/customers you get a result. If that result shows a significant increase when it comes to improvements you implement the change.
That’s it. Any other case is probably just a waste of time to do A/B testing for.
When you shouldn’t A/B test
If you know what you already have is not working. There is no point in comparing it with something else.
If you don’t have enough traffic/viewers/customers to make the A/B testing conclusive and actionable theres no point in doing it either.
What are you going to do if you get 457 people who preferred one thing and 521 people who preferred the other thing during 2 weeks of split testing? It’s not obviously better so it doesn’t help you in the decision making.
A/B testing is about helping you decide between two versions of something. So first of all you need to have two good options to choose from. Otherwise you don’t need help.
And then you want a clear answer as something that is unclear won’t help you at all. So you need statistical significance during a preferably short period of time.
If you want to make a big change or pivot there is really no point in A/B testing. Just test the pivot. You probably don’t believe much in what you already have so why compare it. The A/B test won’t tell you why the new thing works better or worse anyway as the changes are so big.
Do you really want an orange button?
A/B testing works best when comparing small differences. But usually small differences won’t make a huge impact on your overall goals. But they can increase conversions. For example changing the main message or the copy on the CTA.
But you should also ask yourself if the tests you are doing to optimize things are in your best interest in the long run.
For example if you are A/B testing your CTA color until you have a clear winner you will probably end up with an orange or red button. But do you really want an orange button?
It might be more clear and give you a 5.4% increase in conversion. But you ruin the branding and are you sure those 5.4% are worth it in the long run? Will those customers even be good customers or are they the ones that shouldn’t click the button in the first place.
Highway to A/B testing hell
A lot of social media platforms use A/B testing to incrementally improve their product. They sure have the traffic and they probably have two good options to pick from.
But they can also run into problems from the A/B testing as they test small changes and gradually change the product into something that makes their KPIs improve short term. What they don’t see is the way it can ruin their important metrics long term.
I think Instagram is a good example of this. They are clearly doing a lot of experiments in the app where they change things and move stuff around. It might temporary improve some KPIs and help the team doing their tests with their report – “Good job everyone!” – but… slowly they ruin the overall user experience and make the product loose focus. What is Instagram even about now? Reels? Images? Stories? Videos? Your friends? Influencers? Likes? Commenting? Live video? Who knows.
So in conclusion. Yes A/B testing can sometimes be good. But the next time someone is unsure of wether to implement a feature or change and suggest that you should do an A/B test. Think carefully if that is really the answer to your problems.