A/B testing is the best way to find out how you can mold your website to optimize conversions. However, it does not guarantee better numbers for your business. If you go about A/B testing incorrectly, your efforts could amount to striking the air—leaving you frustrated, tired, and with nothing to show for all your efforts.
Is your A/B testing working for you or against you? Are you making any of the following mistakes?
Targeting Instant Gratification Instead of Long-Term Results
This first mistake is a common one among number-lovers. They ignore human nature and the context of the data and narrow their focus to the numbers immediately in front of them.
Imagine you tweak your sales page so it has grandiose claims about your product that aren’t technically lies, but that some people would see as taking liberties with the truth. Your sensational promoting of a product will likely yield increased conversion in the short-term, and your A/B tests will say hard-hitting is the way to go, but this will be short-lived.
People value integrity in a business. If you develop a reputation for skimming the lines of truth, it will make conversion rates take a nosedive in the long run.
An easy way to avoid this mistake is by applying the good old-fashioned golden rule of putting yourself in the shoes of your website visitors before you effect any big changes.
Believing Every Positive is a True Positive
Image via Flickr by Sarah Reid
The A/B test says it’s a hit! You did everything right with the test. You collected a significant amount of data and you have 95 percent confidence that making a certain change to your website will boost conversions. However, after you implement the change, you find the results are nothing like what you expected.
Instead of scratching your head and wondering what is wrong with your website visitors, accept that the test yielded a false positive. Run another test to confirm your suspicions and start hypothesizing about what more you can do to optimize your website.
False positives are particularly likely to crop up if your A/B testing takes in more letters of the alphabet. You could have tests running for eight or nine different website changes at once, with one in particular looking promising. Don’t rely on that data. Narrow the field of testing and see if it still looks like a beneficial change.
One expert explains more about the subject in this video.
Targeting Significance Instead of a Preset Number of Data Points
Some marketing professionals will say it’s okay to stop A/B testing once the data reaches significance. However, doing so can increase your chances of a false positive. Before you start a test, decide how many data points you want to collect and run the test until you reach that number—not before, even if the data has already reached significance.
Also, keep in mind that you can’t trust data that excludes other factors, like the day of the week and the season. Run your test for at least a full week and consider if any events (Christmas, seasonal weather, etc.) could be impacting the sales data.
Wasting Time on Testing Tiny Changes
While it’s true that sometimes a small tweak can yield amazing results in split testing, it isn’t likely that it will. Thinking that changing a word in a headline or the color of buttons on your website will act as a magic potion is like a brick and mortar store owner thinking if he posts his business hours on the left side of the door instead of the right, more customers will come in.
Focus most of your tests on changes that an average consumer would have an easy time spotting, even if they couldn’t see the two versions side by side.
If you test tiny changes and see significant data in the results, it’s more likely to be a false positive than if you had tested a larger change.
Lumping Mobile and Desktop Results Together
Image via Flickr by Niels Heidenreich
Everyone is familiar with the frustration that comes along with trying to navigate a desktop-designed website on a mobile device. It’s awkward and sometimes not worth the effort. Therefore, if you only have a desktop site and you put all visitor data, including access from computers and mobile devices, together, you are skewing your results big time. Pew Internet Research found “34 percent of cell internet users go online mostly using their phones, and not using some other device such as a desktop or laptop computer.”
Even if you do have a separate mobile-optimized website or you use responsive web design (RWD), resist the urge to throw all the conversion data into the same basket. The ways in which people use the Internet on their smartphones differs from how they typically use it on their desktop computers.
Resting on Your Laurels
Remember, you want to optimize your website, not just make it mildly effective. Optimizing requires a constant quest for improvement and, consequently, constant A/B testing, which isn’t to say you should run tests solely for the sake of running tests. Always make sure every change you test is well reasoned and designed to help you reach a specific conversion goal.
Your tests won’t always yield eye-popping results. However, a small improvement in conversions is still an improvement, one that contributes toward a healthy bottom line for your business.
Keep in mind that what works varies depending on time of year, the economy, and hot news items. If something didn’t work for you two years ago, but you think might have a greater impact now, don’t be afraid to test it out again in current conditions. Maybe two years ago you were ahead of your time.
Successful A/B testing takes a delicate balance of statistical know-how, strategy, patience, humility, and understanding of human nature. Avoiding the above mistakes can save you time, effort, and frustration as you strive to boost conversions and take your business to the top.
Contact us today to increase your Conversion Rate . . .
We put our money where our mouth is – click here to contact us today and learn how we can drastically increase your website conversion rate for no money up front, and no fees until we get you an increase!
By Jon Correll