Guide to A/B testing — what it is, how it works, examples, and tools
Whether it’s your landing pages, product pages, or the user experience on your brand’s app, you surely want to increase conversions and engagement. To do that, you need to know what works for your audience and what doesn’t — and that’s where A/B testing comes in.
A/B testing provides the data necessary to make the right decisions about seemingly small but important details. You probably know you should be A/B testing everything from your website to your email marketing efforts, but maybe you haven’t done it in a while — or maybe your team tried A/B testing in the past and it wasn’t useful. Or maybe you’ve never done it and don’t know where to start.
A/B testing is crucial in today’s competitive digital marketing landscape, but it has evolved over the years. It’s time for your brand to recommit to split testing.
In this article, you’ll learn how A/B testing works, how to do it effectively, and why it’s so beneficial, as well as what to do with the results you get and how to do better A/B tests for your business. With the right tools, A/B testing becomes so second nature that you’ll always be running some kind of split test to improve engagement and you’ll feel more confident about your marketing strategies.
This guide to A/B testing will cover the following:
- What is A/B testing?
- How it works
- What you should test
- How to do A/B testing
- How to analyze the data
- A/B testing examples
- Split-testing tools
- Statistical significance calculator
- How to get started with A/B testing
What is A/B testing?
A/B testing is an experiment wherein marketers split their audience and create multiple versions of a particular variable in a campaign to test the effectiveness of the variable and determine which version performs better. Also called split testing or bucket testing, you can use A/B tests for emails, web pages, product designs, apps, and more.
A/B testing is a conversion rate optimization (CRO) technique that businesses use to boost conversions. After analytics, A/B testing is the second most-used tool for CRO.
Whether it’s conversion rates on a landing page or email click-throughs, A/B or split testing allows you to take a scientific approach to marketing and design. Over time, A/B testing will help you better understand your customers so you can make more effective choices too.
Types of testing
A/B testing is a CRO technique in its own right, but it comes in a few varieties. Depending on the information you need, you might want to use one of these types of A/B testing to get better insights into your shoppers.
Less than 50% of businesses use split testing, but it’s one of the most powerful tools you have to prove that your approach is effective.
With split testing, you’re able to make data-driven decisions. Because you only test one element at a time, you can prove that a single change encourages more audience engagement.
For example, an A/B test normally would direct website traffic to the same URL but show a different design to visitors. Split URL testing works similarly, except it displays a page with a different URL to your users. This is a back-end change where you adjust the URL for each variation.
You might direct 50% of your traffic to www.example.com/freebies and the other 50% to www.example.com/resources. The unique URLs make it easier to monitor your A/B test results in an analytics tool.
In a typical A/B test, you’re testing a single variable across several variations. For example, if you want to test your website forms, you would create three versions of the form and keep the other variables on each page the same. This way, you’re comparing apples to apples every time.
Multivariate testing allows you to test multiple variables on multiple pages all at once. If you want to test CTAs, images, headers, and more, you don’t have to run separate A/B tests for each option — and that saves you a lot of time. All you need to do is run a multivariate test and you’ll get all of your answers from one campaign.
However, multivariate testing is much more complex than a typical A/B test because you’re testing so many variables. For that reason, it’s best to wait to do multivariate testing until you have a few A/B tests under your belt first.
A/B testing typically looks at changes made to one aspect of your digital marketing. But that’s not representative of the entire customer journey. What shoppers see before they reach your website, app, or email can influence how they interact with your content. Sometimes you need to test the entire experience leading up to a landing page, and that’s where multi-page testing comes in.
With multi-page testing you can test every stage of the funnel, giving shoppers a different version of your content to see what sticks. Multi-page testing is less myopic than A/B testing because it looks at the net effect of the customer journey (rather than a single page). However, it requires a lot of content creation and planning, so executing this type of test can take a lot of time.
Benefits of A/B testing
Although there are different types of testing available to your brand, A/B testing is still the best place to start. That’s because A/B testing helps you better understand your customers, quantify what works, improve ROI, increase time on site, and evaluate your efforts.
Give customers what they want
Your business exists to solve customers’ biggest problems. But you need to know if you actually are. For example, maybe you think you should create a customer loyalty program, but your shoppers actually want free shipping. You won’t know what engages your customers best unless you run a test first. In this way, split testing helps you understand what actually motivates your shoppers to take action — and how you can convince them to choose your brand over everyone else.
Quantify what works
With A/B testing, there’s no guesswork. Your graphic designer doesn’t need to wonder which layouts or colors your audience likes. And your app developer knows exactly what type of experience your users expect from your branded app. A/B testing software allows you to run tests on a constant basis, generating helpful data that you can use to quantify what works. If you’re trying to run a more data-driven business, A/B testing is a must.
You’re already investing in email, social media, your app, products, and a website. You want to get results from all of your hard work. A/B testing helps brands enjoy increased engagement and all the return on investment (ROI) that comes with it. In fact, 60% of businesses that do A/B testing say it’s highly valuable for increasing conversion rates. Furthermore, 70% of brands increased landing page sales with A/B testing, and some businesses reported as much as a 50% increase in average revenue per user.
Increase time on site
The more time users spend on your site, the more interested they’ll be in your solution. If people bounce from your site within a few seconds, however, they aren’t interacting with your brand. That translates into lost conversions and fewer sales. But with split testing, you know which elements people want to see. You can give your audience exactly what they need to stay on your website for as long as possible.
Increased time on site is also a positive indicator to search engines. If you want to rank higher in search engines like Google, increasing your time on site can improve your organic search traffic.
Know what moves the needle
Less than 50% of businesses use split testing, but it’s one of the most powerful tools you have to prove that your approach is effective. With A/B testing, you can learn what works for your business. Because you test just one variable at a time, you can demonstrate which changes are most impactful. That means you’ll spend less time on experiences that don’t work and focus your resources on what shoppers want to see.
How does A/B testing work?
Most marketers A/B test their landing pages, email, and pay-per-click (PPC) ads, but you can use it for just about anything.
For example, you can use A/B testing to discover how:
- Decreasing the number of form fields in the checkout process increases purchases.
- Adding testimonials to a product page increases conversions.
- Your email click-through rates increase when you add a video thumbnail to the email body copy.
Just 7% of brands say it’s hard to do A/B testing. So the good news is that with the right tools, split testing is a pretty easy process. Every organization is different, but A/B testing usually follows these steps:
- Create a hypothesis. Proper A/B testing should follow the scientific method. To start your experiment, you need to have a theory. Your hypothesis may or may not work out, but it’s an assumption to test. For example, your hypothesis might be, “Making the CTA button orange will increase click-throughs” or “Adding emojis to our email subject lines will increase open rates.”
- Choose your versions and variables. Next, you’ll choose a variable that fits with your hypothesis and create two versions of it. All of the other elements of your email, page, or app should be identical — just one variable must be different. You’ll need to create at least two versions. Just keep in mind that tracking performance across a handful of versions might make it more difficult to understand your results.
- Use split-testing software. You can’t do A/B testing manually. You’ll need to use split-testing software to serve different versions of a page, app, or email to your audience. The software will measure how people engage with the different versions and report on the data. At the end of the experiment, you can see which version got the most positive results.
CTAs are the most common variable that brands split test. One-third of businesses that do A/B testing always test their CTAs.
What should you test?
With A/B testing, you can test just about any element of a page, app, or email. But here are some of the most important things to test.
Titles and headlines
Page titles and headlines give users an idea of what to expect from your content. Search engines also scan your titles and headlines to see what your content is about, so split testing could improve your SEO and ranking over time too.
Calls to action (CTAs) tell your audience what you want them to do next. Since CTAs help you transform leads into paying customers, it’s no wonder why one-third of brands that do A/B testing always test their CTAs. In fact, CTAs are the most common variable that brands split test.
If users aren’t biting on your current CTAs, split test a few options and see if you get better results. For example, you might see that “Learn more” is too weak of a CTA. Instead, substitute it for these stronger CTAs:
- Shop now
- Get the free guide
- Send message
- Subscribe now
- Join our mission
- Donate here
- Sign up
- Try for free
- Get started
There are so many CTAs you can try out, so drop in a few options to see what works. Just make sure the CTAs are relevant. If you don’t want people to “Learn more” and actually want them to start shopping, say that. Transparency builds trust, so be upfront about what the next step is.
You rely on landing pages to generate leads, bring in more sales, and promote your products. You put a lot of thought into your landing pages, so use A/B testing to measure their effectiveness.
The average landing page conversion rate is 26%. But with split testing, some businesses report seeing conversion rates in excess of 70%. Even if you don’t convert 70% of your traffic, you can boost landing page performance by split testing these variables:
- Social proof
- Multimedia like photos and videos
The images you use on your website, app, landing pages, and emails tell users what they can expect from you. You can say a lot with a well-placed image, so A/B test your images to see what sticks.
That might mean:
- Trying out different color palettes
- Testing lifestyle photos versus product photos
- Testing a male versus a female subject
- Testing realistic versus cartoon images
Whatever you do, make sure the images are always high quality. Low-quality images will lose your shoppers’ trust and hurt your conversion rates. Your images should be at least 300 dots per inch (DPI).
Forms are necessary for collecting information from your customers, but a lot can go wrong. More than 80% of people who start to fill out a form will leave, but A/B testing can help you identify ways to improve your conversions.
You can test:
- The length of the form
- Adding or removing a progress bar
- Adding or removing example text in each form field
- The copy used on the “Submit” button
- Color and design elements like font
That’s right — you can even split test your navigation menu. This is especially helpful if you want to evaluate your home page or landing page performance. Try A/B testing:
- Where you place navigation elements
- Which items you include in the navigation
- How many items are in the navigation menu
- Whether you should have a navigation menu at all
Email subject lines
You need solid subject lines if you want people to open your emails and take action. But you need to determine which subject lines persuade people to open an email — and which options will send you straight to their trash folder.
Aside from CTAs, email subject lines are some of the most common variables to split test. Nearly 60% of brands report split testing their email subject lines. Many email service providers (ESP) offer this feature out of the box, so split testing subject lines is extremely easy.
Many brands adjust these variables to improve their email performance:
- Adding the recipient’s name or business name
- Asking questions
- Adding emojis
- Creating urgency (“Sale ends in two hours”) or scarcity (“Only two slots left”)
No matter how you slice it, body copy is the meat of your app, landing page, or email content. The good news is that A/B testing allows you to assess any piece of copy.
Split test your product descriptions, website home page taglines, and ad copy to see what resonates most with your audience. Play with the tone (whether people prefer casual or formal), formatting (whether people like headers, bullets, or bold text), length (whether they prefer longer or shorter copy), and more.
A/B testing page layout is an option that often falls by the wayside, but how you lay out a page can have a big effect on the user experience. Split test where you place elements on your website and see if it makes a difference. Test things like:
- Social proof
- Carousel versus fixed images
- Long versus short pages
- Font sizes
- Color schemes
How to do A/B testing
A/B testing isn’t complicated, but you do need to follow a careful strategy to do it correctly. You want to keep your data as clean as possible to make the best decisions. Follow these seven steps to conduct better A/B tests.
1. Test one element at a time
First things first — meet with your team and decide which elements you want to test. It’s tempting to create two totally unique landing pages and compare them against each other, but that will only make it harder to determine which variables had an impact on conversion rates. With traditional A/B testing, you only want to test one element at a time. This way, you know which changes actually led to positive (or negative) results.
2. Establish a goal
Once you decide on the variable you plan to test, get clear on your goals. What do you want to get out of this split test? Conversions are the most common objective, but your team might also want to:
- Increase form submissions
- Lower bounce rates
- Decrease abandoned carts
Once you know your goal, you’ll need a baseline for comparison. Where are you right now? For example, if you want to lower bounce rates, you need to know what your current bounce rates are. This baseline will help you prove that the changes you made actually benefited your brand. So if your bounce rate is 40% right now and your A/B test gets it down to 20%, that’s a sign that you’re on the right track.
3. Create a variant
Now it’s time to create a variant. If you’re new to A/B testing, stick with two variants to start. For example, if you want to compare CTAs, test two identical pages with different CTAs and see which one performs better.
We recommend testing two variants at first because adding variants makes the experiment more complex. It also makes it harder to find statistically significant results. If you’re new to A/B testing, stick with a control page and a variant for your first test.
4. Split sample groups, if necessary
This isn’t always necessary, but sometimes you’ll need to watch your audience demographics during a split test. For example, if you’re split testing emails, you need to make sure you send the split test to two similar audiences. If you send them to a segment of longtime customers and a segment of new leads, you’re going to have skewed results because these two audiences are so different.
If you only want to see how a certain segment reacts to your split test, ensure that your test is only reaching people in that audience.
5. Decide on a schedule
You shouldn’t run A/B tests in perpetuity. They’re an experiment, and all experiments need a start and end date so you can gather data and analyze the results.
Your split test needs to be long enough for you to gather meaningful data. Two weeks will usually give you enough time, but if you have a particularly large audience or a lot of web traffic, you may be able to test within a shorter window than that.
Smart split testing helps brands understand their customers, improve the digital experience, and stay one step ahead of industry trends.
6. Analyze and act on results
When the A/B test ends, sit down with your team to analyze the results. Your split testing software will pull all of the analytics for you so you can track the performance differences between your two versions.
It’s important to track clicks, shares, and other engagement metrics. But what matters most is whether you achieved the initial goal you set at the beginning of the A/B test.
It’s okay if you didn’t. Unexpected things occur in experiments all the time, so whether you met your goal or not, use this time to dig into the analytics and see what happened. A/B testing is a science, and science will often surprise you. Let the numbers speak for themselves — follow the data and make changes to your approach based on the research.
7. Rinse and repeat
A/B testing isn’t something you do on a one-off basis. Unfortunately, 57% of brands stop split testing once they get their desired results, but there’s so much more for you to uncover about your customers. A/B testing should be a regular part of your marketing team’s CRO strategy.
Everybody wants to see a lift in conversions across all of their campaigns, so don’t lose momentum after the first split test. From here, it’s a matter of rinsing and repeating. With more tests under your belt, you’ll gather even more data and have a better understanding of your audience’s needs.
How to analyze A/B testing data
A/B testing is critical for improving your conversion rates and your marketing approach as a whole. But it creates a lot of data for your team to sift through, so you need to know the best way to glean insights from that split test data.
If you’re using A/B testing software, it will do a lot of the calculations for you. Even then, your team needs to interpret what these calculations mean. You don’t need a statistician to do that, either. Follow these tips to find actionable insights from your split test experiments.
If you gathered data from 1,000 visits to your website, you’re going to get lots of information. But if you only got 10 visits to your landing page, that isn’t a great sample size. If you have a small sample size, take your findings with a grain of salt. After all, with only 10 people in a sample, the actions of one person account for 10% of your findings, which will skew the data.
You don’t want to take action on incomplete or incorrect data, so be mindful of your sample size. If you didn’t get enough visitors this time, try to boost traffic for the next experiment so you get a bigger sample size.
If you see a 1% difference between landing page A and landing page B, that likely isn’t statistically significant enough to say one version was better than the other. Generally speaking, small differences in performance mean that your data isn’t definitive enough to guide your decisions either way. At a minimum, you need at least a 5% difference in performance to determine which version is superior.
Ask yourself a few questions. What was your goal for this particular A/B test? Did you achieve it? Which variation led to the best results?
If you didn’t achieve your goal, that’s important information too. For example, maybe you thought changing the CTA button from black to red would increase conversions but the opposite happened. You tested your assumption and it didn’t make a difference, but that’s still significant — now you know that a black button performs better.
As long as you use a robust A/B testing platform, you can look at the split test data to see how it performed across your different audience segments. Look at the data as a whole, and then break it down based on the following:
- Buying history
You might find that your email performed better for desktop users than mobile users. Or that a specific buyer persona was more likely to convert on a CTA than your other personas.
Check for anomalies or errors
Nobody likes to admit it, but sometimes problems can happen in an experiment. Maybe you accidentally ran your campaign for too long — or not long enough. Perhaps you unintentionally tested a variable that you meant to save for a different split test.
These actions can muddy the waters and skew your data. Problems happen, but make sure your A/B test results are actually valid before you treat them as gospel.
A/B testing examples
A/B testing differs from company to company. Regardless of your customers or industry, split testing can help you improve the customer experience. See how these five brands got better results with split testing.
BBVA Bank wanted to become a leader in digital banking, so it decided to optimize its mobile and web experiences. The brand used Adobe Target to create more than 1,000 split tests. This not only grew BBVA’s customer base by 20%, but it also encouraged more people to use online banking. Today, the brand has a 50/50 split between online and traditional customer communications.
In-person interactions were on the decline for Nissan, so the brand wanted a deeper understanding of who it was selling to. Nissan used Adobe Target to better understand its sales funnel, focusing specifically on content that drove the most sales.
The brand did A/B and multivariate testing to evaluate design elements like button shape, positioning, and body text. As a result, Nissan decreased bounce rates and increased conversions. It also saw its email open and click rates double.
AAA wanted to create a digital experience that was completely influenced by member feedback. It partnered with Adobe to boost new memberships by capturing customer feedback and using it to attract new signups.
AAA conducted 450 real-time A/B tests on its website over 18 months. It tested variations of its membership landing page, from images to CTAs to membership benefits. As a result, AAA saw a 45% increase in memberships initiated online. Split testing also helped AAA rank first nationwide in digital satisfaction ratings, when it was previously ranked 14th. It also had an 11 times greater ROI on the budget spent on new digital experiences.
A/B testing isn’t just for enterprises. Nonprofit organizations can use it to boost donations too. In response to COVID-19, Save the Children switched to digital fundraising and content. Before the pandemic, the nonprofit did very little testing — but after performing a few A/B tests, it better understood its donors’ needs.
Save the Children tested a donation live feed, which displayed recent donations as social proof. It also A/B tested components of its website content. As a result, Save the Children increased conversions by 85%. It increased revenue per visitor (RPV) by 25% and raised £1.5 million for Ukraine in just two weeks.
The Swiss telecom company wanted to better meet the needs of its customers and become more competitive in the digital space. With Adobe Target, Swisscom tested content and banners to create real-time experiences.
The goal was to measure customer behavior, so it used Auto-Allocate to automate A/B testing in real time. Swisscom’s new approach saved time and made personalization possible — and the brand saw a 40% uplift thanks to split testing.
Split testing is a must-have for any data-driven team. However, it’s not something you can do manually. You need the right tools on your side to do A/B testing correctly. Every business has its own tech stack, but if you need guidance, these three tools will help you jump-start your A/B testing.
1. Google Analytics
Your business likely already uses Google Analytics, which is a free analytics tool. It’s great for gaining basic insight into how your website performs. It’s definitely not as robust as paid options, but if you’re a small organization or you just want proof of concept, Google Analytics can function as an A/B testing tool.
To set it up, create or log into your Google Optimize account. Go to “Create Experiment” to run your A/B test. You can add variants and even change the weight of certain variants if you prefer one to display more than others. Be sure to add your objectives so Google knows your goal. Add start and end dates — when the campaign finishes, Google will notify you so you can check the results.
2. Email service provider tools
Google Analytics can handle basic website A/B tests, but if you want to split test your emails, you’ll need to use split-testing tools through your ESP. Many of these platforms offer A/B testing as part of your subscription, although the exact setup depends on your ESP.
These email service providers all offer some form of basic split testing:
3. Adobe Target
If you want more advanced split testing options — including the ability to do multivariate testing — go with a robust A/B testing platform like Adobe Target. Adobe Target is a personalization solution that delivers the right experiences to every single customer at scale. Using a combination of A/B and multivariate testing, it works as a powerful optimization engine across screens and channels.
Adobe Target will tell you which experiences fuel better customer outcomes. Use the drag-and-drop interface to easily test different variations, define your audience, and select success metrics. The platform will walk you through the entire process one step at a time, so it’s incredibly intuitive.
Statistical significance calculator
Keep in mind that A/B testing is only useful if the findings are statistically significant. This means that you can attribute the results to an actual difference between the versions you tested — and not because of an error or a small sample size.
Many split testing platforms will calculate statistical significance for you — but if they don’t, you’ll need to do it yourself.
For starters, you’ll need to calculate your conversion rates for each variant with this formula:
For example, if you had 90 conversions and 1,000 views:
Next, you’ll need to look at the percentage differences between the two conversion rates with this formula:
For example, if Test A converted at 9% and Test B converted at 12%:
This means that Test B converted 33% better than Test A.
Generally speaking, anything higher than a 5% difference counts as statistical significance. You can continue to calculate this by hand, consult your A/B testing platform, or use online calculators from Neil Patel or SurveyMonkey.
Getting started with A/B testing
Smart split testing helps brands understand their customers, improve the digital experience, and stay one step ahead of industry trends. Adobe Target helps both enterprises and growing brands with better A/B testing that gets results.
Built for marketers, Adobe Target makes it easy to perform testing, measurement, and optimization — all without writing code. With the power of unified customer profiles and machine learning from Adobe Sensei, Adobe Target personalizes experiences based on historical behavior rather than the latest transaction.
A/B testing might require some research, but it should be part of every data-driven organization’s playbook — especially in these competitive times. When it’s time to start A/B testing, find the right tool for the job. From there, review which elements you want to test. It’s as easy as plugging the elements into your A/B testing platform, analyzing the results, and making the changes going forward.