A/B Testing — What it is, examples, and best practices.

Adobe Communications Team

07-24-2025

A woman in a living room looks at her laptop and writes notes about A/B testing.

Comparing two different user experiences to make an informed decision is the essence of A/B testing in the digital world. Whether it’s landing pages, email campaigns, or mobile app experiences, A/B testing provides the data necessary to move beyond guesswork and make decisions that resonate with customers and drive business goals.

However, the conversation around A/B testing has matured significantly. It is now a strategic discipline that helps businesses understand customers deeply. The core function of A/B testing is to transform conversations from "we think this will work" to "we know this works.”

What is A/B testing?

A/B testing is an experiment in which marketers split their audience and create multiple versions of a particular variable in a campaign to test the effectiveness of that variable and determine which version performs better. A/B tests can be used for emails, web pages, product designs, apps, and more. A/B testing is a conversion rate optimization (CRO) technique that businesses use to boost conversions.

Whether it’s conversion rates on a landing page or email click-throughs, A/B or split testing allows you to take a scientific approach to marketing and design. Over time, A/B testing will help you better understand your customers, enabling you to make more effective choices.

The fundamental value proposition of A/B testing is its ability to replace subjective decision-making with objective, quantitative data. Every digital experience is composed of countless variables—headlines, images, calls-to-action, layouts, user flows—and each one represents a hypothesis about what will best serve the user and the business. A/B testing is the process of rigorously testing these hypotheses in a controlled environment. This enables teams to validate their ideas, learn from both successes and failures, and continually improve customer experience based on actual user behavior.

This data-driven approach fosters a culture of humility and learning, where even the most strongly held opinions can be challenged and refined by empirical evidence. As organizations mature, this culture of experimentation becomes a powerful engine for growth, ensuring that resources are invested in changes that demonstrably move the needle on the metrics that matter most.

Quantify impact.

While increasing conversion rates is a common and important goal of A/B testing, mature experimentation programs look beyond this single metric to optimize for more strategic business outcomes. A pivotal concept in this advanced approach is a primary metric, also called North Star Metric (NSM). An NSM is a single metric (or a small set of metrics) that encapsulates the core value a product delivers to its customers and serves as a leading indicator of long-term revenue and success.

For example, a media streaming service might define its NSM as "time spent listening," because this metric reflects deep user engagement and predicts subscription retention. A/B tests in these organizations are designed not only to increase sign-ups but also to drive behaviors that directly contribute to the NSM.

Beyond the NSM, a robust testing program will track a portfolio of key business metrics to gain a holistic understanding of impact. These can include:

By aligning A/B testing goals with these high-level business metrics, organizations ensure that their optimization efforts are directly contributing to sustainable growth and profitability.

Test.

The true power of A/B testing is unlocked when it transitions from a series of one-off projects into a continuous, programmatic business function. A single successful test can provide a valuable lift. Still, an ongoing culture of experimentation creates a deep, data-backed understanding of customer behavior that other brands cannot easily replicate. This organizational learning is the ultimate strategic asset derived from A/B testing.

Building this culture requires a shift in mindset across the organization. It means viewing every new feature, campaign, or design change as a testable hypothesis. It involves celebrating the learnings from failed tests as much as the wins from successful ones, as both contribute to a more innovative, more customer-centric organization. Ultimately, a mature experimentation program transforms a company's decision-making DNA, moving it from being reactive and opinion-led to proactive and data-driven. This transformation is far more valuable than any single increase in conversion rate.

How to A/B test.

A/B testing isn’t complicated, but you do need to follow a careful strategy to do it correctly. You want to keep your data as clean as possible to make the best decisions. Follow these seven steps to conduct better A/B tests.

1. Test one element at a time.

First things first — meet with your team and decide which elements you want to test. It’s tempting to create two unique landing pages and compare them against each other, but that will only make it harder to determine which variables had an impact on conversion rates. With traditional A/B testing, you only want to test one element at a time, and you know which changes led to positive (or negative) results.

2. Establish a goal.

Once you decide on the variable you plan to test, clarify your goals. What do you want to get out of this A/B test? Conversions are the most common objective, but your team might also want to:

Once you know your goal, you’ll need a baseline for comparison. Where are you right now? For example, to lower bounce rates, you need to know your current bounce rate. This baseline will help you demonstrate that the changes you made had a positive impact on your brand. So, if your bounce rate is 40% right now and your A/B test reduces it to 20%, that’s a sign that you’re on the right track.

3. Create a variant.

Now it’s time to create a variant. If you’re new to A/B testing, stick with two variants to start. For example, if you want to compare CTAs, test two identical pages with different CTAs and see which one performs better.

We recommend testing two variants initially, as adding more variants can complicate the experiment. It also makes it harder to find statistically significant results. If you’re new to A/B testing, stick with a control page and a variant for your first test.

4. Split sample groups, if necessary.

This isn’t always necessary, but sometimes you’ll need to watch your audience’s demographics during a split test. For example, if you’re split-testing emails, you need to ensure that you send the divided test to two similar audiences. If you send them to a segment of long-term customers and a segment of new leads, you’re going to have skewed results because these two audiences are so different.

If you only want to see how a particular segment reacts to your split test, ensure that your test is only reaching people in that audience.

5. Decide on a schedule.

You shouldn’t run A/B tests forever. They’re an experiment, and all experiments need a start and end date so you can gather data and analyze the results.

Your split test needs to be long enough for you to gather meaningful data. Two weeks will usually give you enough time, but if you have a large audience or a lot of web traffic, you may be able to test within a shorter window than that.

A/B testing enables brands to understand their customers, enhances the digital experience, and stays one step ahead of industry trends.

6. Analyze and act on results.

When the A/B test ends, sit down with your team to analyze the results. Your split testing software will automatically collect all the analytics for you, allowing you to track the performance differences between your two versions.

It’s crucial to track clicks, shares, and other engagement metrics. But what matters most is whether you achieved the initial goal you set at the beginning of the A/B test.

It’s okay if you didn’t. Unexpected things occur in experiments all the time, so whether you met your goal or not, use this time to dig into the analytics and see what happened. A/B testing is a science, and science will often surprise you. Let the numbers speak for themselves — follow the data and adjust your approach accordingly based on the research.

7. Repeat.

A/B testing isn’t something you do on a one-off basis. A/B testing should be a regular part of your marketing team’s CRO strategy.

Everybody wants to see a lift in conversions across all their campaigns, so don’t lose momentum after the first split test. From here, it’s a matter of rinsing and repeating. With more tests under your belt, you’ll gather even more data and have a better understanding of your audience’s needs.

What should you test?

With A/B testing, you can test just about any element of a page, app, or email. But here are some of the most important things to test.

CTAs.

Calls to action (CTAs) inform your audience of the next step they should take. CTAs are the most common variable that brands split test.

If users aren’t engaging with your current CTAs, consider splitting test a few options to see if you can achieve better results. For example, you might see that “Learn more” is too weak of a CTA. Instead, substitute it for these stronger CTAs:

There are numerous CTAs you can try out, so consider including a few options to see what works. Just make sure the CTAs are relevant. Transparency builds trust, so be upfront about what the next step is.

Landing pages.

You rely on landing pages to generate leads, bring in more sales, and promote your products. You put a lot of thought into your landing pages, so use A/B testing to measure their effectiveness. With split testing, you can boost landing page performance by split testing these variables:

Images.

The images you use on your website, app, landing pages, and emails tell users what they can expect from you. You can say a lot with a well-placed image, so A/B test your images to see what sticks.

That might mean:

Forms.

Forms are necessary for collecting information from your customers, but a lot can go wrong. A/B testing can help you identify ways to improve your conversions.

You can test:

Split testing your navigation is beneficial if you want to evaluate the performance of your home page or landing page. Try A/B testing:

Copy.

No matter how you slice it, body copy is the meat of your app, landing page, or email content. The good news is that A/B testing allows you to assess any piece of copy.

Split test your product descriptions, website home page taglines, and ad copy to see what resonates most with your audience. Experiment with the tone (whether people prefer casual or formal language), formatting (such as using headers, bullets, or bold text), length (whether they prefer longer or shorter copy), and more.

Page layout.

A/B testing page layout is an option that often falls by the wayside, but how you lay out a page can have a significant effect on the user experience. Split test where you place elements on your website and see if it makes a difference. Test things like:

Types of A/B tests.

Split.

With split testing, you’re able to make data-driven decisions. Because you only test one element at a time, you can prove that a single change encourages more audience engagement.

For example, an A/B test typically directs website traffic to the same URL but displays a different design to visitors. Split URL testing works similarly, except it shows a page with a different URL to your users. This is a back-end change where you adjust the URL for each variation.

You might direct 50% of your traffic to www.example.com/freebies and the other 50% to www.example.com/resources. The unique URLs make it easier to monitor your A/B test results in an analytics tool.

When to Use: Split URL testing is the appropriate choice for more radical or exploratory changes. Use cases include:

Multivariate (MVT).

In a typical A/B test, you’re testing a single variable across several variations. For example, if you want to test your website forms, you would create three versions of the form and keep the other variables on each page the same. This way, you’re comparing apples to apples every time.

Multivariate testing allows you to test multiple variables on multiple pages all at once. If you want to test CTAs, images, headers, and more, you don’t have to run separate A/B tests for each option, and that saves you a lot of time. All you need to do is run a multivariate test, and you’ll get all your answers from one campaign.

However, multivariate testing is much more complex than a typical A/B test because you’re testing so many variables. For that reason, it’s best to wait to do multivariate testing until you have a few A/B tests under your belt first.

When to Use: MVT is an advanced technique that should be reserved for mature testing programs with substantial website traffic. Because traffic is divided among many variations, a large sample size is required to achieve statistically significant results for each combination. It is best used to refine and polish a page layout after A/B testing has already determined the most effective general design and messaging.

Multi-page.

A/B testing typically examines changes made to a single aspect of your digital marketing. But that’s not representative of the entire customer journey. What shoppers see before they reach your website, app, or email can influence how they interact with your content. Sometimes you need to test the entire experience leading up to a landing page, and that’s where multi-page testing comes in.

With multi-page testing, you can test every stage of the funnel, providing shoppers with different versions of your content to see what resonates. Multi-page testing is less myopic than A/B testing because it examines the overall effect of the customer journey (rather than a single page). However, it requires a significant amount of content creation and planning, making the execution of this type of test time-consuming.

When to Use: This approach is ideal for testing hypotheses related to user journey consistency, such as changes to branding, messaging, persistent navigation, or sitewide promotional banners. Setting up and executing a multi-page test is significantly more complex than a single-page test, requiring careful planning and a robust testing tool.

A/B/n.

A/B/n testing is a straightforward extension of classic A/B testing, where "n" represents an unknown or variable number of challengers being tested against the control. Instead of just comparing A vs. B, a team could compare A vs. B vs. C vs. D.

When to Use: This method is useful when there are multiple viable variations for a single hypothesis. For example, if a team wants to test four different colors for a CTA button, an A/B/n test is more efficient than running a series of separate A/B tests. However, like MVT, it requires more traffic than a simple A/B test because the audience is split among more pages, diluting the sample size for each variation.

How does A/B testing work?

You can use A/B testing to discover how to decrease the number of form fields in the checkout process, add testimonials to a product page, or increase your email click-through rates when you add a video thumbnail to the email body copy.

With the right tools, A/B testing is straightforward. A/B testing usually follows these steps:

A/B test with Adobe Target.

A/B testing enables brands to understand their customers, enhance their digital experience, and stay ahead of industry trends. Adobe Target helps both enterprises and growing brands achieve better A/B testing results.

Built for marketers, Adobe Target makes it easy to perform testing, measurement, and optimization without writing code. Adobe Target personalizes experiences based on historical behavior rather than the latest transaction.

A/B testing might require some research, but it should be part of every data-driven organization’s playbook, especially in these competitive times. When it’s time to start A/B testing, find the right tool for the job. From there, review which elements you want to test. It’s as easy as plugging the aspects into your A/B testing platform, analyzing the results, and adjusting for future changes.

Use Adobe Target to deliver an optimized customer experience today. Get a free demo to see it in action.

https://business.adobe.com/fragments/resources/cards/thank-you-collections/target