A/B testing campaigns test different versions of a single email to see how small changes can have an impact on your results. Choose what you want to test, like the subject line or content, and compare results to find out what works and what doesn't work for your audience.
In this article, you'll learn about A/B testing campaigns.
Things to know
Depending on your plan, you may not have access to A/B Testing Campaigns. To find out what features are included in each plan, check out our pricing page.
If you have a Premium plan, you’ll have access to our more advanced Multivariate testing. To learn more, check out About Multivariate Campaigns.
When we talk about A/B testing campaigns, we use some terminology that's a little different from how we talk about other tools and tasks in Mailchimp.
The element of your campaign that you want to test. With an A/B testing campaign, you can test one of four variables: subject line, From name, content, and send time. Each version of the variable is called a variation.
Each variation of your campaign that is created from your chosen variables. If you want to test three different From names, we'll create three different combinations of your campaign. Combinations sent in the test phase are called test combinations.
The period of time after the combinations are sent out and we compare the results. Data collected during the test phase can be used to determine the campaign's winning combination automatically or manually.
Winner or Winning Combination
The campaign that performs the best. This may be automatically determined by click rate, open rate, or total revenue, or manually chosen based on the reporting data you find the most valuable.
How A/B testing campaigns work
Set Up the A/B Testing Campaign
You'll choose a single variable type—subject line, From name, content, or send time—and create up to three variations. We'll generate all possible combinations and send them to different sets of recipients, so no one receives more than one combination of your campaign.
The combination that recipients receive are chosen at random and tracked solely for the purpose of choosing a winner, so you won't be able to see which combination went to a specific person.
Choose winner criteria
Send the combinations to all your recipients at once if you have a small audience or segment, or if you're testing send time. With other variables or a large audience or segment, send your test combinations to a percentage of your recipients, and send the winning combination to your remaining recipients.
To choose the winner, use one of these options.
Automatic: Open Rate, Click Rate, or Total Revenue
Use these options to send the winning campaign to your remaining recipients after a set amount of time. The winner can be determined by the highest open or click rate, or total revenue if your online store is connected to your account.
Manual: Report Statistics
Use this option to choose a winner yourself based on reporting data or other factors that you find to be the most valuable.
Variables you can test
Try different phrasing or sales offers to see what gets the most attention.
See if your recipients are more responsive to emails coming from a person's name or from the name of your company or organization. You'll provide the From name and From email address you want to use for each combination.
Create different versions of your content to see what gets a better response. Use this variable to test small content changes or completely different templates.
When you test content, you may want to better understand the efficacy of calls to action, links, or buttons. Use our link comparison tool in the campaign report to see how your links performed in each combination.
Learn when your recipients are most likely to open your campaigns. Since this option tests specific days and times, you must send your combinations to all your recipients at once because the winning combination can't send at a time that has already passed. Instead, use this data to inform when to send or schedule future campaigns.
A/B testing campaign ideas
Here are some common ways Mailchimp users learn from A/B Testing Campaigns.
What day of the week gets better open rates?
Does a subject line with an incentive or a teaser work best?
Does including your company name in your subject line increase engagement?
Is it better to use your name as the from name, or your company's name?
Does the time of day a campaign is sent affect the click rate?
Are recipients more likely to click a linked image or linked text?
Do recipients prefer a campaign that contains a GIF or one with static images?
Have a question?
Paid users can log in to access email and chat support.