Skip to main content

Introducing More Powerful A/B Testing

Hero image for Introducing More Powerful A/B Testing

Way back in 2007, we launched A/B testing—a feature that has helped Mailchimp’s 8 million users learn more about how different subject lines, from names, and delivery times affect their subscribers’ engagement rates. In this week’s release, we’re excited to introduce a brand new A/B testing experience, complete with a redesigned interface, native content testing, and the ability to compare 3 different versions of a campaign instead of just 2. Let’s take a look at these powerful new changes.

New A/B testing interface

If you’ve used our A/B testing features in the past, you’ll notice some of the slight differences right away—A/B Split Campaigns have been renamed A/B Testing in the Create Campaign drop-down menu, and the A/B Split step in the Campaign Builder navigation has been replaced by Variables.

Here, you’ll decide which variable you’d like to test and how many different combinations of that particular variable to compare. The familiar A/B testing variables—Subject line, From name, and Send time—have now been joined by Content (more on that in a moment), and up to 3 combinations of a single variable can now be tested at once.

The Variables step is also where you’ll determine how you’d like to test the combinations, what percentage of your list you’d like to test, and which metric (open rate, click rate, or manual selection) will be used to determine the winner. The Summary table on the right side of the page will update as you make changes, displaying the number of combinations you’ve chosen to test, along with how many subscribers are in each of the testing and winning segments, when applicable.

Native content testing

A/B testing has always been a great way to get insight into the preferences and engagement tendencies of your customers, but one thing that’s been missing was the ability to A/B test different types of content or design elements. In the past, a limited amount of content testing could be done by placing campaign text, images, or links within a *|GROUP:X|* merge tag, but testing completely different designs required building separate campaigns.

 

In this release, we’ve simplified the process, eliminating the need for a special merge tag and making content testing native to A/B testing campaigns. Now, you can easily create and test 2 (or 3) different versions of a campaign. All elements of the campaign are fair game, too. Test different text blocks, images, links, calls to action, design elements, or even completely different templates to identify the most effective combination for your audience.

After creating each variation of your campaign, you’ll write a brief description for each so you can quickly distinguish between them as you’re reviewing your A/B reports.

Comprehensive A/B reports

Speaking of A/B reports, they’ve also been redesigned to be more comprehensive and easier to understand. On the Test Results page, you’ll find the aggregate results for your entire test, along with a breakdown of results for each combination, so you can easily review the performance of each variable. And, if you’d like an even more detailed look at your data, the full campaign report for each combination is just a click away.

A/B reports also feature a brand new Links tab, with a Content Comparison tool that will provide more insight into the behavior of the links within your campaign. When testing content, this tool will help you quickly review the location and performance of links used within each combination. If your testing combinations have the same links throughout, this section of the report will allow you to identify which link drove the most clicks.

For a full walk through of all of our new A/B testing features, visit our Knowledge Base.

Share This Article