How to Practice Effective A/B Testing for Emails


Chris Donald Director of InboxArmy

Wednesday, June 24, 2020

A/B testing is a vital element of sending interesting and engaging emails. Without it, you risk making the same mistakes over and over again.

Article 4 Minutes
How to Practice Effective A/B Testing for Emails

More often than not, marketers either overlook A/B testing or fail to use its maximum potential. In this article, we’ll dig deeper into A/B testing and how to use it effectively.

What is A/B testing?

A/B testing, also known as split testing, is a process by which you can determine which two variations work better to garner results.

This involves setting up two variations of a single campaign. You send version 1 to one portion of your subscribers and version 2 to another.

For example, Swiggy, an Indian food delivery platform, sent out the same email with different subject lines to figure out which works best. One of the subject lines included emojis, allowing them to test whether the use of emojis influenced the open rate or not.

Swiggy split test their email subject line to determine whether emojis influence the open rate or not

Which elements should you test?

Before you begin, decide what you want to test. It can be tempting to test everything, but you should only test one element at a time to get accurate results.

Here are some elements you can test:

  • Subject line
  • Personalization
  • Email copy
  • Layout of the email
  • Use of visuals
  • Call-to-action
  • Testimonials to include
  • Type of promotion
  • ‘From’ name
  • Use of emojis

Two case studies to better understand A/B testing

Here are a two examples of how A/B testing can be used to improve performance.

uSell: testing subject lines

uSell, a brand that helps the user sell their used electronic devices, A/B tested the subject lines of their drip campaign. As a part of this process, they sent an email that encouraged customers to ship the device free of cost. The objective of their campaign was to encourage the maximum number of customers to take action. To get better open rates, they tested their subject line with two variants:

  • Your old device is saying, only 1 week left
  • We still haven’t received your [product_name]

The first version saw a 24.5% improvement in the open rate and a 7.5% increase in send-in rate.

The second variant scored a 14.3% improvement in open rate but the number of users sending in their devices increased by 8.6%.

Unbounce: testing CTAs

A study by Unbounce revealed that by replacing “My” with “Your” in the CTA copy, there was a 24.91% drop in the conversion rate.

Split testing call to action copy

In other words, writing in first person narrative taps on the ‘fear of missing out’ and encourages the subscriber to take quick action.

The actionable CTA copy also converts faster. Variation in the tone of the CTA button significantly influences the click-through rate and conversions. For instance, ‘Get Started Now’ works a lot better than ‘Try Now’.

marketingexperiments highlights and ranks different copy variations based on their clickthrough rate performance

5 mistakes to avoid while A/B testing

To help test your email campaigns better, here are some mistakes to avoid:

1. You don’t have a solid hypothesis

Having a valid hypothesis is the foundation of effective A/B testing. You must assume a justifiable reason for your current email performance and take steps to improve it based on the same. Before you start the test, be clear about your motive behind doing it and the end goals you wish to achieve.

2. You compare results with an obsolete campaign

It’s imperative to consider recent email campaigns while comparing results to eliminate the bias caused by time. Don’t compare the results of emails sent in January with those sent during Cyber Week.

3. You miss out on list segmentation before testing

A/B testing should be conducted between subscribers who have similar interests and preferences. Not segmenting your email list might give you unreliable results.

4. You change the metrics in the middle of the test

It’s not advisable to change the metrics or the sample of your testing midway. If you think you need to change the parameters, you should start the test again.

5. There’s no consistency in the test campaigns

There shouldn’t be too many differences in the two variants to be tested. For instance: ‘50% OFF: Sale is live’ and ‘Grab your 50% OFF now’ are good variants to test.

Wrapping up

Most of the email services providers offer an easy-to-understand user interface to carry out A/B testing. Once you’re familiar with executing A/B testing, you’ll see your email campaigns performing better and yielding greater ROI. A/B testing is indeed a powerful way to turn clicks into customers and deliver a better user experience.

Chris Donald

Director of InboxArmy

Chris is the Director of InboxArmy, a firm that specializes in providing email marketing services from production to deployment. He has worked directly with Fortune 500 companies, retail giants, nonprofits, SMBs and government bodies in all facets of their email marketing services and marketing automation programs for almost 2 decades. Chris’s success track record covers building email programs at competitive prices and using data-driven strategies to turn around underperforming accounts.


Join the conversation...

24/06/2020 Zee
Excellent insight