If you’ve been following this blog for a while, you know that we’re firm believers in testing everything you can to continually refine your marketing strategy. A/B testing your emails is one of the simplest things you can do to improve engagement, and ultimately, the ROI on the emails you send out.
What is A/B Testing?
A/B testing, also called split testing, is a controlled experiment between two variants, A and B. In emails you can use this form of testing to compare two versions of a single campaign to identify which is most effective for opens and clicks.
Choosing What to Test
Many elements of your email campaign can have an effect on engagement. What you test may be based on the results you’re looking for (i.e. opens vs. click through). Below are some examples of things you may test.
- Subject Line: “Read our Latest blog Post!” vs. “A/B Testing Your Way to Better Emails”
- Email Layout: Two column vs. Three Column, or Placement of Button
- Preview Text: “Sale ends tonight!” vs. “Don’t miss this sale!”
- Images: Objects vs. People
- Email Length or Headline: Short and sweet vs. Long and detailed
- Offer: Free Shipping vs. 30% off
- Call to Action: “Shop Now” vs. “Buy Now” or “Book an Appointment” vs. “Schedule a Call”
Setting Up an A/B Testing Process
It’s important to test just one piece of your campaign at a time to get accurate results. If you test multiple elements you have no way of knowing what aspect of the email resulted in the change.
Like any good scientific experiment, you want to have a control element to compare against. To do that everything in your A and B campaigns should be exactly the same except for your test element. For example, if you test just the Call to Action and one email gets more clicks than the other, you’ll know what caused the improvement.
Keep notes on these test over time, and you should be able to come up with some general rules of thumb that work for your particular audience. For instance, you may find that images with action in them lead to more conversions, or that subject lines with emojis lead to higher open rates. Eventually, you’ll be making all your campaign decisions based on actual data that tells you what your subscribers want.
If we had to pick an order for conducting tests on these variables, we’d say start with the top down. Start with your subject line, then headline, then main image, etc. That way, you can increase your open rate first. Then, once more people are opening your message, you can focus on testing how to get your subscribers to engage. Eventually you can transition this into testing elements on your website as well.
The whole point is to use A/B testing to work toward the perfect email campaign.
A/B Testing Tools
In the past you had to run A/B tests like this manually, splitting lists yourself and working through the data to figure out the winning campaign. Thankfully, A/B testing tools are now standard on most mainstream email campaign tools.
These tools make the process simple and easy to understand, even if you’re still not quite sure how this works. For example, Mailchimp’s system for A/B testing walks you step by step through selecting your test element (subject lines, From name, email content, email send times etc), what percentage of your list to test, and how long to wait until identifying a winner. Then, it’ll send out the winning campaign to the rest of your list automatically!
If you see the benefit of A/B testing but just can’t seem to find the time or willpower to set up these kinds of campaigns, we can help! Get in touch with us today!