Testing, testing: The case for A/B email testing

Image courtesy of Creative Commons

Image courtesy of smoothtransitionslawblog.com

We’re all aware that email marketing is one of the most cost efficient and effective ways to reach your target audience. But if you’re not conducting A/B tests, you could be short changing your open rates, click-through rates and conversions. Creating two different versions (A and B) of your email and sending each to a different half of your database allows you to easily see what resonates with your target and what doesn’t.

Treat it like science
A/B testing provides an objective way of evaluating message effectiveness in what is traditionally an intuitive creative process. As such, it’s important to approach your tests like a true science experiment. First, send out a well executed eblast that will serve as your benchmark, against which you’ll measure the results of your A/B test.

The next step is to optimize/modify two new versions of the eblast to see how the change impacts the results. For example, you may want to experiment with two different subject lines, the placement of a button, the call to action, or simply the background color. What’s important to remember here is that for a true A/B test, you should only test one change at a time. That way you can clearly determine which change is influencing the target’s behavior.

See what pulls
Once you have these two new and different versions, deploy each to one half of your database and see how the results compare to your benchmark figures. Look at how many people opened each email, clicked through and converted across the call to action. This analysis will help you identify fallout points which influence where and what you should focus on for subsequent A/B tests. From there, you can continue to modify the message or design in your quest for even better results. But remember, only one change at a time.

Broader application

A/B testing isn’t just for eblasts. You can apply the same techniques to landing pages, web banners, paid search ads and more.

The bottom line
A/B testing makes sense for a number of reasons. It allows you to prove a point that may not have been supported by empirical data in the past, e.g. this headline will pull better than that one. Plus, it adds a level of accountability to the process which may, eventually, help you build a bigger interactive marketing budget. And who couldn’t use that?

Tags: , , , , , ,

2 Responses to “Testing, testing: The case for A/B email testing”

  1. AlexAxe says:

    Hi there,
    Not sure that this is true:), but thanks for a post.
    Thank you
    AlexAxe

  2. Scott Posner admin says:

    Sorry, don’t have anything I can share, but thanks for reading the post. I’ll have to check out your site.

Leave a Reply