Get Better Results with A B Split Testing

blog-a-b-testing

There are a lot of factors that can determine the success of an email marketing campaign - the subject line, the day and time it was sent, product images and calls-to-action used. The only way to know what works best for your campaigns is to experiment with these types of elements. The most common way to do this is via a method called A/B split testing.

Essentially, this involves testing variations on a specific element in a campaign to see which is the most effective in terms of user reaction. So, two variations of the same email campaign gets sent to a portion of the database, and which ever one receives the most opens or clicks gets sent to the remaining portion of the database.

It’s used to determine which specific type of subject line or email layout works better for your database. For example, your particular database might react more positively to a subject line that says “Buy one get one free” rather than one that says “Special offer now on”. Alternatively, if you have several promotions running, you might want to test the order in which you lay out the promotions in the body of the email. This could have a big impact on your click through and conversion rates.

How A B split testing works

TotalSend allows you to choose what size portion of your database you want to test with. Depending on the overall size of your database, you can use a percentage that will give you plenty of responses and still have a large amount left over to send the more successful email campaign to.

Starting with 20% of your database is ideal. 10% will get one version of your email campaign and the other 10% will get the other version created; and the better version is automatically sent to the remaining 80% of your database.

Furthermore, depending on what element in the campaign you are testing, you can also select whether you want the winning email to be based on opens, or based on clicks.

A/B split testing ideas

Here are some of the most commonly used elements for A/B split testing:

  • Subject line – this is often the first thing that a recipient on your database will see of your email. With A B split testing, you can see which action words your database responds more positively towards.
  • Content – this includes everything from headlines and text to offers and graphics. Play around with different ways to put across the information or use new content you haven’t previously used – but remember only one major change at a time.
  • Design – this refers to the template design itself, the order of the information within your email, the structure of your offers and how they correlate to the wording, as well as the placement of links, buttons and calls-to-action. It’s important to make links and buttons stand out so that readers will know where to click in order to convert. If you aren’t sure which of your special offers or news items will be of most interest to your audience, try variations of the order you rank them in.
  • From name and address –the most common practice is to use the company name, but perhaps consider the name of the manager, owner of the business or whoever is running the email campaigns – to give it that personal touch, and then use A/B split testing to see which one your audience responds to better.
  • Time of day – basically, if the email ends up in the recipient’s inbox at a time when they are at their computers, and they’re not too busy to look at the message, you’ll likely see an increased level of engagement. Try out various times of the day to see when you get the most responses. An optional starting point for time would be to consider the type of email addresses your list contains - are they mostly company or free webmail addresses? Perhaps go even further by looking at country specific addresses such as .com, .co.uk, .co.za and the like to find out where your subscribers are mostly based and optimise the times of your sends based on this.
  • Day of the week – this links quite strongly to time of day as well. You can use A/B split testing to work out if people are more likely to read and react to your email on a Tuesday as opposed to a Friday. Considering the type of audience that is in your database you could probably get an idea of what they’re likely to be doing on different days of the week.
  • Personalisation – should you call the recipient by their first name or title and surname? Depending on the situation at hand certain salutations might or might not be appropriate. Check to see if you get better results with a more personal touch of “Hi Bob” or a more formal approach like “Dear Dr. Smith”.

Once this send out is complete, you will have the results showing which one performed best, and this is a clear indicator of the effectiveness of the elements in question. To understand exactly which version of your email performed best, certain metrics apply to particular elements that you’re testing out:

  • Open rates - If you’re testing the subject line or from name, the open rate will let you know which option works best.
  • Click Through Rates (CTR) – this should help you determine which of your layouts and designs are getting better engagement results.
  • Conversions on the landing page – This is important too as it may illustrate what wording or offering works best in your campaigns and can indicate whether the email is getting the correct message across and enticing people to convert once they've arrived on the landing page.

It’s important to note that the A/B split testing takes portions of your subscriber base selected at random, so you may want to run a few A/B split tests to get clearer reporting results.

It’s also vital that you only test one thing at a time. If you send out a mail with a different subject line as well as changes to the content in the body, you might not get a clear indication of what is causing the different open, click through and conversion results. By changing one element at a time, you can calculate the exact impact it’s having on your database.

A/B split testing is a powerful tool and it can help you gain confidence in what you are doing. As long as you are certain that the results are driving engagement positively, you’re able to then optimise accordingly with clarity and certainty that what you are doing is the right thing.