Split Testing: How to A/B Test Cold Sales Email Templates

By Sujan Patel

Split Testing: How to A/B Test Cold Sales Email Templates

Cold email templates save you the time you’d otherwise spend writing out individual messages by hand. But all templates are not created equal.

How can you tell whether the messages you’re sending are as effective as they could be?

The answer is A/B testing – a process which pits two versions of the same message against each other in order to determine which variation drives more desired outcomes. In the case of cold sales email in particular, A/B testing can help you get:

  • More opens
  • Link click-throughs
  • Replies
  • And conversions

Sounds great, right? Yet, despite the clear benefits, nearly 70% of email marketers surveyed by YesWare don’t use A/B split testing in their campaigns.

split testing cold emails

YesWare respondents provided three main reasons for not using A/B testing:

  • They don’t understand its benefits
  • Don’t know what to test
  • Or, don’t have the tools to test

Let’s break down each of these issues, in addition to exploring the solutions that will help anyone using cold emails in their sales process to take advantage of this powerful technique.

The Benefits of A/B Split Testing

To be 100% clear, if you’re sending cold sales emails, you can benefit from A/B split testing. Just take a look at the following case studies:

  • HubSpot used A/B testing to determine whether their audience responded better to emails that featured the company or a real person as the sender. Sending from a real person won, driving 0.53% more opens, 0.23% more clicks and 131 more leads.
  • An A/B test for Money Dashboard found that focusing on their business (with the subject line “Please put us out of our misery”) – rather than on recipients themselves – resulted in a 104.5% increase in opens for inactive subscribers and a 103.3% increase in clicks for active subscribers.
  • Wishpond used A/B testing to test the impact of adding “You” to their subject lines in emails intended to boost sign-ups to their VIP demo. Ultimately, they found the subject line “Social Media Stats You Need to Know for 2014” resulted in 11% more opens than “Social Media Stats for 2014.”

While these aren’t exclusively examples of A/B testing on cold sales emails, they don’t need to be. What these – and the hundreds of other case studies published online – demonstrate, is that split testing your sales emails can drive performance gains, no matter what you’re selling or what industry you operate in.

What to Split Test

Having ruled out the argument that A/B split testing isn’t important, let’s look at the most common response to YesWare’s survey: “It’s valuable, but I don’t know what to test.”

Determining what to test in your cold emails can be challenging – but not necessarily for the reasons you expect. Far from having nothing to test, marketers face a seemingly-endless number of options. That can make moving forward more paralyzing than if you had nothing to test at all.

That said, just because you can test everything doesn’t mean you should. Instead, I recommend starting with tests on Go to the full article.

Source:: Business 2 Community

Be Sociable, Share!