Published on May 7th, 2012 | by My Truong1
The Results Are in! Takeaways from VR Email Tests: Offer Edition
We’ve given you quite a few ideas lately on the different components of your email campaigns that you could test (CTAs, Personalization, etc.), but had yet to give you any tips on how to could mix up your copy/content itself to generate better results. Time to remedy that!
Tale of the Tape
We run sales promotions on a regular basis here at VerticalResponse, but it’s proven tricky to find customers’ sweet spot in terms of which offer can be most appealing to them, leading them to pull the trigger in making a purchase.
Looking back at historical data, we discovered, somewhat surprisingly, that two very different offers had been most successful so far: we’ll call them A and B, with A presenting a lower barrier to entry and B being on the higher end of the spectrum but offering a higher reward. For example, offer A would be “buy one, get one free,” while B would be something like “buy $100 worth of supplies, and get a year’s worth of free service.” We had also run varying offers in-between, but for some reason people seemed to respond better to these extremes.
And the Winner is…
We chose to conduct an A/B test and to do so, split our mailing list into two segments: one that would be receiving offer A and the other offer B. Groups with offers A and B were sent 2 waves of emails, the first to announce the sale and the second as a reminder about a week later, shortly before the sale ended. The from label, subject line, pre-header text and imagery didn’t present any huge variation save for the offer amount (obviously). Here are the combined results:
– Offer A: 8,456 recipients, 9.69% open rate, 0.36% CTR, 10 conversions, $334.61 revenue
– Offer B: 8,627 recipients, 10.21% open rate, 0.34% CTR, 6 conversions, $104.55 revenue
So while the email stats were quasi similar, we can see that offer A generated almost twice the number of conversions and over triple the purchase amount than offer B, despite the lower dollar amount. Version A by knockout!
We can see that offer A was more attractive than offer B for this particular audience, which seems logical given its lower barrier to entry. What’s more surprising though is how it managed to generate such a higher purchase amount especially given the gap between both offers, and how it led customers to purchase in higher quantities even than what B was offering.
As always, further fine-tuning will be required. For instance, we could take a look at whether these results apply to all channels, the possible impact of seasonality, the age of the accounts that purchased, etc.
What have you been testing lately?
© 2012 – 2013, My Truong. All rights reserved.