Column by VerticalResponse CEO/Founder Janine Popick, Inc.com "Basement to Boardroom" April 23, 2012
For me, measurability is very important when it comes to marketing. I keep a close eye on how much money we're spending on our marketing efforts and make sure that we recoup our spend-and then some-in return. If the math isn't right, we adjust. I also want to know which leads aren't converting into customers and why, so that we can optimize our performance.
To accomplish this, we do a ton of testing at my email marketing company, VerticalResponse. Because we provide self-service online marketing tools for SMBs, most of our advertising and marketing are online and via email. This affords us a lot of testing opportunities, tracking and data. Interestingly, many of our tests end up with surprising results, which shows you should never assume anything! We tell our customers that they should always be testing their email campaigns and fine-tuning their efforts, and we definitely practice what we preach.
Here's a glimpse at three tests we've done with our own email marketing campaigns. You might find some takeaways that could be useful in your own marketing efforts.
In this test we had an identical offer and message in the email, but we changed the text within the call-to-action (CTA) button. Half of our test recipients got a button that said "Download the Guide" and the other half got "Get the Guide":
We originally assumed that the word "download" would receive fewer number of clicks, since it sounded more formal. Good thing we ran the test, because the button with the word "download" actually performed better than the word "get," resulting in 2 percent more clicks and 43 percent more in revenue!
We put together a list of customers who had signed up for a free VerticalResponse trial but hadn't activated their account. Two emails were sent out. In version A, customers were offered a discount if they activated their account and "liked" us on Facebook. In version B, no offer or incentive was included and customers were simply encouraged to activate their account.
Surprisingly, version B, with no offer or incentive, received 32 percent more opens and clicks. Perhaps version A required too many actions that weren't worth the incentive? What if we offered 75 percent off?
Usually, you'd find the CTA after you read some sort of description about what the email is about. One day we thought, why don't we make the CTA the first thing the recipient sees when the email is opened, above the so-called fold?
Here's version A, with the CTA ("Save $30 Now!") at the bottom:
And here's version B, with the CTA on top:
By tagging "referred-by-codes" to each banner image we were able to differentiate them in our reporting tool. Our results showed that both versions accumulated the same clicks, but version B acquired a higher click-through rate (the number of clicks divided by the number of times the ad was opened) and brought in more than 33 percent more revenue.
As you can see, testing is an integral part of virtually any marketing campaign if you want to continuously improve the quality of your communications and boost conversions. So try it out; it may do wonders for your revenue!
Have you ran any tests on your marketing efforts? Did you find something surprising?