We all have things we habitually do that are "good" for us, but offer no immediate benefit. For example, I had the bright idea to join a boot camp class between 5:30 and 6:30 a.m. every Monday, Wednesday and Friday. I am two weeks in, and the only immediate benefit has been some pulled muscle or random ache. But alas, I march, on knowing that the results will come!
Sometimes, when the alarm rings at the crack of dawn, I consider not going (much like the relationship you likely have with your email test plans). But then I remember that the immediate benefits may seem miniscule, but the long-term gain can be huge.
I know it’s tough to plan a test; it takes a lot of work to generate a hypothesis, create assets and content accordingly, structure the test, send it, and measure and analyze the results. But each little insight you make adds up to a better understanding of your subscribers and their preferences. If you’re committed to testing (and you should be), there is always something you can test, even in the simplest of campaigns.
advertisement
advertisement
Easy Things to Test
Subject Lines—Every email has one. Try a different approach on a small percentage of your list. I know email marketers like to get creative, but would a simple “Free Shipping Through Tuesday” get a higher open rate? Test it.
Sending Time—Think you know the best time of day to send? It may have changed, or it may be different when the sun’s out (I know it is for me). It may even be different for offer emails versus content emails. Test, and see what you find.
Audience—Even if you batch your entire database, there is an opportunity to create a segment within that audience that can tell you something about them and how they engage with your email. Recent subscribers vs. older subscribers, men vs. women, email clickers vs. non-clickers – all segments you can leverage to garner greater understanding.
A Little Tougher
Call to Action—Test its location in the email, the language, even the color or size of the text or image. Remember to test one thing at a time so you isolate what is causing the difference.
Content—Does a certain style get more clicks or conversions than another? Are more words better? More images?
Personalization—Just because we have tools that can insert a subscriber’s name, profile picture, or most
recent purchases into an email doesn’t mean that technology will result in better stats. Or does it? Find out.
Dedicated Testing
Layout—Have you kept the same template for four years? Try a new one with the same content (for a few campaigns to minimize the newness factor) and see how it does. Have you been ordered to change your template? Test a few alternatives to see what resonates best with your subscribers.
Entire Campaigns—Say you are about to launch a new welcome campaign for new subscribers. Will it really result in more engagement? Launch the welcome campaign, but create a control group of about 20% of your new subscribers who will receive the previous version. Run it for a month, and see which does better. This type of test is especially compelling for higher-ups who like to see that an investment in a new welcome campaign was worth it. You probably know it will be, but test it and prove it. You can do the same thing with abandoned-cart campaigns, re-engagement campaigns, and even cross-channel campaigns
Keep in mind that insight from testing comes slowly. A small lift in an open rate or conversion rate may seem insignificant, but each little lift adds up. Also keep in mind that the people you’re testing are, in fact, people. They change their minds. Just because you’ve tested before and found a winner doesn’t mean that method will win if tested again.
This is just the start of what you can consider testing. I'm sure you're boiling over with new ideas, as my tired mind and achy body sit at my desk and contemplate if I really am going to get up at 5 a.m. tomorrow morning to get my a$$ kicked by the drill sergeant. What else have you tested in your campaigns? What else would you like to test?