Friday, January 15, 2010

Learn to Love Testing

We all know how important it is to make the most of the money we've got and spend it wisely. This being said, you may not think that this is the right time to test out new creatives to see what works best for you and your readers. Testing new creatives is the first step you take towards improving your email marketing skills and create more productive campaigns. Ultimately, this should (hopefully!!) increase your revenue - and the prospect of that alone should be enough to persuade you that testing is undoubtedly very, very important.

Where do I get started? First let's identify and set your goals - do you want to increase performance? If yes, then which aspect of it? Without a doubt, clicks should be your primary focus (of course, hopefully you're using the right ESP to get them delivered). Understand the needs of your business and the needs of your clients. Recession or no recession, your readers don't have a handful of free hours during the day to read emails with a lot of content. Try to keep your content short and to the point.

A/B split campaigns are a great way to test the effectiveness of one campaign against another. Here's how you do it -
    - Ensure you've got a list big enough to evenly split for testing and to get a realistic picture of which campaign had a better performance

    - Create 2-3 (or more) versions of the same creative/offer using different verbiage, or images (I'm sure you've heard this before, but I'll say it again -- while creating campaigns don't forget the 40% to 60% text to html ratio)

    - Choose the subject lines next - keep in mind that your subject line is the identifier of your mailing. It's the thing that prompts readers to open or not open your email

    - Make sure to run Delivery Testing prior to sending out the campaign. This will help you understand any flags that rendering or delivery may cause prior to scheduling the mailing. If you need further clarification, please feel free to contact us at (888) 732-8090 and we'll be happy to assist you in understanding the process
Another very effective tool for measuring and testing your creatives is sending an Nth sampling campaign. It allows you to extract a small sample of users from your overall list and allows you to define at what interval/position you want the selection to be. For example, a 10th name selection from a 100 user list will bring back 10 users for your testing. KobeMail allows you to create not only Nth sampling campaigns, but a combination of A/B split testing as well. Once you identify the performing subject line and creative, a follow up campaign can be sent to the rest of the users (excluding the 10 users that have already seen the creative).

After the campaign has been successfully deployed, analyze and compare the reporting for the different splits. You should not only measure the outcome of the split campaign, but compare the reporting with that of the previous campaign. KobeMail clients can easily do a campaign comparison report along with purchase trackng (if purchase tracking is set up). You should be looking at the following for your campaigns:
    - Opens - this will give you an idea of which subject line/from name works better. Although, you should limit your deduction on this, as an open is dependent on whether the user downloaded images. In text only campaigns, this is close to impossible to determine unless the user clicks a URL in the email

    - Clicks - this will give you a better understanding of the content your readers respond more favorably to. Refer to a previous blog post on call to actions for more info

    - Compare the conversion rates from previous campaigns you sent out for the same product/offer - see which campaign had a better performance

    - Check to see if a certain campaign increased traffic to your website

    - Look at the percentage of received mailings - did one campaign seem to get blocked more? Did it have more images or a risky subject line?

    - If you're selling a product/service and have purchase tracking, compare the revenue across the campaigns. Did any particular offer work better than another? Did placing a certain link at the top or bottom of your creative generate better revenue? Did readers buy more when you placed links on the left versus the right?
Comparing the results of test campaigns will help you better understand your reader base and better understand the changes that could be beneficial to your future mailings.

Author: Roopal Sharma
Editor: Courtney Dillsworth

No comments: