Showing posts with label a/b testing. Show all posts
Showing posts with label a/b testing. Show all posts

QR Code Campaigns

QR codes have been around for several years with their start in Japan. These days, the codes have become an inexpensive way for marketers to easily track the progress and ROI of their mixed media campaigns using print (direct mail, magazines, BRCs, business cards, coupons), tv, POS (e.g., in-store displays, billboards), mobile, email, or web marketing. As long as you have a mobile device that is capable of downloading an QR code reader app and has a built-in camera, you and your customers are good to go.

Here's a simple QR generator. Learn more about its developers here.

Some notable marketing campaigns using QR codes:

Dole Salad Mobile Club
  • Campaign specs: 40k direct mail postcards with a QR code and short code for texting, Price Chopper database of users with an affinity for pre-made salads, separate codes used for Facebook and online banner ads 
  • A/B Testing: different coupon redemption requirements for current versus time-lapsed participants
  • Goal: Increase mobile subscriptions among
  • Consumer reward for signing up: discount coupon, holiday recipes, sweepstakes entry to win a $500 Price Chopper gift card
  • Next steps: national campaign via Valassis and News America mail programs
Calvin Klein Jeans
  • Campaign specs: three billboards (downtown NY, Sunset Blvd in LA), 40-second commercial that users can share with Facebook/Twitter networks
  • Goal: Introduce CK's 2010 Fall Jeans
  • Consumer reward for signing up: a shareable mobile/web commercial 
MyToys.de
  • Campaign specs: POS posters in public areas with themed and colored Lego codes directing users to MyToys.de's website
  • Goal: Increase web traffic and online store revenue for LEGO product line
  • Results: 49% increase in inbound web traffic; Twice as many LEGO boxes sold when compared to non-QR-coded marketing campaigns
  • Watch their campaign wrap-up video
TATmobile

A/B testing, is it worth the effort?

Industry: Banking

This year we've done a few A/B tests of just subject lines for B2B e-mail campaigns. The "standard" way we split the campaign audience was by halving the dataset equally. The first half of the e-mail addresses receive Subject Line A, the latter half receive Subject Line B. This got me to thinking that maybe customers will read just about anything we send them, and if our content is unclear they'll hit reply and/or call up their salesperson. And, that the population subset where campaign readership was higher might also reflect how clean the dataset was.

The customer population used for all three broadcasts is the same dataset.

March
E-mail Focus: Product

Test A: NINA/No Doc - $750K to 80% LTV
Test B: NINA/No Doc - Non-Traditional Solutions

May
E-mail Focus: Product

Test A: New 1-Month LIBOR for the Building Season
Test B: New Construction Solutions

July
E-mail Focus: Technology Enhancement

Test A: Private Label Marketing A Click Away
Test B: Let Us Market for You

The test results are as follows:

Month, A/B, qty e-mails sent, 1-wk opens, 1-wk open rate, 3-wk opens, 3-wk open rate

March A, 21019, 9809, 47%, 10450, 50%
March B, 21022, 9286, 44%, 9840, 47%

May A, 22024, 7766, 35%, 8152, 37%
May B, 22028, 9563, 43%, 10153, 46%

July A, 22600, 8698, 38%, 9580, 42%
July B, 22604, 9468, 42%, 10330, 46%

You might infer that a 3% difference in open rate is not statistically relevant. Even in market research where our audience size was much smaller, at least 8-10% is noteworthy, but not 3%. Not for this population size. Maybe if we did consumer marketing and broadcasted to millions of addresses per month would this be more relevant, but I digress. July is about the same with 4% not being that significant either.

So what happened with the May campaign? Perhaps it isn't data related at all. Spring to Fall is when a lot of home construction and remodeling goes on. A product that has good rates for just its first month isn't as appealing as say, a more general umbrella of solutions offered to a customer.

One thought in the aftermath of the May campaign was that dataset B responded more favorably to campaign broadcasts. Sure, I might be inclined to believe that. However, for the July A/B campaign, the second half of the population was sent the A campaign, and the first half were sent the B campaign. As the results show, the audience didn't care one way or the other.

Analysis: Inconclusive. The only good way to measure e-mail campaign success is to tie the data to quantitative metrics, like how much loan volume resulted from a broadcasted campaign, how much of a media mix was used to promoted a particular product, how many calls were generated by this campaign. ROI isn't necessarily a good metric anymore especially when your company broadcasts in-house. It costs us almost nothing to broacast, except the time spent by the business units putting the campaign content together, myself formatting the content into html/aol campaign templates and pulling data for broadcasting.

So, is it worth the effort? No. But that isn't going to keep us from testing this process with future campaigns.

A/B Testing for E-mail Campaigns

A/B testing, also called an A/B split, is the easiest method of testing elements within your emails or on your Web site. The target audience is divided into two groups. One group is broadcasted the original version of whatever you are testing, and the second group sees an alternative version, where only one element has changed. Then, the results are tracked. With e-mail marketing, the following components are usually tested: bonus gifts, coupons, P.S. messages, guarantees, opening sentence image, closing sentence image, from-field, calls to action, opening greetings, type styles, layout elements, graphic images, etc. This form of testing is used by both business and consumer-driven campaigns.

My company did its first A/B e-mail campaign test last week. The only element that changed was the e-mail subject line. We used:

Test A: NINA/No Doc - Non-Traditional Solutions
Test B: NINA/No Doc - $750K to 80% LTV

On average, our audience open rate for product e-mails is about 45%, it's not that great and could use some improvement. The distribution list was split 50/50. After 5 calendar days, there's no discernable difference between either subject line and the combined open rate is 42%. Having not worked in the mortgage industry before, much of the e-mail content that we broadcast doesn't make much sense and is so banking specific that I hope our customers are getting value from it.