Know Your S#*!: Maximize Web Conversion with A/B Testing

January 10, 2011
205 Views

Long before the web existed, savvy direct marketers understood how to identify the copy, graphics, even the paper that resulted in maximum return for marketing expense. They relied on experience and personal observation, but that wasn’t all. Successful marketers test their materials! They test each element of marketing packages, alone and in combination.

Testing makes the difference between businesses that thrive and businesses that go under.

Long before the web existed, savvy direct marketers understood how to identify the copy, graphics, even the paper that resulted in maximum return for marketing expense. They relied on experience and personal observation, but that wasn’t all. Successful marketers test their materials! They test each element of marketing packages, alone and in combination.

Testing makes the difference between businesses that thrive and businesses that go under.

You may have heard web or traditional marketers refer to “A/B,” “split” or “multivariate” tests. All of these are names for a formal approach to trying alternative materials with real, live prospects and observing the results. Here’s how it is done:

Prepare two versions of a landing page (this technique is equally valid for testing email design and copy). Ideally the two will differ in only one element, such as two different images in the same spot, two alternative versions of copy in otherwise identical layouts, or perhaps “Buy Now” buttons in two different colors. The advantage of varying only one element is that after the test, you will have a clear understanding of what element affected conversion and how.

It is also common to use A/B testing to compare two quite different alternatives. There is nothing wrong with this, but while you will end up knowing which design has the best conversion rate, you won’t know which elements made the difference.

Often, one of the two alternatives will be the page you are currently using. A page, piece of copy, or an email that is your current standard may be referred to as the “control.” The alternative, which includes some new approach, is the “test.”

Serve each alternative design to a randomly selected sample of visitors. (You’ll need web tools capable of doing this. Ask your web team if you have the proper setup for A/B and multivariate testing. If they don’t seem to know what you are talking about, it’s time to get them some training.) “Randomly selected” means that each visitor has an equal chance of viewing either Version A or Version B.  If you serve Version A to the first 1000 visitors in a row, then Version B to the next 1000, that’s not random. If you don’t use random samples, then you may end up with a misleading conclusion.

Track conversions resulting from each of the alternative views. (Again, you must have the proper tools.) You will need to know how many views occurred for each version of the page, and how many conversions. If you can obtain more detailed data, such as purchase size, items selected, or time to complete the purchase, you can perform even more sophisticated and useful research. For example, if one landing page resulted in a moderate number of sales, but the purchase size was large, that might be more desirable than an alternative which yielded many more sales, but for lower amounts.

Use proper statistical analysis methods to compare the results of the alternatives in your test. Statistical methods provide a consistent and effective way to weed out small differences in results which are merely due to natural variability from visitor to visitor. Some web tools have this capability built in, but if not, you can enter data into an appropriate statistical analysis product. In some cases, it may even be practical to crack open your old statistics textbook, get the formulas and do the calculations by hand, but that’s not usually the best long-term solution.

Even if you have terrific web tools that do all the analysis for you, you’ll have more confidence discussing test results if you familiarize yourself with the mechanics of statistical analysis. A good review for this kind of testing is to read the basics on the “chi-square test of independence.” This topic is covered in all introductory statistics books. You can use the book you kept from your school days, or stop by the library and pick up any statistics book that appeals to you. This method is so widely used that you will have your choice of many books that include it.

Alternative designs can result in remarkably different conversion rates. Experienced direct marketers know that while they may have good ideas of what type of copy and layout to use, they must test to determine what works best for particular offerings and markets.

Testing provides information about what works. To realize improved results from testing, you must take action. When you find that a test alternative performs better than your control, replace the control with the alternative! Then track conversions to confirm that the results are consistent with your expectations. Continue to create new alternatives, and perform tests to see if you can obtain incremental improvements in returns. Optimum sales depend on an ongoing program of testing alternatives and revising marketing materials.

Nothing compares to the experience of testing and improving returns in your own business. But if testing is new to you and you’d like to see some concrete examples and get ideas for what you might test yourself, here are some sources for ideas.

Marketing Experiments – a research lab which shares many real life examples.

http://www.marketingexperiments.com/

10 Factors to Test that Could Increase the Conversion Rate of your Landing Pages – an article which describes many of the landing page elements that can affect conversion rates.

http://www.wilsonweb.com/conversion/sumantra-landing-pages.htm

Hidden Secrets of the Amazon Shopping Cart – a case study by Bryan Eisenberg which discusses the evolution of the Amazon.com shopping cart.

http://www.grokdotcom.com/2008/02/26/amazon-shopping-cart/

 

©2011 Meta S. Brown