Web Analytics + Customer Voice = 20/20 Vision.

by Andrew Lucyszyn on November 10, 2010

Andrew Lucyszyn, Web Analytics Team

Andrew Lucyszyn

In this age of tweeting of every customer’s every impulse, and the power of exquisite multivariate tests to generate thousands of permutations, there is no shortage of data inputs that can drive a business’s reactions to, well, nearly everything.

Web analytics is a balance of a number of different disciplines, but unlike a #16 seed in the NCAA basketball tournament, it’s never one-and-done.  A recent web analytics engagement at SIGMA put the spotlight on an outcome where the results of multivariate testing conflicted with what came next.

Our client’s website primarily existed to drive traffic to their sign-up form for membership to their service club, and the client was dissatisfied with the site’s conversions.  In this case, a conversion meant that site users completed the form and paid for membership online.  In what should have been a simple A-to-B-to-C sequence of pages, lots of A’s never became B’s, lots of B’s never became C’s, and our client wanted to know why.

A combination of web analytics data and a wealth of voice-of-customer data gave a very clear picture of the situation.

  • The website was pushing people into being A’s who wanted to be R’s or J’s or anything but an A. You can’t turn an A into a C when there was never an A in the first place.
  • The website was pushing people into being A’s who may have wanted to be A’s eventually – even later in the same session. But they weren’t ready to be A’s at just that moment, and they certainly weren’t ready to be B’s.

Recommendations for changes and testing programs were made, and a response to one key element was “but we tested that and it didn’t make any difference,” which is understandable, since one usually performs testing to determine “a winner” among page versions.

While we didn’t have every aspect of the tested variations available, we did have what had come in the aftermath of the testing — a stream of consistently negative “I was trying to join, but…” customer feedback equating to many, many missed opportunities and frustrated prospects.  This client was fortunate to have a long history of voice-of-customer information from a pop-up survey flashed in front of everyone who abandoned its sign-up form, and these responses, analyzed in context, matched the discoveries from the client’s web analytics data perfectly, but the client hadn’t acted on them, “because we tested it.”

When the steady drumbeat of customer complaints rolls in each day addressing the same issue in alternating tones of confusion and snark, then clearly something needs to be done, and it begs the question of whether there was really a “winner” among the testing options in the first place.

Unlike that #16 seed, you get the option of a do-over. Change the control group. Change the whole look and feel of the process. Change something, because the customers are clearly telling you something isn’t working.

Nothing, whether it’s test results or metrics, is set in stone, even in the simplest of business outcomes. Metrics that seemed important six months ago lose their mojo, and tests you thought told you all the answers last year can be called into question today.

Keep testing, keep trying new things, lather, rinse, and repeat—especially when your customers are shouting at you that your tests left them responding “none of the above.”

About the Author:

Andrew Lucyszyn is a Web Marketing Analyst with SIGMA Marketing Group.

Interested in learning more about SIGMA’s web analytics offering?  Subscribe to the Fifth Gear Analytics newsletter.

Leave a Comment

Previous post:

Next post: