Monday, 5 October 2015

A/B Testing Plot Twists We Didn’t See Coming

The last turn of events you want in your A/B testing program is a M. Night Shyamalan-eque plot twist where a key measure plummets after an initial success.

This is a common risk for most testing programs if you judge your test’s success by what you immediately want a visitor to do next. Do more of them do that next step? Excellent! Winner! Celebrations all around! But honestly, did you look and see if the right people took all the right subsequent steps? For example, a variant may lead more people to the next step in your funnel, but does that carry throughout the entire funnel? Often times, a variant wins in one step of the funnel but actually decreases signups. It’s happened to us plenty of times.

Here at Kissmetrics, we recently ran a test on the homepage, with the not-too-uncommon problem that we don’t love our main call-to-action. It asks for a lot of trust right away, it’s confusing about what you get from a Google-based login, and so on. But it keeps winning, each and every time, against all kinds of other formulations. (If you’ve got ideas about how it makes visitors feel, or what might be better, please suggest away in the comments.) Here it is:

kissmetrics-homepage-tagline

This “Log In With Google” has beat every CTA we’ve tried it against.

But then we tested it on the blog, and got a much different result.

We tested Try Kissmetrics as the CTA instead. Much more indicative of the offering, we told ourselves with anticipation of great results. And it had a near 19% increase over the previous CTA:

signed-up-kissmetrics-blog-ab-test

Completed trial signups were up 42%, 74% increase in active trial use. Time to move it across the site, right?

We’ll call this plot twist the Usual Suspects: If the test won in one area of the site, apply it everywhere and it should still work. But that’s not what happened.

When we tested again on the Kissmetrics homepage, it won – but only on the surface, like pulling the name of Kobayashi from a coffee mug.

In terms of getting to the next action, the test seemed to do very well. More than 85% improvement on clicks, actually. With 99% certainty.

optimizely-kissmetrics-test-data

Sounds like you’re ready to leave the planet of the apes, right? Only, no. You’re still on Earth and the Statue of Liberty is right there in the sand.

In this case, we ended up with fewer completed trial signups (down 2%), fewer active trial users (down 25%), and less revenue:

kissmetrics-received-data-down-2-percent

25.24% fewer active users:

kissmetrics-fewer-active-users-ab-test-report

Not a winner. So we’ll keep trying to find the right match, or accept that we might have found what works for visitors, even if it’s not an internal favorite.

And regardless for your love of the test element, it’s just always better when you know what’s in the box. And that talking with dead people is okay. Plot twists are much better in movies than in your sales.

Essentially, our conclusion is that the customer is always right in A/B testing, and so we need to live with the CTA on our homepage. It’s withstood and won so much testing. At least we solved a problem for blog readers, who are visiting us to learn about marketing techniques, and are better understood as a separate audience.

About the Author: Maura Ginty is the VP of Marketing at Kissmetrics.



from The Kissmetrics Marketing Blog http://ift.tt/1OTSoKj
via IFTTT

No comments:

Post a Comment