7 Deadly Sins of A/B Testing

January 9, 2013

Have you every noticed how the 7 deadly sins are applied to everything these days? Wonder why?

Turns out people just like the idea of it. I thought it was kind of spammy and that nobody would be interested, but as it turns out, the data tells a different story, but that's not the story I'm going to tell today.

Note: These are very real warnings about common pitfalls with a/b testing, though only loosely based on the 7 deadly sins.

Sloth

Baby sloth, Puerto Viejo, Costa Rica

The first mistake most people make when A/B testing is not A/B testing.

A/B testing is essentially the scientific method, which works crazy well. Like, we're talking seriously powerful shit.

If you're not running A/B tests right now, do yourself a favor and set up at least 1 A/B test and check it every monday this month.

Lust

When you first start an A/B test it's easy to get so excited about the method that you consider early results to be a validation of your original hypothesis. Don't fall into this trap. Always wait for statistical significance.

Gluttony

You might think of more and more variations you'd like to test as the A/B test proceeds. Don't add all these variations midway. You run the risk of ruining your results if you have some variations which were not in rotation for half of the experiment.

Wait for statistically significant results, then run a second A/B test if you think one of your new variations may outperform the winner.

If you have 50 variations to test and just a trickle of traffic… maybe A/B testing isn't the right tool for the job (see: wrath).

Greed

Always place your users above all else, some changes that may increase profit may hurt your your users, or deteriorate the product. These changes must be weighed very carefully. You set out to make the world a better place; don't lose sight of that.

If you're having trouble making these kinds of decisions, you probably haven't found your One Metric That Matters. Hint: It's usually not profit.

Envy

Don't waste time obsessing about other people's A/B test results. Focus on coming up with a reasonable hypothesis about something that might improve your business and test it yourself.

Statistical Significance is the ultimate measure of your success. If you spend your time in pursuit of "40% improvement by changing the color of a button" type things because of some blog post you read, you'll probably gloss over much more productive uses of your time.

Pride

This does not mean you should ignore the outside world though. You might feel like you have to come up with all the variations yourself; don't let your pride get in the way of using other people's good ideas.

Watch your competition closely, look for things they're doing right and A/B test them to see if they'll work for you.

Picasso said it best: "Good artists copy, great artists steal".

Wrath

I'll be honest, I don't have anything for wrath… But I'll give you a piece of unrelated free advice: Learn about other tools and know when A/B testing is the right tool for the job (and when it's not).

For instance A/B testing relies on a certain amount of people being exposed to variations and measuring how many do some action that you want. This is A/B testing 101.

If you have 50 variations, and 5% of visitors do the action (on average), it's going to take a lot of visitors before you even start to have an idea about how the variations stack up.

So I'll chalk this rule up to, "Don't hate on the alternatives. Use the right tool for the job." ;)


« Back to Which Blog

The Whicher

Do you wish you knew more about your customers? Yeah me too.

I use A/B Testing to learn about my customers, and it's a great tool, but don't you wish you could get data faster? and without needing so much traffic?

The Whicher gets data fast, and doesn't need hundreds of conversions per variation to work!

Learn more »

Subscribe

Socializeratisphere