by Ben Elowitz

The advertiser’s dilemma is well known, and well summarized (by John Wanamaker): “Half the money I spend on advertising is wasted; the trouble is I don’t know which half.”

Most media companies are in the same boat. Think of all those network executives who, each fall, commission scripts for some 70 new TV shows. Fewer than eight of those scripts actually become series, and only two of those ever survive their debut season. It’s not much different in the movie industry—five films (of the hundreds released) generated more than a third of last year’s box office totals—or on the Web. At Wetpaint Entertainment, 10 percent of our articles drive 70 percent of our traffic.

The point is we produce a lot more content than we can find an audience for. When it comes to creation and distribution, most media companies operate in “ready, fire, aim!” mode. We make new works that we hope will sing, and measure their impact on audiences long after it’s too late to use that audience feedback to shape or direct the product.

But if you knew exactly what your audience would respond to before picking up your creative pencil, you could make more popular content, and multiply your audience, your relevance—and your success. To help figure out just what excites your audience you could turn to traditional forms of research—surveys, focus groups and the like. But we’ve found that nothing is as reliable, actionable, or immediate as live testing. Particularly, A/B testing.

Other Internet practitioners (specifically, e-commerce sites) have been using A/B testing to massively increase conversion rates for years. But the empirical science of A/B testing is now, finally, relevant for media. It turns out that A/B testing can help you make great content decisions that compound. Meaning that by continuously testing and adjusting your content type, packaging and timing you can literally improve results (i.e. traffic, Likes, etc.) by five times, 10 times or more. This kind of evidence-based programming is a potentially powerful tool. But you have to do it right. Here’s what I’ve seen working:

1. Don’t do an A/B Test. Do 100.
Audience insights are valuable one by one—but only to a certain extent. A/B tests are designed to fail most of the time—only about 25% of your hypotheses will be proven out. Don’t fight these limitations. Work with them. The power is in the compounding. One test may generate an insight that gives you a five percent lift. Which is great, but not earth shattering. The real game change comes when you get maniacal about testing. Compound that five percent lift just 15 times and you’ve doubled your total audience. Fifteen insights may sound like a lot. But make every article, video and tweet part of an A/B test, and pretty soon you’ll be generating hundreds of data points per week.

2. Insight isn’t valuable. Action is. Create an action machine.
The most sophisticated intel in the world isn’t worth a dime unless it spurs action—informed, continuous action. Test the things that lend themselves to action, the things you control. For most media companies that means timing and frequency. For each of your distribution channels, find out what times of the day and what days of the week juice your content the most—and most importantly, how many times a day is too many as far as posting new content goes. It’s different for every audience—and you had better act accordingly. Jersey Shore fans have an appetite for 10 (yes, 10!) raucous tidbits a day about their favorite guidos and guidettes. Once you know that, what matters is getting 10 new tidbits into your line-up.

3. Institutionalize results.
Some companies do all the right things—to a point. They run the tests, do the analysis, even make some changes based on the analysis. But then they stop short of institutionalizing the changes and the process that could make their site more successful. Capture your insights in a programming playbook, then make your team memorize it. Or better yet, put it on stone tablets (or in a Google doc) where everyone can reference the rules and action plans that will get you the best results.

4. Take A/B testing offline too.
Digital media is fundamentally testable in a way that offline media isn’t. But digital test results can cross platforms. Many of the audience insights unearthed by digital A/B testing can be fed back into print or broadcast properties to enhance their performance too. It may be too late to re-shoot episodes of Revenge based on audience feedback about Daniel’s dark turn this fall. But you sure can use that feedback to edit next month’s promotional spots and maximize tune-in.

5. Beware small ball.
A/B testing is a tactic. And a good one. It can help you tune your product to your audience. It can inform business decisions, even drive them. But however relentless and precise, A/B testing is no substitute for strategic vision. Don’t let the allure of real-time testing lock you into a kind of small ball incrementalism, where the emphasis on empirical data becomes the enemy of big ideas and giant leaps. That’s the tyranny of testing, and it can lead to a culture of tweak, rather than transformation. Every company needs both. Strike a balance between information and inspiration. 

6. Test ‘til you drop. (Or until your audience stops changing.)
The digital world is mutable. Facebook changes its algorithms. Users are fickle. You can either ride the next wave or get swamped by it. Keep on testing—and be assured that the results will keep changing along with your audience.

A/B testing and other forms of digital data collection and analysis won’t create the brilliant content that ultimately builds your brands. But these tools can help you direct and promote that content to multiply its success with your audience.

top