graphical user interface, application

Earlier this year, we built a tool that lets us A/B test article titles

We were tired of guessing whether to publish an article with the title “The Rise of Kimchi Diplomacy” or “The Campaign to Make You Eat Kimchi.” So now we can run a statistical test on two titles to see which one people click more. We built this tool for ourselves, but we made it a premium feature of Content Tracker, our content analytics software, so other people can use it too.

You’ll never believe what happened next. Okay, you might. Because we’re about to tell you.

Since we have A/B tested titles for the past five months, we thought we’d look at the data to see whether it has helped. Did A/B testing get us more readers?

Over this time period, we published 68 articles and A/B tested 40% of them. The typical reason we didn’t test a headline was that we didn’t have enough time (the test takes about 3-6 hours). It’s possible that we tended to A/B test better articles, although that doesn’t seem to be the case. 

So, did articles we A/B tested perform better than ones we didn’t? Let’s review three metrics: pageviews, social shares, and press mentions.

Let’s start with pageviews. On average, each article we published where we ran an A/B test got about 34% more pageviews than an untested article. Several extremely successful articles could be throwing this off, so we looked at the median improvement too. The median article that we A/B tested had 28% more pageviews. 

chart, bar chart

Data via Priceonomics Content Tracker.

When it comes to pageviews, our data indicates that A/B testing improves the median article’s performance by more than 25%. 

Next, if we A/B test titles, do they get more shares? 

Intuitively, they should. Our methodology for A/B testing is to load identical articles into Tracker so that the only difference is the title. Tracker then purchases a $10 Facebook advertisement for each headline and measures which one gets more clicks from Facebook users. More details on how it works are available here.

chart, waterfall chart

Data via Priceonomics Content Tracker.

The average article we A/B tested got seven times as many shares as an untested article. This was dramatically influenced by an outlier. But the median article still performed almost 75% better when we A/B tested the headline.

Lastly, do articles get picked up by the press and generate more inbound links from other sites if we A/B test them? Interestingly, the press doesn’t seem any more likely to reference an article that has an optimized headline. The number of mentions received by articles with tested and untested titles is identical.

chart, bar chart

Data via Priceonomics Content Tracker.

After so many years of writing headlines designed to generate clicks, it appears the press has developed its own antibodies to optimized headlines. And so while you can often get more clicks and shares with a better headline, it may not result in more press coverage.

***

We analyzed this data in preparation for a talk at the Priceonomics Content Conference. During the talk, titled “The Science of Headline Writing,” we’ll share everything we know at Priceonomics about how companies can write better headlines. This blog post is just the tip of the iceberg of the things we’ll be discussing. If you’d like to attend, we have a few early bird tickets available here.

Also, if you want to start measuring your own company’s blog posts like we do, you can sign up for a free Priceonomics Content Tracker account here. The A/B testing feature is part of a premium account, but most of the other features are free.

***

For Priceonomics, A/B testing headlines has been essentially “free performance.” When we A/B test an article, it gets 28% more pageviews and 75% more shares. Optimizing our titles, however, doesn’t seem to garner more links from the press. 

A/B testing titles seems to work on everyone except journalists.

a black and white logo