Split Testing Your Content Marketing Campaigns: How and Why To Do It

How much do you know about split testing?

Chances are, you’re familiar with its role in helping to boost the conversion rate of an email, a web page, or an ad.

But have you ever considered that split testing could help boost the performance of your content marketing campaigns?

Despite its prevalence in landing page conversion rate optimization, a split test can actually be applied to most situations in which we want to optimize performance.

Including a content marketing campaign.

According to CMI’s most recent survey of content marketing trends, 88% of B2B companies are using content marketing:

But only 6% state it’s very effective

This says to me that there is clearly some room for improvement.

There are many ways to boost the effectiveness of a content marketing campaign, but one way I very rarely hear mentioned, is a split test.

Want to know more?

Then let’s start from the top.

Split Testing

For those of you that are new to split testing, for best results you’ll want to invest in a split testing platform.

You’ll also want to get yourself up-to-speed with exactly:

In short, a split test involves pitting multiple variants of something against each other. Your goal is to find out which of those variants perform best.

The simplest split test is an A/B test, which tests 2 versions of something against one another. You can add additional variants to the mix (known as multivariate testing).

In theory, multivariate testing shouldn’t be any more complicated to execute than an A/B test.

Problems arise however, because the more variants you test, the more traffic you need to send to each version in order to gather accurate and valuable data.

Bear this in mind if your site only welcomes a few thousand (or fewer) visitors a month.

In split tests you’re looking for big wins. You need to see a large uplift in results to be certain that you’re seeing real results, and not natural variance in user behavior.

The fewer visitors you have, the larger that uplift needs to be for you to have confidence in what you’re seeing.

It’s important the tests are run over a long enough period of time to accrue enough data for you to be certain that as above, what you’re seeing isn’t occurring “by chance”. Tests that are cut short too soon run the risk of leaving you with inaccurate results.

Neil Patel has a calculator that’ll show you whether or not you’ve reached statistical significance, but let’s look at an example to see why it’s so important.

Let’s say you’ve sent 100 visitors to both your control page and your variation page.

You’ve seen 2 conversions on your control page, and 4 on your variation page.

That’s statistically significant, right? You’ve doubled your conversion, so it seems like you should be confident your variation page is the winner.

But don’t get ahead of yourself. According to the calculator, you aren’t there yet:

In fact, you’d have to leave your test running until you had 7 conversions on your variation page – compared to 2 on your control page – to be sure your results hold water:

“You should not call tests before you’ve reached 95% or higher. 95% means that there’s only a 5% chance that the results are a complete fluke.”

Peep Laja of ConversionXL breaks it down further for those using A/B testing tools that report results in percent confidence:

Another key point to remember is that A/B split tests only work if you change (and test) one thing at a time.

If you make more than one change to a page, it will be almost impossible to tell which change is responsible for the results.

This can make split testing seem like a long and laborious process. What it really means however is that optimization is never complete. There is almost always an improvement to be made.

In short: you should always be testing.

But when it comes to content marketing, exactly what should you be testing?

Headlines

For something only a few words long, your headlines play a huge part in the success of your content. But it goes beyond determining whether or not someone will click on your content.

Writing for The New Yorker, Maria Konnikova said:

“By now, everyone knows that a headline determines how many people will read a piece, particularly in this era of social media. But, more interesting, a headline changes the way people read an article and the way they remember it. The headline frames the rest of the experience.”

This says to me that when split testing a headline, you should look beyond clicks as a metric of that headline’s success.

You should look at how people are responding to that content too. Are they sharing it? Commenting on it? If you’re using a scroll mapping tool does it signal that people are scrolling to the bottom of the page?

The best headlines are the ones that do more than get people to click.

They are the ones that put the reader in the right state of mind for consuming the content.

Whether you want a reader to laugh at your content, or cry at it, the headline should set them up for that emotion.

Buffer test 2 to 3 different headlines per post. Upworthy test 25.

Most of us however, will do better to test just 2 or 3 headlines. This is because you need a lot of visits for each headline to become statistically significant.

This might not be a problem for a site like Upworthy who can top 20 million visits in a single month.

Most of us however, will be reliant on a far smaller pool of visitors and will gain better results from a simple A/B or A/B/C test.

Length of Content

According to Buffer, the ideal length of a blog post is 1600 words (an average reading time of 7 minutes).

Of course, each site and their audience is different. The only way to find out what length of content your audience responds to best, is to split test it.

To test this properly, you’ll need to create two different versions of your content. To gain an accurate result, you’ll want to repeat the test with multiple pieces of content. You’ll likely want to play around with different content lengths each time until you land on that “sweet spot”.

It’s worth noting that although we’ve been speaking about word count, this test doesn’t have to only be applied to written content. If you make them, you could also try split testing the length of videos and infographics.

Positioning of Suggested Posts

Great content shouldn’t just keep a visitor on their initial landing page – it should entice them to read more.

To encourage this, you should be linking out to other content, both from within the content itself, like this.

You can also link from above and below the content:

And in the sidebars:

You can also experiment with pop-ups:

How many links to include

Which links to include (do you link to your post popular content? Related content? Or both?)

Where you include links (above or below the content? In the sidebars? Should the sidebar content move as the user scrolls? Do you use pop-ups? If so, when should they appear?)

Images

The precise positioning of these links can impact how effective they are. To maximize their effectiveness, you need to test them. This might include tests for:

Images serve to illustrate points and break up long pieces of text. This makes the content more appealing to the eye and easier to read.

However, as important as images are, they’re not all created equal.

Images that aren’t relevant to your content can actually serve to confuse and distract the reader, while the ever popular stock image is generally accepted as a conversion killer.

Visual Website Optimizer proved this theory when they worked with Harrington Moving and Storage. Replacing a stock image with a “real” photo saw their conversion rate increase by more than 45%.

Your blog posts might not be as reliant on having the “right” images than a product or service page, but there’s no doubt that they’re important.

Try testing:

Images you’ve taken yourselves vs. images found online

Images with people vs. images without

Diagrams vs. pictures

Photos vs. illustrations

Call to Action/s

If you’re not wrapping up your content with a clear indication of what you want visitors to do next, you’re not leveraging it to its full potential.

Call to actions in content differ somewhat to the CTAs you’d include on a product or service page. You’re unlikely to be selling anything, for starters. Visitors to your content are also (often, though not always) going to be much further behind in the sales funnel than visitors to a product or service page.

The call to action (or actions) on your content might ask visitors to:

Comment

Share

Read more content

Visit a product or service page

Subscribe to your email list

Enquire about your product or service

A combination of the above

The nature of the call to action is likely to be determined by the subject matter of the content and the goals you hope it will achieve. The wording you use should be determined by a split test.

There’s some great ideas for blog post call to actions in this post from Writtent. What you’ll notice about them is their length. CTAs on product and service pages are often very short. These CTAs take their time to get the reader fired up about engaging with the site further. For example:

Do you use any of these productivity hacks in your daily workflow? Feel free to brag about your success in the comments! Like what you’ve read? Subscribe to our blog by adding your email address to the form on the right. You’ll be the first to hear about our updates 5 times a week! Did you know we offer 30 days of our product for free? Click here to sign up for your no-obligation trial – no credit card number required!

Of course, these are just a few ideas to get you started. Split testing will enable you to establish what works best, specifically for your site and your audience.

Conclusion

Have you ever carried out any split tests designed to optimize your content marketing campaigns? Let me know what you did and the results you saw in the comments below.

About the Author: Aaron Agius, CEO of worldwide digital agency Louder Online is, according to Forbes, among the world’s leading digital marketers. Working with clients such as Salesforce, Coca-Cola, IBM, Intel, and scores of stellar brands, Aaron is a Growth Marketer – a fusion between search, content, social, and PR. Find him on Twitter, LinkedIn, or on the Louder Online blog.