How to Run an A/B Test in Google Analytics

Designs don’t always work out as intended.

The layout looks good. The color choices seem great. And the CTA balances clever and clear.

But…

It’s not working. All of it. Some of it. You’re not completely sure, but something’s gotta give.

Despite everyone’s best intentions, including all the hours of research and analyses, things don’t always work out as planned.

That’s where continuous testing comes in. Not a one-and-done or hail & pray attempt.

Even better, is that your testing efforts don’t need to be complex and time consuming.

Here’s how to set-up split test inside Google Analytics in just a few minutes.

What are Google Analytics Content Experiments?

Let’s say your eCommerce shop sells Pug Greeting Cards. (That’s a thing by the way.)

Obviously, these should sell themselves.

But let’s just suspend disbelief for a moment and hypothesize that sales are low because you’re having trouble getting people into these individual product pages in the first place.

Your homepage isn’t a destination; it’s a jumping off point.

Peeps come in, look around, and click somewhere else.

Many times that’s your Product/Service pages. Often it’s your About page.

Regardless, the goal is to get them down into a funnel or path as quickly as possible, (a) helping them find what they were looking for while also (b) getting them closer to triggering one of your conversion events.

The magic happens on a landing page, where these two things – a visitor’s interest and your marketing objective – intertwine and become one in a beautiful symphony.

So let’s test a few homepage variations to see which do the best job at directing new visitors into your best-selling products.

One has a video, the other doesn’t. One is short and sweet, the other long and detailed. One has a GIF, the other doesn’t.

New incoming traffic gets split across these page variations, allowing you to watch and compare the number of people completing your desired action until you can confidently declare a winner.

(It’s probably going to be the one featuring this video.)

Running simple and straightforward split test like this is landing page optimization 101, where you identify specific page variables that result in the best results for your audience and multiply them across your site.

Google Analytics comes with a basic content experiments feature that will allow you to compare different page variations, split traffic to them accordingly, and get email updated about how results are trending and whether you’re going to hit your defined objective or not.

But… they’re technically not a straightforward A/B test. Here’s why, and how that’s actually a good thing.

Why Content Experiments Can Be Better than Traditional A/B Tests

Your typical A/B test selects a very specific page element, like the headline, and changes only that one tiny variable in new page variations.

The interwebs are full of articles where switching up button color resulted in a 37,596% CTR increase* because people like green buttons instead of blue ones. Duh.

(*That’s a made up number.)

There’s a few problems with your classic A/B test though.

First up, tiny changes often regress back to the mean. So while you might see a few small fluctuations when you first begin running a test, small changes usually only equal small results.

The second problem is that most A/B tests fail.

And if that weren’t bad enough, the third issue is that you’re going to need a TON of volume (specifically, 1,000 monthly conversions to start with and a test of at least 250 conversions) to determine whether or not those changes actually worked or not.

Google Analytics Content Experiments use an A/B/N model instead. Which is like a step in between one-variable-only A/B tests and coordinated-multiple-variable multivariate tests.

(After typing that last sentence, I realized only hardcore CRO geeks are going to care about this distinction. However it’s still important to understand from a high level so you know what types of changes to make, try, or test).

You can create up to 10 different versions of a page, each with their own unique content or changes.

In other words, you can test bigger-picture stuff, like: “Does a positive or negative Pug value proposition result in more clicks?”

Generally these holistic changes can be more instructive, helping you figure out what messaging or page elements you can (and should) carry through to your other marketing materials like emails, social and more.

And the best part, is instead of requiring a sophisticated (read: time consuming) process to set up to make sure all of your variable changes are statistically significant, you can use Google Analytics Content Experiments to run faster, iterative changes and learn on-the-go.

Here’s how to get started.

How to Setup Google Analytics Experiments

Setting up Content Experiments only takes a few seconds.

You will, however, have to set-up at least one or two page variations prior to logging in. That topic’s beyond the scope here, so check out this and this to determine what you should be testing in the first place.

When you’ve got a few set-up and ready to go, login to Google Analytics and start here.

Step #1. Getting Started

Buried deep in the Behavior section of Google Analytics – you know, the one you ignore when toggling between Acquisition and Conversions – is the vague, yet innocuous sounding ‘Experiments’ label.

Chances are, you’ll see a blank screen when you click on it that resembles:

To create your first experiment, click the button that says Create Experiment on the top left of your window.

With me so far? Good.

Let’s see what creating one looks like.

Step #2. Choose an Experiment

Ok now the fun starts.

Name your experiment, whatever.

And look down at selecting the Objective. Here’s where you can set an identifiable outcome to track results against and determine a #winning variation.

You have three options here. You can:

Select an existing Goal (like opt-ins, purchases, etc.)

Select a Site Usage metric (like bounce rate)

Create a new objective or Goal (if you don’t have one set-up already, but want to run a conversion-based experiment)

The selection depends completely on why you’re running this test in the first place.

For example: most are surprised to find that their old blog posts often bring in the most traffic. The problem? Many times those old, outdated pages also have the highest bounce rates.

Navigate to: Behavior > Secondary Dimensions + Google/Organic > Top Pageviews > Bounce Rate.

Here’s an example:

(Here are a few other actionable Google Analytics reports to spot similarly low hanging fruit when you’re done setting up an experiment.)

Let’s select Bounce Rate as the Objective for now, so we can make page changes to the layout, or increasing the volume and quantity of high quality visuals to get people to stick around longer.

After selecting your Objective, you can click on Advanced Options to pull up more granular settings for this test.

By default, these advanced options are off, and Google will “adjust traffic dynamically based on variation performance”.

However if enabled, your experiment will simply split traffic evenly across all the page variations you add, run the experiment for two weeks and shoot for a 95% statistical confidence level.

Those are all good places to start in most cases, however you might want to change the duration depending on how much traffic you get (i.e. you can get away with shorter tests if this page will see a ton of traffic, or you might need to extend it longer than two weeks if there’s only a slow trickle).

So far so good!

Step #3. Configure Your Experiment

The next step is to simply add the URLs for all of the page variations you want to test.

Literally, just copy and paste:

You can also give them helpful names to remember. Or not. It will simply number the variants for you.

Step #4. Adding Script Code to your Page

Now everyone’s favorite part – editing your page’s code!

The good news, is the first thing you see under this section is a helpful toggle button to just email all this crap code over to your favorite technical person.

If you’d like to get your hands dirty however, read on.

First up, double check all of the pages you plan on testing to make sure that your default Google Analytics tracking code is installed. If you’re using a CMS, it should be, as it’s usually added site-wide initially.

Next, highlight and copy the code provided.

You’re going to need to look for the opening head tag in the Original variation (which should be located literally towards the top of your HTML document. Search for

to make it easy:

Once that’s done, click Next Step back in Google Analytics to have them verify if everything looks A-OK.

Not sure if you did it right? Don’t worry – they’ll tell you.

For example, the first time I tried installing the code for this demo I accidentally placed it underneath the regular Google Analytics tracking code (which they so helpfully and clearly pointed out).

After double checking your work and fixing, you should see this:

And now you’re ready to go!

See, that wasn’t so bad now was it?!

Conclusion

Websites are never truly done and finished.

They need iteration; including constant analysis, new ideas, and changes to constantly increase results.

Many times, that means analyzing and test entire pages based on BIG (not small) changes like value propositions or layouts. These are the things that will deliver similarly big results.

Landing page optimization and split testing techniques can get extremely confident and require special tools that only CRO professionals can navigate.

However Google Analytics includes their own simple split testing option in Content Experiments.

Assuming you already have the new page variations created and you’re comfortable editing your site’s code, they literally only take a few seconds to get up-and-running.

And they can enable anyone in your organization to go from research to action by the end of the day.

About the Author: Brad Smith is the founder of Codeless, a B2B content creation company. Frequent contributor to Kissmetrics, Unbounce, WordStream, AdEspresso, Search Engine Journal, Autopilot, and more.