Over the past few months, we’ve been focused almost exclusively on building out Cacher’s features (VSCode, Atom and Sublime integrations), with very little in the way of marketing. Our marketing website design hasn’t changed much since its launch in September 2017.

With the development of so many features, the homepage needed an update to reflect all the extra value added. I gave myself the task of testing whether changes to the site were helping user conversion rates.

In marketing parlance, trying out out different variations of the same page in order to find the best performer is called A/B Testing. These tests (or experiments) are periods during which variations are run simultaneously on subsets of users with the intention of analyzing and drawing conclusions from the results. Once an experiment is over, you can make the winning variant the one all users see.

Requirements

For my first experiment, I tested whether having the user download the Cacher desktop client or visit the web app led to a higher chance of registration.

The two variants of the homepage experiment

I’m an engineer by trade and didn’t set up cacher.io the way many digital marketers typically do, using a heavy CMS like Wordpress or Drupal. The marketing site uses Jekyll, is completely static and has no database backend. As I know HTML/(S)CSS pretty well, I didn’t feel the need to pay for more UI-friendly tools like Optimizely or Unbounce. What I decided instead was to combine Mixpanel and a bit of Javascript to roll my own lightweight experiment.

Testing with Alephbet

After looking at ABalytics.js and signalerjs, I settled on AlephBet due to its great documentation and ability to set weights on variants. While you could pick a library with fewer features, you’d still want something that can ensure a particular user gets the same variant with every page visit.

Step 1: The HTML

Here is a simplified version of the HTML involved in the experiment.

// index.html <span data-variant="download" data-block>

<button>Download for Mac OS</button>

</span> <span data-variant="visit_web" data-block>

<button>Get Started</button>

</span>

Why the data-variant attributes? I’ll use those in the CSS and JS below to toggle the visibility of the elements.

Step 2: Hide your elements

I had to make sure the Javascript got to determine which version of the UI to show. I hid all the HTML elements in the experiment with CSS and add a separate class that could be toggled on.

// tests.scss *[data-variant] {

display: none;



&.variant-visible {

display: inline-block;



&[data-block] {

display: block;

}



&[data-inline] {

display: inline;

}

}

}

Notice that I used custom data- attributes to denote whether an element should be shown as inline , inline-block or block .

Step 3: Initialize the experiment

When the homepage loads, the experiment starts. To make setting up future experiments easier, I wrote a wrapper function around Alephbet’s Experiment initialization. In the code below, note that I used jQuery’s document.ready() function to ensure all the HTML loaded first.

// testing.js function newExperiment(name, variants) {

var args = {

name: name,

variants: {}

};



_.each(_.keys(variants), function(variant) {

args.variants[variant] = {

activate: function() {

$('*[data-variant="' + variant + '"]').addClass("variant-visible");

},

weight: variants[variant]

};

});



new AlephBet.Experiment(args);

} $(function() {

newExperiment("homepage", {

download: 50,

visit_web: 50

});

});

The newExperiment() method does a couple things: (1) initializes the experiment with variant weights and (2) shows any element with attribute data-variant-[download|visit_web] if the user hits that variant.

In this case, I’ve set up an experiment with 2 variants (you can do more), each shown to 50% of website visitors.

Step 4: Track events with Mixpanel

I’ve opted to use Mixpanel to track the results of this experiment but you may choose to use another analytics service. Alephbet comes with built-in support for Google Analytics so that would be another fine choice.

// tracking.js function testingStore() {

var store = localStorage.getItem('alephbet');

if (store) {

return JSON.parse(store);

} else {

return null;

}

} function mixpanelTrack(event, data) {

data = data || {}; var store = this.testingStore();

if (store) {

_.each(_.keys(store), function(key) {

if (key.indexOf("variant") >= 0) {

var tokens = key.split(":");

data["experiment:" + tokens[0]] = store[key];

}

});

} mixpanel.track(event, data);

}

testingStore() takes the localStorage data that Alphebet uses to keep track of experiments and returns its JSON value. The _.each and _.keys functions are from lodash.

For each event, I sent each experiment and its active variant as properties of the Mixpanel event.

I added a line of JS on the homepage to indicate that the user has visited:

// index.html <script>

$(function() {

mixpanelTrack('visited homepage');

});

</script>

On the registration success page, I added another line to track the user registration.

// signup-success.html <script>

$(function() {

mixpanelTrack('user registered');

});

</script>

Step 5: Set up funnels in Mixpanel

I created a new funnel in Mixpanel for the download button variant of the homepage:

Notice that because I’ve sent along all experiments as event properties, experiment:homepage was selectable for both events. It was set to download for this funnel. I’ve also created another funnel with visited_web as the experiment:homepage value.

Step 6: Wait

Once I’ve set up my funnels and deployed the website changes, I waited for the results. How long did I wait? Neil Patel has a handy A/B Testing Calculator that tells me. For marketing site changes, I generally wait for enough visitors to achieve >95% certainty on tests. Depending on your own website’s traffic, that could mean waiting a few hours or several days.

Step 7: Look at Results

After running the experiment for some time, I got the following results:

Funnel results for the two homepage variants

Putting these numbers into the A/B testing calculator gave me the following:

A/B testing calculator results

In this case, users were more likely to register after visiting the Cacher web app as compared to downloading the desktop client. This makes intuitive sense as there could be some friction involved with installing a piece of software versus trying it out on the web. As I was confident in the conclusion drawn from the experiment, I’ve since changed the homepage to direct visitors to the web app by default.

Why roll your own?

With the popularization of A/B testing, there are certainly a number of established services out there for running tests and collecting results. If you’re a marketing pro who doesn’t want to (or isn’t allowed) to touch code, a tool like Optimizely would be a good bet. On the other hand, if you’re an engineer who’d rather have granular control over what happens on the page, there is no better option than writing your own framework.

Want to remember this technique? Save the snippet for this blog post to your Cacher library.