If you clicked on a 42Floors ad for “new york office space” a while back, there’s an 89% chance that you landed on one of these eight versions of our site.

They’re all fake: just throwaway static HTML mockups.

Design evolution

When 42Floors launched back in March 2012, we had relatively few listings so we could fit them all on a list/map combo like this.

As our database grew, though, the map started to bog down with data points. When we added 1,300 listings for our New York launch, render times were hitting 12 seconds for Midtown. Plus, it was visually overwhelming to have so many data points on the map.

We took the obvious next step and started clustering.

Clustering solved our speed and density issues but we still weren’t happy with the design because it took too many clicks to drill down to individual listings.

So we came up with Unified View. Unified View was a three-month undertaking that burned up a lot of dev hours but it seemed worth it. The finished product was blazing fast and had all the goodies: big photo cards, infinite scrolling, a mini navigation map, and the ability to switch to a full-page list or map view just like AirBnB or Yelp.

Conversion rates

Given the unquestionable awesomeness of the new site, we expected a conservative 50% lift in conversion rates when we deployed. But there was nothing — not even a blip in bounce rate, time on site, or conversion rates to mark the deploy date.

Unsurprisingly, enthusiasm for working on the front end cratered for a while and our focus shifted over to the supply side of the business — adding more listings, making tools for building owners, etc.

We eventually did start talking about iterating on the design again but nobody wanted to sign up to build yet another version of the site and see it flop.

Ignoring the user

At this point, you might be suspicious that we’d been designing in a vacuum. Not so. We’d done plenty of interviews and mockup sessions with users trying to tease out what they wanted, it just never got us anywhere.

Here’s where I issue a disclaimer, lest I get trolled: talking to users is really important, just not about conversion funnels. Here’s my handy user feedback weighting algorithm:

Knowing something broken: 100% accuracy

Knowing what they want: 30% accuracy

Knowing what they’ll actually click on: 2% accuracy

With this in mind, we set out to disregard the focus groups and drive the design, instead, based entirely on browsing data from live traffic.

The problem was: how to get data? The types of wholesale redesigns that we needed to split test aren’t possible with Optimizely. Feedback services like 5SecondTest are unrepresentative of real users. We also couldn’t dedicate months to building multiple working versions of the site that would, in all likelihood, fail.

So we decided to fake it.

Faking it

Over the course of a few days we sketched out eight designs in Moqups and converted them into sloppy Photoshop mockups. We were pretty optimistic that the design called Modals would win. The modal-based UX paradigm was intended to act like a one-page app and solve the common back and forth problem of loading search results, clicking on an item, going back, etc.

In addition to ignoring our users’ design preferences, though, we also wanted to try ignoring our own design instincts. So we designed versions that we hated: Ugly Banking Site, Landing Big Buttons, etc.

We shipped the PSDs off to PSD2HTML and got back the quotes: $295 for the simplest design, $972 for the most complex one.

We added two extras to the typical package:

Copy and paste in listing descriptions and photos from specific real listings from 42Floors instead of lorem ipsum placeholders Use Bootstrap to make editing easier for us

Even though these were going to be static mockups, they had to feel real to first-time visitors, otherwise the test results would be worthless. So, each mockup got rollovers, hovers, lists, and modals that showed real but pre-populated data.

Live testing

PSD2HTML finished in nine days at a cost of $4,197. The next day we uploaded eight folders of static HTML into /public/land/v1-8 in our Rails app. We cloned eight copies of our AdWords campaign and changed the destination URLs to match.

Then we waited.

The winner

The winning variation was Google Hover Clone — a 1:1 mockup of Google’s site preview pane.

We call that static test Hover1. Our criteria for picking the winner came down to one number: tour request rate. In other words, what percentage of users found a listing they liked and contacted it using a particular design. We use other data points to augment this core metric (bounce rate, time on site, number of listings viewed, search criteria revisions), but ultimately a variation only wins if it’s better than the control at getting users to an office space that they like.

We went on to build a working version called Hover2 and put it in production back in August. With it, the long-awaited conversion rate lift finally materialized. As of this writing, we’re on Hover3 and about to deploy a split test for Hover4.

PS – The strangest outcome of the process was that Ugly Banking Site had the lowest bounce rate of all the designs by a wide margin.