Summary: Across 429 UX professionals, 71% of teams report performing some kind of quant UX research at least sometimes, and almost everyone reported struggling with challenges to get quant research done.

UX is sometimes perceived as a “soft” science. Often, that’s due to our field’s reliance on qualitative research and observations. To investigate how digital product teams use quantitative research to get “hard” data (or why they don’t) we surveyed 429 UX professionals.

How Often Teams Use Quantitative Research

We asked respondents to roughly estimate how frequently they, or someone else on their team, perform quantitative studies.

When interpreting these results, bear in mind that there may be some sample bias at play. We recruited our respondents from Twitter and LinkedIn outreach, and offered a chance at a free report or online seminar as an incentive. We also mentioned that the survey was about quant practices. As a result, our sample consisted of NN/g-fan UX practitioners who potentially were interested or had heard of quantitative research methods. I’d bet that sample (and probably you, since you’re reading this) likely performs more quant research on average than the wider UX community as a whole.

(Although, since you are reading this article, that likely means the results of this survey are probably fairly representative for the kinds of projects you work on, even though they may not be representative for all design projects in the world.)

We were somewhat surprised to hear that the majority of our respondents (71%) said they performed quantitative research either “sometimes” or “at least one study per project.” This result makes sense when we look at the methodologies that respondents report using.

Which Methodologies Teams Use

We asked respondents to tell us how frequently their teams were using 11 popular UX research methodologies — 7 quantitative and 4 qualitative. For each methodology, respondents told us if they were using it “often,” “sometimes,” “rarely,” or “never.”

The most frequently used methodologies, as reported by our respondents, were, in order:

Analytics (Quant)

Qualitative usability testing (Qual)

Interviews (Qual)

Large-sample surveys (Quant)

Small-sample surveys (Qual)

A/B or multivariate testing (Quant)

Card sorting (Quant/Qual)

Quantitative usability testing (Quant)

Focus groups (Qual)

Tree testing (Quant)

Eyetracking (Qual/Quant)

Predictably, the relatively lower-cost methodologies rank near the top (analytics, qualitative usability testing, interviews) while more expensive methodologies are towards the bottom (quantitative usability testing and eyetracking, which can be both prohibitively costly).

These results give some context to the surprisingly high frequency of quantitative studies in design projects. 74% of respondents who reported using quantitative research at least once per project also reported using analytics “often” (86 out of 117).

Analytics data can play a significant role in UX design. Unfortunately, in this survey, we don’t have an indication of exactly howrespondents were using analytics in their projects.

It’s possible that many of the analytics-heavy respondents, who reported using quantitative research at least once per project or sometimes, used that analytics data in a meaningful way to guide their design projects — for example, to help them identify problem areas in the product.

However, it’s also possible that those simply reporting that their teams were just collecting analytics data in every project, not that it had any real significance for their work. This possibility is supported by some of the open text-field comments from our respondents.

“We don’t have funding for more advanced quantitative research methods (beyond click tests, surveys, etc.)” “My company started as an A/B testing and CRO company and hasn't evolved their thinking beyond conversion rates.” “My manager has access to the analytics and measures success on traffic within our products/sites rather than interpreting the numbers to extract meaningful insights.” “They're the only metrics we have access to. We don't collect any usage data, so sales and revenue exclusively drive product decisions.”

Success Criteria

User research (not just quantitative, but qualitative too) can help us determine whether our designs work as we want them to, and whether we’re meeting our goals.

We asked respondents in our survey how they know when a design project is successful. In a multiselect, they could choose from 5 options:

Based on calculated improvements using metrics and quantitative research (like quantitative usability testing, NPS, and analytics)

Based on observed improvements through qualitative research (like qualitative usability testing and interviews)

As long as the leadership/executives are happy, the design changes are counted as a success

We don’t really know

Other

In an ideal world, success would be evaluated by both qualitative and quantitative data — observations and measured results. That would tell us if we’re hitting our goals and making our users happy, which would make the leadership or stakeholders happy as a natural consequence.

Unfortunately, those aren’t the results we found: only 24% of respondents checked bothof the options for looking at quant and qual research. While quant and qual data were each prioritized by over 40% our respondents, so were happy executives/stakeholders. Additionally, a disheartening 18% of respondents admitted that they “don’t really know” whether or not their design changes are actually improvements.

Quant Research Challenges

We often hear complaints from UX teams that they want to do more quant research, but too many insurmountable obstacles get in the way. To capture this in our survey, we asked respondents to choose their quant research challenges from a multiselect list of nine of the concerns we hear most frequently, plus an “other” write-in field.

Quantitative research is too expensive

Quantitative research is too time-consuming

Difficulty recruiting enough participants for large sample sizes

Lack of knowledge on the team about how to conduct or analyze quantitative research

Lack of knowledge on the team about what quantitative research is, when to use it, or what the methodologies are

Lack of understanding of the value of quantitative research

Lack of understanding of the value of research in general — not just quantitative research

Difficulty interpreting or reporting quantitative research findings

Another group in the organization is responsible for quantitative research, and UX isn’t included

Other

I’m not sure

Difficulty recruiting large samples was the most popular response (37%). Some respondents reported struggling to collect large samples because their end users were blocked by gatekeepers. For example, one respondent who works on an enterprise product explained, “We rely on Product Management to decide when we can or cannot contact customers who may not want to offer their employees' time.”

After difficulty recruiting, the rest of the options had fairly similar rates of selection (16–29%). Only 2 respondents out of 429 reported performing at least one quantitative study per project and having no significant concerns. The primary takeaway here seems to be that almost everyone struggles with quantitative research in some way — even those who reported doing quantitative research frequently.

Ignorance as a Roadblock

Notably, lack of knowledge about quant methods and analyzing quant data ranked towards the top of this list. Quant research can be intimidating to UX professionals, their teams, or their stakeholders, and ignorance is a substantial roadblock.

“[Our challenge is,] in particular, the advanced math behind A/B testing” “Lack of data scientists (1 at the moment) and limited quant skills (or training) for qualitative researchers. Can be off-putting […] when there are better people who correct you too ;)” Several respondents working in consultancies or agencies complained that they struggled to “get client buy-in” on quant research. “Our UX team understands the value of it, and how/when to use it. The product teams in the rest of the organization, however, is a different story...” “Owner does not see the value in quantitative research. Crazy, I know.”

Despite these challenges, there’s some good news in these results: only 16% of respondents found quant UX research to be too expensive. Obviously, whether or not something is considered “expensive” depends on three factors: the cost, the cost–benefit ratio, and the available budget. (Even research with a favorable cost–benefit ratio will be too expensive if the cost exceeds the available budget.)

A decade ago, it was the common understanding in the UX field that quantitative studies were expensive and reserved for extravagant, well-funded projects in big companies. We’ve always advocated discount usability instead of deluxe usability methods, in order to get user research more widely used.

However, while cheap methods are still great and should account for the majority of research on a design project, quantitative methods are less of an unaffordable luxury than they used to be, for three reasons:

The cost is down, due to improvements like remote research services (e.g., UserZoom) and automated data collection.

due to improvements like remote research services (e.g., UserZoom) and automated data collection. The cost–benefit ratio is more favorable, because quantitative findings are used as more than just vanity statistics — instead, they’re often used for longitudinal tracking, demonstrating ROI, and triangulation with qualitative findings.

because quantitative findings are used as more than just vanity statistics — instead, they’re often used for longitudinal tracking, demonstrating ROI, and triangulation with qualitative findings. User-research budgets in general are growing, as more companies move to higher levels of UX maturity.

Your Quant To-Do List

Based on these findings, we have the following recommendations for you to consider on your next project.

You can probably afford to do some quant research, so plan for it. Even if that means starting with the cheaper or more lightweight methods like analytics, that’s ok. Getting your team or your client started with quant research is the important thing, and you can work to expand your quant methodologies as you build up expertise.

Even if that means starting with the cheaper or more lightweight methods like analytics, that’s ok. Getting your team or your client started with quant research is the important thing, and you can work to expand your quant methodologies as you build up expertise. As early as possible, consider how you can get the necessary sample size, since this is the problem holding most teams back and shouldn’t be left to the last moment. If you struggle with getting around gatekeepers who block your access to users, it might take some networking to convince them of the importance of your research. If recruiting is a constant obstacle for your team (and you have the resources), consider dedicating an in-house part-time or full-time recruiter to the job.

since this is the problem holding most teams back and shouldn’t be left to the last moment. If you struggle with getting around gatekeepers who block your access to users, it might take some networking to convince them of the importance of your research. If recruiting is a constant obstacle for your team (and you have the resources), consider dedicating an in-house part-time or full-time recruiter to the job. Before even starting the project, educate yourself and your team on the available quant methods, the appropriate uses for each, and how to interpret findings and turn them into action items. Don’t let “lack of knowledge” stand in your way.

and turn them into action items. Don’t let “lack of knowledge” stand in your way. Don’t just use the most popular methods , but consider when some of the more specialized methods will be more valuable to answer specific design questions.

, but consider when some of the more specialized methods will be more valuable to answer specific design questions. Combine methods . In particular, use a combination of quant and qual studies to inform each other, increasing the effectiveness of both.

. In particular, use a combination of quant and qual studies to inform each other, increasing the effectiveness of both. Plan how you will judge the success of your design at the very beginning,preferably using a combination of quantitative data and qualitative observation. That way you’ll join the elite 24% of UX practitioners that do this right.

To learn about the value of quantitative research, quantitative methodologies, and how to choose between them, check out our full-day seminar, Measuring UX and ROI.