Gamifying Surveys to Increase Completion Rate and Data Quality

One of the biggest challenges for research involving surveys is maintaining a high rate of completion and compliance with survey requirements. First, we want a reasonably representative sample of whomever we send the survey to. Second, we want those that do complete the survey to do so honestly and thoughtfully. One approach that researchers have taken to improve these outcomes is to gamify their surveys. But does gamification actually improve data quality?

In an empirical study on this topic, Mavletova examined the impact of gamification on survey completion rate and data quality among 1050 Russian children and adolescents aged 7 to 15. In her study, Mavletova compared a traditional text-only survey with a visually interesting survey which incorporated graphics, background colors, interactive slider bars, and Adobe Flash-based responses, and a gamified survey.

To gamify the survey, Mavletova first developed four guidelines for effective gamified assessment: 1) narrative, 2) rules and goals, 3) challenges, and 4) rewards. To realize this vision, students in the gamified condition set their name, specified an avatar, and followed a story about the respondent traveling in the Antarctic among friendly penguins. In the story, respondents were asked to tell some characters in the story about themselves to the penguins in order to travel home. They also played mini-games between sections of the survey. Regular feedback was also provided indicating progress through the survey.

So did all this extra pay off? Here’s what happened:

Total respondents (N = 1050) to the text, visual, and gamified surveys on desktops and laptops did not differ in the total number of drop-outs.

On mobile devices (N = 136), fewer respondents to the gamified surveys dropped out than those completing the visual surveys, who dropped out less than those completing the text-based surveys.

Participants took the least amount of time on the text version (13.9 min), more on the visual (15.2 min), and the most on the gamified version (19.4 min). However, the gamified version was also substantially longer, due to the extra content.

When asked about the amount of time they spent, participants in all three conditions reported approximately the same subjective experience of time.

Respondents found the gamified survey easier to complete than either the visual or text surveys (however, this was analyzed by comparing response rates to “Strongly Agree” on the scale across conditions, which is a strange way to do it).

More respondents were interested in further surveys when competing either the visual or gamified surveys in comparison to the text-based surveys (although this was determined with the strange analytic approach described above).

The highest non-response rate was found in the gamified survey, less in the visual survey, and the least in the text-based survey.

When omitting Flash-based questions (which required additional technology to view), there was no difference in non-response rate between conditions.

There were no differences in socially desirable responding by condition.

Straight-line responses (answering all “c” or all “b”, for example) were more common in the text survey (11.4%) than in both the visual (2.8%) and gamified (3.2%) survey.

Extreme responses (answering “a” or “e” on a five point scale) did not differ by condition.

Middle responses (answering “c”s) were more common in the text survey than in either of the other two conditions.

There were no differences on open-ended questions in regards to length of responses, number of examples provided, or the distribution of that number.

So is the effort to create a gamified survey with a narrative worthwhile? Gamified surveys had better drop-off numbers for some respondents, respondents found the survey a little easier, and some types of lazy responding decreased. However, they also had a higher non-response rate. Most benefits were small and also realized by the visually interesting survey.

Overall, this is not very good news for gamified surveys. It appears that the (rather extreme) time and cost investment to develop gamified surveys did not really help much. However, because this study design did not isolate particular gamification elements, it’s difficult to say what did or did not lead to this result. Were the gains seen because of the mini-games? Perhaps the gains seen simply because the respondents were able to take breaks between sections of the survey? Did the narrative help? We don’t really know.

In fact, the cleanest comparison available here is between the text-based survey and the visual survey, since the visual survey was essentially the text-based survey plus graphics and interactive item responses. This is a relatively small investment (relative to gamification) and thus I can recommend it more easily.

This doesn’t mean that gamifying surveys is a bad idea – it just means that gamification designed like this – with mini-games and Flash-based response scales – is unlikely to do much. This type of gamification may simply be too simple; fancy graphics may not convince anyone, even kids, that your survey is more interesting than it really is. More transformative gamification, such as those approaches integrating the story into the questions (which wasn’t done here) or taking more innovative approaches to data collection (rather than dressing up normal Likert-type scales), is an area of much greater promise.