FEATURE27 September 2011

More than a game

Experiments by Engage Research and GMI show how gamification can help researchers improve feedback from online surveys. Engage’s Deborah Sleep reports.

Res_4006083_Board_games

Have you ever missed your train stop because you were absorbed in your crossword or magazine quiz? Whiled away a rainy afternoon with cards or a board game? Played the slot machine on a seaside pier? Or do you own one of the 22 million games consoles in use in the UK?

Before we started thinking about making surveys more like games, we knew about the negative effect that bored respondents can have on the information you get from surveys. We’ve developed effective techniques for overcoming this through better design, language and imagery, using more creative questioning techniques and applying learning from behavioural science.

Over the course of the experiments we undertook it became clear that the more the respondent enjoyed the experience, the more feedback we received. But when we started to observe the correlation between the amount of fun respondents had completing a survey and the quality – not to mention quantity – of feedback they gave us, we began to look to games for further inspiration.

The year-long study conducted by Engage Research and GMI involved more than thirty research experiments undertaken with more than 5,000 participants on behalf of six clients: Sony Music, Allianz Insurance, AMS Media Group, Heinz, Kimberly-Clark and Mintel Research. The findings could change the way online research is conducted.

“Instead of asking respondents how much they liked a brand, we asked them how happy they would be to wear a given brand on their T-shirt”

We began by exploring how questions could be redesigned to be more fun and game-like in nature and what impact this would have. We experimented with the wording of questions, to humanise them, make them more engaging and link them to potential real-world emotional experiences. For example, instead of asking somebody to tell us what clothing they liked to wear, we asked them what clothing they would wear for a first date. Instead of telling us where they liked to go on holiday, we invited them to imagine that they had to publish a magazine offering holiday recommendations, and instead of asking respondents how much they liked a brand we asked them how happy they would be to wear its name on their T-shirt. The results were instructive: we got two or even three times as much feedback to the more engaging questions and participants consistently took more time providing their answers.

Rules can transform a boring task into a game. How does a ten-mile hike in the rain carrying a 15kg rucksack sound? But what about a game of golf? We explored how rules that we all know from playing games could be adapted to turn questions into puzzles. A question such as “Describe yourself” yielded on average 2.4 descriptive words, with effectively 85% of respondents answering. When that question was changed to present the challenge “Describe yourself in exactly seven words” the descriptors increased to an average of 4.5 and the response rate rose to 98%.

We also spent some time looking at the way many video or online games are structured to provide the player with tasks which when put together are turned into a series of quests. Our study found that by adding a motive to answer a question we could improve the response rates. For example, we asked respondents how much they liked each of a list of music acts. Typically, this yielded evaluations of 83 artists. Not bad. However, when we asked them to imagine that they owned their own radio station and to decide which of the artists they would put on their station’s playlist, respondents were willing to spend longer deliberating their answers and the average number of artists evaluated rose to 148.

Most game playing, of course, taps into our competitive spirit, so we explored ways in which we could add a more competitive framework to questions. When we asked respondents to make a list of their favourite foods, we received an average of six items in response. When we told them they had two minutes to make a list of their favourite foods, not only were respondents conditioned to spend the aribtary two minutes that we had allocated, but also it produced an average of 35 items in reply. That can make a major difference to the quality and insight a brand would receive from its online research.

“When we asked respondents to make a list of their favourite foods, we received an average of six items in response. When we told them they had two minutes to do it, the figure rose to 35”

Rewards also have a major role to play. Most games have mechanisms for rewarding players, and we explored how we could incorporate these into surveys. A typical question such as “What emotions do you think people associate with…?” produced an average response time of eight seconds and a 50% reported enjoyment rate. When we told respondents they would win a point for every answer they guessed right, time spent rose to 12 seconds and 90% said they enjoyed the experience.

But it’s not just about language and question wording – design and visual stimuli are also important. We experimented with a range of fun selection processes. For example on the playlist question, respondents could place a ‘Banned’ sticker over the artist’s name. When we gave respondents a packet of crisps and a series of labels to describe them, ranging from “well designed” and “good colours” to “unoriginal”, we produced 15% more activity (in clicks) and 50% more fun reported by respondents. The game concept became even more popular when we gave instant feedback, for example by revealing their suggestions as correct on screen – 80% of respondents said they enjoyed this approach.

That was the theory established – the next challenge was to put it into practice. We have worked with clients from different industries to see how we could apply these techniques to the types of research they typically conduct. Experiments included media diaries, usage and attitude studies and co-creation-style questions, using a range of game approaches to great effect. Across the board we have seen consistently high completion rates and enjoyment scores, respondents willingly spending significantly more time answering, and a greater quantity and quality of response to our gamified questions. The research showed that almost everyone will respond to game mechanics, and with the right sort of mechanics for your audience and research needs these game-inspired techniques really deliver.

Deborah Sleep is a director of Engage Research. She and Jon Puleston, vice president of innovation at GMI, won the award for best methodological paper award at this year’s Esomar Congress for their paper The Game Experiments

14 Comments

13 years ago

Great article. Where does "gamification" change the nature of the response rather than just enhancing it? I was struck by your example of clothing - favourite clothing vs clothing for a first date. These could be very different. The question to me is different, not just the playfulness of the expression. Play per se can be fun, but it can change our actual perceptions through the game - a source of additional cognitive bias, perhaps?

Like Report

13 years ago

''When we asked respondents to make a list of their favourite foods, we received an average of six items in response. When we told them they had two minutes to do it, the figure rose to 35'' - the old school equivalent of this is to increase the size of write-in box for open ended responses. Interviewer feels obligated to probe more!!!!

Like Report

13 years ago

Interesting.... How do clients react to this approach? Do most embrace the idea of engaging the audience more or do many reject this approach as unscientific/not serious enough etc? Regarding Edward's comment about changing rather than enhancing the response....this assumes that a typical questionnaire elicits a 'true' response. I'm of the school of thought that responses often vary by situation. A questionnaire about brand favourability of breakfast cereals might not mirror my behaviour in the aisle at the supermarket, for example.

Like Report

13 years ago

Edward has a good point – in some instances, the fun element you introduce may well change the nature of the response. We have other examples of this. It’s a question of understanding what effect the question might have and interpreting the data in light of it. In some instances it will be better to have a richer response to a slightly different question than you might otherwise ask!

Like Report

13 years ago

In response to Matt, we are finding clients to be increasingly receptive. The initial work we did several years back proved the negative impact of disengaged respondents. These game applications are the latest set of tools and techniques we can use to overcome this. And in response to Pete - I absolutely agree!! Its one of a number of ways you can explicitly or implicitly set an expectation!

Like Report

13 years ago

To Matt's comment: responses are likely to be determined by context, sure. but adding another layer of potential interpretation on top makes the task of interpretation more challenging, and potentially also allows for even more "bias" via the perspective of the interpreter. Fractured glass versus truth refracted sort of concern

Like Report

13 years ago

On the subject of "gaming" the survey-taking process: personally, I would always prefer an honest response to a slightly different question over a dishonest response to the original question.

Like Report

13 years ago

Deb, i absolutely love this article. Have been pushing for the importance of this approach internally and this is really helpful. Thanks for sharing! I shall be purchasing the full paper! Jonny

Like Report

13 years ago

Deb, i absolutely love this article. Have been pushing for the importance of this approach internally and this is really helpful. Thanks for sharing! I shall be purchasing the full paper! Jonny

Like Report

13 years ago

Interesting approach! Anything we can do to engage the average online respondent is valuable. I do wonder if there aren't major differences in how respondents in different demographic categories will respond, however. For many of your examples, I pictured respondents in younger age groups.

Like Report

Results 1 to 10 of 14