OPINION17 July 2017

‘OK Google, start the survey’

Innovations Opinion Technology UK

Alexa, Siri, Cortana. Are these all new players in the field of research asks Zoe Dowling.

Voicebots_crop

I recently played a game of 20 questions with Alexa and my seven-year-old niece. There was cause for much hilarity and also a great deal of fascination at interacting with this sometimes very smart/sometimes not bot. Despite some hiccups, seeing the ease with which a seven-year-old picked up the rules of engagement with Alexa and the ensuing ‘conversation’ was a clear glimpse of the future.

The voice-controlled technology behind the likes of Amazon’s Alexa, Apple’s Siri and Google Home is a new and interesting touchstone in the way humans interact with technology. Brands are already asking how they can latch on to this technology. The most obvious application is a route to single-voice-command-shopping, but many other possibilities are yet to be uncovered. Looking closer to home, how, as researchers, can we grasp this technology and fruitfully use it for market research?

Voice technology is quickly getting stronger and smarter. Garner reports that in less than a year, 30% of interaction with technology will be through conversations. The big names – Google and Amazon (Amazon controlling 70% of the market) – will no doubt continue to invest in voice technology and push mainstream adoption.

The standalone voice-activated device market (such as Google Home and Amazon Echo) is forecasted to grow nearly 130% this year according to eMarketer. Notably, this figure doesn’t include voice assistant software like Siri and Google Now, which are used by one in four smartphone users.

This rapid growth undoubtedly spells opportunity for other industries to create new uses for the technology. Within the market research industry, we need to ask what the rise of the voicebot will mean for data collection. Could they be a new mode for generating quality insight?

Is Alexa’s personality powerful?

Do machines have a personality? Certainly, the voice behind the machine can influence responses in the same way the voice of the telephone interviewer can introduce bias. While the voicebot’s tone is directed by algorithms and enhanced by tech advancements, we need research to explore the voice’s impact in creating a connection with the respondent, and how that connection might differ across voicebots.

Is Siri’s voice more trusting?

Another important consideration is how comfortable respondents will feel providing answers to sensitive questions. Do you trust the voice behind Siri (currently Susan Bennett)? What about the voice behind the Sainsbury’s self-checkout? Sure you trust her, you willingly hand her your money. Of course voicebots are just machines, but they are personified ones and it’d be foolish not to consider the limitations in respondent comfort levels. 

Does Google Home respect your privacy?

Privacy is already a concern with voicebots, and any privacy concern in our industry is a significant issue. For example, Burger King’s TV ad featuring ‘OK, Google’ which was designed to prompt Google Home devices to describe its Whopper burger according to its Wikipedia page. But within hours the ad was pulled due to a slight mishap – consumers made some mortifying edits to the Wiki page, causing Google Home to read out ‘the Whopper is made of medium-sized child’ and contains ‘cyanide’. Even though the ad was removed within hours it had already received widespread coverage, reminding us how new this technology is and its susceptibility to flaws. The Burger King scene highlighted a false sense of privacy – and that anyone, and even anything, may be listening in to a personal statement is unsettling. As respondent answers disappear into the digital ether, questions will arise as to where the information is going, where it is being stored, and how it is being used?

Does Cortana talk too much?

Survey length is another factor – what will be acceptable to respondents? Will respondents tolerate only two to three question micro surveys or can we push to five minutes or more? Furthermore, what will respondent engagement look like for a survey designed using this mode? How much will respondents be able to talk back? Will this increase our ability to ask open-ended questions? Would they provide richer data? All areas for investigation.

A voice portal to research?

Despite the many questions arising with the voicebots, the potential is apparent. We can even think beyond surveys to more ethnographic or qualitative approaches where Alexa performs research over the course of a number of days. And then there’s gamification. Going back to that game of 20 questions with Alexa, it is bizarre but very engaging. How can we capitalise upon that?

Zoe Dowling is lead research strategist at FocusVision 

2 Comments

7 years ago

I now have a couple Google Homes (GH) and had the Echo since it launched. I just do not get how anyone being honest in comparing the Echo to the GH and say they are close. The GH is completely different, IMO. The Echo uses commands. So you memorize how to ask it something. An example for songs it has a command "song goes like" if you do not know the name. With the Google Home you just talk like you would to your wife. The GH is the most humanish technology I have used. I am starting to use a shorter english as the inference is so incredible with the Google Home. So say "hey google play sting gwen bottle on tv". Google figures out that I want to watch a video of Gwen Stefani and Sting singing message in a bottle on my TV. It then turns the TV on, sets the proper input, and the video starts playing. Our brains inference capabilities allow us to communicate with one another in a compressed manner. Information can be inferred versus being said. This is what Google is doing and for some (many?) things they can do better than a human. Maybe it is because I have an engineering background but the Google Home from a technology standpoint and what Google is doing just blows me away. The GH is also picking up some nuances. So started talking to it and then say "I forgot" and she will say "no worries happens to me all the time". Then another time I say "nevermind" and it indicates "Yes let's move forward". I gets the I. The answer changes I guess from Google crawling the web all these years it learned how to respond to humans in a human type manner. Often blown away at the AI in the Google Home (GH).

Like Report

7 years ago

We've been working on a few prototypes the past several months. Here is the first with Alexa/DOT: https://www.linkedin.com/pulse/check-voice-enabled-survey-amazon-echo-dot-e-david-zotter?articleId=6240691839934959616#comments-6240691839934959616&trk=prof-post

Like Report