FEATURE12 June 2018

Virtual behaviour: measuring emotion in VR

x Sponsored content on Research Live and in Impact magazine is editorially independent.
Find out more about advertising and sponsorship.

Features Impact Innovations North America Technology UK

The mass market tipping point for virtual reality may be several years away, but technological advancements from Isobar in conjunction with MIT Media Lab mean user experiences can now be more accurately measured. By Jane Bainbridge.

Young girl virtual reality VR_crop

It’s immersive, intense and all-consuming – the emotional impact of virtual reality (VR) is evident to anyone who’s put on a headset. But for brands to see this technology as anything more than the latest gimmick, a more scientifically replicable emotion-measurement technique is required. This is exactly what Dave Meeker, vice-president of digital marketing agency Isobar US, has been working on with a team of collaborators at MIT Media Lab.

Isobar’s owner, Dentsu Aegis, has been a consortium member of Media Lab for more than six years, so a history of collaboration between the two was established. But the need for this kind of robust measurement was driven by clients. “The question of how you measure VR continued to surface, and without being able to quantify the experience, why would brands invest in it?” asks Meeker. “We know that there’s a payback, purely from the visceral reaction you get when users step inside these worlds, but there was no hard science behind the measurement.”

Different teams at Media Lab were working on different aspects of virtual and mixed reality. This overlapped with what Isobar was trying to determine in tracking users to understand their behaviour, as well as having multiple users in the virtual space together.

Meeker’s team worked with Pattie Maes and Scott Greenwald of the Fluid Interfaces Group, MIT Media Lab, to integrate their tracking code around behaviour with Isobar’s neuroscience and measurement using biometrics. 

Isobar already had its proprietary tool, MindSight, to access the emotional brain in real-life environments, which could be incorporated with this work.

It then integrated this with iMotion’s biometric research platform, measuring human emotional response to visual stimuli with biometric triggers such as eye-tracking, electroencephalography (EEG), galvanic skin response (GSR), electrocardiography (ECG) and facial electromyography (EMG). 

A wrist sensor was used to measure heart rate and GSR, and all the sensor data was taken and processed with algorithms.

“We take the biometrics in real time on every frame of the VR experience – they run at between 60-90 frames a second. We know what you’re looking at, what our biometric data says, what you’re interacting with. We can see if users pick up objects, how fast they move from space to space, how they are navigating, what they’re interested in and what they’re not interested in,” says Meeker.  

“What’s most fascinating, and what we haven’t achieved yet in full, is taking all the data we’re capturing – demographic, behavioural and biometric data – giving that back to the team in MIT Lab, and so they try and gain insight from it. Could we take how you’re responding to content and tune it so it’s most meaningful to you individually? It’s about applying machine learning to that data set to drive the VR experience,” he says.

“We found VR is a very strong tool to create empathy and to really do some of the things we want as storytellers.” 

Meeker says, initially, they were trying to measure VR in a similar way to measuring web experiences of mobile apps. “The biggest challenge of measuring VR was that with traditional approaches – such as copy testing or talking through user scenarios – the sheer act of participating in the process creates bias; you know you’re being watched and tested, or you’re talking out loud so it removes you from the immersion, which skews the results.”

With this, users – although they know they’re being tested – have no idea what the researchers are looking for in their responses.

“We also do facial coding,” says Meeker, “but it’s really hard to do with a mask on your face, so we’re looking at new types of sensors that would sit inside a headset mask. The Media Lab is working on getting a more accurate read of your facial expressions and some additional data points.”

Meeker is in no doubt that, if the costs come down, uptake will grow. “People take off their VR headset and they have a look in their eyes that says they have just returned from somewhere else.” 

This article was first published in Issue 21 of Impact magazine.

0 Comments