Virtual behaviour: measuring emotion in VR

The mass market tipping point for virtual reality may be several years away, but technological advancements from Isobar in conjunction with MIT Media Lab mean user experiences can now be more accurately measured. By Jane Bainbridge.

Young girl virtual reality VR_crop

It’s immersive, intense and all-consuming – the emotional impact of virtual reality (VR) is evident to anyone who’s put on a headset. But for brands to see this technology as anything more than the latest gimmick, a more scientifically replicable emotion-measurement technique is required. This is exactly what Dave Meeker, vice-president of digital marketing agency Isobar US, has been working on with a team of collaborators at MIT Media Lab.

Isobar’s owner, Dentsu Aegis, has been a consortium member of Media Lab for more than six years, so a history of collaboration between the two was established. But the need for this kind of robust measurement was driven by clients. “The question of how you measure VR continued to surface, and without being able to quantify the experience, why would brands invest in it?” asks Meeker. “We know that there’s a payback, purely from the visceral reaction you get when users step inside these worlds, but there was no hard science behind the measurement.”

Different teams at Media Lab were working on different aspects of virtual and mixed reality. This overlapped with what Isobar was trying to determine in tracking users to understand their behaviour, as well as having multiple users in the virtual space together.

Meeker’s team worked with Pattie Maes and Scott Greenwald of the Fluid Interfaces Group, MIT Media Lab, to integrate their tracking code around behaviour with Isobar’s neuroscience and measurement using biometrics. 

Isobar already had its proprietary tool, MindSight, to access the emotional brain in real-life environments, which could be incorporated with this work.

It then integrated this with iMotion’s biometric research platform, measuring human emotional response to visual stimuli with biometric triggers such as eye-tracking, electroencephalography (EEG), galvanic skin response (GSR), electrocardiography (ECG) and facial electromyography (EMG). 

A wrist sensor was used to measure heart rate and GSR, and all the sensor data was taken and processed with algorithms.

“We take the biometrics in real time on every frame of the VR experience – they run at between 60-90 frames a second. We know what you’re looking at, what our biometric data says, what you’re interacting with. We can see if users pick up objects, how fast they move from space to space, how they are navigating, what they’re interested in and what they’re not interested in,” says Meeker.  

“What’s most fascinating, and what we haven’t achieved yet in full, is taking all the data we’re capturing – demographic, behavioural and biometric data – giving that back to the team in MIT Lab, and so they try and gain insight from it. Could we take how you’re responding to content and tune it so it’s most meaningful to you individually? It’s about applying machine learning to that data set to drive the VR experience,” he says.

“We found VR is a very strong tool to create empathy and to really do some of the things we want as storytellers.” 

Meeker says, initially, they were trying to measure VR in a similar way to measuring web experiences of mobile apps. “The biggest challenge of measuring VR was that with traditional approaches – such as copy testing or talking through user scenarios – the sheer act of participating in the process creates bias; you know you’re being watched and tested, or you’re talking out loud so it removes you from the immersion, which skews the results.”

With this, users – although they know they’re being tested – have no idea what the researchers are looking for in their responses.

“We also do facial coding,” says Meeker, “but it’s really hard to do with a mask on your face, so we’re looking at new types of sensors that would sit inside a headset mask. The Media Lab is working on getting a more accurate read of your facial expressions and some additional data points.”

Meeker is in no doubt that, if the costs come down, uptake will grow. “People take off their VR headset and they have a look in their eyes that says they have just returned from somewhere else.” 

This article was first published in Issue 21 of Impact magazine.

We hope you enjoyed this article.
Research Live is published by MRS.

The Market Research Society (MRS) exists to promote and protect the research sector, showcasing how research delivers impact for businesses and government.

Members of MRS enjoy many benefits including tailoured policy guidance, discounts on training and conferences, and access to member-only content.

For example, there's an archive of winning case studies from over a decade of MRS Awards.

Find out more about the benefits of joining MRS here.

0 Comments

Display name

Email

Join the discussion

Newsletter
Stay connected with the latest insights and trends...
Sign Up
Latest From MRS

Our latest training courses

Our new 2025 training programme is now launched as part of the development offered within the MRS Global Insight Academy

See all training

Specialist conferences

Our one-day conferences cover topics including CX and UX, Semiotics, B2B, Finance, AI and Leaders' Forums.

See all conferences

MRS reports on AI

MRS has published a three-part series on how generative AI is impacting the research sector, including synthetic respondents and challenges to adoption.

See the reports

Progress faster...
with MRS 
membership

Mentoring

CPD/recognition

Webinars

Codeline

Discounts