Look into my eyes
Eye-tracking and facial coding have made the jump online. Robert Bain looks at the web-based tools that can tap into the emotions our faces reveal.
Studying the face for market research purposes is nothing new. Analysing expressions is one of the best-established ways of gauging emotion, and eye-tracking technology allows us to record where people’s gaze is drawn in order to hone products, ads and websites.
But until recently these techniques have been unwieldy. Facial coding requires trained analysts to spend hours poring over videos of people responding to stimuli, while eye-tracking usually requires dedicated technology (either screens equipped with sensors or specialised glasses), limiting their use to labs and planned tests in the field.
Now, though, more and more people have access to the basic equipment needed to read the face: the humble webcam. Most new computers are fitted with webcams, and mobile phones come with front-facing cameras to allow video calling. As a result, the potential to tap into what the face reveals has exploded.
One of the companies taking advantage of this is Affectiva, whose co-founder Rana el Kaliouby developed software for reading facial expressions for her PhD. Initially the technology was applied to helping people with autism understand emotional communication, but consumer goods companies also began to take an interest and in 2009 Affectiva was set up to pursue its commercial applications.
El Kaliouby told Research: “A lot of companies we work with, it turns out that they already record the face because they realise it’s an important channel of information, but they never have the funding or the resources to actually code the video, so it sits there unused. That’s where we’re able to bring value.”
The way Affectiva’s technology reads expressions is based on the Facial Action Coding System which has been established since the 1970s as the standard way of coding expressions. Its weakness is that you have to do it by hand. Using software that can study a video, recognise the shape of a face and track the movement of reference points like the eyebrows, eyes and lips, the process can be automated.
Affectiva’s solution is cloud-based, so all the research participant needs is a webcam and an internet connection. A browser plug-in takes the video feed and streams it back to Affectiva’s server to be analysed.
“We want to be measuring emotion in the wild and at scale,” says el Kaliouby. “Right now in the lab you can basically do 10 or 20 people, but you can’t measure affect in 4,000 or 10,000 people, so that was our vision. And the only way we felt we could do that was to build a cloud platform. You land on a site, you get prompted to turn your webcam on, and if you agree you can basically consume the content as your face is being recorded.”
The firm recently used its technology to gather more than 4,000 videos of people responding to Superbowl ads. “There isn’t a dataset like this with facial expressions,” says el Kaliouby, “so it’s really cool.” The scale, speed and flexibility of the technology provides an answer to critics who say that online methods miss out on the face-to-face connection you get in person.
Companies including P&G, PepsiCo and Interpublic have worked with Affectiva, as well as universities and health organisations. El Kaliouby says she has seen interest in measuring emotions increase dramatically.
“When I first started this research, people were like, ‘Why would you want to measure emotions?’ Now everybody wants to measure emotions. There’s been a lot more research indicating emotions play a big role in how we take decisions.”
Another company moving in to similar territory is Realeyes, which has developed facial coding software to complement its eye-tracking system, both of which will soon be available through a web-based platform. Until now Realeyes has conducted its tests using portable equipment in malls, cafés and train stations.
Realeyes’ founding partner Mihkel Jäätma says he has wanted to expand from eye-tracking to facial analysis since he set up the firm in 2007, but the technology is only now making it feasible. “Eye-tracking can do some things very well, but in other cases it falls short on its own,” said Jäätma. “People start asking things like, ‘He focused there, but was it good or bad?’ So the combination of these two different dimensions adds up to a much more valuable insight. We can’t read thoughts, but we can definitely now make a distinction between whether the user’s emotional state was positive or negative.”
Jäätma believes the demand for emotional measurement has been strong for some time, but the cost and difficulty of using the technology have held the market back. “We don’t require anything extra; it’s purely software and mathematical models. If you think about the alternatives that clients have to assess the user experience, you can have people in labs, or you can look at web analytics data which is quite sparse. The combination of being able to do it via webcam over the internet at large scale and measure subconscious behaviour in the background without affecting users’ interaction really makes for a very nice package.
“The kind of measurement that we can bring is more useful, more relevant than some of the self-reporting questions where people try to tell you whether they were actually engaged or not.”
Inevitably, taking the technology online involves a trade-off – and not all webcams are of a high enough quality to be effective for reading the face. But the firms pushing this technology believe that, for many clients, the scale and flexibility of their web-based offerings will more than compensate. Jäätma believes 30-40% of computers already have webcams good enough to use, and his company’s lab-based solution will remain available for when clients need higher levels of precision.
Realeyes is now looking at ways its technology could be used for movie editing and to enhance video-calling services, while Affectiva’s latest innovation monitors reflections of light from the face to measure people’s heart and breathing rate. Online methods may have taken over from face to face in some instances, but thanks to the webcam, the face has the potential to reveal more than ever before.
It doesn’t get much more personal than reading people’s facial expressions. So Affectiva’s Rana el Kaliouby is keen that consumers should have some control over how their data is used.
“It would be cool to redress the imbalance between companies that track your data, versus what you as an individual are able to learn about yourself.
“I think there’s an opportunity to give you feedback as an individual on how you respond and also enable you, if you wanted, to share this with market research companies or over a social network.”
But how do her clients feel about this idea? “I think they like it, because it will increase engagement with content, and increase trust too. So far companies are open to that. We’re very big on respecting people’s privacy and having them in control of their own data. Especially with emotions, that’s important.”