This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more here

OPINION16 May 2016

Face up to it: emotion detection transforms market research

Innovations Media Opinion UK

Advances in the technology used in facial recognition are making it a valuable tool for researchers says Confirmit’s Terry Lawlor.

Understanding emotions is hugely powerful in market research. We all know that facial expressions are strongly linked to emotions, and research organisations have analysed facial responses by human observation of video for many years.

However, human assessment has its limitations. To overcome these, technologists have been striving to deliver effective emotion-detecting technologies that can provide much deeper insight about personal sentiment and reaction. This has been a notoriously difficult journey, but with commentators declaring 2016 ‘The Year of Emotion’, is this all about to change?

The power of emotion

Emotions, even fleeting ones, have the capability to reveal a person’s beliefs and their propensity to act or buy. For researchers, being able to capture these emotions through minute facial expression changes can be incredibly powerful – informing tailored advertising, customer loyalty and other programmes based on reactions at the point of experience. This knowledge not only gives researchers a greater understanding of behaviour patterns but also helps predict likely future actions of that consumer.

Combined with data from other areas of research, emotion recognition can deliver an unprecedented level of insight into what impacts customer emotions. This can, in many case, drive improvements in product and service offerings and experiences. It also offers a way for researchers to circumvent the continuing decline in response rates: emotion detection is a highly engaging method of gathering feedback and is being used increasingly to complement panels, focus groups and surveys.

Application to research

The range of emotion detection applications out there is already vast, primarily used by researchers for ad testing. Traditionally, respondents will answer questions about the advert they’ve being shown, rating it on various scales. While broadly effective, it relies on the respondent’s ability to recall what they’ve just been shown and how they interpret their own emotions, as well as their ability to express these emotions in the way the researchers need it to be gathered. Of course, researchers can also observe and record emotions while video content is being shown, but this needs specific skills and is difficult to perform consistently.

Technology that monitors facial expressions bypasses these issues by capturing data as the respondent views a video. With a traditional view-then-report approach, some fleeting emotions may not even be recognised by respondents who are more likely to remember how they felt at more memorable points in the advert, and at the end. If the emotions of the respondent are being observed, the risk is that different observers may interpret emotions differently. Using technology throughout the viewing stage removes this risk, enabling advertisers to understand how the tiniest elements of their video may impact audience response.

This type of technology builds a huge reference database of expressions against which to judge the face being viewed. Researchers can then compare the aggregate emotional performance of their video clip against a benchmark.

Is emotion global?

Perhaps the biggest challenge with emotion recognition is global representation. With an increasing number of research programmes conducted on a global scale, and localised programmes relying on multicultural respondent samples, ensuring consistency while understanding regional nuances may seem like an impossible task.

Academic research over the past 10 years has shown that, contrary to the Darwinian theory of ‘universality’ in emotional expression, facial emotional responses cannot be easily categorised across different cultures. An expression of ‘disgust’ in Canada, for example, does not necessarily mirror the same reaction in China. This is because reactions are made up of ‘macro’ and ‘micro’ expressions, and are also influenced by genetics, relationships and other complex factors.

But it is possible to tailor and ‘fine tune’ emotion recognition technologies for global programmes.  Many providers will have worked hard during the ‘set up’ period to capture and categorise the way in which different nationalities and cultures react to the same visual prompts.

However, set up needs to go deeper than this because it is not only facial expressions that vary from culture to culture, but also the level of emotional response that people from different nationalities display.

Facing the future

Like every next ‘big thing’, emotion detection software is constantly evolving, and will only improve as further investment is made in it. The technology has not been designed to replace tried and tested research tools, rather, it is an addition to the toolkit from which experienced researchers can select specific and tailored solutions.

As with most advances of the past decade – mobile, social analytics, text analytics, beacon technologies, and more – emotion detection will find its place and help forward-thinking researchers to continue to add value to the increasingly strategic services they provide to their customers.

Terry Lawlor is executive vice-president product management at Confirmit

2 Comments

3 years ago

At the moment facial recognition software is entirely full facial and seems to be limited to just a handful of emotions. Are we over-promising here with something so rudimentary? Even Ekman, the father of this research, postulated many more emotions than the 6-7 in these models. And hand on heart, many marketers must wonder about the value of emotions like hate and anger when viewing ad campaigns that are hardly intending to challenge our emotions on these dimensions. More care is needed here not gushing sales driven claims.

Like Report

3 years ago

There are pro's and con's of using facial recognition software in market research. On one hand, as pointed out by Chris, you have a relatively small set of evolutionary emotions that are expressed consistently across cultures defined by the Ekman emotions while humans have many thousands of ways they express there emotions and there are multiple models of emotion. Although the upside of using this approach is the simplicity, universality, and scientific validity of the measurements. There is a clear temptation to want to measure what marketeers want to hear, however the basic set of 7 emotions are valid, applicable across cultures, and measure what they claim to measure. Hence they can be used as a solid foundation of further analysis and models. However there is generally a misconception around the names used in this such model, and exposed by Chris. Anger is not Anger, rather a placeholder for a dimension of Anger that can take form in different ways, especially dependent on secondary emotions present (https://en.wikipedia.org/wiki/Contrasting_and_categorization_of_emotions). Hence the article would be better served with a deeper explanation of how these primary emotions are and can be interpreted for the purposes of marketing research based on psycho-evolutionary theory of emotions.

Like Report