FEATURE1 August 2009

Eyes wide open

Eye-tracking is moving out of the lab and into the real world, making it an increasingly tempting way to get inside consumers’ heads. Tim Phillips eyes up its potential.

?If, recently, you spotted some travellers wandering around Heathrow Airport wearing oversized safety goggles wired to a shoulder bag, don’t be alarmed.

They had been recruited to spend the long hour before boarding their flight by taking part in an eye-tracking survey for JC Decaux, to see if digital media screens were at the right height and in the right places to catch their attention (they were).

If when you visit your local supermarket one of the shoppers is wandering around in a similar pair of glasses, there’s a decent chance that they’ve been recruited to Nunwood’s ‘Bioshopping’ panel, which aims to link how we look at things to how we feel about them – and what we buy as a result.

These are just two of many new studies which are attempting to take the discipline of eye-tracking, for 100 years a subject of academic research, into the real world. The proponents of eye-tracking research think they have finally created methods which are cheap, practical and effective, and powerful enough for companies like Think Eyetracking to research how two-year-olds use the internet.

Iain Janes, director of Eyetracker, who completed the study for JC Decaux and who has been conducting eye-tracking research for 27 years, says that the move out of the lab and into the real world is a step forward not just in the attractiveness of the technology to clients, but also the accuracy of the results. “We’d like to think that people come to us because they need results rather than eye-tracking,” he says, “The laboratory ‘pretend you are in a shop’ approach has some value, but it’s not how we really buy corn flakes.”

It takes about five minutes to put on the Eyetracker goggles (originally developed by defence tech specialist QinetiQ to help train pilots). They are held in place by an adjustable strap and calibrated by looking in order at five fixed points on the wall. From then you have the (slightly eerie) experience of knowing that the camera in the goggles is recording a film of what is in your field of view and exactly where your eyes are looking at any time.

Eyetracker, whose customers include Tesco, Halfords, Sainsburys, L’Oréal and Heineken, spends 60 per cent of its time working alongside MR agencies to enhance existing research, and 40 per cent direct with brand owners. Both, Janes explains, prefer to work with a specialist, because although eye-tracking may be becoming more accessible, they still need an expert to interpret the results they get.

The recording of where the participant looked the most is known in the military-derived jargon of eye-tracking as a heat map. But it takes more than just this to reach any conclusions.

“A heat map looks great, but it shows only that you were looking at something. It doesn’t explain why your eyes were pointing in that direction,” says Janes. Immediately after taking part in an eye-tracking experiment, Eyetracker subjects do a debrief interview to tell the researcher what grabbed their attention, whether they recall looking at the things they saw and whether they felt positive or negative emotions.

?”We had one competitor who had drawn conclusions because the eye was constantly drawn to the same place. What he didn’t know is that was where the system’s crosshairs would default to when they lost the real position of the eye”

Iain Janes, Eyetracker

Attention does not always mean recall, Janes warns. “Sometimes their eyes will rest on something, but they don’t remember looking at it. Sometimes they look at something for a fraction of a second, but they recall it,” he explains. “We had one competitor who had drawn conclusions because the eye-was constantly drawn to the same place. What he didn’t know is that was where the system’s crosshairs would default to when they lost the real position of the eye.”

Ian Addie, research and development director at Nunwood, is currently recruiting participants for his Bioshopping supermarket study. The brands that take part will receive data from a panel who volunteer their biometric feedback from a regular shopping trip. This allows participants to see not only how shoppers look at their products and others in their category, but how they respond to the shop and to products, promotions and stresses. “If you fixate for about 3/25 of a second, then we know the visual image is having some impact on your mind,” Addie explains. “That is more or less where most companies get to. They know the subject has seen something, they know it potentially has an effect on them – but that doesn’t mean it has a positive effect.”

So Addie’s research, either one-off projects for FMCG clients or as part of Bioshopping, will combine the feedback from the eye-tracker with two other sources from a sensor placed on the subject’s finger: a galvanic skin response, which uses activity in sweat glands to monitor arousal, and a temperature measurement sensitive to 0.01 degrees Fahrenheit to examine whether that arousal represents a positive emotion (hotter) or a negative one (colder).

“Once you start to research a shopper as a person, rather than the customer of your brand, you need to create understanding of their behaviour. That includes their subconscious emotions,” he explains. Biometric feedback goes one stage further than an after-the-fact narrative, because it allows you to capture an emotional response that may be unconscious.

Eyetracker, Nunwood and the other nascent real-world eye-tracking specialists can often provide quick, inexpensive results, which attract the attention of FMCG companies and retailers who don’t want lengthy in-store trials if there’s an easier way to the same conclusion. Typically, eye-tracking studies use between 20 and 40 respondents – more tend only to duplicate what is already a large amount of data.

Eye-tracking has already become commonplace for internet brands. Usability or media effectiveness researchers can use eye-tracking systems based on a camera mounted on a monitor, which can be calibrated in seconds.

Giles Colborne, managing director of Cxpartners (whose client list includes Tesco, Lastminute.com and Direct Line) uses eye-tracking in his research. “It’s a common technique for us now. We use it all the time. While we are listening to the user’s narration, we have someone watching the eye-tracking. They can pick up little extra things that we would otherwise miss,” he says. “This is a level of detail that a user could never describe to you because they’re not conscious of it.”

But Colborne warns that insights are often not obvious to the casual observer, and eye-tracking is not yet at the stage where it is a useful tool for general research. “I wouldn’t bother with eye-tracking unless you know something about how people look at things, about how visual perception works, and about how the interview technique can interfere with, rather than help, the process.” In usability research users narrate what they are doing as they do it, but that involves them rationalising or interpreting it. “You can hear what they think they are thinking. But if you are running an eye-tracking study, potentially you can see what they actually are thinking,” he says.


Keeping an eye on the drawbacks of eye-tracking

?There is a danger that eye-tracking might over-promise, warns Doug Edmonds, managing director of 2CV, who used the technique to investigate whether readers of The Times were spotting and recalling advertising for Old Speckled Hen beer. Some gave very little visual attention to the advertising, but could still recall the brand, even though they could not recall other brands that they looked at for longer. “It needs careful calibration. You can’t just hand the technology to people and expect them to get on with it,” he says. “From a technical point of view the results are interesting, but this is often confirming things we sort of know anyway. It’s a very big hammer to crack a nut.”

He warns that clients are currently wary of new methods unless they can show an advantage in speed or cost. “Sometimes agencies like to use technology to add theatre to what they are doing, but it doesn’t add to our collective knowledge.”

This theatre includes the rush to capture brain stimulation directly while we are looking, using real-time EEG or MRI scans to supplement eye-trackers. The problem of attaching electrodes to the skull of someone who came to do their weekly shop makes this impractical outside the lab, Addie says. “EEG requires multiple electrodes to be attached to the scalp, and for a good electrical connection you need a conductive gel. It’s difficult to get people to agree. And even the blink of an eye-can have an effect on the data you are getting, so you need some very sophisticated algorithms to strip out the noise. We are investigating, but I have some big caveats.”

Janes, too, has his doubts about this. Using eye-tracking in the field can establish simple truths quickly, he says: are people responding to your pack design? Where should it be in the store? Where should we put the ‘buy’ button? For that, he says, it’s rarely useful to look inside our brains. “Lying in an MRI scanner,” Janes says, “is not how I usually read a catalogue.”

2 Comments

15 years ago

In our view, the main limitation with GSR is that, while industry insiders may find their own categories fascinating, shoppers themselves are pretty bored by most of them. In so many FMCG categories there is little or no shopper involvement as shoppers' purchases are habituated: much the same week in, week out. So they know where the relevant displays are situated and the on-shelf locations of the SKUs they normally buy. This means there is insufficient difference in emotional arousal between these categories. We use eye tracking and EEG, combined with respiration, heart rate, temperature and head motion measurements, to capture shoppers’ emotional and cognitive responses. By gaining insight into responses to the visual stimuli that the shopper actually looks at, as well as prefrontal cortex responses, we can gauge the relative roles of emotion and cognition at each and every stage of the shopping journey. The role of memory - implicit and explicit - is also key to understanding why shoppers buy and we are working with fMRI techniques to understand more about the role of memory in store.

Like Report

15 years ago

What needs to be considered is the precise nature of GSR and the fact that it is a measure of something which is subtly different from that captured by EEG, but nevertheless equally valid. GSR is a response operating via the Autonomic Nervous System and activated by structures within the Limbic System deep within the brain which control our reflex response to stimuli. In the context of a shopping experience where the individual is deeply immersed in the activity (be that consciously or subconsciously) stimuli of relevance to the activity initiate such reflex responses in order to prepare the individual for action - this is not a cognitive process, but may then lead to cognitive processing in the Pre-frontal Cortex. Our developmental work in this area has found GSR responses to specific stimuli across the shopping experience to be clearly evident and of varying magnitudes indicative of their significance and relevance. In our experience GSR effectively measures the "shout volume" of a stimulus. Although whether the "shout" is heeded is indeed dependent on the conscious or subconscious processing of the sensory information contained within that stimulus by the Cerebrum and Pre-frontal Cortex in particular, which then determines it's relevance to the individual. Whilst we accept the principle of EEG measurement of Pre-frontal Cortex activity in order to objectively gauge cognitive activity and higher level emotional processing, this is a somewhat different endeavour to the measurement of GSR. Indeed EEG as a measure of surface brain activity would be unable to effectively measure activity in the Limbic System to which GSR provides a secondary measure. The use of EEG to measure cognitive activity is also fraught with difficulties which are compounded by taking the technology out of the lab and into the real world as is the case in a shopping environment. Specifically, EEG requires electrodes to be affixed to the subject's scalp. Achieving a good connection is a significant practical problem when subjects are mobile and generally requires conductive gels to be applied. Although technology is advancing towards dry electrodes, as yet, though conversations we have had with neuroscientists, it is apparent that such technology is some way from being perfected. The electrical potentials measured by EEG are also very weak and as such are prone to outside "noise" from muscle contraction and other external electrical sources. In a lab environment these can be kept to a minimum, but in real world applications this must be tolerated and subsequently dealt with during the data processing stage to a greater degree. Conventionally, such excerpts of noise have been dealt with by entirely stripping out sections of effected data, leaving holes in the EEG trace. Clearly this is not an ideal option when we are concerned with split second responses to specific stimuli which cannot be repeated. As such sophisticated algorithms are required to account for this noise and clean the data. The more noise, the more difficult this is to achieve.

Like Report