OPINION12 June 2015

Uncovering the truth


Market research is taking a battering in some quarters, with opponents fired up post General Election. But are market research techniques really to blame asks ICM Unlimited’s Tom Wormald and David Perry?


New Coke, the McLean burger, and the low fat KFC skinless chicken – all examples of innovative concepts from some of the world’s biggest brands failing at product launch, yet testing well in the rational, controlled environment of the focus group or survey. Examples that author Philip Graves used recently to critique market research, based on the assumption that asking people questions is not an accurate way to predict their behaviour.

In an article published in the FT on 5th June, Graves says the market research industry should have used “better” techniques to get past the rational conscious mind, and understand “theeffect of the [Coke] logo on your emotions, the memories of summer evenings when you were a teenager, the shape of the bottle, the music from the ads.”

“Without knowing it, we want something else.” The implication here is that these products failed because research suggested that people wanted something when in reality, they did not. Asking questions failed to get to the heart of what people ‘really’ want – with a similar critique levelled at the polling industry.

Graves suggests that observation can be better than questions for understanding the non-rational drivers that can predict behaviour. At ICM Unlimited we certainly agree – for some time our strategy has been based on the belief that you can’t always get the whole answer just by asking questions.

However, we’d challenge the portrayal of the market research industry as trading solely on the “baseless belief” that “straightforward” questions can tell you everything. Whether via surveys, groups, online communities or other methods, we help our clients by asking questions of many different kinds, or by applying analysis that provides far richer findings than the “rational” approach that Graves rightly criticises.

Just one example is in the analysis of the speed with which participants answer questions – where even the smallest differences in reaction time can speak volumes about the depth of feeling in the unconscious mind. Another comes from our Wisdom of the Crowds work, predicated on the increasingly well-recognised assertion that respondents are far better at understanding and judging others’ behaviour than their own.

So yes, results will be far richer if combined with other data, whether observational data collected by researchers, or by analysing the vast datasets held by corporate organisations, governments, and other parties. But this does not mean that asking questions has no place at all, particularly for “why” questions. After all, while people are certainly not rational decision making machines, neither do we believe they are automatons, blind slaves to emotional drivers over which they have no control or understanding.

This is the premise of our own observational panel, Real Life, where we combine observational and question based data in a way that allows us to explore in entirely new ways the relationships between actual behaviours and attitudes, preferences and decisions of various kinds. What behaviours can be linked to an action you took? Can this kind of observation offer accurate predictions for future behaviour?

And what about what happens once the research has been carried out? How are products marketed, once the decision to launch has been made? Could the failure of the McLean burger be found in the overly rational tone of the advertising campaign? “It’s healthy, you told us you want it, now buy it!” Perhaps research isn’t solely to blame after all…

Tom Wormald is director future insights and David Perry is associate director at ICM Unlimited

1 Comment

9 years ago

I don't think there is anything wrong with market research at all. The problem is that researchers get comfortable and experienced with a specific method and then they use that method to solve every problem whether it's appropriate or not. We need to think about problems with methodology blinders on so that we end up using the best methods, not the easiest methods.

Like Report