OPINION16 January 2020

Welcome to the world of Qual 3.0

Opinion Trends UK

2020 brings Qual 3.0, a hybrid of traditional, technology-enabled and data-led approaches to deliver speed, quality and value. Firefish’s Jem Fawcus argues it will bridge the empathy gap between businesses and their data.

A new decade always encourages us to take stock of where we are and where we are going, and it is a bit of a shock for me to realise that the ‘20s will be my fifth decade in research and insight – and, in its most broad definition, as a qualitative researcher. (That means that I started at the very end of the ‘80s not that I’ve been at the coal face for 50 years I hasten to add).

I am interested in how where we’ve come from informs where we are now, and where we are going. Qualitative research as a commercial discipline has been around since the late 50s (though you could argue that it has been around as long as conversation itself).

It has always been valued for its depth and richness – to understand people’s needs and motivations, the meanings they ascribe to their lives and the role that brands can play for them.

Qualitative research made sense of different strands of human attitudes and behaviour and brought this to life as human stories and experiences – a powerful way of communicating. Marketers not only gained insight into their audience, the humanity of qual led to understanding, intuition and empathy. It wasn’t the only way of approaching problems, nor the answer to everything, but Qual 1.0’s role was recognised and valued.

There were many frustrations, however. It could be time consuming, expensive, used badly and done badly – it didn’t always get it right.

And then came the first headrush of technology, promising a brave new world with a solution to everything. Technology was new, clean and had no baggage. It was fast, cheap – if not free – and exciting. And it would make all other approaches redundant, always get it right, and best of all, reduce the need for expensive humans.

Qual 2.0 evolved into this environment, and it is fair to say, it was a difficult evolution. It was forced to play by the new rules of fast and cheap, ideally adding some sexy new whizzbang thing that sounded cool, whether or not it enhanced the outcome.

Qual 2.0 seemed to be about trying to use tech to do things the same way we’d always done them – faster, cheaper but rarely, if ever, better.

The tech was often in its infancy and didn’t work very well (early online video groups anyone?) – or tried to replicate existing methodologies which it didn’t fit with (remember text ‘groups’?). But it often made them more time consuming and expensive, though we generally hid this from our clients – did any agency’s margins not decrease during this phase? 

Most worryingly, because Qual 2.0 was about replacing existing approaches with technology, and the outcomes from the technology were often not very good, there was a danger that the most important part of qual, the stuff you can’t get from anywhere else – the richness and depth, the meaning and understanding, the human stories and intuition – was getting dumbed down. In this period, it often felt like we had to work extra hard to get to the deep and meaningful insight despite the technology, not through it.

But we learned. And the tech improved and evolved.

Meanwhile in the ‘real’ world, CMOs and heads of insight were under ever more pressure to deliver under tighter financial constraints, and with more efficiencies.

They need to keep up with the cultural and technological changes. Budgets are being ploughed into acquiring their own data centres and capturing and storing more data.

Thousands of marketing messages, touchpoints, opinions and responses, get logged every minute of every day to the point of overwhelm. With apologies to Coleridge, there was data, data everywhere and no-one stopped to think.

We seem to have lost sight of a golden rule of research: it is not the data that is useful, but the meaning extrapolated from it. And often, relying on data alone ignores that each issue we are trying to get the bottom of is inherently human.

Many recent client conversations have been characterised by a number of interlinked themes: I have lots of data, but I can’t really make sense of it; or I have lots of data, but it all seems to say something different; or I have lots of data, but I’m not really sure what to do.

Many businesses have identified a gap between their data and insight – between data and meaning and how their audience feels. It is sometimes called the empathy gap and has led to some high-profile missteps in recent years (Pepsico and Gillette to name two).

It is into this gap that Qual has evolved and developed: not Qual 1.0, which is time consuming and costly; not Qual 2.0, trying to replicate traditional approaches with second rate tech; but Qual 3.0, which has learned from what has gone before, filled the empathy gap and unleashed the power of data.

Qual 3.0 is made possible by several interlinked developments:

  • Tech has improved (and is no longer seen as a magic bullet)
  • Moving from binary thinking (approach A vs approach B) to hybrid approaches and multiple data sources
  • Recognising that smart human brains are still the most powerful intelligence we have – and that augmented human intelligence is more realistic than artificial intelligence.

Qual 3.0 makes the tech work for us, not us for the tech. Routine tasks can be speeded up and made more efficient – from faster, smarter recruitment driven by programmatic tech to swifter first-stage analysis through auto-transcription and simple video analytics.

Online discussion platforms and communities mean that geographical and numerical scale are not prohibited by cost. An expanding digital tool-box can be dipped into across the project cycle – from social media tools and blog aggregators that streamline collating disparate information (aka desk research), through to self-serve quant tools and data platforms that add sharp, focused ‘numbers’ to give further weight to our findings.

We can get closer than ever to people – into hidden areas of their lives and literally under-the-skin to their physiologies, through wearable technology and cameras. And we can even observe reactions to experiences that don’t exist yet through virtual reality.

It frees up time for the strategist to assimilate many different data sources and develop hypotheses – which can focus and target ensuing stages of research. It reduces the cost of initial stage cultural insight to a fraction of its traditional cost, allowing budget to be invested in a more targeted manner. And most importantly, it increases the quality of the insight, by allowing a combination of data sources, iterative approaches and rich human insight.

Qual 3.0 has overcome the drawback of Qual 1.0 as slow and expensive by targeting time and spend where it delivers most value. And it overcame the race to the bottom of Qual 2.0 by putting smart human insight back at the centre, derived from more sources than ever.

Welcome to 2020. Welcome to Qual 3.0, a golden age of qual.

Jem Fawcus is group CEO of Firefish


4 weeks ago

Great piece. Like you, have been at the qual coalface for longer than might care to remember. GULP. Whatever the journey, whether chasing new technology or it chasing us, am still a huge believer in the adage that 'poor quality data, badly interpreted, leads to the wrong answer'. Often people do not mean what they say and not say what they mean and technology continues to struggle with this. BUT an experienced researcher can see through the hazing and occasional bullshit to generate genuinely strategically valuable insights and not merely regurgitate detail.

Like Report

3 weeks ago  |  1 like

I completely agree, but I also feel qual has been guilty of sometimes giving the impression that only face to face will do and technology and data is somehow the enemy. I am strongly against this view; I believe data and tech are fantastic additions to the search for better insight - with the obvious caveat, when used well. And as I say in the article, I think qual's 'human' element will come to be seen as vital to unlock the power of data.

Like Report