OPINION18 January 2024

The case against data

Data analytics Opinion

It’s crucial for researchers to understand people, culture and context, and data should not be an end in itself, says James Fox.

graphic of colourful jigsaw puzzle pieces being put into place by four hands

Data is evidence, not insight. Viewed in isolation, you can say a data point means just about anything – and you’ll always be wrong. The steady decline of budgets allocated to traditional market research raises the question: even if your conclusions about that specific data set aren’t wrong, so what? What then? 

Data points reflect the questions we ask. They rarely tell you anything about the data collection methods, study design (and intentions), why results look the way they do, or how to meaningfully action findings (if you even can!). When it comes to surveys, these weaknesses can produce outputs and reports that are so predictable you probably didn’t need to run them in the first place.

Some will try to get around this problem by making surveys longer, more detailed, more complex. But good market research requires good data, and good survey data comes from  questionnaires that are straightforward and easy to answer. Others will turn to big data, and now even synthetic data, hoping that if they collect enough of it they’ll eventually find something useful. But this doesn’t get around the problem – algorithms are only as good as the information they’re given, and analysis is only as good as the researcher interrogating the results.

Let’s consider the so-called “value-action gap” that marketers love to point out as demonstrating that what consumers claim is important to them clearly isn’t – otherwise the gap wouldn’t exist. But by looking at results with such a narrow lens, we miss obviously important context. 

Instead, we need to step back. You don’t have to collect information on everything, you just need to know what you don’t know – and to really understand what factors are relevant to the actual research objectives. Designing good analysis requires methodologies beyond standard quant and beyond data analytics. Information is important, but context is king. Without it, we might miss relevant factors, alternative approaches and hypotheses, answers and opportunities. No amount of inferential analytics will make up for asking the wrong questions.

In the case of the value-action gap, we might approach the design using COM-B, collecting information on factors impacting ability to act on values and competing needs and motivators, or we might consider the theory of reasoned action if we genuinely care about the gap between values and action. The value-action gap is evidence of a gap – it is not an insight about consumers lying.

Employing qualitative methods and/or desk research should be a mandatory step in any insights programme. There is simply no way around it – and that should be seen as a good thing. It’s crucial for good researchers to understand people, culture, and context. We need reports that consider not just a single source, but which bring together insights considering as wide a pool of information as possible. Reports that tell you how, why, and what next – in other words, actual insights and ideas. 

Good reports should be backed by research, not a 300-slide information dump. Data is one tool helping us to achieve our specific goals, not an end in itself. A nationally representative study with a report deliverable containing page after page of charts is not nice to look at. If you don’t want to look at it, why would anyone else? The real value of research comes from understanding how both quant and qual techniques, expert perspectives, methodical desk research and planning can serve to provide actual insights, creating something genuinely useful – not just the what but also the why – and compelling for the audience.

James Fox is head of data & analytics at Canvas8