FEATURE19 November 2014

Back in the day

x Sponsored content on Research Live and in Impact magazine is editorially independent.
Find out more about advertising and sponsorship.

Features Impact

From innovations such as QR code surveys and gamification to Skype calls reducing the need for international travel, much in the research landscape has changed over the past 10 years as Chime Insight & Engagement’s Crispin Beale explains.

During a recent discussion about geolocation beacons and omni-channel marketing opportunities, I was struck by the huge changes that have taken place in the industry in recent years, not only in my own work – which is to be expected having moved from client to agency side – but also in the way that brands use research.

In the words of American scientist Carl Sagan, “You have to know the past to understand the present”, so with this in mind I reflected on the elements of change I have seen over the past 10 years, drawing on some of the four Ps of marketing as a guide. 


Research is no longer limited to traditional methodologies such as face-to-face interviews and paper-based surveys. In this age of technological evolution, research takes place across an ever-growing range of smart devices. The pace of technology is increasing and brands expect faster turnaround times, real-time data and the ability to access information across the globe.

Ten years ago, online research was seen as the latest and greatest and, as predicted, has transformed research as we know it. However, there are down sides to the fast turnarounds and broad samples on offer – as Fujitsu’s Simon Carter explains (see below). While there will always be new and innovative approaches to capturing consumer opinion, nothing can replace the nuances and insights that can be gained from speaking to a respondent in person.


Research used to be the exclusive domain of the blue chip multi-nationals but technology has made it more accessible to SMEs using highly targeted samples over short time frames. Budgets are tighter on all sides (see Danny Russell of Telefonica comments). Large research agencies still dominate the industry as they did 10 years ago, but at the other end of the spectrum there has been an increase in small-scale consultancies and boutique agencies catering for SMEs.


We now operate in a truly global marketplace. When I was head of insight at the Post Office 10 years ago, weekly trips to France and Italy to view fieldwork were inevitable. Now, I can seamlessly deliver insights to APAC clients via daily updates and weekly Skype calls, and it’s no longer necessary to be physically present to conduct business or oversee projects.


New ideas are constantly being sought to engage the consumer. Where previously video was considered innovative, it’s now the norm. From podcasts and interactive websites to gamification, QR code surveys, and live audience response, the trick is to keep it new and interesting. Tables and graphs cannot be replaced for reporting survey findings, but brands are finding infographics and interactive visuals more effective for presenting results. The next big thing is the Internet of Everything and we are already looking at ways in which smart devices in the home such as wearables can be used for research purposes.

So, while there have been dramatic changes in the research industry in the past decade – some positive, some less so – what is clear is there is plenty of change still to come. Understanding what works and what doesn’t is the first step in adapting to an ever evolving, evermore competitive landscape.

Simon Carter, marketing director, UK and Ireland, Fujitsu

A decade ago I worked mainly with UK agencies, but now global partnerships are possible, irrespective of country base. Similarly, where regional teams previously worked on the ground across the world, they are now able to work virtually from anywhere and hours of operation have become 24/7 to accommodate time zone differences.

Whereas 10 years ago the focus was on sampling, Big Data and the ability to mine information now means ‘the whole universe’ can be accessed – an exciting proposition indeed. However, the result of this plethora of information has – I believe – caused a downturn in marketing skills. One example is that email is ‘free’ (versus paper direct mail needing a stamp), which has made marketers lazy. They now send to thousands of addresses without worrying whether the recipient is alive or dead – arguing it is cheaper to waste the effort, than spend time cleaning and segmenting the data, resulting in recipients being less open to this channel of communication.

In this fast moving world, it is also harder to undertake research that generates the real nuggets of insight, as customers have become increasingly fickle toward brands or suspicious of marketing and research activity. With lower levels of brand loyalty, opinions change quickly and researchers need to find ways to adapt.

Having moved roles from B2C to B2B in the past decade, I have also found significant differences in research requirements between the two. While for B2C the best way to gain trends and satisfaction insight is via focus groups and street interviews, B2B research is used to mainly develop a thought leadership positioning and 80% of it is undertaken online, which is almost the only way to achieve a response.

So have things improved over the past 10 years? On balance I would say yes, but technological innovation should never be a replacement for human intelligence and rigour.

Danny Russell, General Manager, Business Intelligence, Telefonica UK

Firstly, technological advances have made the pace of everything today much faster. Gone are the days of dial-up broadband; now, there are complaints if a page doesn’t load within three seconds. As a result, individuals have high expectations of delivery time frames. In market research terms, this means some of the more traditional research methodologies have fallen out of favour with some stakeholders. Setting up a traditional research group takes time and it can be hard to justify when a panel exists or virtual group can be set up online within hours. Similarly, waiting a week or more for a de-brief meeting now seems cumbersome when stakeholders can easily compromise on the purity of the research for the sake of quick decision-making.

The next major difference lies in the way brands use market research in the wake of the great data explosion. Ten years ago, stakeholders would often start with a much smaller knowledge base and then rely on market research to fill in the gaps. They would be happy to wait six months and allocate a considerable budget to the project because the accuracy of the results formed the basis of major decisions. Now, stakeholders have access to a plethora of data information, which they can layer and analyse to create a knowledge base of 50-60% as a starting point. This means decisions are often already close to being made before research starts. When looking to research, stakeholders often seek contextual insight to reinforce and refine their decision-making.

Thirdly, greater collaboration with procurement departments is often required.  With technology now available that enables risk assessment, business continuity checks, and costs to be broken down by project, area and methodology – all at the click of a mouse – the need to justify budgets is greater than ever before.

Finally, data-rich and email-laden stakeholders can be harder to convince of the need for further insight. When there was a smaller knowledge base, they would be more involved and keen to attend research de-briefs, but often that is no longer the case, so we must work much harder on our communication to achieve cut through and highlight the genuine value that research and insight brings to a business.

1 Comment

10 years ago

Here we have a piece on how research has changed over the past 10 years when, in fact, absolutely no progress has been made at all. In fact the market research industry has gone backwards. Why? Because all of the focus is on data volume and speed rather than data quality and insight, which is what REALLY matters. This raises another massive problem, which is that all of the data being collected is based on cognitive responses and statistical techniques developed almost 100 years ago. We now know that cognitive responses are frequently worse than useless while the statistical techniques still used rely on data assumptions that are rarely, if ever achieved, such as genuine random sampling from a representative population. Things that could address these issues, such as the capture of emotion-based data and statistical analysis using the bootstrap are conveniently overlooked. Is it any wonder that more and more marketing professionals are questioning the ability of market research to deliver useful insight when what is sold as 'progress' actually represents nothing of the sort.

Like Report