FEATURE1 November 2010

Negotiating the data deluge

DATA SPECIAL— Andrew Wiseman, director of ICM Research, on how a bit of careful thought can help companies extract value from the deluge of data.

It seems not a week goes by without another stream of information becoming available to an already perplexed insight community. Location-based this, social media that… It’s all very exciting, but how much value can clients really expect to garner from these new sources?

IT researcher IDC has estimated that in 2010 we will create some 1,200 exabytes of data. That’s more than a billion, billion megabytes – an eightfold increase since 2005 and still growing. How can organisations use this information to drive business results? In fact, how can they even absorb this deluge of data?

Times have changed
It all seems a million miles away from the technological landscape of 20 or so years ago, prior to the commercial advent of the internet, when the information available to client organisations was at best finite and at worst sketchy. The key sources back then would have included some form of sales data – either purchased from a syndicated provider or using data on goods leaving factories as a proxy (even in the data-rich retail arena, sales data was still some way behind today’s scanner-based methods).

As for the marketing budget, a client would have some access to data in relation to specific promotional or media activities it was undertaking, the measurement of which was relatively simple provided the campaign was executed effectively in-store. Meanwhile the media marketplace was firmly centred in traditional and terrestrial media, rather than the hugely fragmented media we have today.

Next, organisations may well have had access to both primary and secondary market research, which typically was held in silos, far away from the intelligence that had been collected elsewhere.

Fast forward to today
Since that time, advances in technology have made things far more complex. Mobile telephony became the norm for many individuals and businesses, and the internet burst on to the scene to provide instant access to pretty much anything. IT infrastructure also developed to the extent that businesses with regular or contracted customers (such as banks or mobile phone companies) could develop cavernous data warehouses, capable of collecting a whole host of behavioural information.

With the arrival of Web 2.0 it now seems there are endless possibilities for harnessing new sources of information and bringing them together to create better propositions for customers. We have already seen the technology being used in the development of online brand communities and customer panels, like having a virtual room full of customers on hand to get involved in activities on a regular basis.

With these new and powerful sources of information at their fingertips, many organisations are failing to realise the huge potential to generate a more holistic view of the customer by using attitudinal, behavioural and financial metrics. In many respects, the key to harnessing these individual streams is in the thought process, rather than the data itself. This process can be broken down into five parts.

1. Make sure data adds value
The key strategy for all business data should be a link to a commercial outcome. The most obvious KPI is profitability, but in reality the indicator may be a factor that influences business performance. Sales, brand preference, Net Promoter Score, customer churn and revenue per user can all be used to create a link to something tangible.

Ultimately the data fusion or mining process needs to have a ‘so what’ attached to it – allowing us to create something that is useful rather than just interesting.

But what about a business’s ability to make these value judgements reality? A collaborative approach between client organisations and agencies allows for the production of simple tools, which immediately prioritise the key action areas.

Using new and emerging data sources such as social media helps to create an additional diagnostic for such programmes. Understanding what customers are saying about a brand, product or service adds a further degree of richness, building on the mining of verbatim comments from a survey. At ICM, this is a central tool in helping clients to understand and prioritise issues in their businesses – and adds depth in the same way as the more traditional voxpop.

2. Join the dots of existing data
It sounds simple to say that organisations should know what data is currently at their disposal – but in the majority of companies this is not the case. Knowing what the business already knows is the starting point of a successful data strategy which leverages the right information sources and creates value from them. Historically, organisations have tended not to have a single team to look after the various streams of data. For example most organisations will, at one stage, have had a research team or customer insight team whose remit was to manage all the primary research. There may also have been a market insight team with responsibility for syndicated sales and other information sources coming into the business. To make things more complex, businesses developed CRM teams, which looked after the increasing wealth of information about consumers being stored on the customer database.

Even in this relatively simple framework, there was often no triangulation of these three sources. The result? Wasted effort and wasted budget spent on research and analysis projects to uncover answers to questions that were either already known in the business or could be found using some relatively straightforward analysis.

With all the new data sources now available, the key question is how best to manage the thirst for new data, while ensuring organisations can cope with the deluge that might result.

3. Don’t assume that new data will produce new insights
Metrics move at different paces, and recycling can be a great way of saving budget and mining existing data more effectively. Markets also move at different paces, and clients need to understand their markets first if they want to understand how and when to invest in more data, and when to work with what they already have.

A good example of this is marketing mix modelling for mature FMCG brands. Each year businesses choose to invest tens of thousands of pounds in updating their knowledge on the marketing initiatives for their core brands and competitors. In some cases this can be absolutely the right thing to do. In the spirits and wines market of the late 1990s and early 2000s, the liqueurs sector was in growth and innovative campaigns were being executed by many brands as they competed for a larger share of the expanding market. Conversely sherries and ports had been stable for years, with no real innovation in products or marketing strategies. Despite this clients returned annually to update a set of numbers that would return the same findings. Not only did this waste valuable resources, it made the researcher’s job far less enjoyable as there was seldom any news to shout about.

By stopping this particular stream of insight, the client was able to divert the budget into other areas where new insight was created through the exploration of new data and new techniques for analysing it.

4. Don’t rely on pure science to drive business value
One of the key reasons that much data fusion fails is that the links between data streams are not always very clear. Trying to find links between cause and effect can be especially difficult.

In many ways, this is where an over-reliance on pure statistics will lead to a disappointing return. Statistics on their own allow us to understand some of the nuances of a business problem, but the most powerful analytical solutions tend to be those that can be directly used to develop new strategies for business. A reliance on pure statistics can create something academically interesting and robust, but without an ability to implement the results successfully into a business model organisations are left with a set of statistical artefacts.

Working collaboratively with clients to understand the desired commercial outcomes, and relaxing some of the assumptions about pure statistics, are both critical in creating real value from such programmes. For example, analysis of the effect of advertising has historically focused on the short-term (two to three months) impacts that a particular campaign had on sales. In fact the majority of any advertising impact occurs in the mid- to long-term. However, proving the link between current advertising and future sales is difficult – but by working with the client to define what the data needs to look like, how it should be lagged and so on, it is possible to deliver a pragmatic (if not statistically perfect) solution that creates significant value.

5. Stay focused
When dealing with multiple sources of data, it’s common for interesting nuggets of information to fall out from a process of data analysis. While these can be interesting and sometimes useful, they can also prove an unwelcome distraction from the overall analytical process.

Maintaining a focus on the business issues at hand, and not being distracted by unnecessary diversions is central to a successful data programme. The nuggets produced as a by-product can always be revisited as part of further analysis, but maintaining a tight focus will ensure that the greatest commercial value is driven from each programme of analysis.

The road to success
The excitement surrounding the myriad data streams that are becoming available can lead to ‘kid in a sweet shop’ syndrome – where the desire to have the latest data overcomes the thought process about what to do with it. This can lead to a poor use of the information, and at worst it means yet another potentially valuable stream of data being buried and not capitalised on.

Many client organisations have difficulty managing the information they already have – simply adding more streams will do little to help these firms distil that data into insight. Fundamentally, the success of data-mining programmes in the new data landscape will depend on a realistic view of what is possible.

0 Comments