Facebook magnifying glass_crop

NEWS23 March 2018

Insight experts react to Facebook and Cambridge Analytica data scandal

Data analytics GDPR News North America Privacy Public Sector Technology Trends UK

UK – Insight industry professionals have reacted to the news that elections consultancy Cambridge Analytica allegedly harvested data from millions of Facebook user profiles in the lead up to the US presidential election.

Following reports from The Guardian and The New York Times over the weekend, Facebook’s market value dropped by $40bn in two days as the company comes under pressure to address the data privacy issues raised by the revelations.

Facebook has since said it will change the way in which third-party applications can access data on its platform, with chief executive Mark Zuckerberg vowing to take its data privacy issues seriously, saying: “We have a responsibility to protect your data, and if we can't then we don't deserve to serve you.” 

Meanwhile, the information commissioner’s office said it would seek a warrant to access Cambridge Analytica’s information and systems as part of a wider investigation on the use of data for political purposes, and the company’s chief executive Alexander Nix has been suspended pending an independent investigation, following an undercover investigation by Channel 4.

Research Live spoke to a cross-section of research and insight companies to get their perspectives on the scandal, and what the ramifications may be for the data and insight industry more widely.

Steve King, co-founder and chief executive, Black Swan Data

“Governments and regulators should play a more active role”

This news has certainly caused shockwaves around the world, but we need to be careful we don't throw the baby out with the bathwater. On the one hand, profiling, segmenting and targeting groups of consumers for individual gain is what modern marketing is built on. On the other, this doesn't paint the best picture of data businesses. But there is a way to do it without infringing the privacy of individuals. We, for example, use anonymised data to understand trends from the many, not the few. Clearly, our underlying goal is to be commercially profitable, but we made a fundamental decision to use this type of technology to try to enrich consumers’ lives, not manipulate, or target an individual in an underhand way. 

I've been a big advocate of the need to regulate this type of technology – after all, should I, as CEO of Black Swan, be responsible for applying my own moral code to this emerging and powerful field of work? I think not.

Instead, governments and regulatory bodies should be playing a more active role and work with social media platforms and technology companies to make it safer for everyone. Technology isn't to blame – it can be used as a force for good when used with positive intentions – but like anything powerful it can be exploited for ill use and personal gain if not regulated properly.

Ben Page, chief executive, Ipsos Mori 

“Getting informed consent is essential”

These revelations have raised questions about how the market research industry is dealing with people’s data. Without being complacent, I think all reputable research companies would regard this as a complete anathema. The MRS Code of Conduct – and GDPR from 25 May – mean that this type of exercise would be unthinkable.

Getting informed consent is essential. We only process data for the stated purpose of research and never sales or marketing, consumer or political. Trust in pollsters remains far higher than that of politicians, journalists and business leaders – and this latest scandal is a reminder of the need to maintain our standards – and for those in charge to always remember their responsibilities.

Ryan Howard, head of analytics, Simpson Carpenter

“The model of data in exchange for a service is still sound”

While indeed the tip of the iceberg, the news is a nothing-burger. We have long accepted that our personal data is monetised and have made an uneasy peace with being micro-targeted and influenced. After all, our personal data is the heart of the golden data economy. However, when used in ways that we are not aware of, we are forced to consider its wider impact and find it chilling.

The business model of data in exchange for a service or benefit, though humbled, is still sound. A minor course correction is required before we will believe, once again, in big data’s utopian promise.

Jane Frost, chief executive, MRS

“It’s not difficult, it’s a matter of will”

We’ve always campaigned on the need for transparency within businesses, particularly as data and analytics grow in importance. That is one of the reasons we launched the Fair Data accreditation in the first place – to provide consumers with the choice to make educated decisions about their personal data. We believe this week’s news emphasises the need for a mark like that, preferably one that’s international.

There have been queries in the past about the transparency of some of Facebook’s research, so I think this is an opportunity for the company to look at its own practices, by undertaking a health audit using one of the many already-established codes on best practice.

Facebook – and all platforms and businesses handling vast amounts of consumer data – should approach this from the perspective of ‘just because we can do it, doesn’t mean we should’. It’s not difficult, it’s a matter of will. 

Tom Ewing, head of communications, System1 Group

“Ethics codes mean we’re not data cowboys”

This is a major issue for any data industry, as people don’t have time or inclination to sort the verified facts – which look bad enough – from wilder speculation about psychological operations and conspiracies. But for market research, it could have been much worse.

A few years ago, there were a lot of industry debates on whether we should go with the flow, relax our ethics codes, accept that ‘privacy is dead’ and so on. Some said we should or we’d be left behind. Others said no, we need to modernise and strengthen the codes. The ethical side won, thank goodness, as it’s the codes of ESOMAR, the MRS and other bodies that let us say, truthfully, that we’re not like Cambridge Analytica or other data cowboys.

The MRS has issued a reminder statement on its Code of Conduct here.

2 Comments

6 years ago

Hear hear Tom Ewing and Jane Frost! But I can't agree with Ryan Howard. A key point here is that most people actually have no idea what Facebook and others do with their Facebook data (and burying that stuff away in the small print is simply not good enough). That's why news such as this comes as a shock to them, unsettles them and decreases their trust in anyone they give data to for whatever purpose.

Like Report

6 years ago

I agree with Ben, "getting informed consent is essential", I would add "no ifs or buts! Ryan's "golden data economy" , I would say, is the "black data economy" I was interested to see there around 1000 UK Data Brokers. Do they have a code of good practise?

Like Report