OPINION7 June 2018

How private is your brain?

Data analytics Innovations Opinion Privacy Technology

With companies investing more in neurotechnology, retaining control of our brain data could be the most pressing privacy issue of the future, says Dipesh Mistry.

Brain neuromarketing neuroscience_crop

Data and analytics have been in the news a lot recently, largely for the wrong reasons. Cambridge Analytica’s alleged use of personal Facebook data to change audience behaviour has raised questions about how companies can currently access and use personal data, while the recent implementation of GDPR in the UK has also grabbed headlines.

Today’s data news is about personal data. But what will tomorrow’s data news be?

As neuroscience becomes more accessible, ‘brain data’ is a topic high on the future agenda. This can be defined as an information trail made by the brain that maps out how decisions are made. 

Brain data on the rise

The use of brain imaging and other neuroscience-related techniques is currently being heavily researched in both the academic and commercial sectors to understand more about a complicated and unknown area of the human body.

Techniques such as fMRI (functional magnetic imaging) and EEG (electroencephalography) produce vast, complicated data sets, but once translated and interpreted, they can tell us a lot about how individuals think. So much so, that something called a brain computer interface (BCI) is currently being researched to help those who have suffered brain trauma to help replace function. BCIs can use a computer connected to a microchip to interpret an individual’s neural activity, effectively predicting what someone will do based on their neural patterns. It may take a while before BCI technologies are part of our daily lives, but it will happen in our lifetimes.

Global companies have also been investing heavily in brain data. Google, IBM, Microsoft, Apple, and Facebook have all claimed to have a dedicated research department looking at brain technologies and neural networks in some form, while Facebook has said it is developing a BCI to allow users to one day talk to each other purely through their brains using computers, AI and electrophysiological data.

Meanwhile, Elon Musk, founder of Tesla, has created the company Neuralink, investing  millions in the creation of devices that can both ‘read’ human brain activity and write neural information into the brain.

Current spending on for-profit neurotechnology is already claimed to be $100m a year and growing. We are heading towards a world where it will be possible to decode people’s mental processes and consequently manipulate the brain mechanisms underlying their intentions, emotions, and decisions.

Some of the claims of what these companies are aiming to do may be optimistic, but it’s almost certain that reading the human brain may be the  next arms race. One of these juggernaut companies, with unlimited resources, will get there. And what will happen when they do?

Proceed with caution

The Cambridge Analytica US election scandal may become a drop in the ocean once companies are storing neural data on their servers. Imagine a world where Facebook had data on a user’s emotional reaction to an image of a dog, for example. That individual could then be targeted with advertising using animals, because that it is therefore assumed to be more memorable. Insurance companies could calculate premiums based on brain health or activity that is unknown at a visible level. Effectively, any type of algorithm will become exponentially more powerful if they draw on neural information.

This may seem to be a problem for the distant future. However, if it has taken this long for something like GDPR to kick in, we should start thinking about what’s next for other forms of personal data that could end up in the wrong hands. In future, we may be talking about a scandal in much more severe terms. For example, millions of users’ brain data being exploited for political voting, targeting individuals using certain trigger messages that cause spikes in our brain activity.

Law makers should learn from how outdated the pre-GDPR data protection legislation was and ensure that this doesn’t become a problem we revisit in the future. There should be active consent and a willingness to have your brain data stored to ensure this type of data isn’t exploited by a technological giant. Companies shouldn’t be able to just pay for large amount of behavioural data in order to influence elections or any other type of campaign. This may seem a problem for the future, but as we’ve seen in the news, it’s one that will quickly become part of the world we are in.   

Dipesh Mistry is senior research executive at Northstar Research

0 Comments