OPINION27 October 2022

Tech vs human: Increasing trust in health data

AI Healthcare Opinion Privacy UK

How do you increase trust in how data is stored and used by technology firms? The example of the healthtech sector holds some answers, argues Gabi Kriaucionyte and Jessica Miller. 

Health app

If I had a dollar for how many times I hear “data” in a day… 

Data is a nebulous concept – it quite literally is everywhere, intangible, and increasingly essential in business processes, social systems and personal organisation all across domains, especially health. While data can be seen as something technological and robotic, in health and fitness, it’s ironically reflective of intrinsically human, organic experiences.

A simple tap on a screen may now show us more about our health daily than we find through an annual doctor’s check-up. We can use our Apple Watch to measure our steps, use an Oura ring to track our sleep cycle, or Whoop to monitor our respiratory rate. The data these wearable devices capture (perhaps compiled with those genealogy test results you got on 23andme) allows companies to analyse our vitals, behavioural patterns and subsequent health risks at any given moment.

But how can one ensure people trust these devices’ data the same way they would trust a family doctor’s prescription?

In times of increasing distrust in people and systems of power, including tech, businesses have struggled to maintain consumers’ trust in their health wearables and the data they accumulate. One method to increase consumers’ trust is in the transparency movement. In the health wearable space, companies often attempt to explain data and its collection to consumers by sharing the methodologies behind it. However, behind health tech is often extensive, coded algorithms or scientific systems like “Optical Heart Rate Technology”, so even if they’re shared with the common consumer, this may not clarify what data and its collection entail.

The human tendency to trust what is overly complex (complexity bias) may be an advantage to health companies, as they share their methods in a similar scientific language. But today, it’s harder to convince customers that data collection methods are personalised, ethical and accurate given our growing tendency to be doubtful of business claims.

Unlike in other industries, health tech faces a challenge that’s new for them – to speak consumers’ language and hold the authority of a trusted family doctor. After all, when ideas are explained in scientific or algorithmic language, how transparent are they really?

Thus, healthtech wearable companies have to look for ways to make their products and data processing more trustworthy for consumers.

According to Alessandro Latif, founder and chief executive of consumer biotechnology artificial intelligence (AI) startup Watz: “Medical researchers and doctors would need to reach consensus around the health decisions the algorithms provide. Algorithms alongside domain experts would automate your super-doctor, able to decipher patterns, in particular biometric signals, better than a human can.”

Latif is suggesting a crucial duo between algorithms and humans – that optimised healthtech comes from their dynamism. He follows this with two key strategies for improvement of consumer trust in healthtech: ( 1 ) transparency of the health experts, and ( 2 ) movement away from standardisation.

First, Latif suggests making transparent not only the algorithms and what they are doing but also that real humans, in this case medical researchers and doctors, are behind the tech and collaborating. Furthermore, by highlighting human involvement in creating these devices, we can hold these companies more accountable for the personalised data they claim to offer.

Second, while people may fear receiving generalised recommendations based on general population health metrics, an increasing human involvement and thus moving away from standardisation can lead to accurate, individualised health and fitness recommendations.

If we want to make any technology feel safe and trustworthy, it is important to understand what truly drives humans – their values. If the health technology feels too alien, there has been a misstep in the way the company is attempting to build trust with consumers. Health data is the most sensitive data one can collect, so any company putting healthtech products out there has to make their consumers feel secure and be honest about their products by providing individualised recommendations that are both relevant and accurate. 

Gabi Kriaucionyte is research assistant and Jessica Miller is research manager at One Minute to Midnight