FEATURE24 June 2021

Civic-minded: The rise of public interest technology

x Sponsored content on Research Live and in Impact magazine is editorially independent.
Find out more about advertising and sponsorship.

AI Data analytics Features Impact Public Sector Technology

Public interest technology is a growing field focused on the ethics of how we build, apply and interact with systems. By Katie McQuater.

Aerial view of people walking across a zebra crossing

In 2020, Netflix documentary-drama The Social Dilemma brought a disturbing vision to our screens: shady controllers monitored people’s every online move, influencing the content they come across and, thereby, their mood, and even their likelihood of being indoctrinated.

If it sounds like a dystopian prospect of a distant future, it’s far from it. Facial-recognition technologies; the hyper-targeted digital advertising market; the growth of disinformation and conspiracy theories threatening democracy, leading to violence such as the attack on the US Capitol; online abuse of female journalists and politicians. All are negative outcomes of technology.

There is a growing field that recognises technology should be designed, understood and applied in a way that considers all outcomes and potential impact. Enter public interest technologists – those who operate in and around tech, who understand that designers and coders don’t function in a silo and that tech should be used to benefit society as a whole.

“Public interest technologists produce knowledge, hardware and software optimised to advance the public good. Therefore, they are less likely to mine and trade in personal data,” says Mutale Nkonde, founding chief executive of non-profit communications agency AI for the People.

“They provide resistance against technologies with sexist and racist inputs, and they look for new ways to design, deploy and govern advanced technological products through justice-affirming frameworks.”

Public interest technology has emerged from a need to reshape how technology works – and to hold accountable those who hold the power. Definitions of the field can vary, however.

Sarah Drinkwater, director of responsible technology at Omidyar Network, and former head of Google’s physical start-up hub, Campus London, points to a lack of shared language.

“A core challenge for those of us working to build a more equitable technology ecosystem is the lack of common language to rally around,” says Drinkwater. “Technologists, academics, activists and us, as the public, all use different words to mean the same things. When I hear the term ‘public interest technology’, my first question is: who is that public? There’s such clarity around what we don’t want – systems that entrench inequalities – but we need to get clearer on what we do, to help us all accelerate; whether it’s responsible technology, humane or ethical technology, or public interest.”

Algorithmic bias

Conversations about the problematic impact of technology often refer to algorithmic systems and their increasing prevalence in our public and private lives. Algorithms are not built in a bubble, and their potential negative consequences, either intended or unintended, are far-ranging – from pressurising us to purchase, to infringing on our fundamental right to privacy, such as police forces’ use of facial recognition for public surveillance.

Algorithmic bias also means that the effect is intersectional: race, disability, class and gender all have an impact on how decisions are made about our lives, without us necessarily being conscious of it.

Nkonde was part of a team that helped introduce the Algorithmic Accountability Act into the US House of Representatives in 2019. The bill calls for large tech companies to audit the algorithmic systems they build, to ensure they do not violate the civil rights of people from protected classes.

To address algorithmic bias, Nkonde says the US government should mandate independent impact assessments that would make clear how people from protected classes are affected. She likens this to the US Food Drugs Administration, which makes sure food and drugs are safe.

Public interest technology goes beyond discussions of regulation, which often becomes a back-and-forth game between big tech and government authorities. So, are discussions about technology’s wider impact too dominated by the issue? No, says Nkonde. “We need discussions about regulation, because there is none, but we also have to discuss how to ensure communities understand how algorithmic decision-making systems work to build agency among the most impacted groups. Companies themselves should also take their corporate responsibility obligations seriously.”

Carl Miller, research director at the Centre for the Analysis of Social Media at think-tank Demos, says regulation is just one part of the picture. “We often try to deal with online problems simply by regulating the tech giants, rather than trying to find other ways of dealing with the problem,” he adds.

Technology ‘up for grabs’

It’s important that there exists the prospect of an alternative to the systems we’ve become accustomed to, says Miller. “Within my own field of social media research and online analytics, the reason it’s so important to have a civic society that can do this is to, basically, keep an independent voice that can hold the tech giants accountable.

“Technology is political in many ways; it’s kind of what power flows through, and it shapes our lives – and you don’t have to look far to find this radical alternative to the settled order. In Taiwan, digital democrats have made entirely new ways of connecting people with politics that are very open-source and radically transparent.”

He adds: “The decisions being made around platforms, or where data goes, or what our experiences are using these platforms – they’re up for grabs, they’re disputed, and they don’t have to be the way that they are.”

Public interest technology can take the form of platforms that operate differently from the established tech players – a search engine that plants trees, for example. But commercial sustainability can be an issue, notes Miller.

“The problem is, because they’re not run for profit – and they’re certainly not run using this enormous capitalist architecture that has proven itself to be so successful – very, very few, if any, have managed to scale up and make genuine competitors,” he says.

Funding is an ongoing challenge for public interest technologists, says Nkonde. “Funding is dependent on philanthropic funds, and that carries a certain amount of risk. Therefore, organisations like mine have to find a way to capitalise on their work to make it sustainable.”

Drinkwater also points to the power imbalance. “Whether you’re an ‘ethical owner’ embedded in big tech, or a start-up designed in opposition to Silicon Valley norms, the core challenge is the imbalance of power between several large companies and everybody else.”

The Omidyar Network has developed a toolkit for technologists to help them understand complex topics, including AI bias and surveillance. “We see incredible appetite from workers to do the right thing, but employers are constrained by their business practices,” says Drinkwater. “So, for those working in the field – and the field is so broad, from engineers building in co-ops to researchers and beyond – they’re pushing against the grain, exactly as previous generations of earlier tech workers did in building Silicon Valley.”

This article was first published in the April 2021 issue of Impact.

0 Comments