FEATURE24 May 2024

Hannah Perry in seven: Disinformation, AI and democracy

x Sponsored content on Research Live and in Impact magazine is editorially independent.
Find out more about advertising and sponsorship.

AI Features Impact People Technology

Hannah Perry discusses researching disinformation and conspiracy theories, the rise of generative AI and how her experience as a teacher has helped her research career.

HP-inseven-2024-2

1. What is the biggest challenge of researching disinformation and conspiracy theories?

Not having data access to the biggest social media platforms, which restrict access from external researchers, or only make it available for a significant fee. While the EU’s Digital Services Act includes transparency provisions for accredited researchers, the UK’s Online Safety Act doesn’t mandate this, so access remains very limited.

2. How can regulators and platforms draw the line between misinformation and freedom of speech?

Lines are already drawn in law surrounding speech that we recognise is harmful and, therefore, illegal, so the task for regulators and platforms is about ensuring that illegal content is removed rapidly. Where there’s speech that can be harmful but legal – which might include misinformation – it’s more helpful to recognise this as on a spectrum of risk, where sensitivity to the topic and context is crucial for decision-making. ‘Speech’ or ‘content’ decisions need to be incredibly nuanced, so it’s important that platforms invest sufficiently in sensitive oversight and consider how they develop these policies (and with what level of citizen engagement) very carefully.

3. Are you concerned by the proliferation of generative artificial intelligence (AI)?

Yes, I’m concerned that the adoption of large language models and audiovisual generators has accelerated before we have sufficient safeguards in place – especially in such a big year of elections. These innovations challenge our understanding and assumptions about what is ‘real’, accurate, and ‘who’ is acting. When trust in institutions and political actors is already very low, it can be potentially damaging to values that we hold dear, particularly truth. Political and corporate leaders will need to act rapidly to establish, transparently, how they intend to use these tools, so trust in their communications remains.

4. Can generative AI be effectively and ethically moderated?

Generative AI companies and platforms have significant responsibilities to improve the approach to moderation, whether that be ensuring users have clear guardrails for how tools should be used, providing clear information about potential inaccuracy of content, or investing in ‘prompt hacking’ and ‘red teaming’ exercises that can be used to mitigate model misuse. I don’t think social media platforms need new rules for synthetic content, but they should double down on enforcement, removing harmful content whether it’s generated by human or machine. Synthetic content should be labelled, or enable labelling, so users have transparency of its provenance.

5. How important are citizens in helping to answer these questions?

They are crucial to tackling digital policy challenges. Sadly, digital skills and jobs remain the preserve of an elite few, which means a range of voices and risks are not considered in the design or regulation of powerful technologies. This has resulted in technologies incentivising behaviours that aren’t conducive to a society that cares about equality, truth and mutual respect. Deliberative research methods can be very effective for bringing in voices at multiple stages of digital policy development.

6. You are a qualified secondary school teacher. What lessons have you brought from teaching into your research career?

Teaching taught me the variety of complex factors that might affect how someone behaves – and that, often, ‘the solution’ isn’t the one you had in mind. It reinvigorated my love of observational and qualitative research; it taught me to start with open questions, humbly listen to the challenges that people are facing, and be ready to reconsider and learn from others what policy solutions might actually work.

7. Are you optimistic about how democratic systems are evolving in the digital age?

No, I haven’t seen sufficient evidence that they are evolving fast enough. It’s crucial that all citizens have access to the internet and digital civic literacy skills, so they can source good-quality information and deliberate confidently with others. National and local government could be using online methods more effectively to enable participation in policy-making, as a means of rebuilding trust between citizens and their political representatives.

Hannah Perry is lead researcher at CASM, the digital research unit at think tank Demos, where her work includes programmes to investigate disinformation and conspiracy theories in local information ecosystems. Prior to Demos, Perry led research and social impact programmes with a focus on harmful attitudes and behaviours online and offline.

This article was first published in the April 2024 issue of Impact magazine.

0 Comments