NEWS12 March 2024

Consider intent and consent in AI research design

AI Annual Conference 2024 News UK

UK – Researchers should consider questions of intent, consent and accountability when it comes to applying AI in public sector research design, according to speakers at the MRS Annual Conference.

Ai public sector research_crop

The public want AI to be used in a way that augments skills and human capabilities, according to Sylvie Hobden, head of public attitudes at the Responsible Technology Adoption Unit.

Speaking during a session on considering public attitudes to AI in research design, Hobden said: “AI is a partner in knowledge creation, but it still feels like there is a huge role for the researcher to apply expertise and context.”

While AI “does not feel like a silver bullet” as it cannot yet develop its own knowledge, that may change in future, added Hobden. She continued: “We may get to the point where AI can create knowledge in its own right and we may then want to draw lines. Are there certain tasks that we would want to reserve for researchers, for example?”

Lucy Farrow, associate partner at Thinks Insight & Strategy, said the issue of intent should be considered when thinking about AI in social and government research. 

Farrow said: “As researchers, we approach our work with intent – we have an ethical orientation. If an AI tool is interviewing a participant, how do we think about the intent which is coded in to a machine system, and how does that compare to the intent of a human researcher?”

Later in the session, Farrow made the point that intent is not easily measured: “Our legal system is based partially on intent. Harm is one thing we can measure, but intent is not the same. If we put that in the context of research, what is the intent of the systems we use?”

These questions are particularly pertinent in the realm of public sector research, said Chloe Juliette, associate director at Thinks Insight & Strategy. Juliette said: “There is a real sense of the social contract with government. Research for public sector is generally seen as trying to improve lives or policies in some way and so there’s a kind of value placed on the respect that is shared for stories.”

There is “quite a lot of openness from the public” to the use of AI in the public sector, according to Hobden. She said: “People might start off in a [focus] group saying that they don't know what to say about AI, but then 10 minutes in, you get a lot of value. It reinforces the role of public attitudes research.”

However, Hobden added: “People really aren't happy with AI if it is not going to be consistently accurate. Transparency is also very important to people – both in terms of whether AI has been used at all, and also how it is being used. This feeds into accountability – people want to be able to challenge and question the outcomes of AI and to do that they need to understand a bit about how those decisions are being made.”

Additionally, Hobden said there must be a tangible benefit for individuals as to why AI has been used. She said: “When we speak to the public, there is an overarching sense of risk aversion – the risks tend to be more salient than the benefits. As researchers, it is important to articulate the benefit for the individual, not talking about the efficiency of the research in general. It needs to be articulated powerfully.”

0 Comments