FEATURE21 November 2024

How can AI support financial services research?

AI Features Finance UK

Applications of AI can help to predict vulnerability in the financial services sector, but relying on technology to help people make financial decisions can also be detrimental. Liam Kay-McClean reports on the recent MRS Financial Services Research conference.

AI abstract image

How can technology help improve financial services research? Last week, the MRS Financial Services Research conference featured three presentations that examined the potential impact of artificial intelligence (AI) and synthetic data, but also carried a note of warning for anyone conflating increased use of digital technology in banking and finance with supporting people who are financially vulnerable.

Predicting vulnerability
Michael Fisher, head of data at The Wisdom Council, discussed an AI platform the company built to help predict which people were most financially vulnerable and identify the reasons why.

The Financial Conduct Authority reports that more than 24 million people in the UK display characteristics of vulnerability, and the tool built by The Wisdom Council sought to identify those people who would financially struggle to deal with an adverse life event.

Fisher said: “What we are trying to understand is if something was to happen to you – a life moment – how would you react? And would you react differently to someone else?

“If you have 60 million people on your database, who do you want to speak to first? The people you want to speak to first are the people who can’t cope – those are the people who are going to break, rather than bend, if something was to happen to them.”

Factors that indicated someone’s potential vulnerability to a personal financial crisis included curiosity and whether they could think of a way to solve a problem, such as a job loss; creativity, such as understanding how to make a solution happen; and connectedness, or the support available to deal with an issue, such as family and friends.

Some of those most or least vulnerable could be surprising, Fisher said. “There are people who are financially very low, but their ability to cope is very high,” he explained. “On the other side, some people have substantial assets, but if something was to happen to them, their ability to cope was very low, and therefore their likelihood of making the wrong decision in that moment is very high.”

Using the predictive AI model, financial services companies could identify those people who are potentially most vulnerable, and target support in those areas.

Digital divides
Ben Skelton, chief executive officer at Quadrangle, examined whether driving up digital capability helped to tackle financial vulnerability, and found that there was a disconnect. He said that half of those deemed financially vulnerable were in fact deemed digitally capable, which meant a rethink was needed in the approach to these groups.

“Digital is not the silver bullet in helping to address capability and confidence,” Skelton told the conference, adding that while “it might well help with financial freedom”, there was not enough evidence to say that digital services are effective at boosting capability and confidence in dealing with finances.

AI tools to help people with financial decisions could also end up having a detrimental impact, Skelton added, with people instead outsourcing decision making to technology rather than understanding the best financial decisions to take for themselves.

Skelton said: “How do we create frictions to put into the process so people can see a little bit more about what is going on the more we outsource to digital and AI? I know that some people will massively benefit and will love many of these digital tools out there and will love some of the AI stuff that is going on, but there is almost for some people the luxury of not having to think as opposed to actually trying to help people’s capability, confidence and understanding of what is going on with their financial information and the decisions they are making.

“We could put some friction in somewhere that is helpful, that’s intelligent and helps people with this subject.”

He added that consumers did not want to know everything about financial services, but wanted support at the ‘big moments’, such as getting a mortgage. Face-to-face advice at these times helps boost confidence and trust in the advice provided, Skelton said.

“AI is not engineered the same way as human beings are, so therefore just replacing a lot of what we have being doing as human beings for many years with digital and AI means it is going to be built on different principles, and we need to stop and think about what those different principles are.”

Synthetic support
Paul Hudson, founder and chief executive at FlexMR, and Tereza Anderson, insight senior consultant at Aegon, discussed a trial of how the addition of synthetic respondents into a panel compared to those featuring only real human responses.

Anderson said that many long-term tracker studies come with challenges in keeping response rates high, mainly due to challenges in long-term engagement that necessitated exercises to boost interaction with the panel or refreshing the sample to add in new respondents. However, Anderson said “it is a constant battle to keep response levels high”, and the decision was taken to examine how AI could support a tracker study and boost responses.

Hudson detailed an experiment to test filling incomplete responses and making them complete with the use of synthetic data, as well as using synthetic data is to fill quotas for surveys, therefore building new responses. The study included comparing a control group of human responses with a sample consisting of 1,032 collected responses, with 767 real people and 265 synthetic responses.

There was little difference between the accuracy of the two samples, Hudson said, in the accuracy of the data they produced. However, he did strike a note of caution on using synthetic data to help predict future consumer sentiment and behaviour: “From a use case perspective, it would appear that synthetic data would not work as well in future-orientated questionnaires. So don’t use it to predict the future. It wouldn’t be able to predict Covid, for example.”

0 Comments