FEATURE25 September 2017

Human intervention

x Sponsored content on Research Live and in Impact magazine is editorially independent.
Find out more about advertising and sponsorship.

Features Impact Public Sector Technology

Using machine learning to make decisions around public services can save time and money, but can also exacerbate existing discrimination. Bronwen Morgan looks at potential solutions

Police tape

Machine-learning algorithms are everywhere. They are used by taxi apps such as Uber to calculate when to impose surge pricing during peak times, and by Facebook to decide what content – and ads – to display. 

They are also beginning to be used by some financial services companies, to inform decisions on loan lending, and by police forces to engage in predictive policing. Past crime data is used in predictive-policing algorithms to highlight people and areas that are most at risk of future crimes, to help police chiefs decide where best to allocate resources. However, recent studies have highlighted that this approach may reinforce bad policing habits. 

A review by the Human Rights Data Analysis Group – a not-for-profit organisation that uses science to analyse potential violations of human rights around the world – showed that, as these algorithms are based on databases of crimes known to police, they cannot predict patterns of crime that are different ...