Independent report

Report commissioned by CDEI calls for measures to address bias in police use of data analytics

The Royal United Services Institute (RUSI) has published research - commissioned by CDEI - into the use of algorithms in policing, and the potential for bias.

Documents

RUSI Report - Data Analytics and Algorithmic Bias in Policing

Request an accessible format.
If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email [email protected]. Please tell us what format you need. It will help us if you say what assistive technology you use.

Details

RUSI’s research involved interviews with UK police officers themselves, who describe that the landscape of technological sophistication is varied across forces in the UK. The evidence suggests that there is an absence of consistent guidelines for the use of automation and algorithms, which may be leading to discrimination in police work.

This research forms an important part of the CDEI’s overall review into algorithmic bias. We are working on draft guidance to help address the potential for bias in predictive analytics in policing, and will make formal recommendations to the Government in March 2020. Read more about our 2019/20 Work Programme.

What were the findings?

  1. Multiple types of potential bias can occur. These include discrimination on the grounds of protected characteristics; real or apparent skewing of the decision-making process; and outcomes and processes which are systematically less fair to individuals within a particular group.
  2. Algorithmic fairness is not just about data. Rather, to achieve fairness there needs to be careful consideration of the wider operational, organisational and legal context, as well as the overall decision-making process informed by the analytics.
  3. A lack of guidance. There remains a lack of organisational guidelines or clear processes for scrutiny, regulation and enforcement for police use of data analytics.

How were these findings identified?

RUSI carried out a series of in-depth interviews and a roundtable with a range of police forces in England and Wales, civil society organisations, academics and legal experts.

What are the implications of this piece of research?

RUSI highlighted the following important implications:

  1. Allocation of resources. Police forces will need to consider how algorithmic bias may affect their decisions to police certain areas more heavily.
  2. Legal claims. Discrimination claims could be brought by individuals scored “negatively” in comparison to others of different ages or genders.
  3. Over-reliance on automation. There is a risk that police officers become over-reliant on the use of analytical tools, undermining their discretion and causing them to disregard other relevant factors.

Updates to this page

Published 16 September 2019

Sign up for emails or print this page