Report commissioned by CDEI calls for measures to address bias in police use of data analytics
The Royal United Services Institute (RUSI) has published research - commissioned by CDEI - into the use of algorithms in policing, and the potential for bias.
Documents
Details
RUSI’s research involved interviews with UK police officers themselves, who describe that the landscape of technological sophistication is varied across forces in the UK. The evidence suggests that there is an absence of consistent guidelines for the use of automation and algorithms, which may be leading to discrimination in police work.
This research forms an important part of the CDEI’s overall review into algorithmic bias. We are working on draft guidance to help address the potential for bias in predictive analytics in policing, and will make formal recommendations to the Government in March 2020. Read more about our 2019/20 Work Programme.
What were the findings?
- Multiple types of potential bias can occur. These include discrimination on the grounds of protected characteristics; real or apparent skewing of the decision-making process; and outcomes and processes which are systematically less fair to individuals within a particular group.
- Algorithmic fairness is not just about data. Rather, to achieve fairness there needs to be careful consideration of the wider operational, organisational and legal context, as well as the overall decision-making process informed by the analytics.
- A lack of guidance. There remains a lack of organisational guidelines or clear processes for scrutiny, regulation and enforcement for police use of data analytics.
How were these findings identified?
RUSI carried out a series of in-depth interviews and a roundtable with a range of police forces in England and Wales, civil society organisations, academics and legal experts.
What are the implications of this piece of research?
RUSI highlighted the following important implications:
- Allocation of resources. Police forces will need to consider how algorithmic bias may affect their decisions to police certain areas more heavily.
- Legal claims. Discrimination claims could be brought by individuals scored “negatively” in comparison to others of different ages or genders.
- Over-reliance on automation. There is a risk that police officers become over-reliant on the use of analytical tools, undermining their discretion and causing them to disregard other relevant factors.