CDEI publishes review into bias in algorithmic decision-making
The CDEI has published the final report of its review into bias in algorithmic decision-making.
Documents
Details
The government commissioned the CDEI to review the risks of bias in algorithmic decision-making. This review formed a key part of the CDEI’s 2019/2020 Work Programme, though completion was delayed by the onset of COVID-19. This is the final report of the CDEI’s review and includes a set of formal recommendations to the government.
Government tasked the CDEI to draw on expertise and perspectives from stakeholders across society to provide recommendations on how they should address this issue. The CDEI also provides advice for regulators and industry, aiming to support responsible innovation and help build a strong, trustworthy system of governance. The government has committed to responding to the CDEI’s review.
What was the focus on in the review?
The CDEI focused on the use of algorithms in significant decisions about individuals. The review looks at the use of algorithmic decision-making in four sectors (policing, local government, financial services and recruitment) and makes cross-cutting recommendations that aim to help build the right systems so that algorithms improve, rather than worsen, decision-making. These sectors were selected because they all involve significant decisions about individuals, and because there is evidence of both the growing uptake of algorithms and historic bias in decision-making in these sectors.
What recommendations is the CDEI making?
The measures that the CDEI has proposed are designed to produce a step change in the behaviour of all organisations making life-changing decisions on the basis of data, with a focus on improving accountability and transparency. Key recommendations include:
-
Government should place a mandatory transparency obligation on all public sector organisations using algorithms that have an impact on significant decisions affecting individuals.
-
Organisations should be actively using data to identify and mitigate bias. They should make sure that they understand the capabilities and limitations of algorithmic tools, and carefully consider how they will ensure fair treatment of individuals.
-
Government should issue guidance that clarifies the application of the Equality Act to algorithmic decision-making. This should include guidance on the collection of data to measure bias, as well as the lawfulness of bias mitigation techniques (some of which risk introducing positive discrimination, which is illegal under the Equality Act).
What happens next?
The CDEI will help industry, regulators and government in taking forward the practical delivery work to address the issues it has identified and future challenges which may arise.