Independent report

BritainThinks: Complete transparency, complete simplicity

This report, commissioned by the CDEI, outlines the findings from a deliberative public engagement exercise conducted by BritainThinks about algorithmic transparency in the public sector.

Documents

Details

What does this report cover?

In the CDEI’s review into bias in algorithmic decision-making, it recommended that government should place a mandatory transparency obligation on all public sector organisations using algorithms when making significant decisions affecting individuals. To move this recommendation forward, the CDEI has been supporting the Central Digital and Data Office (CDDO) as it develops a standard for algorithmic transparency. This report details findings from a deliberative public engagement exercise that the CDEI commissioned BritainThinks to carry out to explore what meaningful transparency about the use of algorithmic decision-making in the public sector could look like in practice.

What are the key findings?

  • Participants felt that all categories of information about algorithmic decision-making should be made available to the public. Participants prioritised information about the role of the algorithm, why it is being used, and how to get further information or raise a query.

  • When working to design a prototype information format, participants allocated information categories to different tiers, with information such as the purpose of the algorithm being in ‘tier one’, and more detailed information, such as technicalities of the algorithm, being in ‘tier two’. Participants expected the information in ‘tier one’ to be immediately available at the point of, or in advance of, interacting with the algorithm, while they expect to have easy access to the information in ‘tier two’ if they choose to seek it out.

  • Different use-cases impacted how proactively participants felt transparency information should be communicated. For lower potential risk and lower potential impact use-cases (such as the use of an algorithm in a car park), passively available transparency information was acceptable on its own. For higher potential risk and higher potential impact use-cases (such as the use of an algorithm in a recruitment process) there was a desire for active communication of basic information upfront to notify people that the algorithm is being used and to what end, as well as the passively available transparency information. It was felt that this information should be more targeted and personalised.

What happens next?

In addition to this public engagement work, CDDO, as policy sponsor, have been running a series of workshops with internal stakeholders and external experts, to discover what information on the use of algorithmic decision-making in the public sector they would like to see published and in what format. Its findings will be consolidated with the outcomes of this public engagement project and will inform the development of a standard for algorithmic transparency. The prototype of this standard will then be tested and evaluated in an open and participatory manner.

Updates to this page

Published 21 June 2021
Last updated 21 June 2021 + show all updates
  1. Added HTML version of the report

  2. First published.

Sign up for emails or print this page