Algorithmic Transparency Recording Standard (ATRS) Mandatory Scope and Exemptions Policy
Published 17 December 2024
1. Summary
This is the mandatory scope and exemptions policy for the Algorithmic Transparency Recording Standard (ATRS). It sets out which organisations and algorithmic tools are in scope of the mandatory requirement to publish ATRS records, as announced in the previous government’s response to the consultation on the AI White Paper “A pro-innovation response to AI regulation” in February 2024. It also sets out the required steps to ensure that sensitive information is handled appropriately.
2. Introduction
Since the beginning of 2022, the Algorithmic Transparency Recording Standard (ATRS) has been an established mechanism for the proactive publication of information on the use of algorithmic tools in the public sector. The ATRS has been piloted with various organisations, enhanced and iterated multiple times, and on 6 Feb 2024 it was mandated across all government departments, with a stated intent to extend the requirement to the broader public sector over time. To implement this mandatory rollout, we need to establish clear lines about which organisations and tools are in scope, and the type of information that, for various reasons, may be too sensitive for publication on GOV.UK.
Transparency around how the public sector is using algorithmic tools is useful and appropriate in most circumstances and should be our default position. However, there is some need for caution to make sure that information that is sensitive or confidential is handled properly.
This document sets out the scope for mandatory publication of ATRS records, implementing the cross-government policy set out in the AI White Paper consultation response on 6 Feb 2024.
It sets out:
-
The organisations in scope of this policy, which currently comprise central government (with an intent to extend this more broadly across the public sector in future)
-
The algorithmic tools that are in scope
-
The steps to be taken to make sure that sensitive information is handled appropriately
3. Organisations in scope
3.1 Overall scope
The mandatory requirement to complete ATRS records currently applies to central government. For the purposes of this policy, this consists of the following:
-
Ministerial departments, and
-
Non-ministerial departments, and
-
Arm’s-length-bodies (ALBs), meaning executive agencies and non-departmental public bodies, which provide public or frontline services, or routinely interact with the general public.
This scope is intended to capture the majority of central government uses of algorithmic tools, without placing a disproportionate burden on large numbers of very small or non-frontline arm’s-length bodies that are unlikely to be responsible for any in-scope algorithmic tools. We will work with departments to finalise which arm’s-length bodies fall in or out of scope, but as examples we would anticipate the following:
Examples of organisations in mandatory scope:
-
All ministerial departments e.g. MoJ, DfE, DSIT
-
All non-ministerial departments e.g. HMRC, the National Archives, Competitions and Markets Authority
-
All other arm’s-length bodies (ALBs) that provide public or frontline services, or routinely interact with the general public e.g. HM Land Registry, HM Prisons and Probation Service
Examples of organisations out of mandatory scope:
- Arm’s-length bodies that do not provide public or frontline services, or routinely interact with the general public. Likely examples may include:
Shared Business Services Limited
National Infrastructure Commission
Biometric & Forensics Ethics Group
- Any organisations that are not in scope of the Freedom of Information Act.
3.2 Initial rollout
The initial rollout of the mandatory policy is proceeding in two phases:
Phase 1: Most ministerial departments and HMRC, specifically:
-
Cabinet Office
-
Department for Business and Trade
-
Department for Culture Media & Sport
-
Department for Education
-
Department for Energy Security & Net Zero
-
Department for Environment Food and Rural Affairs
-
Department for Science, Innovation & Technology
-
Department for Transport
-
Department for Work & Pensions
-
Department for Health and Social Care
-
Foreign, Commonwealth & Development Office
-
HM Revenue & Customs
-
HM Treasury
-
Home Office
-
Ministry of Defence
-
Ministry of Housing, Communities & Local Government
-
Ministry of Justice
Phase 2: The remaining ministerial and non-ministerial departments, and arm’s-length bodies that fall in the scope listed above.
As a DSA-endorsed Standard, the ATRS remains recommended across the broader public sector. Hence, the sections below will also be useful to other organisations in determining for which tools it would be good practice to publish ATRS records.
4. Algorithmic tools in scope
Organisations determined to be in mandatory scope above are required to publish ATRS records for algorithmic tools they are currently using in relevant use cases.
4.1 What is an ‘algorithmic tool’?
An algorithmic tool is a product, application, or device that supports or solves a specific problem using complex algorithms.
We use ‘algorithmic tool’ as an intentionally broad term that covers different applications of artificial intelligence (AI), statistical modelling and complex algorithms. An algorithmic tool might often incorporate a number of different component models integrated as part of a broader digital tool.
4.2 For which tools must I complete an algorithmic transparency record?
The mandatory requirement to publish an ATRS record applies to algorithmic tools that either:
-
have a significant influence on a decision-making process with public effect, or
-
directly interact with the general public.
‘Significant influence’ includes where an algorithmic tool meaningfully assists, supplements, or fully automates a decision-making process. This could be a tool that plays a triaging or scoring function within a wider process.
By ‘public effect’ we mean a decision-making process having an impact on members of the public, where the latter are understood as any individuals or groups of individuals, irrespective of their nationality or geographical location. Impact on members of the public also includes algorithmic tools directly processing data or information people have submitted as part of a wider process, e.g. an application, complaint or consultation submission.
To decide whether a decision-making process has a public effect, you might want to consider whether usage of the tool assists, supplements or fully automates a process which:
-
materially affects individuals, organisations or groups
-
has a legal, economic, or similar impact on individuals, organisations or groups
-
affects procedural or substantive rights
-
impacts eligibility for, receipt of, or denial of a programme
Note that this is intended to apply to situations where an algorithmic tool is influencing specific operational decisions about individuals, organisations or groups, not where a tool is an analytical model supporting broad government policy-making. Analytical models in scope of the guidance in the Aqua Book will typically be outside of ATRS scope (though it is possible to envisage some specific circumstances where both would be applicable, see examples below).
Examples of tools that could fall within the scope of these criteria are:
-
a machine learning algorithm providing members of the public with a score to help a government department determine their eligibility for benefits (impact on decision-making with public effect)
-
a chatbot on a government website interacting directly with the public which responds to individual queries and directs members of the public to appropriate content on the website (direct interaction with the public)
Examples of tools that would likely not fall within the scope of the criteria include:
-
A tool being used by a government department to transform image to text (e.g. used in digitisation of handwritten documents) as part of an archiving process (no significant decision or direct public interaction)
-
An automated scheduling tool which sends out internal diary invites from a mailbox (doesn’t have public effect)
Further examples are listed below in Annex 1.
To emphasise, the context of use of the algorithmic tool matters here. The same image to text algorithm above might be relevant if being used instead to digitise paper application forms for a government service (e.g. poor performance of the algorithm on some handwriting styles might well have an influence on success rates for individual applicants).
Note that the algorithmic tool scope listed above is that of the policy for mandatory ATRS adoption in central government. If you are using an algorithmic tool that does not strictly meet these criteria but you would like to provide the general public with information about it, you can still fill out and publish an algorithmic transparency record.
4.3 At which stage in a tool’s development lifecycle should an ATRS record be created?
The mandatory requirement to publish an ATRS record applies to tools that are in Beta/Pilot or Production phase.
Teams are welcome to submit records for tools in earlier stages of the lifecycle, but it is not mandatory to do so.
For tools that have previously been in use and had a record created for them and which are later being retired, the responsible team should submit an updated record changing the information in the phase field to ‘Retired’. This update will be reflected on the published record on GOV.UK.
5. Exemptions
5.1 What information should organisations not publish?
The ATRS has been designed to minimise likely risks that could arise from publication of records (e.g. to security, privacy or intellectual property).
Situations where no information can be safely published are expected to be unusual (e.g. in cases where even the existence of a tool cannot be made public for security reasons).
More commonly, for some tools, there may be particular information requested in the ATRS that you may be concerned about releasing into the public domain, even if the majority of the information about the tool is publishable. This may relate to a risk of gaming the tool, risks to national security, infringing intellectual property or releasing commercially sensitive information.
In most such instances, the appropriate response is to reduce the level of information supplied for relevant fields, for example giving a broad description of the type of data used by a tool instead of specific details of individual data sources, or a broad summary of how a tool works instead of precise information about the system architecture.
In developing the rationale behind the exemptions for this policy, we align with those set out in the Freedom of Information Act 2000 (“FOIA”). Although FOIA is a reactive means of providing information to the public while the publication of ATRS records is proactive, we settled on using the FOIA exemptions as a basis for our exemptions policy since the logic around which types of information are too sensitive to publish openly remains the same. Moreover, the FOIA policy is firmly established in the public sector as business as usual, thus this will reduce the administrative burden for organisations to comply with the ATRS policy mandate.
As a general rule, this ATRS Scope and Exemptions Policy does not require the publication of information that would be subject to an exemption under access to information legislation, i.e. the FOIA, Environmental Information Regulations and data protection legislation.
To understand how this applies in practice, imagine that you created a full internal version of an ATRS record and (hypothetically) received an FOI request to publish that record. How would you respond to such a request?
-
If you would release the record in full, then the same applies to proactive publication of the record.
-
If you would release some of the information in the record, but would need to redact some of it under FOI exemptions, then you should remove or de-sensitise the exempt content from the ATRS record prior to publication.
-
In (rare) circumstances where you were not able to confirm the existence of the tool, e.g. you would issue a neither-confirm-or-deny response to the hypothetical FOI request, then it would be inappropriate to publish the ATRS record at all.
Not all FOIA exemptions are relevant here. Specifically:
-
The ATRS is designed to capture tool-level information rather than personal information. Concerns about publishing personal data should therefore not apply when considering whether to remove or de-sensitise partial or whole ATRS records (FOIA section 40).
-
There are limitations on cost of responses, vexatious queries and information already in the public domain, that are necessary for a reactive duty such as the FOIA to avoid disproportionate effort in responding to an unbounded number of incoming requests (FOIA sections 12, 14, 21, 22). They are not relevant to the publication of ATRS records which is inherently limited to one record per tool.
-
Exemptions for reasons of commercial sensitivity (FOIA section 23) need to be applied with care see section on dealing with commercially sensitive information.
5.2 How to exempt types of information within a transparency record
As mentioned above, in most cases it will be sufficient to give higher-level information within particular fields. However, where entire fields are fully exempt from publication, or you wish to indicate explicitly why the information in a record is limited, we recommend recording this within a record in the following format, with the third column giving a general description of the reason. For example:
Example 1
2.4.2.9 | Dataset purposes | Indicate how each dataset was used during the model development process (e.g. training, validation, testing). | EXEMPT: National security |
Example 2
2.4.2.9 | Dataset purposes | Indicate how each dataset was used during the model development process (e.g. training, validation, testing). | EXEMPT: After considering factors for and against disclosure of this information, we have concluded that it is not in the public interest to disclose information on the specific datasets. This is based on the specific information contained within them having the potential to give malicious actors insights on how to evade/game the tool and therefore endanger the UK’s defence capabilities (section 26 of the FOIA 2000). |
If you have any concerns about publishing information that are not covered above, we would ask you to get in touch with the ATRS team to discuss algorithmic-transparency@dsit.gov.uk.
5.3 Dealing with commercially sensitive information
Many algorithmic tools used in the public sector will involve an external supplier in some form, and hence publication of an ATRS record will require some consideration of commercially sensitive information.
There is a need for care in applying commercial exemptions, i.e. in applying Section 43 of the Freedom of Information Act.
If this exemption is applied too broadly, it would undermine the overall intent of this policy to increase transparency on the government’s use of algorithmic tools and limit its benefits. Commercial suppliers that wish to sell algorithmic solutions to public bodies that are then used in processes that impact members of the public should be comfortable with this level of transparency that is expected of the public sector. Public bodies that are procuring solutions from vendors should make this expectation clear in their invitation to tender or other route to market.
The primary focus of the ATRS is the use case that a tool is deployed into, and the steps taken to ensure that a tool is appropriate in that context. Though there is some information required about the technical aspects of the tool, there is flexibility on how much information is provided here, and the Standard is designed to be consistent with emerging industry good practice on model cards (and indeed information that might often be published by vendors in a white paper).
As such, public authorities are encouraged to work with their supply chains at the start of the process to minimise the amount of information being withheld for commercial reasons.
6. Annex 1: additional examples of in and out of scope tools
Type of tool | Example use case(s) in mandatory ATRS scope (rationale) | Example use case(s) out of mandatory ATRS scope (rationale) |
Large language model | An LLM developed/used as a digital assistant to suggest services or resources someone may be eligible for or benefit from after they explain their circumstances to it (direct interaction with the public) A tool intaking and being used to analyse text submitted by members of the public, with the tool providing a summary to then be read by humans who will take decisions based on that information (e.g. how consultation submissions feed into an overall summary, which could later inform policy or legislation) (Direct interaction with the public through being first-line analysis of their submissions, but also Impact on decision making) |
Ad-hoc usage of large language models, such as Microsoft Copilot being used by individual civil servants internally within an organisation/department to transcribe and summarise meetings (no significant decision or direct public interaction) |
Chatbots | A chatbot on a website interacting directly with the public which responds to individual queries and directs members of the public to appropriate content on the website (direct interaction with the public) | A similar chatbot for purely internal purposes e.g. as part of an internal IT support channel (no significant decision or direct public interaction) |
Text recognition | A tool being used by a government department to transform image to text to digitise paper application forms for a government service (potential influence on decisions, e.g. poor performance of the algorithm on some handwriting styles might well have an influence on success rates for individual applicants) | A tool being used by a government department to transform image to text (e.g. used in digitisation of handwritten documents) as part of an archiving process (no significant decision or direct public interaction) |
Productivity | An AI-based automated scheduling tool applying prioritisation criteria to schedule medical appointments (direct interaction with public) | An automated scheduling tool which sends out internal diary invites from a mailbox (no significant decision or direct public interaction) An algorithmic tool that uses written input to generate financial reports, emails and charts, or autofill internal admin documents (no significant decision or direct public interaction) |
Scoring and risk assessment | A machine learning algorithm providing members of the public with a score to help a government department determine their relative priority for accessing a public service (impact on decision making) A tool used to score the risk of harm associated with a detainee and recommends the level of precaution required. (impact on decision making) A tool used to calculate the complexity of a case and assign a different caseworker depending on the score assigned. (If assessment determines that how this is done could create an impact on decision-making) A real time analysis tool that uses readily available data with newly obtained on the spot data to score or risk profile a person, object, location or event, e.g. customs scan (potential to be used in decisions such as who or what to search or check, impacting individuals) |
A statistical model estimating overall demand for a public service to inform policy-making and overall capacity planning (not decisions about individuals) A predictive analysis tool that uses previous data, predictive data to inform requirements and aid in risk assessment e.g. future energy grid, road and rail usage, flood risk, staffing requirements (not decisions about individuals) |
Biometrics | A facial recognition model used to verify the identity of an applicant for a product (e.g. a passport) against a face held on file (potential to influence decisions, e.g. Poor performance on matching particular faces could have an influence on success rates for individual applicants) | Use of facial or fingerprint recognition by public sector employee to unlock a corporate mobile device (no significant decision or direct public interaction) |
Image recognition | An image recognition algorithm reviewing images and labelling them when spotting a pre-trained identifier in the image e.g. driving while on the phone (potential to influence decisions, e.g. individual drivers receiving fines/penalties) A machine learning algorithm that reviews x-ray images using pre-trained images to aid with diagnosis (aids in decision making in dictating a healthcare response, plus direct public interaction with patients) |
A matching algorithm that takes multiple data sources of varying quality to decide if two plus records of varying data types are the same individual, e.g. to clean up a database (no significant decision or direct public interaction) |
Complex statistical models | A model creating a routine statistic used to directly inform a decision-making process e.g. the QCOVID algorithms used as part of the Population Risk Assessment to add patients to the Shielded Patient List in February 2021. (directly influences decisions) | A model creating a routine statistic used to help an organisation understand broad trends and patterns, e.g. understanding the busyness of streets, or health indicators across populations (no significant decision or direct public interaction) |