Inclusion monitoring report findings 2024
Published 19 November 2024
Executive summary
Background
The Department for Science, Innovation and Technology (DSIT) is working to enable the widespread use of secure and trustworthy digital identity and attribute services across the UK economy. Digital identities can improve people’s lives by making everyday transactions simpler, quicker and more secure. Using digital identity services is not mandatory, but the Government is committed to ensuring that anyone who chooses to use them can do so.
The UK digital identity and attributes trust framework is a set of rules and standards that show what a good digital identity looks like. Digital verification services can get independently certified to demonstrate they are following these robust standards. The trust framework outlines how digital identity services can improve inclusion and encourages companies to adopt inclusive and accessible practices, such as choosing technologies which have been tested with users from a variety of demographics. While the potential barriers to using a digital identity are complex, the trust framework aims to ensure that digital identity services are as inclusive and accessible as possible.
Digital verifcation services which are certified against the trust framework must complete an inclusion monitoring report. The inclusion monitoring reports are a mechanism for building a general picture of the inclusivity of the certified digital identity market. Providers with a certified service respond to a survey of around 30 questions on areas such as identity evidence, technology, accessibility, data collection and biometrics. The results are anonymous and individual services are not being assessed. The anonymised and aggregated results will be used to inform inclusion policy and further development of the trust framework.
Whilst the requirement to submit an inclusion monitoring report has existed in the trust framework since the beta version was published in June 2022, this year the report was significantly expanded to include more questions on a broader range of areas. The format for submitting the report was also changed to allow quantitative data collection that can be replicated and compared in future years. This is the first time that the results have been published.
The survey was completed online and 42 certified firms completed a response by the deadline in mid-May. This represents an 84% completion rate amongst certified services. There were some late responses, but these have not been included in the analysis for this publication.
Key findings
-
Certified services accept a reasonably broad range of documents and evidence to create a digital identity or verify an attribute, from a passport or driving licence through to a utility bill or PASS card. This is in spite of some uses of digital identity services (such as right to work checks) requiring specific documents to be used.
-
There are a range of providers offering their services via both web browsers and apps, which reduces the risk of excluding users with access to only one type of device.
-
54% of surveyed services adhere to the Web Content Accessibility Guidelines (WCAG) 2.0 (AA) or higher.
-
Half of surveyed services do not collect any demographic data about their users. Of the half that do, only a handful of services are using this to monitor the inclusivity of their service.
-
Of the services offering biometric technology, 49% offer an alternative route if users do not wish to use biometrics.
-
Across the certified services using biometric technology there are various types of testing conducted. Of the services using biometric technology, around 30% have information on the accuracy rates of their biometric technologies for different demographic groups.
-
Cost and a lack of access to government-held data were the most commonly cited challenges to improving inclusion. Other frequently mentioned concerns included security, lack of data about inclusion, regulatory restrictions and a request for more government guidance.
Methodology
Aims
The government has committed to enabling the widespread use of inclusive and accessible digital identity and attribute services, so that digital identity products can be accessed by all those who choose to use them.
The aim of the inclusion monitoring reports is to collect data to build a general picture of the inclusivity of digital identity services certified against the UK digital identity and attributes trust framework beta (0.3) version. Completing the inclusion monitoring report is a requirement of certification.
The report provides key evidence for policy making, particularly for understanding if further policy intervention is required to support the provision of inclusive and accessible digital identity services.
Approach
To collect the data, a survey was used. The main considerations of survey design were:
-
Limiting the length of the survey to ensure it would take around 30 minutes to complete.
-
Ensuring only the analysis team had access to the raw data and that responses were presented anonymously. Treating responses anonymously was considered integral to receiving honest responses from digital identity services.
The population of interest was all 50 services that were certified at the time of initial survey distribution. The achieved sample was 42. This was in part due to some services approaching the end of their certification and deciding not to re-certify (and therefore not to complete the survey).
Timing
The majority of responses were collected from mid-April 2024 to mid-May 2024, with late responses collected until the end of June 2024. Responses received after the end of June 2024 are not included in this analysis. There were 8 non-responses in total at the time of analysis.
Responses
42 firms responded by the deadline. Of those 42 firms, 37 had a live service. Those who did not yet offer a live service were directed to the end of the survey, as the survey was intended to capture information on how live services are designed and used. The response rate for firms included in the analysis was 84%. In total, the response rate was 88%.
Analysis of results
Analysis was conducted in excel, with the raw data converted into table format and organised by question. The tables were then used to produce graphs to visualize the results. Responses were analysed by question and then quality assured by DSIT analysts.
Responses were divided into organisations with a live service and those without. Only those with a live service could respond to the detailed questions.
Open-text responses were coded by theme. Where possible, open-text responses were then grouped and counted.
Findings
The report was divided into 7 different sections: general information about the service provider, documentation and evidence, technology, accessibility, data collection, biometrics and questions on possible future improvement.
General (Q1-7)
These questions gathered basic information about the organisations completing the survey so the aggregated results could be interpreted more accurately.
Most of the providers who completed the report are certified exclusively as an identity service provider. A smaller proportion are identity and attribute service providers, identity and orchestration service providers, or exclusively attribute service providers. A small number of services are certified as all three roles.
Most providers offer business-to-business services or business-to-business and business-to-consumer services. Very few providers exclusively offer business-to-consumer services. This could impact the inclusivity and accessibility of the certified services as many are only offered via other businesses, which may offer them within their own more or less accessible and inclusive systems.
Transaction type
Transaction type | Number |
---|---|
B2B | 27 |
B2C | 1 |
Both | 7 |
Other - B2B2C | 3 |
Figure 1 (Base: 38)
In terms of organisation size, there is currently a broad range amongst providers, with a roughly even spread of large, medium and small size[footnote 1] businesses with a smaller proportion of micro businesses.
Most of the services stated that they offered between 1-5 use cases for customers. This aligns with the current number of live supplementary schemes (supplementary schemes are rules that exist in addition to the rules in the trust framework for specific use cases or sectors. The current schemes are Right to Work, Right to Rent and Disclosure and Barring Service (DBS) checks. However, around a quarter of services report offering more use cases, showing that there are further applications for digital identities beyond the current government owned schemes. These uses cases included age verification, Know Your Customer (KYC) checks and qualification checks.
Documentation and evidence (Q8-10)
Lack of traditional identity documents, such as a passport or driving licence, can be a key barrier to someone being able to prove their identity or attributes. However, document acceptance can be driven by business demand and sector-specific regulatory compliance rather than the services themselves. For example, the right to work supplementary scheme specifies that employers can only use digital identity services that check a person’s identity using a British or Irish passport or Irish passport card for the check to be compliant.
The inclusion monitoring report nonetheless shows that certified services accept a broad range of documents to create and verify a digital identity or attribute, from traditional documents such as passports and driving licences to other sources, including telecom data and bank statements. This encouragingly suggests that there is already technical capacity to process and verify a variety of document types which people may have easier access to. We hope this will support the sector’s inclusivity as other use cases with less prescriptive requirements continue to grow.
Types of evidence accepted
Figure 2 (Base: 37)
Types of evidence accepted
Type of evidence | Number of organisations that accept the evidence type |
---|---|
UK Passport | 33 |
UK Passport with NFC | 29 |
Non-UK Passport | 29 |
UK Driving Licence | 29 |
EEA Driving Licence | 27 |
National ID card with photograph | 25 |
EEA Identity card with NFC | 20 |
Residence permit | 20 |
Electoral register | 14 |
Utility bill | 12 |
Credit data | 10 |
Bank statement | 10 |
Council tax letter | 8 |
Birth or adoption certificate | 8 |
PASS card | 7 |
Bank account details | 7 |
Telco data | 6 |
Marriage certificate | 6 |
Knowledge Based Verification (KBV) questions | 4 |
Other - please specify | 4 |
Evidence of power of attorney (mark of Public Guardian) | 3 |
Social media | 1 |
Figure 3 (Base: 37)
The survey also asked certified services about the different Good Practice Guide 45 (GPG 45) profiles they offer. GPG 45 outlines the government guidelines on how to prove and verify someone’s identity. There are four different confidence levels (‘low’, ‘medium’, ‘high’ and ‘very high’) for identity checking that are applicable for different scenarios, and a number of different profiles within each confidence level. With the exception of 4 ‘very high’ confidence level profiles, there is at least one service offering each of the GPG 45 profiles. The most offered profiles are at a ‘medium’ level of confidence. This reflects the profiles that are needed for the three live supplementary schemes. If an individual doesn’t have traditional identity documents, it is less likely that they would be able to access ‘high’ or ‘very high’ confidence profiles.
Two services reported they support vouching as a means of proving identity. Vouching is when a person vouches for someone else by declaring they know them as the claimed identity. This is typically used as a method for proving identity when the person doesn’t have traditional identity documents. For example, vouching may be used to verify the identity of a child for their first passport. Certified services offering vouching is likely to support the inclusivity of the digital identity sector in facilitating access for those without traditional identity documents.
Technology (Q11-13)
Digital access is a key part of digital inclusion and being able to use a digital identity or attribute service. Whilst digital access can include access to the internet and digital skills, these were not in scope of the survey. The questions in this section focused on the type of device needed to access a certified digital identity or attribute service.
A mobile device such as a tablet or smartphone is required for most certified services, but 76% of respondents offer their service via web browsers that can be used on a desktop or laptop. Services generally support a range of Apple and Android operating systems and browsers.
Accessibility (Q14-19)
Ensuring that services are accessible means that individuals with different needs can use them. This could include individuals with vision or hearing impairments, individuals with physical disabilities or neurodiverse individuals, but accessibility features usually benefit a wide range of people and can make a service easier to use for all.
54% of services adhere to the Web Content Accessibility Guidelines (WCAG)[footnote 2] 2.0 (AA) or higher[footnote 3]. However, 27% of services answered that they did not know if their service met this standard.
Number of organisations adhering to WCAG
Figure 4 (Base: 37)
Number of organisations adhering to WCAG
WCAG Version | Number of organisations |
---|---|
2.0 (A) | 0 |
2.0 (AA) | 4 |
2.0 (AAA) | 0 |
2.1 (A) | 2 |
2.1 (AA) | 9 |
2.1 (AAA) | 1 |
2.2 (A) | 0 |
2.2 (AA) | 4 |
2.2 (AAA) | 0 |
No | 2 |
Don’t know | 10 |
Not applicable | 4 |
Figure 5 (Base: 37)
Number of organisations adhering to WCAG by success criteria level
WCAG Rating | Number of organisations |
---|---|
A | 2 |
AA | 17 |
AAA | 1 |
Figure 6
8% of services reported that they met at least one additional internationally recognised accessibility standard such as the European Telecommunication Standards Institute (ETSI) standard for accessibility requirements EN 301 549 V3.2.1 (2021-03) and accessibility requirements suitable for public procurement of ICT products and services in Europe EN 301 549 V1.1.2 (2015-04).
65% of services reported they offered at least one of the possible accessibility features listed in the survey. These were: compatibility with assistive technologies, text magnification, text to voice capability, voice recognition or keyboard navigation. 5 services reported that they offered no accessibility features.
51% of services report that they offer non-digital routes for users to access support. 19 services offer telephone support and 2 of these services also offer in-person support. However, access to non-digital support does not necessarily mean that the service itself can be used non-digitally. Non-digital support routes means that individuals who have lower digital skills, limited digital access, or who may not be comfortable accessing support via a webchat are less likely to be excluded and are more likely to be able to access the help needed to use a digital identity service.
5 services offer a route for verifying an identity or attribute via delegated authority. Delegated authority is when a subject nominates a representative to do things for them. For example, someone may give lasting power of attorney to a family member or caregiver.
33% of firms offered their service in more than 5 languages and 19% in more than 10 languages. 50% services reported only offering their service in English.
Data collection (Q20-29)
In this section, the questions aimed to understand what sort of data services collect about their users, including the proportion of services in the certified market that are collecting demographic data and whether they are using this to monitor the inclusivity of their service. Without accurate data, it can be difficult to assess whether a service is inclusive across all demographics.
Around half of surveyed services do not collect and store any data about their users. Of the half that do collect and store this data, only 4 of these services reported using this information to monitor the inclusivity of their service. The most common data collected is name, age, address and nationality. Other personal and demographic data collected by a smaller proportion of services includes ethnicity, religion, marital status, gender and sex.
73% of services collect data on the drop-out rates from their service. Of this group, 11 services collected information on the reasons for drop-outs but the inclusion monitoring survey did not record how this data is collected. The most common reasons for drop out according to the survey responses include: the process takes too long, change of mind, lack of understanding of the process and lack of trust. Other reasons include unstable internet connection, poor quality capture of an ID document or expired documentation.
78% of services collect data on users being unable to verify their identity or an attribute. The most common reason is lack of necessary identity evidence, followed by (in descending order) submission of inaccurate information, technology failure and suspicion of fraud.
Biometrics (Q30-35)
Biometric technology is commonly used as part of identity verification or authentication to conduct liveness checks and likeness checks. A liveness check – usually a video of the user taken by themselves – is used to check that the person is a real person. A likeness check compares the person trying to verify their identity with the photo shown on the identity document they are presenting to make sure they are the same.
While biometric technology can be very accurate, there is also a risk that it can be biased towards certain demographic groups if it is not robustly tested. For example, some types of facial recognition technology may not identify people of a certain ethnicities or genders as well as others.
92% of surveyed services use facial recognition technology with a small number using voice and fingerprint biometrics. 49% of the services offering biometric technology offer an alternative route if users do not wish to use biometrics.
Various types of testing are conducted to assess bias in the biometric technologies used. This includes operational testing (testing in conditions similar to the target operating environment), scenario testing (testing in a simulated operational setting closely resembling the target operational conditions) and technology testing (testing of biometric algorithms using a defined data set). 5% of services reported they are not conducting any performance testing for bias.
Types of performance testing
Figure 7 (Base: 37)
Types of performance testing
Type of performance testing | Number of respondents |
---|---|
Operational testing (testing in conditions similar to the target operating environment) | 21 |
Scenario testing (testing in a simulated operational setting closely resembling the target operational conditions) | 18 |
Technology testing (testing of biometric algorithms using a defined data set) | 22 |
Other (Please specify) | 9 |
No performance testing for bias was conducted | 2 |
I don’t know if performance testing was undertaken to assess the biometric technology for bias | 3 |
Figure 8 (Base: 37)
Of the testing that was conducted, the most common method was internal testing that is delivered by a team within the service provider, using a recognised testing methodology that meets international standards. The second most common method was external independent testing delivered by an ISO/IEC 17025 accredited biometric test laboratory. 27% of services are conducting bespoke internal testing that does not follow a recognised testing methodology.
Around 30% of services have information on the accuracy rates of their biometric technologies for different demographic groups.
Future improvements to inclusion (Q35-38)
This section was designed to understand certified services’ priorities and recommendations for improving inclusion and the key challenges from their perspective. Unlike the other questions in the survey, these questions were optional.
Cost and a lack of access to government-held data were the most common responses for challenges to improving inclusion. Other frequently mentioned barriers included security, lack of data about inclusion, regulatory restrictions and a request for more government guidance.
The survey also asked what services were intending to do in the next year to improve the inclusion of their service. Answers included offering their service in more languages, meeting WCAG guidelines, conducting user surveys to gather feedback and implementing diversity training for internal teams.
Conclusion
The certified digital identity service market remains relatively nascent, but with increasing integration and utility across the economy and a good platform for growth with forthcoming legislation. As the first year of this data collection, the results represent an important baseline of data. Mapping these data points over time will allow us to draw more concrete conclusions about the inclusivity of the certified digital identity and attributes market and track progress as the market develops.
However, it is already clear that some certified services are doing more than others to make their offerings accessible and inclusive for a variety of users. Subject to any regulatory or legislative constraints for specific use cases, we remain committed to supporting a genuinely inclusive digital identity market, and encourage certified services to continue this work. The results of the inclusion monitoring reports will inform our ongoing work on inclusion policy and the further development of the trust framework. We will also be using the findings from this report to consider how we can improve the survey questions for next year.
Annex A: survey limitations
Survey response
It is not possible to be sure who specifically from each organisation completed the survey. It may be that the respondent was not the most appropriate person. They may not have been able to gather accurate information. In at least one case a duplicate response was submitted which suggests more than one representative of the certified service answered the survey.
Only live services responded in full
The survey was designed so that only data on organisations with a live service was collected. The survey did not collect any data on new services.
Question design
Following analysis of results, there were some cases where questions produced unexpected results. These have been highlighted below:
Use cases
-
Responses to this question were in open text format. This was difficult to code and identify where responses pointed to the same use cases
-
Some firms responded that they are already providing services across 20 or more use cases. This seems unlikely based on DSIT’s knowledge of the current UK digital identity market, but the question format did not allow for further exploration
-
For the next iteration, this question should be changed to a list of options with an option to select ‘other’
-
Additionally, a more detailed definition of a ‘use case’ should be provided
Operating systems
-
This question asked respondents to select ‘oldest operating system available’. Many respondents selected all operating systems compatible, rather than selecting the single oldest operating system that can be used
-
For the next iteration, this question could be redesigned so that it is clear what is being asked, and to avoid the possibility of respondents selecting too many options.
Languages offered
-
Respondents were asked to select languages offered out of the top 10 spoken languages in the UK
-
They were then asked to list all languages offered beyond the 10 most spoken
-
Analysis was produced to look at the proportion of firms offering a certain number of languages, but it may have been more efficient to request respondents to just provide a total number of languages offered
-
This question could be changed to ask the number of languages offered
-
Additionally, asking if services offer a translate tool would potentially be beneficial
Demographic information
-
Respondents were asked if they collect the demographic information of users
-
They were then asked if they use this information to monitor the inclusivity of the service
-
This question could be amended to prompt the respondent to give a reason if they answer ‘no’
Data collection
A number of certified services missed the original deadline for completion of the inclusion monitoring report, and had to be offered extensions. For some, this was because they were planning to withdraw from the market, but no clear reason was provided for others.
In future, we will work with conformity assessment bodies to ensure that certified services recognise that their certification against the trust framework requires that inclusion monitoring reports are submitted within the requested time frame.
Next phase
We did not run any regression analyses to assess relationships between different variables. This may be useful for some questions in future. For example, exploring whether there was a relationship between firm size and number of use cases offered.
It would be possible to analyse future iterations in different data analysis software, such as R. This would enable more advanced analysis to be conducted on the results and offer the potential for results to be presented in dashboard form. We did not conduct this type of analysis at this stage due to it being the pilot version of the survey.
Due to a small sample size, regression analysis may have provided little additional insight on our data as we would not be able to establish any findings as statistically significant. We will reconsider additional analysis subject to any future iterations to question design and changes in the sample size.
Annex B: DSIT inclusion monitoring report questions
1. Which organisation are you reporting on behalf of?
2. Is your product or service currently active in the UK market?
-
Yes
-
No
Section 1: general
3. What role is your service certified as? Please select all that apply.
-
Identity service
-
Attribute service
-
Orchestration service
-
None of the above
4. What type of transactions does your business primarily focus on? Please select all that apply.
-
Business to business
-
Business to consumer
-
Other - please specify
5. Please specify the size of your organisation
-
Micro: Fewer than 10 employees and turnover of less than or equal to £2 million or balance sheet total of less than or equal to £2 million
-
Small: 10-49 employees and turnover of less than or equal to £10 million or balance sheet total of less than or equal to £43 million
-
Medium: 50-249 employees and turnover of less than or equal to £50 million or balance sheet total of less than or equal to £43 million
-
Large: 250+ employees and turnover of over £50 million or balance sheet total of over £43 million
6. How many distinct use cases does your service currently operate in? Please only include use cases your service currently operates in, even if you are likely to expand into other use cases in the future.
-
Not Active
-
1-5
-
6-9
-
10-14
-
15-19
-
20 or more
-
Don’t know
7. Please list all of these use cases.
Section 2: documentation
8. Across all the use cases your service provides, which GPG 45 profiles do you meet? Please select the options that apply to current use cases.
-
L1A
-
LAB
-
L1C
-
L2A
-
L2B
-
L3A
-
M1A
-
M1B
-
M1C
-
M1D
-
M2A
-
M2B
-
M2C
-
M3A
-
H1A
-
H1B
-
H1C
-
H2A
-
H2B
-
H2C
-
H2D
-
H2E
-
H3A
-
V1A
-
V1B
-
V1C
-
V1D
-
V2A
-
V2B
-
V2C
-
V2D
-
V3
9. What forms of evidence can someone use to prove their identity or verify an attribute through your service? Please select all that apply for current use cases.
-
UK Passport
-
UK Passport with NFC
-
Non-UK Passport
-
UK Driving Licence
-
EEA Driving Licence
-
EEA Identity card with NFC
-
National ID card with photograph
-
Residence permit
-
PASS card
-
Telco data
-
Council tax letter
-
Electoral register
-
Social media
-
Credit data
-
Utility bill
-
Bank statement
-
Birth or adoption certificate
-
Marriage certificate
-
Bank account details
-
Knowledge Based Verification (KBV) questions
-
Evidence of power of attorney (mark of Public Guardian)
-
Other - please specify
-
Not applicable
10. Does your service support vouching as a means of identity or attribute verification for any current use cases?
Vouching is a type of evidence that can be used to verify a claimed identity. A person vouches for someone else by declaring they know them as the claimed identity.
-
Yes
-
No
-
Don’t Know
-
Not applicable
Section 3: technology
11. What technology is required for a user to access your service to prove their identity or verify an attribute? Please select all that apply.
-
Smartphone
-
Smartphone with NFC reader
-
Smartphone with camera
-
Desktop computer or laptop
-
Tablet
-
Other mobile device – please specify
-
Other - please specify
-
Not applicable
12. If your service is available as an app, what is the oldest operating system for Apple and/or Android products that your service is compatible with?
-
Not applicable - the service is not available via an app
-
iOS 17
-
iOS 16
-
iOS 15
-
iOS 14
-
iOS 13
-
iOS 12
-
iOS 11
-
Android 15
-
Android 14
-
Android 13
-
Android 12
-
Android 11
-
Other - please specify
-
Don’t know
13. If your service is available via a browser, what browsers does your service support? Please select all that apply.
-
Not applicable – the service is not available via a browser
-
Google Chrome
-
Firefox
-
Microsoft Edge
-
Safari
-
Opera
-
Maxthon
-
Brave
-
Vivaldi
-
Duck Duck Go
-
Other – please specify
-
Don’t know
Section 4: accessibility
14. Does your service adhere to Web Content Accessibility Guidelines (WCAG)?
-
2.0 (A)
-
2.0 (AA)
-
2.0 (AAA)
-
2.1 (A)
-
2.1 (AA)
-
2.1 (AAA)
-
2.2 (A)
-
2.2 (AA)
-
2.2 (AAA)
-
No
-
Don’t know
-
Not applicable
15. Does your service adhere to any other accessibility guidelines or best practice? Please select all that apply.
-
The European Telecommunication Standards Institute (ETSI) standard for accessibility requirements EN 301 549 V3.2.1 (2021-03)
-
Accessibility requirements suitable for public procurement of ICT products and services in Europe EN 301 549 V1.1.2 (2015-04)
-
Other – please specify
-
None
-
Don’t know
-
Not applicable
16. What, if any, accessibility tools does your service offer? Please select all that apply.
-
Text to speech capability
-
Voice recognition
-
Text magnification
-
Keyboard navigation
-
Compatibility with assistive technologies
-
None
-
Other - please specify
-
Don’t know
-
Not applicable
17. What support does your service offer for users verifying an attribute or proving identity? Please select all that apply.
-
Telephone support
-
In person support
-
Live chat support
-
Email support
-
None
-
Don’t know
-
Other - please specify
-
Not applicable
18. Does your service offer a route for verifying an attribute or proving identity via delegated authority?
Delegated authority is where a nominated individual is given permission to act on behalf of another. For example, when an individual has power of attorney for someone in their care.
-
Yes
-
No
-
Don’t know
-
Not applicable
19. Please specify which language/s your service offers. The 10 most commonly spoken languages in the UK after English and Welsh are listed below (2021 Census data). Please select all that apply.
-
English
-
Welsh
-
Polish
-
Romanian
-
Panjabi
-
Urdu
-
Portuguese
-
Spanish
-
Arabic
-
Bengali
-
Gujarati
-
Italian
-
Other - please specify
-
Don’t know
Section 5: demographic data
20. Does your service collect any demographic information at any point during identity or attribute verification?
-
Yes
-
No
-
Don’t know
21. Does your service store any of this demographic data?
-
Yes
-
No
-
Don’t know
22. What demographic information does your service collect? Please select all that apply.
-
Name
-
Age or Date of Birth
-
Sex
-
Gender
-
Marital Status
-
Nationality (e.g. via country of issue)
-
Address
-
Ethnicity
-
Religion
-
Socio-economic group
-
Other - please specify
-
Don’t know
23. Is any of this demographic information used for monitoring the inclusivity of your service?
-
Yes
-
No
-
Don’t know
24. Does your service collect data on ‘drop-out’ or incomplete user journeys?
-
Yes
-
No
-
Don’t know
-
Not applicable
25. Does your service collect data on the reasons for ‘drop-out’ or incomplete user journeys?
-
Yes
-
No
-
Don’t know
-
Not applicable
26. For single-use identity or attribute checks, what are the most common reasons for incomplete user journeys? Please select all that apply.
-
Not applicable
-
Session ‘time’s out’ before completed
-
Lack of identity evidence
-
Lack of understanding about how to complete
-
Change of mind
-
Lack of trust
-
Process takes too long
-
Other - please specify
27. For reusable identity or attribute checks, what are the most common reasons for incomplete user journeys? Please select all that apply.
-
Not applicable
-
Session ‘times out’ before completed
-
Lack of identity evidence
-
Lack of understanding about how to complete
-
Change of mind
-
Lack of trust
-
Process takes too long
-
Other - please specify
28. Does your service collect any data on the reasons for a user being unsuccessful in verifying their identity or attribute?
-
Yes
-
No
-
Don’t know
-
Not applicable
29. What are the most common reasons for users being unsuccessful in verifying their identity or attribute? Please select all that apply.
-
Lack of necessary identity evidence
-
Technology failure
-
Inaccurate information
-
Suspicion of fraud
-
Other – please specify
-
Don’t know
Section 6: biometric technology
30. What biometric technology does your service use, if any? Please select all that apply.
-
Facial recognition
-
Fingerprint
-
Retina
-
Voice
-
None
-
Don’t know
-
Other - please specify
31. Is there an alternative route for users to verify an attribute or prove identity if they do not wish to provide biometric information?
-
Yes
-
No
-
Don’t know
32. What performance testing has been completed to assess the biometric technology you use for bias? Please select all that apply.
-
Operational testing (testing in conditions similar to the target operating environment)
-
Scenario testing (testing in a simulated operational setting closely resembling the target operational conditions)
-
Technology testing (testing of biometric algorithms using a defined data set)
-
Other (Please specify)
-
No performance testing for bias was conducted
-
I don’t know if performance testing was undertaken to assess the biometric technology for bias
33. How was this testing conducted? Please select all that apply.
-
External independent testing delivered by an ISO/IEC 17025 accredited biometric test laboratory
-
Internal testing that is delivered by a team within the service provider, which uses a recognised testing methodology that meets international standards
-
Internal testing that is bespoke to the service provider that does not follow a recognised testing methodology
-
Other - please specify
-
Don’t know
34. Do you have information on the accuracy rates of the biometric technology your service uses for different demographic groups?
-
Yes
-
No
-
Don’t know
35. Please provide the most recent accuracy rate for each demographic.
Section 7: future improvements (optional)
36. What are the challenges to improving the inclusion and accessibility of your service? Please select all that apply.
-
Cost
-
Security
-
Lack of data
-
More guidance needed
-
Lack of access to government data
-
Regulatory restrictions
-
Other - please specify
37. If you would like to, please add any information about steps you are planning to take to improve the inclusivity of your service in the next year.
38. Are there any other comments you would like to make about inclusion or accessibility? This can be about your service or about the market more generally.
-
Business size is defined as follows:
micro: fewer than 10 employees and turnover of less than or equal to £2 million or balance sheet total of less than or equal to £2 million
small: 10-49 employees and turnover of less than or equal to £10 million or balance sheet total of less than or equal to £43 million
medium: 50-249 employees and turnover of less than or equal to £50 million or balance sheet total of less than or equal to £43 million
large: 250+ employees and turnover of over £50 million or balance sheet total of over £43 million ↩ -
The Web Content Accessibility Guidelines (WCAG) are an internationally recognised set of recommendations for improving web accessibility. They explain how to make digital services, websites and apps accessible to everyone. ↩
-
Public bodies are required by The Public Sector Bodies (Websites and Mobile Applications) Accessibility Regulations 2018 to meet WCAG 2.2 (AA) standard. ↩