Authentication and Digital Identity
The report for GDS's Authentication and digital identity alpha assessment on the 7th of July 2021
Service Standard assessment report
Authentication and digital identity
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 07/07/2021 |
Stage: | Alpha |
Result: | Met |
Service provider: | Government Digital Service |
Service description
In Government, there are an estimated 121 different single sign-on solutions. This is because service teams build their own authentication solutions instead of them being a GaaP product. People who use multiple government services often have multiple sets of sign-in details.
This means it can be difficult for people to understand they have more than one account; know which account to use for which service; remember the sign in details for each account. This in turn can make it harder for people to access government services that require signing in. There’s an opportunity to reduce this confusion and improve access to services that require signing in by moving to a single set of sign-in details that allows access to all services.
Service users
This service is for any member of the public needing to ‘sign on’ as part of their GOV.UK journey. For example, users that are accessing their Personal Tax Account or updating personal details which need to be kept secure.
This service is also for Government delivery teams to use. Teams will integrate with the single sign-on solution to enable users to securely log in to their service.
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has conducted an impressive amount of research using a varied range of methods. With over 300 end-users engaged in research so far, the team have been able to understand the common needs and behaviours of their user groups and have articulated these in a set of ‘mindsets’ that clearly explain each user type
- research has been conducted with both end users and internal Government users to understand the problem from a range of perspectives. Equal weighting has been given to each user group, and the team are planning research strategically between two researchers to enable appropriate coverage of all of their user groups
- access needs and digital inclusion have been considered from the start
- users with cognitive access needs have been engaged in research during Alpha
- each user has been mapped onto the digital inclusion scale, with some users falling into the ‘never have and never will’ to ‘willing and unable’ categories
- the team have already begun to consider access arrangements and assisted digital support models, for example not always requiring a user to have access to a mobile phone/signal to perform 2-factor authentication
What the team needs to explore
Before their next assessment, the team needs to:
- further refine their user needs. Although the team has a set of user needs for multiple user types, these needs were often articulated as solution-specific and didn’t meet the test of a good user need, in particular: “focus on the user’s problem rather than possible solutions (for example, needing a reminder rather than needing an email or letter)” Conversationally, the team discusses users and their needs from a more behavioural standpoint, for example, “users need to be seen and heard”, which shows that they have a very clear understanding of their users’ behaviours and pain points. In a follow-up workshop, the panel will work with the service team to refine their user needs and translate those pain points into needs that describe the problem to be solved rather than the proposed solution
- although the team have conducted research with users with cognitive access needs, they recognised that they have not yet been able to conduct research with users of assistive technology due to prototype constraints. Whilst this is acceptable in the Alpha phase, the team should include users of assistive technology in their Beta research to ensure that they’re building a truly inclusive service
- the team indicated that they intend to conduct more surveys in their Beta phase to further understand user behaviours. Whilst this is okay, the team should look to utilise a range of methods to understand user behaviour and should generate qualitative insights to support their user needs and promote a deeper understanding of their users
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team are working closely with other parts of the wider team, Accounts and Verification, to solve this problem
- the team have had promising conversations with service teams in a range of different departments
What the team needs to explore
Before their next assessment, the team needs to:
-
continue to engage with service teams across multiple departments, utilising contacts from their programme board and from GDS communities of practice to understand a range of perspectives around take-up of this service
-
the team runs the risk that they could build an excellent, user-centred service that individual departments, directorates or service teams are unwilling to take-up which, in turn, would impact the team’s ability to solve a whole problem for users. Early, regular engagement with decision makers from a range of departments is key to mitigating this risk
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the related teams are meeting regularly to ensure consistent interactions and language
- the teams are working closely with service teams across government to understand the end to end journey within those services
What the team needs to explore
Before their next assessment, the team needs to:
- continue to engage with related teams and service teams that will be including the service
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- Government Gateway, NHS Login and GOV.UK Account are being used as examples and the teams working on these account services are being engaged
- blueprints have been created for both end users and service teams. For end users they cover their journey from arriving at GOV.UK to leaving a service (including supporting processes and responsibilities)
- account confusion has been identified as a primary pain point. The team experimented with a wide range of potential designs to mitigate account confusion
- the team have moved away from navigating to this service via a button in the GOV.UK header, to initially just introducing this account inside services (to mitigate high user expectations, at least until the service is widely integrated and meeting all core needs)
What the team needs to explore
Before their next assessment, the team needs to:
- develop a more cohesive picture of the wider end-to-end service in collaboration with their related teams, for example through shared service mapping
- progress naming the service (the team already has naming workshops and scoring sessions planned with relevant stakeholders)
- remove deviations from the design system where there isn’t a compelling argument for variance. For example, non-standard colours for buttons and confirmation panels
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team are analysing GOV.UK Verify and GOV.UK Account queries to predict the support requirements for this service
- they are planning support lines, responsibilities and response times
- phone message authentication can be via compatible home phone lines (an alternative to mobile SMS)
- additional alternative authentication methods are being considered e.g. authenticator apps
- assistive tech testing is planned
- a rota is planned for a service team Slack support channel and support email inbox
What the team needs to explore
Before their next assessment, the team needs to:
- investigate ways to include users:
- with non-UK phone numbers
- who can’t receive SMS on UK mobiles / home phones
- consider bringing forward assisted digital support. No Assisted Digital support had been planned in the first year. It seems particularly necessary if the digital service cannot support expats or users that cannot receive phone messages
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team have all the expected roles in place and have clear plans for how to scale the team for Private Beta and beyond
- the team have thought about the structure of their team in relation to the teams working on the other parts of this service
- there is a good balance of civil servants and contractors and plans in place to ensure no loss of knowledge or confusion when people leave and join the team
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team are using agile ways of working with a range of ceremonies and techniques that they regularly review
- the structure of their 6 week cycles across the wider programme and then within that 3 x 2week sprints seems to work well and ensure wider communication and coordination across the programme
- the team have access to all the appropriate tools
- the team showed a good emphasis on sharing their work including holding cross programme show and tells at the end of each 6 week cycle
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team were able to demonstrate how they had iterated and improved the service based on user feedback
- the team are empowered to make changes to their service
What the team needs to explore
Before their next assessment, the team needs to:
- continue to iterate and improve their service using their findings from user research and their findings from their KPIs
- ensure they continue to learn from research in the wider programme and how users will experience this journey within the wider service journey
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team had been working so closely with NSCS and spent considerable effort looking into the strategies and options for storing PII data
- ensuring only the minimal date required is stored
- AWS is proposed for the beta; the shared responsibility model is acknowledged and data security and integrity is planned for
What the team needs to explore
Before their next assessment, the team needs to:
- construct a robust plan for user migration from GOV.UK accounts
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team clearly recognise the importance measuring the success of their service
- the team have a well thought through plan for measuring success in Private Beta and beyond
- the team have worked hard to identify KPIs for different parts of their service and have planned to review these throughout private beta.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure they use the data gathered during their private beta to improve their service
- continue to think about ways to measure their service and iterate their approach
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has chosen to integrate with and build on the nimbus code base
- the team are looking to adopt serverless technology to help manage the scale needed in the next phase
- for the POC they used GOVUK PaaS, an appropriate decision to accelerate the delivery and get findings early while scale was not an issue
- the team have explored a number of options
- contributing to the Gdsway documentation updating with experience of serverless technology
- the team built dummy services to test the integration of the service with third party government services
12. Make new source code open
Decision
The service tpoint 12 of the Standard.
What the team has done well
The panel was impressed that:
- The openIDConnect service and example relaying parties are open source https://github.com/alphagov/di-auth-oidc-provider , https://github.com/alphagov/di-authentication-api ,
https://github.com/alphagov/di-auth-stub-relying-party
- the team uses github actions and concourse CI to ensure code is well integrated
What the team needs to explore
Before their next assessment, the team needs to:
- ensure there is good code practices around review and there is good active conversation in PRs
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- OPENIDConnect standard fully embraced and approach to known modern authentication practices will ensure ease of adoption a key factor to the success of the service
- components of the application use existing open source libraries
What the team needs to explore
Before their next assessment, the team needs to:
- creation of good onboarding material and standardised approaches to allow departments to adopt the service quickly
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- as an Alpha Assessment the team provided enough information for the next phase to give confidence a reliable service would be built