View My DBS Result alpha assessment
Service Standard assessment report View My DBS Result 04/10/2022
Service Standard assessment report
View My DBS Result
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 04/10/2022 |
Stage: | Alpha |
Result: | Met |
Service provider: | Disclosure and Barring Service |
Service description
DBS will replace and enhance existing services that customers can use to view the results of their DBS checks, including the paper certificate and an historic online portal.
A key need of our service is to provide a tool to our customers (applicants) that allows them to view and understand the results of their DBS checks and to quickly and easily share their result with any 3rd party that requires to see it, usually to an employer so that they can make safe recruitment decisions.
Service users
Primary users:
- Job and volunteer applicants who require a level of DBS check
- Employers who are recruiting roles that require a DBS check
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- Through a variety of research methods, the team identified 4 user groups (across primary and secondary users) that the proposed service will need to cater to
- The team provided a strong understanding of the pain points involved to the existing way in which DBS results are viewed and shared. The user needs have clearly been articulated and the team have demonstrated a thorough understanding of the people being designed for (and with). The team also explained that 96% of applicants are users who have clear results
- The team are working closely with adjacent services (such as the Request a Standard or Enhanced DBS check service) as part of the wider service design, and research is being observed and shared across the programme of work
- The team clearly demonstrated how the research has influenced design choices in alpha. For example, the team learned that a proportion of users who have previous convictions feel a level of anxiety when receiving their DBS results. Because of this, the way in which information is presented in the service must be done in a way which didn’t add to this anxiety, but also sufficient to get the user’s attention on what the result means
- A strong research plan is in place for beta. The service is dependent on the Request a Standard or Enhanced DBS check service to recruit users for private beta, however, the team explained they have back-up plans in place if that service is delayed in any way (e.g. users from the existing Basic online service and existing paper routes)
What the team needs to explore
Before the next assessment, the team needs to:
- As the service progresses into beta, the team should use the private beta window to gather real usage data to complement existing research methods to help triangulate research data. This will help build further confidence in the team’s understanding of their users as they continue to iterate designs on the back of solid evidence
- Undertake further research with secondary user groups. The team explained that if they had more time in alpha, they would have used this to further engage with regulatory bodies (such as CQC, Ofsted etc). This research with secondary users should also be expanded to include contact centre advisors, and once in private beta, further engagement around common enquiries that can help address any service pain points
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- The team have been communicating well with other government organisations and have had workshops with them.
- The team are working closely with other adjacent services (such as the Request a Standard or Enhanced DBS check service and the Basic DBS check service) to provide a central place where applicants can view and share results with employers and have plans in place where one of those might not be ready in time.
- The team have collaborated with adjacent teams including the management in the contact centre with show and tells and worked with the business readiness team.
- They have designed the service around how the user thinks about their data and how they want to share it to balance the needs of users who want to control their data with those who want to share it widely to avoid being seen as a blocker.
What the team needs to explore
Before the next assessment, the team needs to:
- Undertake further research with regulatory bodies (such as CQC, Ofsted etc.) and work with them to explore how this service can fit into or influence existing working practices around the need to share applicant DBS results
- Continue testing the sharecode and upfront consent model and apply that some
- Keep working to join up the adjacent journeys and DBS services for the user so that it feels as seamless as possible.
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- Considered search terms for service and where it sits in their online content for the start of the user journey.
- The team has not only tested multiple prototypes but also multiple ways of solving the users’ problem. They have structured their research and design around hypothesis, success criteria and measurement.
- The team have considered the online and offline journeys, the communication plan and the strategy around digital uptake.
What the team needs to explore
Before the next assessment, the team needs to:
- Document the support model particularly how it would work with One Gov login and DBS and who is responsible for supporting users in the steps
- Explore the sharecode journey and upfront consent model further.
- Work closely with the contact centre so that users can “stay online” instead of leaving the platform to seek support.
- Ensure that they test the multi and omni channel journeys.
- Document and develop the to-be journey, including the unhappy paths, and the roadmap.
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- The team is effectively using GOV.UK design system and where appropriate is looking to expand on the patterns, in particular, navigation.
- They have tested multiple versions of the service with the users and refined based on feedback. This includes related parts of the journey such as emails that form part of the expectation management.
- They have tested the system with non-native English speakers to ensure simplicity of wording and have updated the text, including guidance text, accordingly.
What the team needs to explore
Before the next assessment, the team needs to:
- Ensure that if they are creating new patterns, they share them back to the community.
- Document the support model so it is clear in the journey, particularly if they move to One Login, who will be supporting the users in each step- DBS or GDS.
- Work with members of the support team to be able to provide guidance on the digital parts of the journey and associated troubleshooting.
- Review the timing expectations for how long the service will take the users so it is correct on the start page
- Ensure that the service is not just simple to use for individuals but also for users at large entities (eg employees in HR departments, who will be the recipients of the sharecodes and will potentially be receiving and tracking hundreds of them). There likely needs to be some mechanism for an audit trail.
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- The team managed to undertake a reasonable amount of accessibility testing in alpha, with the focus being on users with cognitive, motor and visual impairments working primarily with a third-party agency for most of this recruitment activity,
- They have also explored assisted digital factors and have more planned for beta.
- The team have considered that accessibility of the service, in this case, can include users with lived experience of criminal records and for who this process can be stressful. They have not only tested with these users but have designed the results page sensitively as to not cause additional stress to them.
What the team needs to explore
Before the next assessment, the team needs to:
- Explore the reactions of users with different types of colour blindness to the summary screen. Although it is clear that meaning is not conveyed only through colour, the use of blue and green may affect some users. Online tools such as the one provided by Toptal (https://www.toptal.com/designers/colorfilter) may be a useful starting point.
- Test that consistency between the DBS service and the authentication service (One Login for Government – OLfG) does not affect accessibility. WCAG points 2.1.1, 2.4.2, 3.2.3 and 3.2.3 provide further information on the importance of consistency.
- WCAG 2.2 (still in draft) calls for ‘consistent help’ (see https://www.w3.org/WAI/standards-guidelines/wcag/new-in-22/#326-consistent-help-a). Given that the service will combine OLfG and the DBS service, it is worth planning ahead to understand the impact of this point.
6. Have a multidisciplinary team
Decision
The service Met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- The team have continuity of people across several services, particularly the designer and the content designer, which is enabling them to ensure consistency of user experience across services.
- The team are operating efficiently, observing all agile ceremonies and include a cross section of the business stakeholders in reviews and updates
- The team understand the importance of working coherently with external partners and ensure knowledge is shared effectively
- Other members of the team have been able to join in user research.
What the team needs to explore
Before the next assessment, the team needs to:
- They should consider if the current model of multi service designer will be too burdensome for a single person in beta and look for other ways of providing for knowledge sharing and continuity.
- They should ensure there is a clear plan of who remains in the team moving forward and who is leaving, ensuring any gaps in skills area filled.
- Continue to strengthen methods of transferring knowledge between external partners and internal support teams.
7. Use agile ways of working
Decision
The service Met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- All ceremonies are observed, and retrospectives are undertaken for the team to reflect and continually improve
- Engagement across the business is high and good levels of communication and information transfer are maintained.
- The importance of knowledge transfer between parties is understood, and ways of working are clear across the service.
What the team needs to explore
Before the next assessment, the team needs to:
- Continue to work in sprint cycles with sessions to review goals and ensure issues are escalated appropriately
- Continue to ensure the right level of governance is maintained through the Project Boards, to allow the team to be empowered to make independent decisions
- Ensure the product roadmap is visible, and the backlog is sufficiently populated to provide stories to the development team
- Ensure sufficient time is given to run retrospectives and apply outcomes to sprints moving forward
8. Iterate and improve frequently
Decision
The service Met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- The team covered six different versions of the prototype during the alpha and demonstrated a firm understanding of what service pattern they have selected
- Outcomes of user research sessions clearly informed the development of the service.
What the team needs to explore
Before the next assessment, the team needs to:
- Continue to test and iterate, particularly with those who are not able to use the service and edge cases
- Utilise the findings from analytics and other sources to influence design changes
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- The team is working closely with security architects within the department
- Minimal citizen data is shared
- Attack vectors are well understood and documented
- The service is protected using GPG44-compliant standards via OLfG
- Appropriate measures are in place for the design and build of a secure service
- The team have completed a DPIA and shared this for review
What the team needs to explore
Before the next assessment, the team needs to:
- Re-visit the need to use mixed case for the share code. This has been a recommendation from DBS security colleagues, but it’s not clear to the panel what degree of risk is introduced by not using mixed case. Making the share code case-sensitive will make the service harder to use for some and would make the experience more consistent with other parts of GOV.UK, for example Prove your right to work (https://www.gov.uk/view-right-to-work). Consistency between these parts of GOV.UK should also be an ambition.
- Continue working with OLfG to resolve the cross-domain cookies question, in other words removing the need for users to accept cookies on two or more domains (i.e. GOV.UK, OLfG and DBS).
10. Define what success looks like and publish performance data
Decision
The service Met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- The team have developed they have developed comprehensive performance metrics
- Data is updated in regularly and supports decision making
- Common technologies (Google Analytics) are being implemented.
What the team needs to explore
Before the next assessment, the team needs to:
- Continue to develop the GA4 changes to maximise the additional benefits and insight.
- Continue to ensure service data is fit for purpose, and meets business needs
- Ensure any outcomes (reports, data feeds etc.) are well documented so they can be maintained moving forward
- Ensure metrics are provided to capture users who start using the service but cannot complete it
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- The service has made appropriate technology choices and has a plan for implementing these in private beta
- The service will take advantage of cloud hosting
- Technology and architectural choices are subject to clear internal governance and have the necessary approvals
What the team needs to explore
Before the next assessment, the team needs to:
- Carry out an environmental impact assessment of the service. This should include the environmental impact of the existing paper-based service. Is there an opportunity to reduce carbon emissions, for example by offering citizens a paperless version? Further guidance is available in the Technology Code of Practice - https://www.gov.uk/guidance/make-your-technology-sustainable
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- The prototype is available on GitHub
- The team has a clear plan for making their source code open and is working with DBS to establish a clear policy, as open source is a new area for DBS
What the team needs to explore
Before the next assessment, the team needs to:
- Move the prototype code from its current location on a personal GitHub account to a departmental one
- Publish their beta code to the DBS GitHub account, following best practice (see https://www.gov.uk/guidance/government-technology-standards-and-guidance#open-source)
- Work with other teams in DBS to open source any shared components
- The slides presented to the panel indicated the use of GitLab for source control. For the next assessment, it would be useful to explore the source code tool of choice, specifically considering coding in the open. If the plan is to keep code on GitLab and clone it to GitHub, the team may wish to consider whether GitLab CI could replace Jenkins.
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- The team has worked closely with the GDS OLfG service and have influenced the design and build of OLfG
- GOV Notify is used for email communication with citizens
What the team needs to explore
Before the next assessment, the team needs to:
- Review the copy component (see https://github.com/alphagov/govuk-design-system-backlog/issues/249) and test this pattern with users.
- Work with other departments where the ‘share code’ approach has been implemented (e.g. Share driving licence details, Right to work/rent check) and where there is a common interest. Collaborate to establish a common pattern using the design system backlog. Some work has been started already (see https://github.com/alphagov/govuk-design-system-backlog/issues/234), but it would be useful to see the team work across government to suggest a common pattern.
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- There is integration between DBS and the technology supplier, with clear plans for supporting the service in private beta
What the team needs to explore
Before the next assessment, the team needs to:
- Work with GDS to monitor any potential issues users may have with OLfG. The service has been designed to have two areas of support – OLfG and the DBS service itself. During private beta the team will need to monitor support levels carefully