Surge response service Alpha assessment
Service Standard assessment report Surge response service 03/10/2024
Service Standard assessment report
Surge response service
Assessment date: | 03/10/2024 |
Stage: | Alpha |
Type: | Assessment |
Result: | Amber |
Service provider: | DHSC / UKHSA |
Previous assessment reports
- N/A
Service description
The Surge Response Service will provide an end-to-end set of capabilities for responding to a public health incident. The service will:
- provide digital testing services to manage any pathogen that requires processing by laboratories
- enable case and contact tracing to be conducted at scale for outbreaks, and in the surge phase of a pandemic
- provide targeted advice and guidance to the public
Service users
This service is for:
- Members of the public who may be infectious or at risk
- UKHSA staff: Health Protection team
- UKHSA staff: contact centre agents
Things the service team has done well
- The service team has worked well with ambiguity. There has been a clear demonstration of senior level engagement as well as close alignment to the bigger and complex strategic imperative. They have evidenced business change management with the wide coalition of stakeholders involved.
- the team has a good understanding of the data they plan to capture and how this will be protected. This has been through the relevant governance. The team have reviewed multiple technical options, and have made decisions based on practicality and maintainability, ensuring that HSA will be able to support the development of the service into beta. HSA has a good open sourcing policy with examples which the team plan to follow. The team have reused common components from both HSA and wider gov, and have explored the option of integrating with One Login for Government in the future. The team plan to build the service on a well provisioned cloud provider which will allow for scaling up and fast iterations.
- good user research undertaken during the Alpha phase on learning from Test & Trace about the naming of the service.
1. Understand users and their needs
Decision
The service was rated amber for point 1 of the Standard.
During the assessment, we didn’t see evidence of:
- research with GPs, pharmacies, and clinics—anyone who will use the service—to understand the different entry and exit points of the user journey and uncover any additional user needs.
In private beta:
- the team needs a robust user research plan for the private beta. The team needs to consider who to research with, what to researching and testing, when, how and why
- the team needs to organise in-person user research to understand the overall user experience from ordering a test, to receiving a test, using a test, returning a test, and receiving a test result. The overall user journey is not just a digital experience. By conducting in-person observational research they will pick up on how things work both inside and outside of the system and service.
- the team needs to increase the number of users they are researching with and continue to ensure that they are from a diverse user group. One in five people has an access need, so they should try to replicate this in their research.
- the team should continue to research with people who have access needs. In-person research will help to understand people’s circumstances and how they will engage with the service.
2. Solve a whole problem for users
Decision
The service was rated green for point 2 of the Standard.
Optional advice to help the service team continually improve the service
- in private beta, ensure that your user group engages with end-to-end service testing on various devices, browsers, offline and assisted digital channels to ensure that you’ve got the scope of the journey right.
- before the beta assessment, research, iterate, test and confirm the offline and assisted digital routes.
- continue to build relationships and work with other organisations that feed into this user journey for example NHS, pharmacies, clinics.
- understand and address any potential constraints for example delivery time to receive a test, not receiving a result, policy constraints, technology constraints.
3. Provide a joined-up experience across all channels
Decision
The service was rated amber for point 3 of the Standard.
During the assessment, we didn’t see evidence of:
- a fully developed series of full end to end user scenarios that include both happy and unhappy paths, online and offline with particular attention to understanding the offline part of the service. This will include close attention and understanding of potentially in person user testing the experience of receiving test kits, using test kits and returning test results. The team use these to help monitor the success of the way the private beta is conducted.
4. Make the service simple to use
Decision
The service was rated amber for point 4 of the Standard.
During the assessment, we didn’t see evidence of:
- a version of the service that the service team is planning to take into Private Beta. It was unclear from the designs and prototype that was presented at the Alpha assessment what functionality would actually be the version that would be taken into the Private Beta
- a clear plan for when the different integrations presented in the Alpha assessment will be added to the service through the Private Beta and what benefits to the user and the longtime success of the service each one will provide
- the service team using the GOV.UK Design System and Front End Prototype Kit. It is strongly advised to design and prototype public facing services using the prototype kit from as early a point as possible so teams build in good practices around accessibility and consistency. Designing this way avoids having to redesign later and helps focus the service team on solving the problem for the user.
- a priority for resolving the naming of the particular parts of the service to all fit together and make sense to the user whenever it is required to be accessible on GOV.UK.
- prioritising the development of a service start page with the GOV.UK content team to ensure the service could go live on GOV.UK at short notice.
- prioritising more user testing on mobile devices through the private beta. Consider the individual behaviour around why a user would be using the service and in the different locations it may occur. Then consider what is learnt and how it may impact the design of the service from the Alpha phase (for example the location of the call to action requires scrolling down on a mobile device)
As the Private Beta phase begins, the team should continue to monitor how the rounds of research, synthesis and design iteration are working ensuring that they continue to match the objectives of the policy and what was learnt through Discovery and Alpha.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
During the assessment, we didn’t see evidence of:
- how the team will onboard users into the Private Beta
- a clear demographic of the users the service intends to go to Private Beta with and a clear plan how the service will learn the most from them
- a plan for how the service team will deal with any abuse of the system (for example if users order test kits on behalf of others without seeking their consent - the team confirms that they are not currently tracking consent)
- enough focus on understanding the vulnerable user journey (the potential combination of struggling to use the online service combined with struggling to successfully conduct a test)
- a clear enough plan how the service team will learn from the way services were delivered during Test & Trace and how the project team can build trust with the general public around the emotive subject of test kits, pandemics and lockdowns
- a clear enough plan how the service team will understand, map out and develop the full assisted digital capability for the service that will work for the user whenever they require the support
In private beta, the team should ensure that access to the service is easily accessible throughout the service journey.
6. Have a multidisciplinary team
Decision
The service was rated green for point 6 of the Standard.
Optional advice to help the service team continually improve the service
- continue to strive to increase the ratio of civil servants in the team.
- ensure continued formal practice and cadence of documentation and knowledge sharing to mitigate risk of large contractor ratio.
- maintain access to specialist experts in the wider health security ecosystem.
7. Use agile ways of working
Decision
The service was rated green for point 7 of the Standard.
Optional advice to help the service team continually improve the service
- build in regular full project team and ecosystem agility health checks or similar during private beta to mitigate against change risks including responding urgently outside of the beta roadmap.
- at next assessment ensure attendance and interactions with more voices in the team to demonstrate their empowerment and perspectives, for example product managers and user researchers.
- continue essential governance and interactions stretching across the organisation and wider ecosystem to ensure up to date understanding and the best possible opportunity to deliver against changing needs.
8. Iterate and improve frequently
Decision
The service was rated green for point 8 of the Standard.
Optional advice to help the service team continually improve the service
- regularly monitor service goals and the flexibility of the team.
- the team’s biggest stated risk is not knowing the exact scenario in public beta. To mitigate, ensure the private beta approach includes the ability to evolve understanding of how to resource to iterate and improve depending on the need.
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
Optional advice to help the service team continually improve the service
- ensure an increased focus on non-digital elements.
- the team could consider using automated penetration and vulnerability testing, such as OWASP ZAP or JFrog xray (or equivalent technologies for the chosen technical stack).
10. Define what success looks like and publish performance data
Decision
The service was rated green for point 10 of the Standard.
Optional advice to help the service team continually improve the service
- build on the progress made during alpha by dedicating UR capability in private beta to focus on the different performance data related stakeholder and user needs.
- continue the wider ‘team sport’ ethos demonstrated with defining KPI goals and plan to learn from analytics during private beta.
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.