Record a patient safety event
The report for the Department of Health and Social Care's record a patient safety event beta assessment on 3rd March 2020.
From: | GDS |
Assessment date: | 03/03/2020 |
Stage: | Beta |
Result: | Met |
Service provider: | NHS England and NHS Improvement (NHSE/I) |
Previous assessment reports
Service description
This service aims to help support national learning for improvement in the safety of NHS care.
Service users
This service is for healthcare staff who wish to voluntarily share information about things that have gone wrong in healthcare. The information collected is used by the National Patient Safety team at NHSE/I to fulfil statutory duties to understand the safety of NHS services, and to provide advice and guidance back to the NHS on how they can improve their practice to keep patients safer.
1. Understand user needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
-
the team tested with a variety of users (for example occasional, regular, system users) and organisations (for example small, large, with LRMS and without, engaged and not-engaged)
-
the team used a variety of methods appropriate to beta (for example moderated and unmoderated, qualitative and quantitative/survey)
What the team needs to explore
Before their next assessment, the team needs to:
- plan their next recruitment and testing cycles thinking of groups of users (for example, engaged/disengaged, clinical/non-clinical, information inputter/information user, with LRMS or not), rather than organisations (for example, integrated or not). This is to ensure even more variety and representativeness in their sample
2. Do ongoing research
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team research based used user needs/stories to prioritise their backlog in an iterative way, testing different features by testing rounds
- the team continually test content and language with a variety of users (for example clinical and non-clinical reporters), challenging their own assumption of what is more intuitive
What the team needs to explore
Before their next assessment, the team needs to:
- do end-to-end user flow testing (possibly A/B testing), gathering solid evidence of:
- how the users prefer the flow (e.g. text box at the end/beginning) to maximise uptake and decrease under-reporting
- how different flows might produce more/less quality outputs for the system users (e.g. academics being able to use the information quickly)
It is important this is investigated testing riskiest assumptions, so that the balance between the needs of the users and the needs of the system are based on research and testing
3. Have a multidisciplinary team
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team is fully blended, with NHS staff and contractors co-located and working as one team
- the team are already ensuring that there are knowledge sharing activities underway to mitigate the loss of the supplier in the future
- the team is engaged so closely with the vendors, and utilising various comms methods to get the message out there
What the team need to explore
Before their next assessment, the team needs to:
- ensure that the knowledge sharing activities continue as the work increases, it is very easy for such activities to be deprioritised
- make sure that a content designer and an interaction designer are involved in any further iteration, and they’re familiar with the guidance and guidelines of designing and writing for NHS services
4. Use agile methods
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team is co-located and working using an iterative approach
- the team are working together to drive prioritisation, and all the team seem to be fully engaged with the vision for the product
5. Iterate and improve frequently
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team are iterating the frontend design so well, it was great to see such a clear example of using feedback to improve the user experience
What the team needs to explore
Before their next assessment, the team needs to:
- continue iterating the design of the service to continue making it easier for a user to complete their task and collect the relevant data required for the business, specifically enabling the user to submit earlier
6. Evaluate tools and systems
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- there is extensive use of native cloud functions, infrastructure as code and automatic deployment of updates
- APIs are used to good effect for LRMS and for moving data to the long-term datastore
7. Understand security and privacy issues
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- significant effort has been put into identifying unnecessary personal information in the submitted reports
- there is good role separation between development and live environments
What the team needs to explore
Before their next assessment, the team needs to:
- continue the development of machine learning in the linguistics module to identify inappropriate PII with more confidence
- prepare for a possible increase in erroneous submissions as the new solution becomes more accessible
8. Make all new source code open
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- open source products are being used where possible and appropriate
- although new code is not being published, it is already being shared and reused within NHSE
What the team needs to explore
Before their next assessment, the team needs to:
- produce a credible process for reviewing new code so that it can be made public
9. Use open standards and common platforms
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- open source platforms and tools are being used where possible and appropriate
10. Test the end-to-end service
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team has separated collecting incident reports from analysis, as recommended at alpha. However, this has led to some loss of the overall context when this service is considered in isolation
What the team needs to explore
Before their next assessment, the team needs to:
- have a credible plan for monitoring the effects of increasing load, especially on the linguistics processing which doesn’t scale automatically
- continue to iterate user feedback to improve confidence that expectations are being met
11. Make a plan for being offline
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team understood patterns of use of the old NRIS system and used this to argue that the new service being offline would not present any real hardship
- the team understood the inherent resilience of the cloud services that should make outages less common
- supplementary information made it very clear what steps would be taken if the service nevertheless did go offline
12. Make sure users succeed the first time
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team took on board the recommendation of having an interaction designer and a content designer as part of the team
- a significant amount of effort was devoted to test the online recording journey, despite the fact that it only applies to a minority of users
- the team has gathered feedback from a variety of users, testing some of the riskiest assumptions around language
- the team made sure the service is accessible
What the team needs to explore
Before their next assessment, the team needs to:
- use the online incident report journey as the main proxy to design and test the riskiest assumptions about the way data is structured
- run testing with a version of the content that’s fairly stable and they feel confident about, using A/B testing for terminology
- understand how the messaging on the landing page can be improved to support the digital take on, and to align with the wider comms plan, removing detailed guidance that should be provided throughout the journey
- make sure all the assumptions tested for this phase are tested again before the online reporting journey is made available to other user groups, in case this happens before the Live assessment. Special consideration should be devoted to users who have access needs and less familiarity with the terminology
13. Make the user experience consistent with GOV.UK
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team largely used standard patterns and components - a list of suggested improvements was shared with the team where the current version diverged
What the team needs to explore
Before their next assessment, the team needs to:
- focus their content reviews to make language as plain as possible following the NHS style guide, particularly passive voice and plain English synonyms where terminology isn’t required because the current user groups are expecting it
- make sure that a content designer and an interaction designer are involved in any further iteration, and they’re familiar with the guidance and guidelines of designing and writing for NHS services
14. Encourage everyone to use the digital service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team are engaged so closely with the vendors and have a good view of when which vendors will roll on to the service and the support they may require
What the team needs to explore
Before their next assessment, the team needs to:
- spend some time understanding the support models in place within the organisations that will be using the service, they are currently working to the assumption that what is already in place will suffice
15. Collect performance data
Decision
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the team are collecting data from a variety of sources
- data is being combined to use in decision making
What the team needs to explore
Before their next assessment, the team needs to:
- define the data collected for MI and ensure this can continue if the service moves in to live
16. Identify performance indicators
Decision
The service met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the team have a suite of measures they can apply to their service
- the team have measured the goals they set out in private bets
- the team have measurements aligned to their hypotheses iterations
- the team used analysts to understand sample sizes despite not having access to a digital performance analyst
What the team needs to explore
Before their next assessment, the team needs to:
- consider the measurable goals of their public beta. There are some high level goals for the service overall, but some micro level goals will allow the team to measure their achievement as they move through private beta and understand when they are ready to move on
- understand how they will evidence achievement of these, highlight measures and benchmarks that will show progress
- have a performance analyst with time dedicated to the team
17. Report performance data on the Performance Platform
Decision
The service met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the team are reporting outside of the development team
- the team has automated the regular sharing of data, this will put them in a good position to have data ready to upload to the performance platform
What the team needs to explore
Before their next assessment, the team needs to:
- understand how to get the data they share on to the performance platform before live
18. Test with the minister
Decision
The service met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- the service is highly visible within the department and the wider stakeholders, this includes a mention in ministers comms