'Give feedback on care'
The purpose of the Give feedback on care service: a way for people to tell the Care Quality Commission (CQC) about their experience at a health or social care providers. CQC inspectors and analysts use the information from this service to help us decide when, where and what to inspect.
Digital Service Standard assessment report
Give feedback on care
From: | Department of Health and Social Care (DHSC) |
Assessment date: | 20/08/2019 |
Stage: | Beta |
Result: | Met |
Service provider: | Care Quality Commission (CQC) |
Previous assessment reports
Service description
The purpose of the Give feedback on care service: a way for people to tell the Care Quality Commission (CQC) about their experience at a health or social care providers. CQC inspectors and analysts use the information from this service to help us decide when, where and what to inspect.
Service users
- anyone who has used health or social care services in England
- carers/relatives of people who use health and social care services in England
- individuals who work in health and social care settings in England (i.e. potential whistleblowers)
- organisations who are part of our charity partnerships work who are incentivised to give feedback for users from seldom heard voices
- internal CQC staff who triage, analyse and act on the information that a member of the public submits
1. Understand user needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the service team (operating in a complex context) is working in a way that is highly user-focused and evidence-based
- the researcher has developed simple yet effective behavioural personas (and anti-personas) which have allowed the team to consider deeper user needs when designing content
- the service has been developed to take account of natural user behaviour and preferences, while accommodating the needs of the organisation, and the team has thought carefully about how to handle situations where these may be in conflict. As an example, the team were able to clearly articulate how they had designed for staff whistleblowers, to encourage them to share important information about patient care while advising them about possible complexities in the legal situation
- research has been conducted across multiple platforms, and the team showed a very good understanding of mobile performance & constraints
- accessibility and assisted digital needs have been researched thoroughly and the team showed a good, deep understanding of access needs in relationship to the service
- the team is also future-proofing the service, by considering new care contexts, such as e-health/online GPs
What the team needs to explore
Before their next assessment, the team needs to:
- share their ways of working with others - there is a great deal of good practice here which others could learn from
- continue to build their understanding of real users and important subgroups, as they operate at greater scale
2. Do ongoing user research
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team has taken a considered, thoughtful and strategic approach to user research during private beta, using a mixture of methods to inform the development of their service design
- the team has looked at the realistic end-to-end journey, including email acknowledgements
- the team has continually used evidence and feedback from live service teams to guide their thinking and develop a service which meets user needs
- they have made highly intelligent use of analytics and A/B testing during private beta, and carefully monitored the differences between general user research, and evidence from real, live users of the service
What the team needs to explore
Before their next assessment, the team needs to:
- continue their thinking about the end-to-end journey (including search and the CQC start page)
- continue to use feedback and data from real users, at volume, as they scale up the service
3. Have a multidisciplinary team
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- there is a full, agile, multidisciplinary team in place. There are 14 people on the team, 66% of those are permanent staff, the rest are contractors
- there is currently a vacancy for an Interaction Designer and Content Designer is filling in temporarily
- going into the public beta, most of the team will remain the same. Contractors will be replaced by permanent staff
What the team needs to explore
Before their next assessment, the team needs to:
- make sure that when contractor roles are replaced by permanent staff, there is enough time for handover to ensure knowledge transfer
4. Use agile methods
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team demonstrated a mature use and good understanding of agile techniques
- they are using Scrum and working in 2-week Sprints
- they showed that they are regularly engaging with their stakeholders, doing show & tells and inviting other teams to shadow
- the team are not all co-located but have made a good use of a variety of collaboration tools (Jira, Confluence, Google hangouts, Skype). They have set times for their agile ceremonies and also meet regularly as a whole team
- the team are the champions of the agile approach at CQC and have set up a reverse mentoring scheme for leaders with no agile experience
What the team needs to explore
Before their next assessment, the team needs to:
- make sure that they stay empowered to make decisions on their service through changing governance structure caused by the wider transformation programme
5. Iterate and improve frequently
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- this is a user-driven project, prioritisation is informed by user research and performance data
- the team demonstrated how user research and data feed into their weekly sprint review and how changes are taken forward, prototyped, tested and iterated and how permanent changes are developed, tested and published to the live service
6. Evaluate tools and systems
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the service uses recognised technologies such as Azure and Cosmos database
- for deployment it was configured using Azure Dev Ops with two main integrations Notify API and Siebel CMS
- a very clear and informative technical diagram was presented at the assessment which enabled both technical and non-technical assessors to appreciate the technology used
What the team needs to explore
Before their next assessment, the team needs to:
- consider using Restful APIs and if this is not feasible provide a short report to explain why not
7. Understand security and privacy issues
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team worked with colleagues (DPIA) frequently in assurance to produce threat model in DPIA. (DPIA also covers GDPR requirements for service)
- the team used peer review and automated steps
- pen tests carried out by a third party were successfully passed
- there is an independent QA Team
- back office security is in place
What the team needs to explore
Before their next assessment, the team needs to:
- keep up the good work
8. Make all new source code open
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the service uses .Net Core code with a road map to publish code in open
- user working registrations (code reuse)
What the team needs to explore
Before their next assessment, the team needs to:
- provide a link to code on public github
9. Use open standards and common platforms
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the service uses Tag Helpers - GDS Design System
- the service uses Notify
What the team needs to explore
Before their next assessment, the team needs to:
- look at how to implement locations of all providers
- look at how address search functionality might be reused
10. Test the end-to-end service
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- unit testing of all source code was carried out
- automation testing of: smoke testing, User Story Acceptance, regression testing and compatability testing was carried out
- performance and resilience testing (Soak testing, Ramp/volume testing and 3rd Party Security/penetration testing) carried out
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate expansion of proactive monitoring - feedback to web app
- the web application still needs to be made truly reusable
- integration with data warehouse. CQC is looking at suggestions about how data can be leveraged - what data warehouse should do (not just drop data in warehouse)
11. Make a plan for being offline
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a plan for handling offline scenarios of this service via the preexisting ‘Share your experience’ form
- the ultimate fallback is the customer service centre
What the team needs to explore
Before their next assessment, the team needs to:
- report on progress of the replacement of current backup: ATOS with ITSM framework
12: Make sure users succeed first time
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the service is intuitive, and does a good job of guiding users through the process of giving their feedback
- the journey adapts based on what type of provider the user is giving feedback about, to avoid asking the user redundant or confusing questions
- the team could clearly articulate the rationale for design decisions that felt counterintuitive, for example putting the ‘whistleblowing’ page after the ‘feedback page’
- the team has a good understanding of which parts of the design are the least effective. It was refreshing to see a team to share work in progress during an assessment. The panel was confident that the good work they are doing will continue in public beta, and will be focused on the parts of the service that will most benefit from it
- the team had an accessibility audit carried out by the Digital Accessibility Centre (DAC), as requested in their alpha assessment
- they have been working on fixing issues raised in the audit report and going into public beta, they are expecting all A and AA issues to be resolved
What the team needs to explore
Before their next assessment, the team needs to:
- define a quantitative metric that measures whether their search is working for users. This metric (or metrics) should be a fair proxy for what they think is a good user experience, based on what they’ve observed in usability testing. Some examples that other teams have used are how often the user is clicking one of the top three results, or the number of users who have to edit their search terms before clicking a result. Without a quantitative measure it will be difficult to know if tuning the search has been effective, or has had unintended side effects. The panel recommends talking to the search team on GOV.UK to help define this measure
- continue to experiment with varying the questions they ask to get the most useful detail in the feedback their users are providing, balanced against the user’s need to voice their experiences
- address the findings of both the third party and GDS accessibility reports
13. Make the user experience consistent with GOV.UK
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- while the service is exempt from being on GOV.UK, the team has extensively reused patterns and components from the GOV.UK Design System
What the team needs to explore
Before their next assessment, the team needs to:
- look at the consistency between the search page and search results page. The search page has a magnifying glass icon on a blue background; the results page has a green ‘Search’ button. It’s also worth looking at how far down the list of providers is pushed on the results page. Ideally users should be able to see a few results before they have to scroll
- consider taking users back to the ‘check your answers’ page after they have changed one of their answers (rather than making them continue through the rest of the journey again). Other teams on the GOV.UK Design System backlog have reported that this has worked better on their services
14. Encourage everyone to use the digital service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- because the current (legacy) service drives failure demand to phone lines, the team is working closely with call handlers while developing the new service to help reduce the number of people using non-digital channels
- the team recognises that form search is their biggest challenge and that this drives calls to call centres. They are taking steps to improve search, intend to prioritise this for pubic beta and continue to evolve it
- the team has done research on the title and start page of the service to make it easy for users to understand what the service is for and how to use it
- the team explained that healthcare providers are legally mandated to display posters
- the team has done search keyword analysis in order to drive users to the service
- the panel was impressed by the depth of UR work and understanding including on assisted digital
What the team needs to explore
Before their next assessment, the team needs to:
- do more research into where and how the service is represented on the CQC homepage and how search keywords can help drive users to the service
- link up with the GOV.UK search team and other teams that have overcome similar challenges
15. Collect performance data
Decision
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the team demonstrated how they use performance data to make decisions - drawing on an iteration of the ‘give your feedback page’ to show how data is used alongside user research
- the team made excellent use of their legacy platform for baselining - but also explained nuances to these comparisons, for example changes to anonymous users may be due to clearer content
What the team needs to explore
Before their next assessment, the team could explore:
- the team has referenced making more use of visualisation tools such as Google Data Studio for presentation of data. This might help the wider team with their understanding of data, as well as benchmarking and analysis of AB tests
- the team suggested that longer term they may move away from Google Products. If this takes place the team need to ensure they have a suitable transition process - maintaining existing standards of anonymising data and considering dual running with the new tool
16. Identify performance indicators
Decision
The service met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- there is a good understanding of how the mandatory KPIs are calculated. They have also identified additional service-specific KPIs which are monitored internally
What the team needs to explore
Before their next assessment, the team could explore:
- we had a brief discussion of Performance Frameworks - which can be useful as services transition between phases of development. While the team has an established measurement plan, this might help focus the ongoing roadmap and the connections with user research
17. Report performance data on the Performance Platform
Decision
The service met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the team are ready with their Performance Platform dashboard
What the team needs to explore
Before their next assessment, the team needs to:
- provide a link to the their dashboard on the Performance Platform
18. Test with the minister
Decision
The service met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- the Chief Executive met with the Minister of State for Care on 23 July and tested the service with her
- the Chief Executive intends to share an update on plans for the launch into public beta with the minister
What the team needs to explore
Before their next assessment, the team needs to:
- continue to test with the Chief Executive and engage with the minister
Next Steps
This service can now move into a public beta phase, following the recommendations outlined in the report.
The service must pass a live assessment before:
- turning off their legacy service
- reducing the team’s resource to a ‘business as usual’ team, or
- removing the ‘beta’ banner from the service
The panel recommends this service sits a live assessment in around 6 - 12 months time. Speak to your Digital Engagement Manager to arrange it as soon as possible.
This service now has permission to launch on a GOV.UK service domain with a Beta banner. These instructions explain how to set up your *.service.gov.uk domain.