Give feedback on care live assessment report

The report for Care Quality Commissions's give feedback on care live assessment on 6 October 2020

From: Government Digital Service (GDS)
Assessment date: 06/10/2020
Stage: Live
Result: Met
Service provider: Care Quality Commission (CQC)

Previous assessment reports

Service description

The purpose of the give feedback on care service: a way for people to tell the Care Quality Commission (CQC) about their experience at a health or social care provider. CQC inspectors and analysts use the information from this service to help us decide when, where and what to inspect.

Service users

  • anyone who has used health or social care services in England
  • carers/relatives of people who use health and social care services in England
  • individuals who work in health and social care settings in England, for example potential whistleblowers
  • organisations who are part of our charity partnerships work who are incentivised to give feedback for users from seldom heard voices
  • internal CQC staff who triage, analyse and act on the information that a member of the public submits

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to clearly articulate who their users are and what they need - the user needs for the service were clear and evidence-based, and the team were able to bring their users to life and build empathy through the use of personas/user scenarios
  • research has been conducted with users of multiple skill levels, users with access needs and those with assisted digital needs - the team have also conducted research across a range of devices to ensure that users can give feedback on care regardless of their digital skills or device availability
  • the team have considered the universal factors which affect their users’ behaviour and interaction with the service, such as, their available time, their state of mind, their relationship to the person receiving care, their relationship to the care provider and their knowledge of CQC and their remit - by taking into account each of these factors, the team has built a user-centred service that truly meets the needs of their users.

  • research is conducted as a team sport; at least one team member is involved in every research session and analysis is conducted as a team activity - it was clear that this process has enabled the team to build empathy for their users, leading to a truly user-centric design of this service

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • research has been conducted for 2-3 days each sprint, allowing for regular contact with users and quick design iteration
  • the team have used a variety of methods, such as A/B testing, first click testing, exit interviews and completion surveys to paint a holistic picture of how users are accessing and engaging with the service
  • analytics are used effectively to paint a picture of user behaviour -the team were able to explain how they assess the effectiveness of new iterations using analytics and how they triangulate research methods to combine both qualitative and quantitative insights
  • a comprehensive research plan is in place for live, with both short-term and long-term goals documented - where research may pose some difficulties (such as with ‘whistleblower’ users), the team have considered workarounds and chosen suitable methodologies to guarantee that the service can be tested effectively once live
  • the researcher’s plan to use ‘surrogate users’ to test aspects of the service typically accessed by hard-to-recruit users

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that enough time is allocated to meeting both short and long-term research goals - the team has a very ambitious plan for ongoing research, however the team’s availability will reduce significantly in live
  • with the wider CQC, consider the amount of time needed to continue monitoring and iterating this service and plan appropriate time to continue their great work in this area

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has the expected range of roles and is staffed with CQC employees in all the roles
  • the team clearly see user research as a whole team activity and also demonstrated an impressive commitment to learning from analytics
  • the panel were pleased to hear that the team will be allowed to continue to work together on this service into live and that there is a flexible approach to when this work will happen to allow for the most effective iteration of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • review this regularly and ensure that the team are able to continue to iterate and improve the service with the time available - the panel were concerned that the proposed team for live will struggle to tackle the work outlined in their roadmap within the confines of ½ day per week
  • on the panel’s recommendation, encourage CQC to consider funding teams rather than individual pieces of work as this would allow this clearly high-performing team to continue working together to deliver other services for the organisation

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team use agile methods well and were able to demonstrate how they have adapted their ceremonies and processes to help them deliver at pace
  • the team is using appropriate tools to help support their work and mitigate the risks of not being colocated
  • the team has tackled the specific challenges of working from home during Covid-19 and has conducted most of their public beta while working remotely

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has iterated the service and were able to demonstrate to the panel how they had researched and tested different elements of the service to inform these iterations
  • the team was able to outline a plan for how they will use the time available in live to iterate and improve the service in the future
  • the team has an extensive plan for future areas of improvement

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the service uses common technologies, Azure, .net core and cosmos database
  • the service interacts and feeds into existing/legacy processes and systems (Siebel CMS)
  • comprehensive technical documentation and diagrams were available and provided for assessment
  • the service uses cognitive services to provide an intelligent provider search feature

What the team needs to explore

Before their next assessment, the team needs to:

  • explore ways of improving/redeveloping the existing APIs to meet the services needs rather than encapsulating and enhancing the provider data within the service
  • explore opening up the search feature/function for other, similar, services and, if appropriate, external developers
  • consider sharing lessons learnt from the development and iteration of the provider search feature

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has demonstrated a robust security checking and testing process including third party penetration testing
  • the design of the system minimises the trust boundaries and encapsulates existing services
  • the user privacy experience is well considered and a data protection impact assessment (DPIA) has been completed with internal colleagues

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • all code for the service is open source on the CQC digital space on Github

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service uses Tag Helpers - GDS Design System
  • the service is using Gov.Notify for email receipts

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a mature DevOps and automated testing processes before code can be promoted to live
  • automated testing includes multiple environments and browsers
  • accessibility testing and auditing is being undertaken with results acted upon

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has developed clear guidance for the citizens on the types of feedback which aren’t appropriate for the service and gives alternative methods for feedback if the service does not meet the needs

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a mature process of using analytical tools to identify user experience (UX) problems - a comprehensive example was given about how the service team used analytical tools to understand why users were struggling to get through the “search” part of the journey
  • the team were able to give robust metrics to prove their design changes had improved the journey to help users succeed
  • the team has worked through all significant accessibility issues solving complex design challenges to make the service accessible to all

What the team needs to explore

Before their next assessment, the team needs to:

  • continue the standards and quality work they have been producing to date

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team consistently applied common patterns as the first port of call but iterated these patterns with a rigorous testing approach to ensure deviations worked for users

  • the team has solved a lot of common accessibility challenges particularly in regards to the checkbox within a filter - the rest of government would benefit hugely from this insight being shared back in the design system

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a range of ways in which they have encouraged users to use the digital service, they understood how usage of the service had increased during public beta and had a plan for how to further drive usage of the service in the future

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed the use of offline and online channel data to gauge end-to-end service performance both pre and post improvements, which strengthened assurance that improvements were realised and caused no detrimental impact to any other part of the journey
  • the team showed multiple examples of where experimentation was used to evolve the service, with both hypotheses and less formal assumptions - this meant they were able to articulate problems, assumptions, improvement plans and then show the realised improvements using statistical methods to validate findings
  • the team showed strong evidence of close working with performance analytics on both service running and service improvement opportunities which meant that improvements could be designed with insight from all professions, giving a stronger story to the opportunity and a more comprehensive way of measuring change
  • the team uses Google Analytics for the majority of the key performance indicators (KPIs) - the team are aware of the limitations to Google as a monitoring tool so have actively tracked volumes against server volumes to triangulate and verify the data is trustworthy.
  • the use of (and access to) multiple data sources coupled with the use of data visualisation tools like Data Studio and PowerBI has allowed the team to pool end-to-end journey data into a single source and share findings with stakeholder groups for a more complete picture of the services performance

What the team needs to explore

Before their next assessment, the team needs to:

  • explore an option for future experiments which, for example, would be to run multivariate experiments and monitor success/defence metrics, as having multiple variations available against the control may offer a wider result spread and give the team options when discussing proposed newer versions of service elements - the experiments shown showed great positive results.

The team has seen challenges with sharing digital data across multiple stakeholders groups including access rights and stakeholders comfortably navigating the data applications. The team has done a lot to simplify the boards and use applications that are both user friendly and self-service. This issue is a common challenge with analytics and the team have done a lot to reduce the concerns around this area, however it still remains an ongoing challenge

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has revisited the performance indicators to ensure the key measures reflect the service as it moves across the stages and there is understanding that improvements in KPI’s performance is a continuous challenge and the team have showed evidence of a wide variety of improvements across the end to end journey using the KPIs as a major component in measuring those changes
  • the team showed a strong understanding of the performance of the service both as a snapshot and over time through the beta stages, identifying periods of stability and ‘what good looks like’ as well as where erratic events occurred either by external influence or technical/service team changes - this gave the panel assurance that the team knew the services journey through the lens of the KPI’s
  • the team showed a clear understanding of the key metrics and why they were chosen as well as the performance of the metrics over time against baselines - the use of a team developed baseline has allowed the team to challenge and validate their assumptions of what good looks like, while also easily visualising service performance for all stakeholder skillsets - this helps to remove any mis-interpretation of data that is inevitable when sending boards across multiple stakeholder groups
  • the team showed success in channel shift and discussed future strategies for improving shift including online communications plans, working with 3rd party associations and targeted call to actions campaigns
  • the team discussed user satisfaction and completion rate successes and challenges comprehensively and showed an understanding of where and why users are failing to complete (especially first time)
  • the team is aware through thematic analysis of the satisfaction comments that a portion of users were scoring and commenting on care providers through the service customer satisfaction part of the journey - the team however has consistently scored over 80% on customer satisfaction so while contamination is a concern to the team, improvements elsewhere in the service are taking priority

What the team needs to explore

Before their next assessment, the team needs to:

  • develop additional metrics that focus on ‘succeeding first time’ as these would add value to the suite of measures already available
  • develop further work to improve the users transition from service journey to service feedback so users are aware of the purpose of the customer satisfaction survey - removing contamination in the scores will give a cleaner more accurate reflection of the services satisfaction score.

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • all KPI’s were available and automated where possible
  • cost per transaction estimates showed considerable cost reduction going into live

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had tested with their minister and showed an impressive commitment to ensuring senior leaders were aware of their work and the service’s development

Updates to this page

Published 10 December 2020