Share a concern, make a complaint on care alpha assessment
The report from the alpha assessment for Care Quality Commission's share a concern on care service on 7th November 2017.
From: | Central Digital and Data Office |
---|---|
Assessment date: | 07 November 2017 |
Stage: | Alpha |
Result: | Met |
Service Provider | Care Quality Commission |
The service met the Standard because:
- the service team have a strong understanding of their users and their needs and have built and continuously iterated a prototype based on ongoing user research
- the team are capable and working in a collaborative and iterative fashion
- assisted digital support is well defined and there are strong plans in place to test this support.
About the service
Description
The service is for the public and the staff working in health and social care in England to express their concerns, complaints or positive comments about their care or care they have witnessed.
Service users
The users of this service are anyone who has used health or social care services in England carers/relatives of people who use health and social care services in England individuals who work in health and social care settings in England (i.e. potential whistleblowers).
Detail
User needs
The panel were impressed with the level of research the team conducted in Discovery and Alpha, and how the team has been exposed to the research, and the insights they have generated.
In the Discovery the team spoke to 33 users about their needs and experiences of using the existing service. In Alpha this was expanded to a further 40+ users to understand needs, motivations, and pain points, and enabled them to conduct user testing on the Alpha service.
The team have a good understanding of the users of the service, categorising them as Primary Users (users of social care, relatives, healthcare professionals - potential whistleblowers), Super Users (users/charities who make complaints/complete forms on behalf of users, and commissioners who are in charge of making decisions around funding), and Secondary users (CQC Intelligence and CQC Contact centre staff in charge of reading through the submissions and analysing the data, as well as deciding which submissions should be followed up on and looked into urgently). A deep understanding was demonstrated of the needs of users, and the service they are building reflects this.
Users have been spoken to cross the Digital Inclusion Scale and this will be broadened in Private Beta. The team has also spoken to users who have disabilities to understand their needs and any barriers they might have in using the new service. Plan are in place to conduct an Accessibility Audit in Private Beta.
The panel were also impressed by the way in which the team has made product decisions based on the research they conducted, and other data available. This was particularly evident in the work that had been conducting on the name of the service, and other content decisions.
An inclusive approach to research was apparent including observing, note taking and analysis of user research which has enabled the team to make confident service decisions.
The team has a clear plan for research in Private Beta which includes more testing on mobile devices, research with assisted digital users, and remote testing to increase the scope of the user testing they are doing.
Team
The team have all key roles present and are all co-located. Plans are in place to reduce the current 50% who are contingent labour by up-killing and transferring knowledge to permanent staff, this is something the team should focus on in the next phase.
The team have very good agile disciplines in place. The team showed they had made good use of Google Drive to document product decisions, and clearly demonstrated the service improvements that have been made in the Alpha. This approach will serve them well as they move into Private Beta.
Communications both within the team and with wider stakeholders is strong. Examples include the team sharing how user research has influenced decisions and changes to the service as well as the overall progress of the service via Show and Tells followed by circulation of the highlights of each of these sessions.
The team have developed excellent governance for the service which is used only when needed which works well with agile delivery.
Technology
The service is based on Drupal a PHP based content management system using the CMS elements. The site will use Drupals API and data modelling functionality. Drupal has been chosen as it meets the user cases the service presents.
The service is hosted on Azure, using Kubernetes and Docker which are used for continuous integration and deployment which supports the micro services architecture.
Gov PaaS will be trailed for elements of the service as it develops starting with Notify. Solr provisioning could also serve this role and will serve as an alternative while the service matures.
The team will use Solr or Elastic Search as the search tool and storage once the existing database is phased out. The choice of tool will be determined in Beta phase by whichever tool will be better supported by Gov PaaS.
Returns are stored in the site instance and then pushed to a storage engine to provide a query snap shot so that responses can be reviewed against the questions the user was presented with. This will allow the service team to review different iteration of content through AB testing and analysis.
The service runs over HTTPS. API interfaces are secured using over HTTPS and tokens which allows other government departments to perform analysis on data.
Database fields can be marked as sensitive. Sensitive data is not used in the development environment. Developers don’t have access to production data.
Users who log into the system will have unique IDs to provide an audit trial.
Okta has been selected to manage IDs this will be introduced during beta phase.
The tools being used in the service are open source. Components of the code are being open sourced as Drupal plugins. Once the service is live the entire project’s code can be published as appropriate.
The service doesn’t use any proprietary systems; the team are using community modules for Drupal including the GDS theme. Composer flow is being used to manage dependencies and libraries.
Non CQC testers are providing functionality and unit testing; these tests are part of their Jenkins build. Code is then committed to bit bucket before being deployed to the development environment.
The team currently has 3 environments Test, Pre Production and Production. Kubanaties is being used to scale the service based on need. It is expected that dev ops can manage the peak demands.
The long term aim is to use monitoring tools included in Gov PaaS. In the interim the team are planning to use new relic, the full requirements for monitoring are still being explored by the team.
Design
The team showed great attention to detail in both interaction and content design, and the panel were impressed by the design iterations presented. It was evident the team had tried out new approaches for flow, groupings of questions and how the questions were framed.
Despite not being on GOV.UK, the service has made excellent use of GOV.UK design patterns from the service manual, and only deviated from the patterns where they have discovered a strong user need to do so.
The team explained the biggest issue with the current service, and the main cause for user drop out, is the search facility used to identify who the feedback is about. The team have worked hard to fix this, and have a plan to continuously improve their care service search through capturing data from the free text entry where users can identify the service if they can find it. By allowing users to input care service info in a free text box is allowing them to proceed through the service, while also ultimately helping to improve the search functionality.
There is a strong plan for supporting assisted digital users through the service, with a contact number on every page, users can call and contact centre staff will be trained to guide a user through the service and help them complete it online.
Work has taken place on the phrasing of questions in which the team have worked with contact centre reflecting on conversations with callers to inform wording of questions.
There is comprehensive support model in place for when the service is offline.
Users can speak to someone during office hours to log a complaint or leave a recorded message after hours.
Calls flagged as urgent safeguarding issues out of office hours are forwarded to an inspector who can make a decision about how to take action.
Plans are in place for increasing digital take-up of the service, ramping up campaigns targeted at specific user groups to tell CQC about their care.
Analytics
The team had spent considerable time analysing the current service using Google Analytics to understand the pain points, the main one being searching for the care provider they wish to complain or provide feedback on.
It was clear from the plans the team have and the prototype that the pain points identified have been addressed within the service.
In addition to the mandatory KPIs the team plan to capture others including one relating to flagging where safeguarding is being correctly identified.
The team intend to use Google Analytics to focus where improvements can be made in Beta and there is a dedicated person within the team to undertake analysis of the service.
The team have had initial conversations on registering the service for the Performance Platform.
Recommendations
To pass the next assessment, the service team must:
- conduct user research with users of Accessibility Software to compliment the planned Accessibility Audit that is planned for Private Beta
- in Private Beta it is recommended the team do further research on how users find the service including general communications or campaigns this will help with digital uptake for Beta phase
- include further research in Beta on the full end-to-end user journey:
- understand what users do before engaging with the service and what actions could be taken first. How best can users be supported before using the service, for example around understanding that a complaint would need top be made direct with the care provider if they wish to make a formal complaint via the service
- understand the needs and use of the confirmation code on the last page and what expectations the users have of this and how often users call to follow up on complaints submitted. How the confirmation email used and how are next steps communicated and understood by users
- emails could be mocked up for usability testing without having to build anything with notify.
The service team should also:
- continue to progress current plans that are in place to reduce the percentage of contingent labour within the team
- the team should consider replicating the benefits of the planned remote testing in beta in a lab environment. The will give the added advantage of the team being able to observe users completing tasks, and having the ability to follow up with questions. It is easier to control recruitment in the lab, compared to recruiting through companies like What Users Do which can result in the participants having higher digital skills than the average person, and have the potential of being ‘professional testers’
- consider sharing learnings on the service design with the wider design community, both for interaction and content design, including feedback terminology from users on the complaining about a service, alpha banners and other valuable learnings
- plans are in place to test Assisted Digital support. Using lab testing with assisted digital users alongside a contact centre agent playing their part from the observation room could be considered to enhance this testing further.
Get advice and guidance
The team can get advice and guidance on the next stage of development by:
- searching the Government Service Design Manual
- asking the cross-government slack community
- contacting the Service Assessment team
Next Steps
To get your service ready to launch on GOV.UK you need to:
- get a GOV.UK service domain name
- work with the GOV.UK content team on any changes required to GOV.UK content Before arranging your next assessment you’ll need to follow the recommendations made in this report.
Digital Service Standard points
Point | Description | Result |
---|---|---|
1 | Understanding user needs | Met |
2 | Do ongoing user research | Met |
3 | Have a multidisciplinary team | Met |
4 | Use agile methods | Met |
5 | Iterate and improve frequently | Met |
6 | Evaluate tools and systems | Met |
7 | Understand security and privacy issues | Met |
8 | Make all new source code open | Met |
9 | Use open standards and common platforms | Met |
10 | Test the end-to-end service | Met |
11 | Make a plan for being offline | Met |
12 | Make sure users succeed first time | Met |
13 | Make the user experience consistent with GOV.UK | Met |
14 | Encourage everyone to use the digital service | Met |
15 | Collect performance data | Met |
16 | Identify performance indicators | Met |
17 | Report performance data on the Performance Platform | Met |
18 | Test with the minister | Met |