Manage my adult social care workforce data alpha assessment

The report from the alpha assessment for Skills for Care's Manage my adult social care workforce data (NMDS-SC) service on 30 August 2018

From: Central Digital and Data Office
Assessment date: 30 August 2018
Stage: Alpha
Result: Met
Service provider: Skills for Care (DHSC Delivery Partner)

The service met the Standard because:

  • the service team showed how they were working in an agile, iterative manner (and also how they were showing the rest of their organisation how this worked)
  • the service team were able to evidence multiple different approaches that had been trialled during the alpha phase
  • the three core user journeys - create an account, add and edit an establishment and add and edit a worker record - were well understood

The panel were impressed at the way in which the service team had adapted to working inline with the service standard. In particular, we were impressed with the way in which the team had ensured that Alpha was about building prototypes to test, rather than the creation of production ready code. The use of field user research at care locations rather than in labs, and the sharing of this through a video link, was also impressive.

Although the panel have awarded the service an overall Met at Alpha, there are some individual points of the standard that were Not Met. In particular, the panel was concerned about:

  • the plans for the Private Beta phase to be a trial service, with no actual data input or genuine use
  • the bulk upload option, where the options tested in Alpha had not resulted in a definite plan for functionality to be taken forward
  • the provision of accessibility for users with assisted digital needs
  • the architectural approach, and - in particular - the approach to security and fraud

The panel has included mandatory recommendations on these areas below. The service team’s evident motivation and commitment to the service standard gives the panel confidence that these issues can be resolved as part of the work in the next phase.

About the service

Description

The NMDS-SC is an online data collection service managed by Skills for Care on behalf of the Department of Health and Social Care.

Service users

The users of the service are organisations who either provide, commission or organise adult social care.

Detail

User needs

The team have worked hard to understand their data inputter user needs in terms of the information they need to provide to a system. The team articulated that their service needs to be easy to use due to it being voluntary, and that there are incentives surrounding funding that drive people to use this service. They highlighted these needs were in mind when designing the service. The panel recommends the team formally articulates user needs at a higher level to ensure that the wider team are aware of these. This should help make sure that there is no possibility of confusing user wants with user needs, and that that lower-level processes can be easily understood within the context of the high-level user needs.

The panel would remind the service team of the importance of involving all user groups within their exploratory research. Lots of research has been done with existing users of the current service. The team are aware that there is a user group of non-users that is not represented within their work, and they were not clear how they would engage with this large and important user group moving forward. Neither had the team involved users with accessibility issues in Alpha. Only including users that are familiar with the current process risks skewing results and unduly influencing emerging prototypes. The panel felt that the existing network of engagement managers could play a role in highlighting non-users.

The service team were less clear on their plans for beta research. It was good to hear that the team have ongoing plans for recruitment and are potentially going to reach out to current non-users. Once a private beta approach has been outlined by the wider team it is strongly recommended that a detailed beta research plan is developed constituting of a range of user groups and methodologies. Accessibility should also be considered; the team are encouraged to engage with accessibility teams within their own organisation or GDS to ensure that their service takes their needs into account and succeeds for all.

The User Researchers are doing an excellent job of involving the wider team in research. Most of the team have observed research using streaming tools which is a great, unobtrusive way to ensure involvement whilst ensuring the users are not overwhelmed. There are regular feedback sessions involving the full team and this is fed into sprint planning and prioritisation showing that the team value and used the insight gained. It was very clear that the Delivery Manager, in particular, was well versed in the User Research that had been conducted.

Team

The team demonstrated that they were working together in an agile, iterative way. They were managing the challenges of remote working by having set days where everyone was present in one location, and were using a number of tools (e.g. appear.in and a Trello feedback board) to manage this at other times. Agile ceremonies - including stand-ups, retros, show and tells and planning - happen regularly, involve the entire team and inform the direction of work.

Whilst there are two service owners, both were involved in the work of the team and they were able to show good evidence of working together. There were clearly understood governance approaches and risks were being actively managed through regular contact with the audit and risk committee.

The team evidenced skills transfer from contract to permanent staff and the plans for the next phase included more development resource. The panel’s main concern was the availability of interaction design resource. Whilst the team had been able to access some interaction design advice at Alpha, the panel’s opinion is that the team will need substantially more interaction design input in the next phase.

Technology

The team is currently using Github with RESTful APIs and using Jira and Trello board for backlog and requirements sprints and having 2 weekly sprints.

The service will be hosted on GovPaaS and bulk upload in AWS S3 bucket.

The service is using NodeJS, JavaScript Frontend and will be using Postgress to replace the current database, Oracle. The team is also using Docker Image and microservices.

The service is using common platform services Gov.PaaS. The team are proposing to use the DMZ along with Internet Gateway and encrypted database. The current encryption is 128-bit SSL cryptographic protocol (industry standard). The service team expressed that the new service will be using the same encryption as the existing service and migrate the data across in this form with same current encryption standards.

The panel recommends that the service team needs to do more work on security and against fraud e.g. CSRF/ CORS.

The team is planning for Unit testing, integration testing. The team proposing Functional Testing using Selenium and also uses JMeter.. The team is using staging and also has plans for NFR Testing and PEN testing. The use of government PaaS means that the new service will be able to provide the same levels of service as the current service SLAs.

The panel recommends that the service team work on testing for accessibility and assisted technology; and be prepared to show evidence of this at a future assessment.

Design

The team has prototyped and tested a variety of solutions to the problems they found in discovery. Observations from user research influenced changes to later prototypes, and the panel was impressed at how the team had documented their user research findings clearly alongside their prototypes.

However, it seems that the team has not focussed enough on the most challenging aspects of the service. Particularly the ‘bulk upload’ flow. Although potentially used by less people, the majority of data submitted to the service will come via this route. Therefore the quality of the resulting dataset depends on it. The team has done tech spikes into how to make bulk uploads easier, but showed little evidence of prototyping based on user research. It’s a risk to the service’s success that this user journey isn’t as smooth as possible. To proceed to public beta, the team will need to demonstrate they have made significant improvements to this journey to make sure that users succeed first time.

The point ‘make the service consistent with GOV.UK’ has been partly overlooked for now, as the service has applied for GOV.UK exemption. Even if the service adapts the branding, the panel recommends otherwise adhering to GOV.UK patterns.

The team’s research indicated that the vast majority of users will use the service on desktop, and as such they planned to only design for this size. However it must be stressed that some users need to magnify their screen in order to use online services, and that analytics don’t pick up this behaviour. The service needs to work for all users. To pass the standard in beta, the service will need to be fully responsive and work at different sizes.

The service team has access to input from an interaction designer, though it was not clear how much time they had to commit to the team. The team would benefit greatly from a full time interaction designer. The team must add one to the team to get the most out of the beta phase.

Analytics

The service team had plans to implement Google Analytics for the Private Beta. They had considered the data from the existing NMDS service in drawing up 11 KPIs to measure and report on. A satisfaction survey would be used to measure completion rate and satisfaction scores. This would be accessible from each page, and prompted at the end of the user flow.

Given the likely low percentage of users completing a survey, the panel encouraged the use of additional data where possible in reporting against these KPIs. It may also be possible to consider segmenting some KPI results by user type (e.g. local authority/others or bulk upload/online entry may show different patterns of use of support channels).

As the service will not be located on GOV.UK, it will not be included on the Performance Platform but the service team will consider publishing this data openly elsewhere.

Recommendations

To pass the next assessment, the service team must:

  • create as soon as possible, and share with GDS, a detailed plan for the private beta phase of this service. This plan should detail how many users will pass through the private beta before the service team return for a beta assessment. In order to ensure that the private beta provides actionable data, this plan should also outline how at least some users will be allowed to enter real data into the beta system without having to additionally re-enter this data into the existing system
  • determine the approach to the bulk upload option to the service. It’s unlikely that a fully automated validation approach can be delivered within the timescale, so it’s important that the team have an agreed approach that can be developed and tested as part of the Private Beta
  • ensure that the user research plan for Private Beta includes qualitative as well as quantitative work. It should also include a focus on research with non-users/users who haven’t previously used the NMDS service and those users identified during Alpha as having Assisted Digital needs (particularly users with dyslexia and users who score low on the digital inclusion scale)
  • prototype and test the approach to the start of the journey into the service, including deciding on whether this lives at the existing Skills for Care site, the existing NMDS service URL or a new URL
  • assign an interaction designer to be an integral part of the team in the next phase of work
  • make the service accessible and use responsive design to ensure that it works with JavaScript turned off and regardless of the device being used to access it (and show the testing approaches used to ensure both accessibility and device agnosticism)
  • ensure that all elements of the service are secure enough to handle the sensitive personal data it contains by developing a robust approach to security architecture and testing
  • amend the approach to the measurement of the mandatory KPIs so that they are not solely dependent on users filling out survey data (for example, by measuring the number of broken journeys that are not restarted within a set time period)

The service team should also:

  • work on the longer-term plans for migration of users of the existing service into the future version of this service, avoiding a “big bang” approach where all users are suddenly moved to a new version of the service. This work should include the creation of a detailed communication plan for communicating these changes to users
  • ensure that developers and testers are fully integrated part of the wider service team; and that testing is considered as part of the definition of done rather than a separate process
  • consider options for publicly reporting on the KPIs they have chosen. (If the service is not included on GOV.UK, then the performance platform is not an option but the service team could, for example, publish details through one of their own online dashboards)

Next Steps

Before arranging your next assessment you’ll need to follow the recommendations made in this report.

Submit feedback

Submit feedback about your assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Not Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Not Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 23 January 2019