NHS 111 - Alpha Assessment

The report from the second alpha assessment of the National Health Service’s NHS 111 service on 28 April 2016.

Stage Alpha
Result Met
Service provider National Health Service (NHS) England

Result of service assessment

The assessment panel has concluded the NHS 111 service has shown sufficient progress and evidence of meeting the Digital Service Standard criteria and should proceed to private beta.

Detail of the assessment

Lead Assessor: Steve Wood

Researching and understanding user needs (points 1, 2)

The team have done much good work in order to understand who will be needing the service and have used a variety of research methods, such as interviews and surveys. Research has been conducted with a wide range of user groups, including people who have language difficulties and accessibility issues. The latter group included people with both severe physical and mental problems. During research the team came up with some data-driven personas, including people who were in chronic pain and those who are over 65 years old. The team have formed links with people in the blind and the deaf communities, and with people who have motor-neurone disease.

The team demonstrated how they have iterated the service based on research.

The panel was impressed by the team’s work to accommodate transgendered people who would not be able to identify with the initial filtering question of: “Are you male or female?”. The choice made at this stage informs the set of questions being asked later, so the team created alternative filtering options based on anatomy and potential health issues the user may be experiencing.

It was good to see that the team have also carried out research with dentists, nurses, GPs, and call-handlers on the telephony service.

Although the team explained the many ideas they have about researching with different groups and communities of interest, the research plan itself was not clear and the panel would like to hear more at the next assessment.

There is not, as yet, a dedicated researcher embedded in the team. The service is currently working with two external companies, but is planning to recruit in-house researchers.

It was interesting to learn about the effort made to educate the developers in the benefits of user research. Involving the developers in research sessions (and having them do guerilla testing themselves) resulted in the development team demanding to understand the user needs underpinning every story.

The service team (points 3, 4)

The team is very dedicated and skilled, but the panel believes that it is too small for a service as important as NHS 111 online, which will eventually become part of the UK’s critical infrastructure. For example, the tech lead doubled-up as the service manager, which is not sustainable. The panel were encouraged to hear that the service is recruiting and that part of this work involves employing a dedicated service manager and more researchers. The current tech lead will then become the CTO across NHS.UK/Choices/Emergency care, while retaining very close involvement with the NHS 111 online service.

The panel was impressed that there is a clinical lead in the team. This is important as this group is a major stakeholder and their involvement is extremely important in ensuring continual development of the service.

The team is split over a number of sites. This brings its own challenges, but good efforts have been made to overcome them. The technical team is based in Southampton and is in very regular contact with the product and service managers. Webex is used for sprint planning, and the show & tell sessions are recorded. The product manager is on-site for the start and end of each sprint. We would encourage the product manager to be on-site as often as possible as remote working does have limitations when it comes to managing things such as team harmony and effectiveness.

The team appears to be well supported by senior stakeholders, with the service manager reporting directly to the SRO, showing that the decision making process is not overly hierarchical.

Designing and testing the service (points 11, 12, 13, 14, 18)

The service uses NHS branding and should work with other NHS online services to create a style guide and NHS specific design patterns.

Currently some of the service uses NHS or medical terms. Making directions, questions and outcomes clinically accurate and understandable by everyone is a complex task. We would recommend hiring a non-clinical GDS-trained content designer to aid overall comprehension and understanding.

The team has started iterating the telephone set of questions into something that works in a digital service. They should continue to investigate how these can be enhanced, for example by using images for certain questions.

The team should also consider how the service should behave if users want to change their answers to questions, and if functionality like save & return and personalisation could aid completion and accessibility.

More work needs to be done on the final pages of the service to improve clarity about what the user should do next (especially as this can be region and time-specific). This includes investigating what personal information a user will need to give, and using notifications, maybe by email or text, about what will happen next.

The Secretary of State has already taken a keen interest in the service.

Technology, security and resilience (points 5, 6, 7, 8, 9, 10)

The technology used so far seems sound. The service uses Azure, having switched from Amazon Web Services. A big concern is what happens should the system go off-line while someone is using it. To mitigate this, the service is using Akamai for caching, route optimisations, and distributed security. The team has plans in place for penetration testing and is looking at fraud vectors.

Apart from the Pathways (the user journeys defined by clinicians) which are licensed, the service has made everything it uses Open Source. The Pathways come as .csv files and use business logic. They are the same as used by the call centre teams.

It was good to hear the team is interested in using GOV.UK Notify.

It was also good to learn that the developers have carried out many deployments to the integration environments and that any clinical changes that affect the algorithms underpinning triaging can be implemented almost immediately into the online service.

There was much discussion at the assessment about the team’s use of a Graph database and it was deemed to be an excellent choice.

If the service were to be taken off-line, then the current call centre operation would be the fall-back option.

Improving take-up and reporting performance (points 15, 16, 17)

This service is intended to run alongside the NHS 111 call centre for the foreseeable future. IVR (interactive voice response) messages will be used during busy times, offering users the opportunity to get a quicker response by using the online system. There is a lot of data available to the team from the current call centre service (and as noted earlier, the researcher interviewed some of call handlers). The team has spoken with the GDS performance platform team, but as this is an NHS service, that avenue might not be appropriate. There isn’t a performance analyst on the team as yet, but this was discussed at the assessment.

The service uses Webtrends as its analytics tool, due to already having the necessary licences and skills in the team. This will be evaluated during the beta phase.

Recommendations

User research and design:

The understanding of the potential users of the NHS 111 online service is much better than in the previous assessment. While the team outlines some needs regarding convenience in terms of time, access and privacy, the panel will expect a clearer explanation at the next assessment about how the team is capturing user needs and how these are fed into the backlog.

The panel would recommend at least two design and research teams running in parallel, so prototypes can be iterated and researched in every sprint. They could be backed-up by multiple development teams to increase the speed of iteration and to implement quick changes to the service. This would also allow the backlog to be driven more by research and design decisions rather than purely technical stories.

To pass the next assessment the service team must:

  • increase the number of full time designers, user researchers and content designers so as to iterate the service quickly

  • recruit a GDS-trained content designer to help comprehension of questions and outcomes

  • have a clear testing plan for browsers on mobiles, tablets and desktops, including progressive enhancement throughout the service

  • be clear on how findings from user research turn into stories and get implemented into the service

  • have a rollout plan for private beta, public beta and beyond, including Assisted Digital provisions and ideas on how to promote channel shift

  • be fully accessibility tested and compliant, and have completed research with a wide range of users with differing situations and needs

The service team should also consider:

  • iterating on the best way to ask useful questions, including looking at using images and potential for other languages

  • working with other NHS digital projects to create an NHS style guide and design patterns

  • recruiting a second complete development team to speed up iteration

Conclusion

It is possible that the Discovery wasn’t carried out as recommended in the GDS service manual. This was suggested by the service team’s presentation at the assessment, which asked whether people understood the NHS 111 proposition (a survey said 58% of the population do), and whether they would use the service if it existed. This is not the same as building a proposition around user needs.

That observation aside, we think that the new approach to the service is much better defined from the user research perspective. However, as noted above, we strongly recommend embedding more researchers and service designers in the team. Also, it is a very good idea to use the services of a content designer who does not have any clinical background or experience of working with the NHS. That way the designer will be better placed to see the service through the eyes of the user.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 22 December 2016