Your NHS Pension - Alpha Assessment

The report from the alpha assessment of the NHS Business Services Authority - NHS Pensions service on 23rd September 2016.

Stage Alpha
Result Met
Service provider NHS Business Services Authority

The service met the Standard because:

  • The team are regularly conducting user research (once per week or twice every sprint) with a good coverage of users during alpha. They have identified user personas and user needs based on this research. The team are also approaching research as a team sport with everyone in the team participating in research sessions and the subsequent analysis. It wasn’t just the researcher and designer who were able to talk about user needs at the assessment.

  • They are a full multidisciplinary team working in an agile way to meet user needs. For example, they were able to show evidence of iterating the design of the alpha based on the needs of users. The team was also able to explain how they have iterated their processes regularly during the alpha.

  • The team demonstrated a sensible approach to changing technology, re-using the front-end prototyping tools which emerged from the NHS alpha, building and discarding prototypes as they learnt more about the needs of their users. The panel were also impressed with how the team includes developers who had worked on the back-office service, and had defined a close working relationship with the supplier to quickly make changes needed by the team.

About the service

Service Manager: Darren Curry

The NHS Pension Scheme is a member and employer contribution based scheme with 2.8 million members (1.4m active, 0.5m deferred and 0.75m pensioners) and this service aims to provide users with visibility, control and understanding of their NHS pension. It aims to promote accuracy and ownership by empowering members to make more informed decisions about their pension.

Detail of the assessment

Lead Assessor: Mark McLeod

User needs

The team are regularly conducting user research (once per week or twice every sprint) with a good coverage of users during the 6 week alpha. They have conducted research with 16 deferred members, 34 members, 9 employers and 7 pensioners (65 users in total). They have identified user personas and user needs based on this research.

The team were able to talk eloquently about users and some of the challenges they currently face with what is a complex paper based system. They are asking the right questions and looking into existing pain points for users.

The team are also approaching research as a team sport with everyone in the team participating in research sessions and the subsequent analysis. It wasn’t just the researcher and designer who were able to talk about user needs at the assessment.

The team are using the digital inclusion scale to plot the digital skills of users and have attempted to recruit users across age range and salary bands. However, it doesn’t look like they are currently incorporating many users with low digital skills in their research. There needs to be a lot more focus on understanding what the digital skills of staff and employers currently is, so that their needs can be fully understood. It doesn’t necessarily follow that people who carry out administrative tasks on a daily basis at work (using spreadsheets and email as their main tools) have high digital skills.

Also there has been no research with users with access needs carried out so far. This means that the needs of these users have not been explored or understood.

Team

The service has had a colocated multidisciplinary team in place during the alpha and have identified the roles they are going to add to the team beyond alpha - these were a QA, more developers and a content designer. The team is made up of both civil servants and staff from a supplier and they are consciously sharing knowledge within the team by pairing to ensure the service is sustainable in the longer term.

The team are working in an agile way to focus on delivering an outcome for the user and are following the agile methods of Hypothesis Driven Development, Scrum, Lean UX, User Centric Design and Test Driven Development. The team were able to give examples of how they have iterated the design of the alpha to better meet the needs of users, and how they have iterated their processes.

Technology

The team demonstrated a sensible approach to changing technology, re-using the front-end prototyping tools which emerged from the NHS alpha, building and discarding prototypes as they learnt more about the needs of their users.

The panel was also impressed how the team includes developers who had worked on the back-office service, and had defined a close working relationship with the supplier to quickly make changes needed by the team.

These are good approaches for an alpha with a narrowly-defined scope, but there were examples where constraints of the back-end were apparent to users, for example the structure of people’s names, their title, gender, etc. These constraints will become limitations if a beta service needs to accept changes of circumstance and provide calculations more quickly, rather than necessitating the user makes multiple visits to the service.

A beta assessment panel will expect to learn more about the ability to make fundamental changes the back office technology or supplier if necessary, in particular to more quickly respond to the needs of staff working in the back-office and to cut-costs.

The team expressed that they would be making their code open source. The assessment panel would encourage them to do this as soon as possible so that any other departments building similar services can reuse their code. It’s easier to work in the open from the start, identifying what things should be made private than working in a closed way identifying what things should be made open.

Design

The team described lots of compelling user needs and business needs that their service aims to address, with at least three groups of users; existing members of the NHS pension scheme, NHS employees who are interested in becoming members of the pension scheme, and admin users needing to update relevant employment information. The team showed the designs they had explored in three different prototype journeys which they plan to take into beta.

Although the prototype journeys were of a high standard, it wasn’t clear what user needs were being solved by each one. This made it difficult to assess if the journeys are simple and intuitive enough that users can succeed first time, because it’s not clear what ‘success’ actually means for the user. It also makes it hard to understand how the journeys will reduce the demand on the current paper processes.

The panel would like the team to more tightly define the proposition and scope for each journey, being explicit about who the users are and what need the journey solves. And although the team were correct to start with researching user needs rather than the current paper process, it is also important to know what paper forms the service will be replacing if they want to hit their ambitious digital take-up targets.

Analytics

The service team did not have a dedicated performance analyst in the team during alpha. However, they were able to talk through data from the current pension scheme that was relevant for the alpha. They plan to recruit a performance analyst for the private beta. They also explained they will be using Google Analytics to track events and to set up a conversion funnel for the service. They intend to report the 4 KPIs on the performance platform and have started the registration process.

Recommendations

To pass the next assessment, the service team must:

  • Open source the code as soon as possible.

  • Include users with accessibility needs in each round of user research and usability testing. They should also get a better understanding of the needs of assisted digital users and start to test the support they will provide.

  • More tightly define the proposition and scope for each journey, being explicit about who the users are and what need the journey solves.

  • Have a dedicated performance analyst in the team. Currently, the user experience designer in the team is covering the responsibilities of a performance analyst. As the team moves beyond alpha and the service begins to use real data, it’s important that there is a dedicated performance analyst in the team. They should be responsible for identifying actionable data insights, adding appropriate tracking to the code, creating a conversion funnel, and helping to inform the direction of the team’s user research efforts.

  • Improve their ability to make fundamental changes to the back office technology or supplier if necessary, in particular to respond more quickly to the needs of staff working in the back office and to cut costs. The team should have a long term plan and commitment to transform the full end to end service including both the front and backend.

The service team should also:

  • Continue to keep the entire team engaged in the research and design process as they move into beta. It was great to see this at the alpha assessment and they should keep this up during the beta.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 13 January 2017