Customer View alpha assessment

Service Standard assessment report Customer View 05/07/2023

Service Standard assessment report

Customer View

From: Central Digital & Data Office (CDDO)
Assessment date: 05/07/2023
Stage: Alpha
Result: Met
Service provider: DWP

Service description

Customer View aims to give customers more control over their benefits and State Pension online, allowing a customer to view and update their information with one login and allowing 24x7 access to complete simple tasks in one place.  The minimum viable product (MVP) for Customer View will allow a customer, regardless of the benefit they receive (excluding Universal Credit), to view the personal details that DWP hold about them and to change the bank account their payment is made to.

Service users

This service is for any customer who is in receipt of a benefit or State Pension, excluding those on Universal Credit and Child Maintenance, who needs to change the bank account their benefits are paid into, or check the personal information held for them.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • user research is clearly informing the team and the iteration of their designs and understanding of users
  • the team has a clear and well articulated set of user needs that they are working towards, and will be working towards developing an understanding of the different user mindsets
  • they are effectively combining complementary quantitative and qualitative research methods to build up a good understanding of why users are changing their bank details
  • the team are building up a new pool of potential research participants via actual users recruited via DWP agents
  • the team is making the most of previous research done in the space about users

What the team needs to explore

Before their next assessment, the team needs to:

  • increase the frequency of their research cadence in order to ensure they are able to build up a sufficiently detailed picture of all the different users identified and their needs as  they move into beta, and are regularly testing their hypotheses with light touch prototypes
  • (in addition to what the team has already identified for exploring next) consider looking at differences or additional factors in the needs of: state pension users, partially ineligible users (i.e. applies to some but not all their benefits), and those doing things on behalf of children and others
  • continue to explore and reflect on user groups and needs that they might not be researching with to ensure that the service can work for as many users as possible

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is aware of (and in some cases has overcome) technical and policy constraints to make it easier for the user to complete a task, for example by removing the ‘existing bank details’ screen, however the team and the department will only be able to do so much without policy reform to simplify the benefits landscape (for example different requirements across each benefit)
  • the team have refined the scope for their minimum viable product (MVP) to focus on changing bank details and viewing personal details, which is less risk both to users and to DWP
  • the team is challenging historic practices to provide a shared, cross-benefit view for users who are not on Universal Credit and getting Child Maintenance
  • the team is regularly sharing their work with the other teams in the programme, particular the Colleague View team, and in the wider department with operational colleagues including through show and tells

What the team needs to explore

Before their next assessment, the team needs to:

  • understand and test the journey for users who get Universal Credit or Child Maintenance as well as other benefits to make sure there’s not a broken user journey where there’s existing digital services for Universal Credit and Child Maintenance
  • work closely with both the GOV.UK content team in DWP and GDS during private beta and get agreement on how users will access the service from GOV.UK guidance, considering content for each benefit, State Pension and any overall ‘change of circumstances’ guidance
  • work with the online identity verification (OIDV) team to clarify in the journey (as part of the OIDV-owned service and/or this service) where users might already have an account from another DWP service, for example telling users if they’ve used another service, they’ll also have account for this one
  • develop and test unhappy or partly unhappy journeys for users who are not eligible to use the service, a specific feature in the service, select ‘not sure’, or are getting 2 or more benefits which includes Universal Credit or Child Maintenance and what the actions are for these users
  • do user research and if necessary, add signposting for users viewing their personal details about how they can change them

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is building on current processes for users phoning DWP such as bank details verification being part of the MVP
  • the team is aware of users perceptions and feelings about using different channels from user research and acknowledge some users will continue to prefer phoning rather than using a digital service
  • the team is collaborating with colleagues across the department including operational teams and the assisted digital team

What the team needs to explore

Before their next assessment, the team needs to:

  • as planned and as seen in user research, explore using notifications after a user has submitted a change so users are reassured DWP has got their change within any fraud and security constraints
  • explore any changes to telephone scripts or guidance that agents answering calls will need about the new service
  • explore if there is potential value in increasing awareness of the service by making reference to it to users at the existing ‘decision’ or ‘payment’ stages of the current offline journey

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team is using the GOV.UK Design System components and patterns, for example using the established checkboxes component and the ‘Check a service is suitable’ pattern to make sure users are eligible to use the service
  • the team have designed and user tested different components for different parts of the service, for example tiles or lists on the homepage, to make sure users succeed first time and are clear on the actions they can take
  • the team has iterated the content and design after rounds of user research and have a plan to continue to test and iterate into private beta

What the team needs to explore

Before their next assessment, the team needs to:

  • review that the content in service is consistent with the GOV.UK style guide
  • explore how to set user expectations for the service upfront about what they’ll need if they do not already have an account and need to complete identity verification, for example having a passport and other information, to reduce user anxiety

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is testing the service with users with assistive needs as part of the recent rounds of user testing
  • the team is designing and building prototypes using accessible design components, used visually hidden content for additional context and considered how to structure and group content and features logically based on user insights

What the team needs to explore

Before their next assessment, the team needs to:

  • complete a full accessibility audit of the end to end service journey
  • research with a wider set of users: including those with a range of accessibility needs and particularly lower levels on the digital inclusion scale, considering the significant numbers of users who will be eligible for this service
  • continue to research with users that may stress the team’s designs such as those acting on behalf of others or their children
  • consider exploring the Universal Barriers framework to see if it can help

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has all expected roles needed in the alpha phase with permanent staff in leadership roles
  • the team that worked in alpha will continue working in the beta phase
  • there is agreed budget for the next phase, including user research and to on-board two developers who will be supported by the core development team

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how to increase the ratio of civil servants to contractors and have plans in place to ensure knowledge transfer
  • consider if a service owner role is required, for example to ensure policy and digital can work together to remove silos in separate benefits

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used agile ways of working, held necessary ceremonies, and well attended show and tells
  • there was good collaboration to solve challenges found from user research, for example a technical solution was found to avoid asking for previous bank details

What the team needs to explore

Before their next assessment, the team needs to:

  • remove impediments to perform user research more frequently, for example by trying different ways of recruiting users

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to iterate areas that created confusion for users due to technological constraints
  • the team looked at existing services, such as the Personal tax account service and iterated the home page to ensure that users could easily understand how to get to the information they needed

What the team needs to explore

Before their next assessment, the team needs to:

  • increase the regularity with which the team are testing assumptions and designs in user research sessions

  • iterate risky areas identified in user research, for example letting users know when their changes have been completed, through different channels
  • improve journeys where users will need to use different services to update different benefits
  • explore how to improve the user journey when the service redirects users to another service such as OIDV. For example, make it easy to understand when users won’t need to register

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are making use an existing platform which is currently used in production, which is built and maintained to best practices
  • DWP have a comprehensive security sign off process including analysis of data and penetration testing
  • Authentication and authorisation is controlled by the OIDV process, which is a proven process in production

What the team needs to explore

Before their next assessment, the team needs to:

  • although it’s unlikely to cause issues, consider how on-boarding One Login for Government could impact user’s access to data, and ensure that equivalent authorisation rules are applied to One Login credentials

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team worked with performance analysts to develop theoretical metrics based on users and their goals
  • the team has procured support for the beta phase from performance analysts to further develop their performance framework

What the team needs to explore

Before their next assessment, the team needs to:

  • use the theoretical metrics the team has developed to define what good looks like and to set benchmarks and goals for their KPIs
  • specify metrics that will be used to report on cost per transaction

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are using a tried and tested technology stack, with an array of widely used tools, making support simple
  • the delivery stack that the team has chosen is modern and allows for fast iteration with a heavy reliance on automated testing

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • there is an automated process to push code to a public github repository as part of the CI process (https://github.com/dwp)
  • the team are following the standard DWP open sourcing process to manage any risks with this

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are reusing a number of DWP components, such as the CASA framework, and the DWP HCS platform
  • the team make use of standard gov uk design components as part of the CASA framework

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team make use of a standard high availability platform which has been used in production already
  • the team use a suite of monitoring tools as part of this platform to ensure the service runs effectively
  • the team have identified other monitoring tools which may be useful in the future (Dynatrace)

Next Steps

This service can now move into a private beta phase, subject to implementing the recommendations outlined in the report and getting approval from the GDS spend control team. The service must pass their public beta assessment before launching their public beta.

The panel recommends this service sits a beta assessment. Please speak to your Digital Engagement Manager to arrange it as soon as possible.

To get the service ready to launch on GOV.UK the team needs to:

Updates to this page

Published 11 December 2023