Claim Child Benefits beta assessment

Service Standard assessment report Claim Child Benefits 03/05/2023

Service Standard assessment report

Claim Child Benefits

From: Central Digital & Data Office (CDDO)
Assessment date: 03/05/2023
Stage: Beta
Result: Met
Service provider: HMRC

Previous assessment reports

  • Alpha assessment review

Service description

HMRC intends to offer a fully digital process to claim Child Benefit. This will be delivered through an iterative roll-out of digital functionality, moving from pure paper to a full end-to-end digital service for as many users as possible.

This beta assessment is for the transition to that fully digital, end-to-end journey from a print and post experience.

The Claim Child Benefit service consists of a single digital front end journey available to all claimants. At the end of the digital front end journey claimants will either be able to submit digitally at a click of a button or will be asked to print off and post the PDF output to HMRC, potentially with supporting documentation. This depends on:

  • whether users log in with Government Gateway
  • the answers they give throughout the journey (e.g. do they need to send in supporting documents, does HMRC need a fully paper/human process at this point for certain claim types)

This digital change is part of a managed digital transformation for HMRC of one of its core services as part of the wider Single Customer Account programme. Child Benefit claims have historically been 0% digital where a user can wait 16 weeks to have their claim processed. The iterative digital transformation roadmap will allow an increasing proportion of claimants to conduct the whole process in a digital channel and have their claim processed and paid within 3 days.

Service users

This service is for people who are responsible for caring for a child that is under 16, or under 20 and in full-time, non-advanced education (for example, studying past 16 in a school or college).

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • There was a good documentation of user pain points and how these were relayed to the team to enable design iterations
  • There was a good awareness of the users journey into the service, including knowledge of search terms and sources of entry
  • Good use of unmoderated research to identify problems and pain points whilst the service is in private beta
  • Clear next steps identified including how analytics will be used to continue to develop understanding of user needs and pain points

What the team needs to explore

Before the next assessment, the team needs to:

  • Use performance metrics and online feedback to enhance understanding of the user journey, including identifying any pain points or user needs which may not have been picked up via qualitative research
  • Map unhappy journeys to understand the needs of users who are likely to not be able to complete the online service
  • Map the user journeys for non-standard use cases, such as rival claims and instances of domestic abuse, to ensure the user needs and pain points for these use cases are fully understood
  • Continue to review and develop user need statements to reflect any emerging differences between different user groups, using ongoing user research and metrics/analytics

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • The team have a good understanding of the happy path of the end-to-end journey and have made design iterations to improve user pain points.
  • Clear evidence from research and analytics showing users are able to successfully complete their journey for the main use cases.
  • They have worked collaboratively with GOV.UK content designers to improve starting points within guidance
  • An understanding of the non-straightforward claims, particularly those that result in a hybrid online-offline journey
  • There are clear support channels available throughout the service, with user feedback considered on a weekly basis both from a front end and operational basis. This has led to improvements in journeys - for example they knew that people were calling to add a second child to their claim because the guidance wasn’t clear, they have now made changes to the guidance to make it clearer they can use this service to do this.

What the team needs to explore

Before their next assessment, the team needs to:

  • Ensure they have sufficiently explored the needs in the ‘unhappy’ paths and designed for use cases such as the ‘Save & Return’ journey
  • Gained a greater understanding of the experience for users who have hybrid online-offline journeys

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • Worked with caseworkers and operations team members to help understand, co-design and iterate the end-to-end user experience.
  • Mapped out the full end to end service, including on and offline routes.
  • Mapped out and understood the various entry points into the service on GOV.UK and worked collaboratively to agree a plan for updating existing guidance to ensure a consistent experience for users
  • Created clear exit points from the service to follow on tasks a user may need to do, for example into Self Assessment

What the team needs to explore

Before their next assessment, the team needs to:

  • Ensure that the print and post user journey fits with the rest of the online journey  and is iterated as much as possible to also ensure successful outcomes
  • Be confident that the new digital Claims journey fits well as part of the wider ‘Child Benefits’ programme and users can gracefully go between services

4. Make the service simple to use

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • The service has been designed using GOV.UK patterns and styles
  • The team have made significant iterations to the service based on usability testing, and it was great to see examples where particularly complex parts (for example where do you usually live) have been researched with users and iterated to improve content, journeys and cognitive load
  • Data is being used to inform design decisions
  • An accessibility audit has been carried out and there is one issue remaining for the team to address

What the team needs to explore

Before their next assessment, the team needs to:

  • Consider how to provide a clearer confirmation to users about their claim and next steps, such as through a confirmation email or reference number that can be checked against. It was mentioned that some users submitted multiple claims without realising they’ve already had another claim.
  • Usability testing with users with access needs has not been conducted yet, and while the service uses existing patterns and has had an accessibility audit, it’s important that usability testing with these users is conducted
  • Explore what language and design is used in other government services, particularly services which aim to capture similar information or access similar users (such as other benefit/financial support claim services) to improve consistency and familiarity for users across different government channels

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • The team had engaged with the Disability Inclusion and Accessibility Standards team to identify WCAG compliance
  • There is a clear plan for ongoing UR with users with access needs
  • Users with other needs, such as those who speak English as an additional language (EAL), have been included in the UR at private beta

What the team needs to explore

Before their next assessment, the team needs to:

  • Ensure a round of research has been conducted with users with access needs, including those who use assistive tech and those who don’t, so that the team is not only relying on the outcome of an accessibility audit.
  • Capture the digital literacy of users in future rounds of UR to demonstrate that the service is accessible to users with a range of digital skills
  • Continue to do UR with users who are at most risk of being excluded from the service, this includes EAL, low digital literacy, digitally excluded users and users with access. This will ensure that the service can be used and accessed by everyone.
  • Metrics/analytics should be used to identify users who drop out/don’t complete their journey and explore the reasons for non-completion. This will enable the team to identify any issues with inclusion or accessibility.

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • The team is currently well set up and understands their users and the wider context they operate in.
  • There is a lot of collaboration with other teams, including operations and policy colleagues, the wider Single Customer Account (SCA) programme, as well as HMRC’s ‘Model Office’.
  • The Service Owner and Product Manager in particular engage actively with their senior stakeholders and have managed to push back on unrealistic expectations or asks that are not driven by user needs or business rationale.
  • It’s obvious that the team works well together across different disciplines and they share a lot of knowledge such as between user research, performance analysis and service design.

What the team needs to explore

Before their next assessment, the team needs to:

  • Come up with a more robust, written plan for knowledge-share and handing over the service to the wider Child Benefits team as part of the SCA’s programme, and be given the space and time to implement this handover plan. There’s pressure for the work to be handed over as soon as possible, which poses a significant risk for the service improvement backlog. There is a willingness to do detailed handover over a number of workshops as well as through written documentation. However, all that work is still in flux and subject to change due to external pressures.
  • Ensure that the new team which will own this service is well set up to not just support but also iterate and improve the Claims process. At the moment, this team, which is part of the SCA ‘Child Benefit’ squad, has their own long backlog of work. It’s not clear whether the new team that will own the product will be sufficiently resourced to ensure that all the items on the longer backlog of the service will get worked on.
  • Achieve a more sustainable pace of delivery. To meet the deadlines the team has worked really hard, including long hours and there is a risk of burn-out or diminished quality of work. This should be recognised by senior management and provisions should be made to ensure that this does not happen in the future.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • The team uses Agile tools and techniques, and there is a lot of cross-discipline collaboration.
  • There’s a strong commitment to iterative development and releasing value to users and the business as early as possible.
  • The team is trying to champion agile ways of working to their senior stakeholders and other colleagues who may not appreciate the benefits of delivering in this way. They’ve collaborated with comms colleagues and senior leaders to derisk delivery and ensure that the platform is not launched without essential features.

What the team needs to explore

Before their next assessment, the team needs to:

  • Address the significant pressures from external stakeholders to deliver in a fairly ‘big bang’ manner by a certain deadline, which goes against the principles of sustainable agile delivery.
  • Ensure that the new team that takes over the service will continue to work in agile.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • There have been a lot of iterative improvements to the service, especially for the main use cases / happy path.
  • The team releases features and improvements at pace, although it wasn’t obvious if the team have had a chance to retest some of the improvements they release.

What the team needs to explore

Before their next assessment, the team needs to:

  • Ensure that once the service is in public beta it continues to be iterated on. There is a risk that senior stakeholders perceive the service as ‘done’ once the main user journeys and ‘MVP’ marked items are delivered, and there is little resourcing available to continuously improve the product.
  • Ensure that the pace of iteration is more sustainable (as listed in point 6). The assessment panel felt that often an issue will get iterated based on user findings but then there is little opportunity to retest in a qualitative manner and ensure the changes have resulted in improved outcomes.

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • The service team have demonstrated within their service performance metrics, dashboards and KPIs of how they collect and process user’s personal information and its usage, storage and accessibility by the caseworkers team.
  • Their reliability and observability framework depicts how they encompass and carry out penetration and vulnerability testing within the MDTP platform and externally as well viz. 3rd party banking service providers

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • The team had produced a comprehensive Performance Framework document showing how their KPI’s had been developed from user needs and hypothesis on meeting the user need. The team also included external stakeholders in developing the Performance Framework to align metrics and KPI’s
  • The team uses Google Analytics and SPLUNK as the main source of their user behaviour data in line with the rest of HMRC. SPLUNK is the primary data source.
  • The Performance Analysts have the necessary security clearance to access the data needed in order to provide aggregated data to stakeholders.
  • The team has a full time Performance Analyst embedded in the team with support from the wider Performance Analytics team lead. There is structured meetings within the team and also ad hoc discussions when required to feed insights into the wider team
  • The team demonstrated a number of examples where the Performance analyst worked with multiple disciplines to feed insights into the team leading to data driven improvements to the service.
  • There is a concerted effort within HMRC looking into who will be the data owner of the data on data.gov.uk.  The team are working to align with this work
  • The team are also working to ensure that their dashboard solution could be used as a standard across HMRC measurement teams.
  • Their SIRO has signed off using Google Analytics and SPLUNK

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • The CCBS service is built on HMRC’s existing PaaS platform called MDTP(Multi-Channel Digital Tax Platform) based on AWS, their strategic cloud provider
  • An apt and integrative choice of web services, APIs, micro-services is built on a technically strong architectural foundation
  • The team have adopted a balanced approach in managing their ‘print-&-post’ service together with the digital transformation and its adoption within their journey - taking into consideration and expectations based on learnings from iForms and paper-based claims process.
  • Their automation-first approach for testing, in security, performance and accessibility needs has been evident in their approach towards providing a consistent and reliable service.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • The CCBS service team have adopted the Gov.uk design system pattern for their public facing interfaces within the platform
  • Common components have been built as shareable services viz. their DMS submission and ‘User allow list’ services
  • CCB usability testing has been used to inform, configure and develop cost effective reusable patterns to make disparate documents a part of a user’s digital journey.

What the team needs to explore

Before their next assessment, the team needs to:

  • The service team needs to evidence what open standards (and not open sourced) components have been included within their service.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • The service team operates a Reliability and Observability framework within their MDTP platform to  both scale-up their deployment services as well as to deploy changes to their software components there hence minimising service downtime within the environment.
  • Their monitoring tooling viz. Kibana, PagerDuty, Grafana, Splunk provide  performance metrics to detect failures in downstream services much earlier and for triaging problems across the platforms, processes and payment timescales

What the team needs to explore

Before their next assessment, the team needs to:

  • Ensure they can fully scale for increased demand in public beta, and that users are given more assurance about the expected timescales for processing of their claims, and how to get in touch if they have a problem.

Updates to this page

Published 11 December 2023