Service Standard alpha assessment PAYE Service
Service Standard assessment report PAYE Service 03/12/2024
Service Standard assessment report
PAYE Service
Assessment date: | 03/12/2024 |
Stage: | Alpha Assessment |
Result: | Amber |
Service provider: | HMRC |
Service description
HMRC receives over 800,000 phone calls that could be handled digitally if more information could be displayed to customers in a more intuitive way, allowing users to better self-serve. The Single Customer Account (SCA) aims to use a new low code platform to present more information to users to help them understand changes in their tax position including any recent changes. This coupled with a redesign of the service offering should help more users self-serve and reduce low value calls to HMRC’s contact centres.
Service users
This service is for:
- users who have had a change in circumstances which leads to a change in take home pay (Tax Code change resulting in Pay shock)
- users who have a new job or second job
- users who want to check their income tax circumstances
- users who want to report a change in income tax circumstances
Things the service team has done well:
- the service team has worked well together at considerable pace within the constraints of replacing the existing service and has demonstrated that they have been empowered to deliver within the parameters of their set scope. They are aware of their risks and riskiest assumptions.
- it was good to see the amount of research done in a short space of time, it was clear this research had an impact on the design. The fact the researcher had changed the original digital inclusion scale to fit the project was great to see.
1. Understand users and their needs
Decision
The service was rated amber for point 1 of the Standard.
During the assessment, we didn’t see evidence of:
- a full discovery in order to fully understand the user needs and key pain points for users of the current system. The team did a lot of work in a short space of time and received information from the existing team. There was not enough time to fully understand the user needs, user groups and pain points from primary research and existing service analytics
- an extensive research plan for private beta covering different user groups. The team went through a high level plan and it was good to see them prioritising access needs but it would be good to see a more in depth plan prioritising different user groups and how the work would be split between the two user researchers
- testing of the support model the team should ensure this is part of their private beta plan
- extensive testing with users with multiple jobs when they are the users who may potentially have the most issues this should be included in the teams private beta plan
2. Solve a whole problem for users
Decision
The service was rated amber for Point 2 of the Standard.
During the assessment, we didn’t see evidence of:
- demonstrating the risks to outcomes of a limited scope. The team was not commissioned to focus on solving the whole problem for users rather a specific objective on the scope around improving the current experience. However, business and user outcomes would benefit in the next phase from an expanded view
- communicating the risks discussed around consequences of pace of delivery to leaders
3. Provide a joined-up experience across all channels
Decision
The service was rated amber for Point 3 of the Standard.
During the assessment, we didn’t see evidence of:
- the team being empowered to consider the offline journeys that could impact users coming to their service. They should also consider how the content and information changes they make to the online service could impact any offline or telephony routes. Are the users getting the same information as they would, had they accessed the online service
- working with support staff to design scripts and training, nor how they plan to keep support staff up to date with the project’s roadmap and any new features or functionality. The team should be proactive in understanding the needs of support staff and how they prefer to be updated
- prioritising mobile testing in the research and in the plan for private beta. The team should review the analytics from the current service to plan and prioritise testing on different devices
- researching and designing the unhappy paths in more depth
4. Make the service simple to use
Decision
The service was rated green for Point 4 of the Standard.
5. Make sure everyone can use the service
Decision
The service was rated amber for Point 5 of the Standard.
During the assessment, we didn’t see evidence of:
- testing with a varied number of users of different assistive technologies, this was acknowledged by the team and there are plans to address this going forward. The team should try to test with these users to make sure they are meeting their needs and aren’t likely to cause more harm and potential calls to call centres
- exploring alternatives to the online service, being able to understand if there are current offline or other channels that users can currently get similar information, such as through work coaches or employers, about their tax changes might have informed how users expect to be told of a change. With the new notifications of changes coming into effect the team could take this opportunity to understand users expectations in this area
6. Have a multidisciplinary team
Decision
The service was rated amber for point 6 of the Standard.
During the assessment, we didn’t see evidence of:
- demonstrable evidence that an impact assessment has been completed for key roles required in the service team in private beta and beyond to reflect the large nature of this service. The team has been unconventionally resourced in alpha but it has been working well. However, there are single points of failure and heavy dependencies on key roles for example user research
- with a full contractor team, evidence of a formal and active ‘handshake’ plan to permanent civil servants to allow knowledge sharing to manage the risk of single point of failure and if contractor team members change
7. Use agile ways of working
Decision
The service was rated green for point 7 of the Standard.
Optional advice to help the service team continually improve the service
- the team has ensured they have got the service in front of real users quickly and used the learnings but this agile approach has included a short Discovery phase and launching their private beta before the private beta assessment. The degree of pace and delivery targets could harm the ability of the team to flex and take the opportunity to understand and learn from users and adapt to change.
8. Iterate and improve frequently
Decision
The service was rated green for Point 8 of the Standard.
Optional advice to help the service team continually improve the service
- the product manager is focussed on the minimal viable product and prioritised the team to deliver improvements realising the most value. The backlog of out-of-scope priorities should be communicated.
- with a relatively short private beta, the team needs to ensure they de-risk known risks. Some of this crucial activity cannot be rushed and it is about taking sufficient time and headspace. The actions and advice in this report need to be absorbed including taking the space for the required essential user engagement across the eco-system and acting to iterate based on what works and what doesn’t.
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
10. Define what success looks like and publish performance data
Decision
The service was rated amber for point 10 of the Standard.
During the assessment, we didn’t see evidence of:
- the work and definition around measuring performance was well thought out and clearly presented, however it would further benefit in the next phase from evolving to include quantifiable and quantifiable targets that align to the service aims and problems to solve (for example call reductions).
- the beta plan to resource performance and integrate within an integrated system with user research, feedback and other forms of user intelligence.
11. Choose the right tools and technology
Decision
The service was rated amber for Point 11 of the Standard.
During the assessment, we didn’t see evidence of:
- the team is using JavaScript. The team is working with the GDS team on a design, and has agreed on one that would display a message to direct the customer to other digital or non-digital channels. The team needs to see how this works with the Assisted Digital and Accessibility needs.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
Optional advice to help the service team continually improve the service
- the team can try to get the feedback from other departments for more optimisation and improve wherever required.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
Optional advice to help the service team continually improve the service
- the team is using tested components from the design system which is good to see. They have also introduced a timeline component into their designs which seems to be testing well with users. It would be great if the team were able to share any findings and feedback they have had back to the community, that might help build a better understanding of its use.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.
Optional advice to help the service team continually improve the service
- ensure adequate time to run through scenarios with real users in private beta to mitigate risks in public beta including of the chosen technology not being used before at such scale.
- ensure all SLAs are agreed and workflows are published for the running of the services.
Next Steps
This service can now move into a private beta phase, subject to addressing the amber points within three months time and CDDO spend approval.
To get the service ready to launch on GOV.UK the team needs to get a GOV.UK service domain name.