Claim a Self Assessment refund Beta assessment
Service Standard assessment report Claim a Self Assessment refund 19/11/2024
Service Standard assessment report
Claim a Self Assessment refund
Assessment date: | 19/11/2024 |
Stage and type: | Beta assessment |
Result: | Amber |
Programme: | Income Tax Self Assessment |
Service description
The ITSA Claim a Self Assessment Refund allows users to request an ad hoc refund against any Self Assessment credit balance they hold. It also allows them to view the history of their requests and track their statuses. Currently there is a legacy refund service built on the old ‘green screens’. This new service we have built is to replace the legacy service while improving the user journey. The scope of the assessment should include the ITSA View & Change service, and the ITSA Claim a Self Assessment Refund service. BTA/PTA should remain out of scope as they are existing services covered elsewhere by GDS assurance.
Service users and needs
-
as an ITSA individual/organisation I need to request a refund so that I can retrieve money I am owed
-
as an ITSA agent I need to request a refund for my client so that I can retrieve money they are owed
-
as an ITSA individual/organisation I need to view my refund history so that I can track the status of my refund
-
as an ITSA agent I need to view my client’s refund history so that I can track the status of their refund
Things the service team has done well:
-
the team has used the GOV.UK Design System and iterated designs over time to ensure the service is using the most appropriate components and patterns whilst meeting WCAG 2.2 standards. The team also looked at the language in other government services to ensure consistency and familiarity for users.
-
the Performance Analyst on the team works closely with all disciplines ensuring insights are passed ensuring data driven improvements. The team have developed a number of performance dashboards used to share the metrics with their stakeholders.
-
the team understood technical constraints of the refund process and has tested different ways to ensure the user does not need understand those to complete their journey.
-
the team has made good use of HMRC’s digital tax platform, reusing common components and patterns to ensure a reliable, familiar service, while meeting all technical requirements on security and governance.
1. Understand users and their needs
Decision
The service was rated amber for point 1 of the Standard.
During the assessment, we didn’t see evidence of:
- conducting accessibility testing with a diverse range of users who have access needs or rely on assistive technology. It is noted that planned testing aims to include an inclusive group of users to uncover unmet needs or pain points that have yet to be identified.
- recognising that user needs evolve over time, reliance on panels alone may limit insights, particularly into the needs of novice users.
- leveraging the call centre to test and refine the support journey.
- gaining a deeper understanding of users who rely on additional support to complete online tasks due to limited digital skills or a lack of trust.
2. Solve a whole problem for users
Decision
The service was rated green for point 2 of the Standard.
3. Provide a joined-up experience across all channels
Decision
The service was rated green for point 3 of the Standard.
Optional advice to help the service team continually improve the service:
- the team should monitor if contact increases when users have problems with the service, particularly when requests are rejected, as the service team have are unable to provide users with an explanation.
4. Make the service simple to use
Decision
The service was rated green for point 4 of the Standard.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
During the assessment, we didn’t see evidence of:
- researching with participants with access needs and low digital literacy.
- consideration of an assisted digital route for those unable or unwilling to use the online service. The panel recognises the assisted digital solution has been separated from the scope of this work at a programme level, but this needs to be addressed before users are mandated to use the service. Designing assisted digital routes earlier would ensure the service works for everyone from the beginning.
6. Have a multidisciplinary team
Decision
The service was rated green for point 6 of the Standard.
7. Use agile ways of working
Decision
The service was rated amber for point 7 of the Standard.
During the assessment, we didn’t see evidence of:
- service team engaging with the legacy service. Event though it is a new service and a new platform the user will still be trying to do the same tasks and the team ought to be engaging with the legacy service to support the transition and the decommissioning as outlined here: https://www.gov.uk/service-manual/agile-delivery/retiring-your-service.
- at the time of the assessment the team has not tested the service with real users. The service was tested with a small proportion of total users that will eventually have to use the service. While the panel believes that this testing was comprehensive and the team has done a great job with the constraints in place, the panel would like to see how the team adapts and iterates based on the waves of real users that will be using the service.
8. Iterate and improve frequently
Decision
The service was rated green for point 8 of the Standard.
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
10. Define what success looks like and publish performance data
Decision
The service was rated amber for point 10 of the Standard.
During the assessment, we didn’t see evidence of:
- a Performance Measurement Framework. The team has produced a set of measures for the service. The measures were derived following conversations with different stakeholders including their Senior Management Team. The team should produce a Performance Framework document consolidating all the reporting needs of each stakeholder group into one live document.
- the team using legacy data to measure before and after. This data will help to measure how the new service performs compared to the legacy service. The team should investigate how to get access to the legacy data and incorporate into their measurement strategy
- the team should investigate if it’s possible to use alterative tools for sharing insights. The current solution of needing CIP compliant laptops means it’s more arduous to share insights. The team should consider alternative methods for sharing their insights and removing a barrier which could limit the team’s ability to make data driving decisions when iterating on the service.
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.
Next Steps
This service can now move into a public beta phase, subject to addressing the amber points within 6 months and subject to CDDO spend approval.
This service now has permission to launch on a GOV.UK service domain with a beta banner. These instructions explain how to set up your *.service.gov.uk domain.
The service must pass a live assessment before:
- turning off a legacy service
- reducing the team’s resources to a ‘business as usual’ level, or
- removing the ‘beta’ banner from the service