View and change your tax account alpha assessment

Service Standard assessment report View and change your tax account 25/06/2024


Service Standard assessment report

View and change your tax account

Assessment date: 25/06/2024
Stage: Alpha assessment
Result: Red
Service provider: HM Revenue & Customs

Previous assessment reports

Not applicable.

Service description

This service allows self assessment users who have signed up for Making Tax Digital “Sign up to using software for your self assessment“ to view their income tax details, obligations, charges, payment details and pay their income tax and other charges.

Service users

This service is for…

  • Self Employed Individuals
  • individuals with rental properties
  • agents of self employed individuals
  • agents of individual with rental properties

Things the service team has done well:

The team had a clear understanding of user needs and how they link to pain points and demonstrated a clear strategy for gathering ongoing feedback. User needs were developed and iterated throughout initial phases of development along with collaboration with other teams to gather an understanding of user needs and wider user journey.

The team delivered a good demo during the assessment and it was great to see how some designs had been iterated based on user feedback. The team also understood where their service fits into the broader user journey for Making Tax Digital. Evidence had been used to prioritise the ‘opt-out’ feature, ensuring users have a simple to use online journey when not mandated to use the service.

The Performance Analyst discussed how the KPIs of the service were developed using a performance framework. This involved a collaboration with service team colleagues in a workshop environment whose input ensured the framework aligned with the user needs of the citizen. There is also a plan to start to develop hypothesis statements based on the measurement framework and usage volumes, (to ensure statistical significance of findings), which will be a valuable tool when measuring future iterations of the service. The Performance Analyst is working closely with both service teams and is embedded into their ways of working. It was also discussed that an ITSA/MTD Performance Analyst working group is being set up to share insight across the end-to-end journeys, the assessment panel encourage this.


1. Understand users and their needs

Decision

The service was rated amber for point 1 of the Standard.

During the assessment, we didn’t see evidence of:

  • exploration of the unhappy path for users; users who use the service incorrectly or how to support users get to the right service for them
  • testing of the wider user journey, beyond adjustment of POA, including how users opt out and add businesses

2. Solve a whole problem for users

Decision

The service was rated amber for point 2 of the Standard.

During the assessment, we didn’t see evidence of:

  • the team thinking about how this journey will join up with the HMRC app

3. Provide a joined-up experience across all channels

Decision

The service was rated amber for point 3 of the Standard.

During the assessment, we didn’t see evidence of:

  • the team being empowered to find the best way of solving the problem - the scope of the service and the problem the team is working to solve is focused on delivering the policy, rather than meeting the variety of user needs.
  • the team having tested the offline journey. As take up increases, the team needs to investigate if the new service will impact the offline route.

4. Make the service simple to use

Decision

The service was rated amber for point 4 of the Standard.

During the assessment, we didn’t see evidence of:

  • the team testing, iterating and analysing the performance of the account homepage/dashboard
  • the team designing and testing on mobile devices
  • content being simple to understand. In some parts of the service, the content is simple, for example ‘What you owe’ and the team has iterated based on user feedback. However, in other areas, content appears to use more internal language, doesn’t always explain what the content means and feels policy-led, for example the ‘Select a reason’ screen, where there is no way out for users who don’t think their reason fits the suggested options.

5. Make sure everyone can use the service

Decision

The service was rated amber for point 5 of the Standard.

During the assessment, we didn’t see evidence of:

  • the team testing the prototype with assistive technology - the service uses many tables and pages with lots of content. These should be tested thoroughly with screen readers and on mobile devices.
  • consideration of how a mandatory digital service will impact users with low digital literacy or digitally excluded users.

6. Have a multidisciplinary team

Decision

The service was rated amber for point 6 of the Standard.

During the assessment, we didn’t see evidence of:

  • a sustainable plan in place to ensure that there is continuity within the delivery team which is solely made up of suppliers. How will HMRC ensure that the service is built to run and not to launch?
  • a plan to transfer knowledge and skills from contractors to permanent staff. As there are no permanent staff it is difficult to see how this will be achieved. What does the long term View and Change or MTD team look like?

7. Use agile ways of working

Decision

The service was rated red for point 7 of the Standard.

During the assessment, we didn’t see evidence of:

  • how the team will prioritise riskiest assumptions. Whilst the team demonstrated that they have researched and iterated designs for a complex transaction within the service, POA, there was no evidence that the team had researched, prioritised and tested some riskier assumptions around uptake of the service, how agents will use the service when they have multiple (hundreds) of clients registered and whether users can succeed navigating the end to end service.

8. Iterate and improve frequently

Decision

The service was rated red for point 8 of the Standard.

During the assessment, we didn’t see evidence of:

  • where team priorities lie and which improvements will deliver the most value. It is unclear that the team understands or has prioritised improvements in the end to end service that will deliver the most value. For example, will users know how to navigate the ‘portal’, how will agents progress through the service when they have multiple client? Has View and Change been designed to address the pain points of users identified including managing multiple clients and using third party software?
  • a plan for Private Beta to develop and test an end to end user journey that doesn’t only focus on a specific interaction within the service. It was unclear how the Private Beta phase will run for View and Change or how users will be onboarded to use the service in anger.
  • the team didn’t not demonstrate how they are prioritising using the voluntary MTD period to ensure the service meets user needs before it is made mandatory.

9. Create a secure service which protects users’ privacy

Decision

The service was rated green for point 9 of the Standard.

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

11. Choose the right tools and technology

Decision

The service was rated amber for point 11 of the Standard.

During the assessment, we didn’t see evidence of:

  • how the service will effectively handle the relationships between agents and customers. Although there was awareness that agents working for organisations handling many clients are likely to need more flexibility in managing these relationships, there was no evidence presented on how this might be overcome. Given the anticipated audience size, further effort must be put into solving more complex user management questions.
  • how the Agent Services tool integrates with the View and Change platform.
  • a robust approach to how third party software will be tested for reslience, accessibility, security and reliability. Although vendor software is optional for View and Change, the expectation is that there will be a significant range of software vendors integration with View and Change. Evidence of a robust approach to evaluation and testing of vendor software would help clarify how this risk is to be managed.

12. Make new source code open

Decision

The service was rated green for point 12 of the Standard.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated red for point 13 of the Standard.

During the assessment, we didn’t see evidence of:

14. Operate a reliable service

Decision

The service was rated amber for point 14 of the Standard.

During the assessment, we didn’t see evidence of:

  • how the service will support citizens and agents with technical issues or other queries. Although the team explained that HMRC does offer end user support services through its existing helplines, there was no evidence of how the mandatory elements of Making Tax Digital (MTD) are likely to affect the nature and volume of calls and queries to the support desk. The Alpha phase is the right time to test risky assumptions (see https://www.gov.uk/service-manual/agile-delivery/how-the-alpha-phase-works), and therefore the team should explore how the nature and volume of support queries might change with MTD. User expectations around support are likely to be high and growing. A 9-5, business as usual support model should be challenged, for example through exploring options such as chatbots/live chat, a ticketing system or even use of AI. End user support will be an important part of the success of MTD, but this was not adequately evidenced at assessment.
  • how tools such as ‘messaging’ integrate into the service. The messaging tool was present in some places and not others, and it was unclear what the purpose of it was. More evidence is needed on how the View and Change service fully integrates with other parts of MTD and the HMRC platform to ensure that users experience and reliable and joined-up journey.

Next Steps

In order for the service to continue to the next phase of development, it must meet the Standard and get CDDO spend approvals. The service must be reassessed against the points of the Standard that are rated red at this assessment.

Updates to this page

Published 30 December 2024