Manage Import Duties and VAT Accounts live assessment
Service Standard assessment report Manage Import Duties and VAT Accounts 27/03/2024
Service Standard assessment report
Manage Import Duties and VAT Accounts
Assessment date: | 27/03/2024 |
Stage: | Live assessment |
Result: | Amber |
Service provider: | HMRC |
Service description
This service aims to solve the problem of users needing to be able to view and manage their various accounts they use to pay for customs. As well as the ability to add agents to use their accounts.
The service within CDS facilitates self-serve to help:
- businesses importing goods into the UK to make payments at the point of entry so their goods can cross the border into the country.
- customers retrieve their transaction data, statements and other information related to their import accounts enabling them to reconcile VAT, duty, and excise owed to HMRC for the organisation
Service users
This service is for…
- micro, small, medium and large businesses (import from outside the UK)
- sole traders (import from outside the UK)
- freight forwarders
Things the service team has done well:
- since the last service assessment, the team has changed their ways of working to instead be directed by user research findings and user needs. They ensure these insights have impact by using them to iterate the product. There was a distinct lack of user research at the time of the last service assessment so the team’s dedication to turning this around is commended.
- use of quantitative analytics and other teams’ qualitative findings to inform the direction and prioritisation of future user research was encouraging and the evidence demonstrated that this approach worked well.
- good use of ecosystem and journey maps, with a clear example of how the ecosystem map was used to inform improvements to the service.
- service is consistent with other government services which has been frequently tested with a range of users.
- good use of quantitative and qualitative data to inform design decisions such as resolving delivery issues for live statements.
- there was clear evidence that the service owner and product manager are now empowered to make priority decisions, based on their own understanding of users’ needs. This is a great achievement following on from the beta report.
- the team evidenced that it is funded adequately to maintain the team in its current form, which contains no shared roles. They have a clear roadmap for the next nine months. The team have a clear direction on improvements they intend to make and iterate on during this period.
- stakeholder management has improved dramatically, with the team actively engaged by stakeholders to provide their expertise on user needs during design phases and working with them to integrate business requirements into the service.
- the team works in fortnightly sprints, whose ceremonies include backlog and story refinement sessions, sprint planning, retrospectives, bi-weekly show and tells, and daily stand ups. They also employ the principles of huddles and three Amigo sessions which point to a good level of Agile maturity. There is also a programme scrum of scrums to ensure information sharing and this enables any best practice to be leveraged across teams.
- the team showed good examples of where user research had influenced change (for example ‘Managing CDS email address’ and ‘Cash Account’) and how the whole team were involved in the discussions to make decisions on how to iterate these changes.
- great collaboration with external teams particularly to solve the problem of users finding the service. The team worked with GOV.UK content team and the HMRC Business Tax Account team to solve the user problem.
- the analytics are in depth, complex and well understood.
- dashboards are interrogatable, adjustable and break down journey and type metrics into usable flows.
- data security and cookie consent is well understood, dashboards don’t contain PII and are contained to a team view with no external access.
- both Qualitative (CSAT/Deskpro) and Quantitative (ASAD/GA) are used holistically to understand the journeys, present visual data and identify improvements.
- PA and UR have good examples of before and after improvements.
- error tracking is comprehensive and granular.
- the team has created custom dashboards that help them observe how well their service is performing and have sensible alerting configured to identify issues quickly.
- the team has made use of reusable services such those for secure messaging, email and file transfers.
- the team has made the majority of their source code available openly, under an appropriate licence.
1. Understand users and their needs
Decision
The service was rated amber for point 1 of the Standard.
During the assessment, we didn’t see evidence of:
- adequate accessibility user research. The team demonstrated restrictive thinking towards accessibility research during the service assessment and this caused concern. Their rationale for conducting no neurodivergence research and research with only a narrow range of disabilities centred around their decision to prioritise other research areas. They saw these different focuses as mutually exclusive, but this is inaccurate. Researching with neurodivergent participants can be embedded into research that doesn’t solely focus on neurodivergence. A more thorough and joined-up approach to this is needed.
2. Solve a whole problem for users
Decision
The service was rated amber for point 2 of the Standard.
During the assessment, we didn’t see evidence of:
- user needs were being met with giving account permission to other authorities. The team presented an example of continuous improvement to the cash account system to handle higher transaction volume. From what was described in the example, it seemed counterintuitive to the giving account permission journey and users were finding workarounds to the service and how it was originally intended to work.
3. Provide a joined-up experience across all channels
Decision
The service was rated green for point 3 of the Standard.
4. Make the service simple to use
Decision
The service was rated green for point 4 of the Standard.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
During the assessment, we didn’t see evidence of:
- being compliant with the WCAG AA standards. The team has a plan of action to meet the standards and they are working towards it.
6. Have a multidisciplinary team
Decision
The service was rated green for point 6 of the Standard.
7. Use agile ways of working
Decision
The service was rated green for point 7 of the Standard.
8. Iterate and improve frequently
Decision
The service was rated green for point 8 of the Standard.
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
10. Define what success looks like and publish performance data
Decision
The service was rated green for point 10 of the Standard.
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.
Next Steps
This service can now move into a live phase, subject to addressing the points of the Standard that are rated amber within three months time and CDDO spend approval.
The team should repeat the development phases (discovery, alpha, beta and live) for smaller pieces of work as the service continues running, following spend control processes as necessary.