Manage your Transit Movement beta assessment
Service Standard assessment report Manage your Transit Movement 12/06/2024
Service Standard assessment report
Manage your Transit Movement
Assessment date: | 12/06/2024 |
Stage: | Beta |
Type: | Assessment |
Result: | Amber |
Service provider: | HMRC |
Previous assessment reports
Service description
- submit departure declarations required to move goods via the transit procedure
- within the EU and other Common Transit Convention (CTC) contracting parties.
- submit an arrival notification when goods reach their destination
- check their Guarantee Balance to ensure that they have sufficient financial cover for the exercise and duties payable on the goods
Service users
This service is for…
- the NCTS service users are mainly traders who move goods, agents working on behalf of traders, and third-party software developers who make software for agents and traders to use.
Things the service team has done well:
- the team has embedded UCD approaches to their work, and actively worked to understand needs
- the team has worked within constraints to test, iterate and improve the service offering
- there is a clear driver to improve from the current service offering, and the team has started to engage with their user base to inform of the changes through webinars and other activity
- the team has created usable products for both the API and user interface
- the team has worked across HMRC on the changes, including their contact centre and has considered how other teams can be aware of the work
- the team has reused multiple components within their wider HMRC tech stack avoiding duplication and unnecessary spending.
- the team has continued working in the open, publishing their code in open repositories so it can be accessible.
1. Understand users and their needs
Decision
The service was rated green for point 1 of the Standard.
2. Solve a whole problem for users
Decision
The service was rated amber for point 2 of the Standard.
During the assessment, we didn’t see evidence of:
- the team articulating how their service fits into the other services highlighted in their ecosystem map
- a plan of the service will join up with other services or areas over the lifetime, and that the team is actively considering how their service fits in this wider user journey.
- the service team is actively working with the GOV.UK content teams on other areas of content
3. Provide a joined-up experience across all channels
Decision
The service was rated amber for point 3 of the Standard.
During the assessment, we didn’t see evidence of:
- testing and iteration of the end-to-end service. The service includes both digital and offline assets. Work done by the service team in relation to offline assets was not demonstrated.
- how the team is actively testing with front-line staff and ensuring that the service joins up regardless of channel (e.g. API or online journey) and user need
4. Make the service simple to use
Decision
The service was rated green for point 4 of the Standard.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
During the assessment, we didn’t see evidence of:
- testing of specialist language used in the service with users (the team told us that this has happened and they have not identified any problems with terminology used in the service).
- testing of offline service assets (paper forms).
6. Have a multidisciplinary team
Decision
The service was rated amber for point 6 of the Standard.
During the assessment, we didn’t see evidence of:
- policy being actively embedded and working in the team
- whilst the team has lots of the expected roles, the balance of contractors is high, and the team should take care that there’s a stable plan, particularly that they expect a high turnover
- the team has had the opportunity to explore the end-to-end picture and how the API and front-end products can join up together through user-centred design. A dedicated service designer may be able to help explore the end-to-end picture.
7. Use agile ways of working
Decision
The service was rated amber for point 7 of the Standard.
During the assessment, we didn’t see evidence of:
- the team working at an appropriate scale, and with suitable governance. Some roles in the service team appear to be operating at a lead level or have cross-cutting responsibilities.
- the team actively considering the impact of a large team to ensure everyone is heard
- collaborative and iterative working between content and interaction design in prototype. The team described a way of working in which the content designer works in text documents and the interaction designer uploads this content into a coded prototype hosted on Heroku.
8. Iterate and improve frequently
Decision
The service was rated amber for point 8 of the Standard.
During the assessment, we didn’t see evidence of:
- a clear roadmap of future problems, or functionality that the team still need to build or consider to meet user needs, and how they are planning to ensure iteration is at the heart of their public beta activity
- how the team plan to use all the data they are collecting to prioritise iterations
9. Create a secure service which protects users’ privacy
Decision
The service was rated amber for point 9 of the Standard.
During the assessment, we didn’t see evidence of:
- performance testing carried out on the new technical components the team had introduced. The temporary solution has been to ensure access to the new technical component is blocked until the pen test is being carried. Once this happens, if no critical risks are highlighted, the new component will be introduced. The new technical component should be introduced once the team knows it has not introduced any vulnerability or risk to the service.
10. Define what success looks like and publish performance data
Decision
The service was rated amber for point 10 of the Standard.
During the assessment, we didn’t see evidence of:
- how the whole service journey would be measured, although the digital service was thoroughly tracked. There was a mention but no specific plan on how to measure the offline or non-digital aspects of the service such as call centre calls from users.
- how the service plans to measure the mandatory KPIs or reasons why these are not appropriate measures and how they will report and publish these.
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated amber for point 14 of the Standard.
During the assessment, we didn’t see evidence of:
- having any formal disaster recovery procedure (including restoring data from backups) fully tested and signed off by the relevant HMRC service owner. At this moment the team can only estimate what will happen should a critical incident take place. Given the relevance and impact of the service, a plan to maintain service continuity after an incident will minimise data loss and reputational damage.
Next Steps
This service can now move into a public beta phase, subject to addressing the amber points within three months time and CDDO spend approval.
This service now has permission to launch on a GOV.UK service domain with a beta banner. These instructions explain how to set up your *.service.gov.uk domain.
The service must pass a live assessment before:
- turning off a legacy service
- reducing the team’s resources to a ‘business as usual’ level, or
- removing the ‘beta’ banner from the service