Get a goods movement reference Live assessment

Service Standard assessment report Get a goods movement reference 27/07/2023

Service Standard assessment report

Get a goods movement reference

Assessment date: 27/07/2023
Stage: Live
Result: Met
Service provider: HMRC

Service description

  1. as the UK has left the EU, goods being imported to, exported from or transiting through the UK must have customs declarations made against them in order to meet legal requirements and ensure the correct taxes and duties are paid.
  2. known name: Goods Vehicle Movement Service (GVMS); Service name: ‘Get a Goods Movement Reference’
  3. links declaration references together into one digital envelope. This means the person moving goods (the Haulier) only needs to present one reference at the frontier to prove that their goods have the required import or export declarations. This has ensured traffic flow is maintained.
  4. links the movement of goods to declarations, meaning they can be automatically arrived and departed in HMRC customs systems (CDS/CHIEF/NCTS) in near-real-time. This ensures duty points are created and required risking is performed by those systems.
  5. allows customs and transit declarations and associated safety and security declarations to be added to a Goods Movement Reference (GMR) and also caters for other types of movements, for example ATA/TIA Carnet movements, declarations by conduct and empty vehicles that need to move through GVMS enabled ports.
  6. notifies users whether their inbound or outbound goods have been successfully cleared in HMRC systems or not by the time they arrive into or exit the UK or if goods they are moving require inspection, thus ensuring customs controls are in place.
  7. GVMS now caters for movements between GB to EU, EU to GB, GB to NI and NI to GB.

There are currently circa 100,000 transactions per week with 30,800 hauliers registered and there have been 22 releases of new features and improvements (excluding minor content and cosmetic changes) since January 2021.

Service users

  1. Haulier / Driver who moves the goods for the trader. The Haulier is the primary service user and will enrol to use the service and create, populate and present a goods movement reference (GMR) to the Carrier.
  2. Carrier / Port Operator who check-in the GMR and send the notification to HMRC that vessel has embarked.
  3. Border Force who use the GVMS Common Transit Convention (CTC) Portals to view movements and add or remove a control or hold notification.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has tried innovative ways to recruit drivers for usability testing and research (as this was challenging) by going to place of work
  • the team has changed the approach to prioritising policy and user needs to be more user centric and has a collaborative proactive way of working with policy expertise
  • the team has an excellent understanding of how this digital service fits on the ground with hauliers/ drivers and border inspection staff
  • the team has distinguished the needs of the API users from the web service users and has now completed research with API users which was previously a gap

What the team needs to explore

Before the next assessment, the team needs to:

  • continue to seek user feedback through a number of channels (user research, surveys, customer support and so on) and use the insights as a basis for continuing improvements to the service
  • continue to make efforts to research with end users from Northern Ireland, although it is acknowledged that this is politically sensitive, and that the team has mitigated the risks of this research gap through engaging border force staff from Northern Ireland
  • continue to work on revising and refining the user needs, ensuring that they are framed in terms of the user’s problems and in the user’s language rather than in terms of technology, policy and existing services

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team gave a good example of the value of researching and understanding a problem from a user perspective with a problem statement and not assuming a digital and technology solution was the answer. For example need for un-check in – they found that drivers / hauliers solve this problem in a different way on the ground
  • the team has produced a comprehensive and up-to-date service blueprint and eco-system map and undertaken end-to-end research to identify and address pain points
  • the team has undertaken contextual research and is able to demonstrate in-depth understanding of the entire journey using the story board and video clips.
  • the team has been able to explain and document the empowered relationship between the policy team and the user centered design team and how decisions are made to the benefit of the service to ensure the designed service is solving a whole problem for users
  • the team has underpinned design decision making with performance analysis at every step of this phase of the project

What the team needs to explore

Before the next assessment, the team needs to:

  • continue to seek opportunities to provide information at the point where it is required and opportunities to join up different aspects of the journey to avoid making users access disparate websites and apps to complete their goals
  • the team needs to continue to monitor any wider policy changes in trade agreements to ensure that the service continues to meet any new user groups’ needs that may suddenly arise
  • as policy adds further requirements and expectations of the service that the service team is able to scale and to recognise that in order to be able to continue to solve a whole problem for users it may have to create new teams to work on the necessary problems

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to cite an example of bringing information into one place for the border force officers thereby removing the need to go into the customs declaration service to check declarations on hold one by one
  • the team was able to describe how they had worked to allow internal call handlers access to the service’s staging so that the help on offer to external users could be improved
  • the team had constantly monitored how the service is working for users as it has been used in real time and has, where relevant, sort to improve the service to ensure it is providing the best joined up experience for its users across all channels
  • the team has utilised as many different methods as possible to fully understand how the service works in the real world for end users, including story boarding and on site user research particularly travelling in a lorry to understand as much of the real service in real time as possible

What the team needs to explore

Before the next assessment, the team needs to:

  • consider offering contact information for customer support within the government digital service itself. While it is acknowledged that not all end users interact directly with the government digital service, and that research has shown that contact information is readily accessible using a search engine, it seems a shame to make those who do use the online service leave it in order to find contact information
  • continue to develop the use of story boards in conjunction with the continued user research in order to be able to clearly describe the end-to-end user journey

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • for one of the more complicated pieces of new functionality, the file upload component, the team conducted a mini-discovery phase with two rounds of user testing where they were able to iterate the functionality based on user insight and make the necessary content changes
  • the above file upload component is grounded in the GOV.UK Design System file upload component and all further functionality is sourced from cross governement design research
  • where the design team has had to go off pattern it has always been clearly justified by design research and development and feedback back into the wider cross government component and pattern development and research and ultimately the GOV.UK Design System
  • the team conducted A/B testing with different sets of content to disprove that presenting a harsh warning to users would have the desired impact
  • through performance analysis, the team were able to improve compliance checks at the border and improve the experience and save time for both user groups

What the team needs to explore

Before the next assessment, the team needs to:

  • perhaps explore the additional paperwork that is required at the inspection facility that is not known upfront - Border can manually decide to hold rather than an automated decision so not always known up front but may be worth exploring number of times and so on to see if this could be improved based on instances to date
  • consider whether location of inspection while getting a GMR could be on the same screen during transaction and whether it may make it easier
  • continue to monitor the unhappy paths and error generation on the file upload function to ensure it continues to work for everyone who uses it
  • continue to monitor the continuous user research and user feedback for any opportunities to improve the service particularly to speed up processes and to ensure users remain compliant

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • a Digital Accessibility Centre (DAC) audit was completed in July 2023, the service is fully compliant, and the accessibility statement has been updated
  • the team has a good understanding of how their service is used on mobile devices with 80% desktop use in an office environment and 20% mobile / tablet
  • work has been done with proxy users with additional accessibility needs and users with low levels of digital confidence
  • plans are in place for continuous monitoring of feedback of everyone who is using the service and whether the service is meeting their needs. Particular attention is being paid to the monitoring of people with different access needs
  • plans are in place for continuous usability testing
  • the continued developing of good relationships with the HMRC Customer Support Group (CSG) for the above purposes

What the team needs to explore

Before the next assessment, the team needs to:

  • continue to undertake research with users with additional accessibility needs and users with low levels of digital confidence.
  • continue to seek genuine end users of the service with whom to carry out accessibility and assisted digital research. The number of contacts to customer support from users with assisted digital needs is very low, but is the team completely confident that the actual need is low, or are users perhaps being excluded prior to contacting customer support, due to the lack of an alternative to the online service?

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the Service Owner is part of the daily stand ups
  • performance analysis is well integrated into the wider team to ensure all decision making is underpinned by data analysis

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is mature and found the best way to work together (scrum/ban combination) and this was evident from the way the team interacted on the assessment and their collaborative working
  • the service continues to be iterated in response to policy change, continuous improvement and API development with external organisations like DAERA with good governance in place to prioritise work

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has continued to iterate and improve the service with a user centric approach and has a road map defined until June 2024.

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have regular security risk assessments with the HMRC digital security team
  • the team have engaged in both internal and external penetration testing
  • limited data is captured across the journey, with the only identifiers being VRNs, TRNs and CRNs linked to businesses

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has developed a comprehensive and robust Performance Framework document that feeds into defining what success looks like
  • the Performance Framework document is a live document updated as the user needs change
  • the team gave multiple demonstrations on examples where analytics data contributed to changes to the service
  • the team provided examples of using A/B testing to understand the affect changes had to the services and to users’ ability to understand how to complete their tasks
  • the team demonstrated they have developed a positive data culture where the Performance Analyst on the team is actively engaged with other disciplines within the team including, sharing insights in regular meetings, responding to testing hypotheses provided from User Researchers because of user research sessions and providing ad hoc insights as and when they are developed
  • the team shares their insights as and when required and will be uploading their data to data.gov.uk

What the team needs to explore

The team needs to:

  • ensure the details in the Performance Framework are regularly updated and reviewed. They also need to ensure any changes to the KPIs and metrics are reflected in their reporting and dashboards

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team makes use of the standard HMRC digital technical stack, which is a proven, reliable stack with plenty of documentation and support available
  • the team makes use of heavy automated testing using the HMRC digital testing tech stack

What the team needs to explore

Before the next assessment, the team needs to:

  • the team has identified areas of the code which will need to be tidied up once requirements have been finalised.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a plan going forward to open source non-sensitive portions of their code

What the team needs to explore

Before the next assessment, the team needs to:

  • identify more opportunities to open source code as the legal requirements are finalised

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the panel was impressed that:
  • the team use shared components provided by MDTP such as the EIS integration layer, the software vendor facing API platform
  • the team have contributed reusable components, including the Push/Pull notifications API and the EORI Registration service
  • since their last assessment, the team have reached out to the Digital Contact team to reuse solutions for SMS and email contact
  • the team have explored whether business identity verification would be suitable, and have decided it is not necessary for the service

What the team needs to explore

Before the next assessment, the team needs to:

  • reach out as planned with the One Login for Government team on MDTP to ensure that the migration will not cause any issues for the service

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team make use of the standard MDTP platform and the reliability it brings such as zero downtime deployments, high availability across multiple data centres and easy scalability
  • the team have identified a non-digital route to ensure that users are not affected by downtime

Next Steps

This service can now move into a live phase, subject to implementing the recommendations outlined in the report and getting approval from the CDDO spend control team.

The team should repeat the development phases (discovery, alpha, beta and live) for smaller pieces of work as the service continues running, following spend control processes as necessary.

Updates to this page

Published 14 April 2025