Manage your vehicle operator licence

This is the Live assessment report for the Department for Transport DVSA manage your Vehicle Operator licence service from 27th January 2021

Digital Service Standard assessment report

Vehicle Operator License

From: Government Digital Service
Assessment date: 27/01/21
Stage: Live
Result: Not met
Service provider: DfT - DVSA

Previous assessment reports

Service description

A Vehicle Operator License is a legal authority needed to operate goods or passenger vehicles in Great Britain and in the case of Northern Ireland only, good vehicles. Licensing is required under primary UK legislation.

The Vehicle Operator Licence digital service enables users to apply for and manage their Vehicle Operators Licence online and compliments offline options available for the same purpose including post.

Service users

This service is for any business which as part of its operations uses a motor vehicle on the road with:

  • a gross plated weight of more than 3.5 tonnes; or
  • if it has no gross plated weight, an unladen weight of more than 1525kg.

Or if a business needs a Public Service Vehicle ( PSV ) operator’s licence to operate:

  • a vehicle for hire or reward (payment or payment in kind) that can carry 9 or more passengers.
  • a smaller vehicle carrying passengers and charging separate fares for the journey

‘Business’ in this context applies to any kind of trade or business operating in the UK including but not limited to sole traders, partnerships and companies. It is the responsibility of the Company Director, Partner or sole trader to acquire and manage their licence in line with statutory requirements.

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has conducted and retained extensive research into the service’s extremely diverse user group, their different levels of knowledge about transport issues and the different ways they use the service, including validation with trade bodies and industry assumptions
  • the team has considered the needs of people who have a disability, building on the findings of a full accessibility audit across the whole licence management cycle and undertaking research to understand the functional and emotional needs of people who have low IT literacy

What the team needs to explore

As the service continues, the team needs to:

  • build a research library to present and share actionable insight with other teams inside and outside the agency, and ensure the team’s knowledge of its users is retained

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has planned how it will research internal user needs and tasks in preparation for improving efficiency and reducing application processing time for the live service (speed is a key user need identified)
  • the team plans to continue its research into how and when people access the service, with particular focus on type of device and trigger points for change (eg update vehicle details on licence via mobile device during or after a spot check)
  • the team conducted attitudinal research in informal settings (eg ‘Truckfest’) to understand hard-to-reach users’ attitudes and barriers to transacting digitally

What the team needs to explore

As the service continues, the team needs to:

  • research effectiveness of the service’s assisted digital support mechanisms – while face-to-face opportunities to ask questions at ‘new operator seminars’ may help people manage licences, video tutorials and blogs may not be accessible to people with low IT literacy
  • research the user journey for citizens, businesses, local authorities and police who want to object to a licence application (1.5% of applications are objected to) and consider ways the service might meet their needs
  • do more research to understand if people with low IT literacy or concerns about their financial history can use GOV.UK Verify (or equivalent) to prove their identity when applying for or surrendering licences

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a strong commitment from the wider organisation to persist key roles into the Live phase allowing the service to be continually improved in line with user feedback, research and performance metrics
  • the service has secured the required funding to ensure the service can be continually developed and iterated
  • the service team is composed of the key DDAT roles recommended by the Service Manual
  • the team is reducing reliance on contractors

What the team needs to explore

As the service continues, the team needs to:

  • further reduce reliance on contractors and recruit individuals into the remaining key service team roles which are currently not fulfilled by Civil Servants
  • make improvements to the knowledge transfer process including the collection of additional contextual information which would aid the service team in smoothing over changes in team members

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team continues to evidence that they are working in an agile manner with iterative development and the momentum to maintain an appropriate level of continuous service improvement into live
  • DVSA is seeking to adopt agile service delivery methods which have been tested by the VOL service team into the wider organisation
  • the team has established clear service and product ownership with decision making appropriately devolved down to service team level
  • the use of agile project management tooling such as Jira and Confluence has been embedded in the team allowing easier and more proactive delivery management as well as providing a strong knowledge base for future service team members to utilise

What the team needs to explore

As the service continues, the team needs to:

  • maintain the momentum of iteration and test at appropriate intervals whether more significant transformations of the service are required as time passes and user requirements change

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is maintaining a backlog and incorporating regular challenge sessions including Service Reviews and Programme Increment planning to agree priority items for iteration with the business
  • the team evidenced that releases and the completion of backlog items are occuring at a reasonable regularity and pace
  • opportunities for future major iterations have been identified and the team intend to bid for top-up funding to enable delivery

What the team needs to explore

As the service continues, the team needs to:

  • evidence greater collaboration with policy colleagues and agree a roadmap for legislative changes to unblock key future iterations
  • improve the use of performance analytics as an evidence base for iteration
  • challenge existing ways of thinking around how the service has historically operated and make greater consideration to more significant change to the surrounding business supporting processes

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has started to adopt native cloud functions for simpler development and support of the solution in future. This could make any future migration of the solution between cloud hosting providers more difficult, but this is likely outweighed by the ongoing benefits
  • infrastructure as code and automation are both used to ensure reliable and fast deployment of new versions and to aid recovery and version control
  • there is good use of standard tools for monitoring infrastructure performance, service availability and cloud service costs

What the team needs to explore

As the service continues, the team needs to:

  • continue the adoption of serverless and low code functions and decommission traditional servers. This should simplify maintenance and continuous improvement and may reduce costs

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the DPIA is kept under regular review
  • WAF protection is in place and actively monitored
  • there are regular penetration tests and IT health checks
  • code is scanned and checked before being published

What the team needs to explore

As the service continues, the team needs to:

  • consider more frequent vulnerability monitoring of external interfaces, starting with making full use of NCSC’s free Web Check service

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team publish all code on GitHub
  • the team recognise the risks of coding in the open and so code is developed in private then put through a rigorous review process before publication

What the team needs to explore

As the service continues, the team needs to:

  • keep under review the possibility of coding in the open within a regime of tighter coding standards

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the solution is based on standard design patterns
  • the team has a good understanding of their unique components
  • there is some standardisation and sharing of code elements
  • the team is starting to work more closely with other organisations to share good practice and publicise their contribution.
  • the team is committed to continuous improvement and plans involvement in government API groups

What the team needs to explore

As the service continues, the team needs to:

  • continue to build closer collaboration with organisations using similar technology

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a good range of testing including functionality, security and performance
  • there is a strong commitment to continuous improvement

What the team needs to explore

As the service continues, the team needs to:

  • consider strengthening the link between user feedback and setting priorities for continuous improvement

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team appreciate that the impact of the service being offline for a short time is usually just a minor inconvenience to users
  • there is resilience and redundancy built into the infrastructure to reduce the risk of unplanned service outages

What the team needs to explore

As the service continues, the team needs to:

  • consider strengthening plans for issuing notifications in suitable places (VOL website, DVSA website, DVSA press office) should the system be offline unexpectedly
  • consider – subject to user demand – developing apps to allow applications and updates to be created offline and synchronised later

12: Make sure users succeed first time

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is conducting user research and using that to shape design decisions and drive regular improvement
  • the service includes a joined-up phone channel which allows assisted digital users to access the service
  • the team have implemented digital features which help make the journey more seamless for users, such as pulling data through automatically, responsive questions and extended time-out periods
  • the team is developing an upfront tool to help clarify eligibility for the service
  • the team is running a user-led ‘Get your applications right first time’ project in order to address where users are not succeeding the first time
  • the team has had an external accessibility review of the most popular service journey and is addressing points of failure against compliance with WCAG accessibility standards
  • the team run new features against accessibility tools, use screen reader analysis and colour contrast and colour blindness checks
  • the team have contact with staff network groups and industry that enable consultation and testing with people that have a variety of accessibility needs

What the team needs to explore

As the service continues, the team needs to:

  • address potential issues highlighted by the identification of a substantial number of digital users who are making errors and submitting incomplete applications
  • consider that extensive guidance on separate pages might not be the only or best opportunity to increase the number of users who succeed first time
  • conduct in-depth usability and content review to understand exactly which pages and page elements are blocking users from completing their journey independently. If possible, conduct content and usability review before the release of each major iteration
  • review content on the service start page and any critical information to include here that could help users succeed
  • consider where terminology and the placement of elements on the page might be confusing users
  • find alternative ways to distil and breakup guidance so it can be included directly on relevant pages. For example, consider inset text which stays on the page to more specific information on what different options mean
  • review the presence of non-interactive radios, for example the way the ‘Where will you operate your vehicles’ element is presented. Through usability testing, find out if re-phrasing these elements as statements makes the journey easier to navigate
  • explore further use of data validation to reduce errors. See if it is possible to identify more points in the journey where data already held by the government can be automatically pulled through and reduce the amount of information a user has to complete
  • review speed at which points of failure against the accessibility standard are being addressed and see how improvements can be made faster

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are using the prototyping kit to design, develop and test service improvements
  • the team is maintaining a backlog of work to keep them up to date with the standard
  • in many cases the team are using standard styles, components and patterns from the design system to keep in alignment with the GOV.UK design system. Where they have additional needs new features are being developed based on user insight and design need

What the team needs to explore

As the service continues, the team needs to:

  • wherever possible use standard GOV.UK styles, components and patterns when implementing new features to the service. When designing new features ensure the team is actively finding ways to stick to the standard
  • when it’s necessary to deviate from the GOV.UK design system, be sure components are fully tested with users and meet accessibility requirements
  • where the team have developed new components and patterns that could be valuable to other teams, always contribute these back to the GOV.UK design system through the GOV.UK design system community

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has evidenced very high takeup of the digital service and this is a testament to the hard work of the team in engaging with the industry through attendance at trade fairs, appropriate trade and interest festivals etc
  • non-digital routes to completing the service journey remain to allow users who either do not wish to or who have problems accessing online services to complete the transactions they need
  • testing has been completed with users with accessibility needs

What the team needs to explore

Before their next assessment, the team needs to:

  • remediate the remaining accessibility issues in order to bring the service into compliance with the required WCAG AA standard. It was noted during the assessment that the team are committed to delivering against this within the next couple of months
  • continue to explore methods of assistance which can be offered to users who find using digital technology a challenge including a review of the accompanying guidance associated with the service to ensure that it is easy to understand and accessible

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has the tools and processes in place to collect performance data for the mandatory performance indicators and management information to run the service
  • performance data has been collected throughout the service’s lifecycle giving the team an extensive data source to analyse historic trends
  • the team has plans to improve their data collection through use of automation and self-service dashboards to make data more readily available with less manual effort

What the team needs to explore

As the service continues, the team needs to:

  • review existing sources of performance data to assess where they can be improved or expanded. This should include reviewing user satisfaction surveys to consider including short-term questions investigating specific trends and issues
  • collect and use performance data on more performance indicators to build a comprehensive view of the whole service

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has identified performance indicators for the digital service including measuring transaction volumes, user satisfaction, completion rates and digital take-up. They also monitor average processing times to understand the end-to-end service
  • they have the tools in place to generate automated reports and interactive dashboards that enable users within the team and wider organisation to review performance data and act on insights. They have recently introduced automated alerts when performance deteriorates below a threshold to enable faster investigation and action
  • the team has a segmentation strategy for analytics to identify and investigate trends by user type and transaction. This is combined with user research to understand user behaviour, their challenges and test new approaches

What the team needs to explore

As the service continues, the team needs to:

  • review and improve completion rate performance as this has been consistently low (25-40%). The team explained that this reflects how users are interacting with the service to trigger a required timeframe and therefore start the process before they have all the required information. It could however suggest that either users are unable to complete successfully or there is a problem with how the indicator is being measured
  • review their performance framework and identify additional performance indicators to evaluate service performance. This could include metrics such as avoidable contact rate, first contact resolution, average interactions per case, and average time cases are in a pending state
  • make greater use of performance data to iteratively improve the service as part of their research and design process. The team should have clear examples of identifying issues in performance, making changes to the service, and measuring the results to assess if the changes are an improvement. A new performance analyst has recently joined the team with plans to improve this through regular performance review sessions

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has an established dashboard on the Performance Platform with frequently updated data. For some metrics, the service has been providing data since 2016
  • they are monitoring the performance data published to identify trends and issues that need further investigation, such as the deterioration in user satisfaction that they identified as being a result of case processing delays caused by COVID-19

What the team needs to explore

As the service continues, the team needs to:

  • investigate if data can be collected on the Cost per Transaction and how they can improve it. This is one of the standard four performance indicators so should be published on their service dashboard on the Performance Platform. They should review the guidance on Measuring cost per transaction
  • consider what additional metrics can be published onto the service dashboard to be more open and transparent on service performance

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has been tested with the Senior Traffic Commissioner who has overall oversight and responsibility for the service. This is likely an appropriate level for routine testing with senior officials

What the team needs to explore

Before their next assessment, the team needs to:

  • consider arranging a single meeting with an appropriate Minister as the service enters the Live phase to agree any additional requirements or priorities

Updates to this page

Published 25 August 2022