Apply for duplicate log book (V5C) beta assessment report
The report for Department for Transport's apply for a duplicate log book (V5C) beta assessment on 20 October 2020
From: | Central Digital and Data Office |
Assessment date: | 20/10/20 |
Stage: | Beta |
Result: | Met |
Service provider: | Driver and Vehicle Licensing (DVLA) |
Service description
This service enables registered vehicle keepers to apply for a duplicate V5C registration document (log book) via an online channel as opposed to the current paper (form V62) and telephone processes. The information provided will be used to issue a replacement V5C to the registered keeper when the original has been lost, stolen or destroyed or when the vehicle has been involved in an accident and has been identified as Category S salvage. There are no changes to the legal requirements and the service is being built under the current regulations that require the registered vehicle keeper to be in possession of a V5C registration document (log book) for the vehicle on which they are recorded.
Service users
This service is for a keeper of a vehicle who needs to apply for a duplicate copy of their V5C registration certificate (log book) in compliance with the law.
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team did user research with a diverse range of users, ensuring research is inclusive of users falling within the assisted digital spectrum and users that use assistive technology to go online
- a number of methods used to uncover in depth user needs and pain points, including; desk research, survey analysis, in depth interviews, moderated and unmoderated usability sessions
- the team tested error pages as part of the unhappy path of the service journey
- usability sessions conducted with 14 users, 3 sessions and discovered very few issues and is very simple to use
- the team uncovered support needs, where users would go if there are any issues, for example friends or family members
- users would also contact the customer contact centre with regards to enquiries
- the team end-to-end journey tested with users to address any additional pain points
What the team needs to explore
Before their next assessment, the team needs to:
- conduct any planned research, and iterations are made accordingly to the service journey, for example around vehicle identification number
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team had considered where this service sits within the greater suite of services with an eye towards a digital logbook in future
- the feedback from related services was examined for points relating to V5C
- users’ existing data is surfaced for them so they can confirm rather than re-enter
- by working with the GDS, they were able to improve the start page to try to stop users going through and paying if the service would not apply to them
What the team needs to explore
Before their next assessment, the team needs to:
- explore if there is more that can be done to guide users who are not on a simple duplicate journey - those who need to make a change before requesting have a much harder path
- consider other places to join up, for example tax, and should keep those conversations going
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team researched with the call centre to ensure any questions
- the team has tested both mobile and laptop as standard journeys and have had no issues
- the team has a plan to update the paper form, but not get rid of it, and also currently have ways to nudge users online including the phone system messaging and in person guidance at the post office
- the team ensured that the final page says that the transaction is being processed as opposed to complete and were able to explain why a transaction might fail in the backend and how the user would be contacted
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- while people may leave to check their vehicle identification number (VIN), the service also does not time out if they choose to do it amidst the process
- DVLA is considering the terminology changes over time, for example while the page is no longer in book form, people still refer to it as a logbook so that is the word chosen
- customer expectation survey done periodically to gather information with regards to users digital versus non digital preference and their thoughts on this service
- GOV.UK patterns and in line verification are used
What the team needs to explore
Before their next assessment, the team needs to:
- continue looking at more guidance to help people find their VIN
- continue testing error messaging and the failure demand
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team tested the end to end service journey with users using assistive tools to go online for example, screen readers and users with dyslexia
- a good support model in place for users that are not so confident with online services
What the team needs to explore
Before their next assessment, the team needs to:
- complete their assessment at the digital accessibility centre (DAC) which has been scheduled but was not available before this service assessment
- ensure any recommendations from DAC are actioned
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the service owner is empowered and able to make decisions to prioritise the work to improve the service
- a core team is working on the service with all digital, data and technology disciplines included
- the team is working closely with those managing other channels such as policy and those managing the paper channels
- the same team will continue during beta to ensure consistency and continuous improvement of the service
What the team needs to explore
Before their next assessment, the team needs to:
- start planning for their live assessment, taking on board the recommendations from the beta assessment
- have a clear plan for how the service will be run during live, including moving it over to the legacy replacement service
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team apply agile principles to their work
- a clear governance process is in place to allow the team to prioritise their work and move forward very quickly
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- since launch, the team has already improved the service in response to user feedback a number of times
What the team needs to explore
Before their next assessment, the team needs to:
- continue to iterate in response to user research
- continue to monitor user needs in respect of the amount of information on the start page and iterate as necessary
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team obviously take security very seriously
- the team has completed a number of activities including items such as IT health check, penetration testing, data protection impact assessment (DPIA)
- the team will include this new solution into existing security monitoring
- data will be encrypted in transit and at rest
- payment card industry data security standards (PCI-DSS) has been factored in from the beginning
What the team needs to explore
Before their next assessment, the team needs to:
- consider the use of outbound email protection with a facility such as domain-based message authentication, reporting and conformance (DMARC) or sender policy framework (SPF)
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- key metrics had been developed via discussion between the performance analyst (on a centralised analytical team) and the core team to identify bespoke metrics that show whether the overall purpose of the service is being met, as well as required metrics
- a range of evidence from research, customer feedback and data are used to understand what’s working and not working for users
- quality assurance of data using external sources such as the tech tracker is taking place
- achieving sustainability - looking at paper and carbon emissions - is measured
- the impact of the tracking consent changes to opt-out by default rather than opt-in by baselining Google Analytics against back-end data systems is understood
- caveats in Google Analytics metrics such as location are understood, and the team had considered reasons for including these
- data is used to understand the user journey flow, and has well-named drop-out points making them clear to understand
- the information assurance group is being worked with to understand data security options balanced against creating a unique identifier to measure completion rate
- impact on calls following introduction of the digital service is understood
- reasons for the current higher demand are understood
What the team needs to explore
Before their next assessment, the team needs to:
- show how data and evidence is driving iterations of the service
- show how iterations of the service are measured - ensure a hypothesis of expected results is considered before a change is made, for example to content, then show how it is measured to understand the impact of the change - was it successful or not? For example - what impact did you expect adding a ‘try again’ button to the failure page to have, which metrics are relevant to measure this, did adding it show improvement or decline in those metrics?
- continue your investigations into how to measure completion rate with a split journey for payments, in particular where the start and end points of the journey are - if the current start point is maintained, continue your focus on why there is such a drop-out within the journey for users to look for the VIN
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has selected from existing technology already extensively used with DVLA
- the team is making extensive use of cloud technologies along with use of technology that will allow use of a CI/CD pipeline
- the team realises that some decisions have been made due to the limitations of the systems they interface to, for example use of simple object access protocol (SOAP)
- the team has made extensive reuse of existing modules within DVLA, for example the use of the payment broker to provide the interface to GOV.UK Pay, and the reuse of the email gateway
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team is working towards making their source code open and uses GitHub
- the team use open and common components
- the team has a clear view of what can be open, meaning the front end and what cannot, for example the backend
What the team needs to explore
Before their next assessment, the team needs to:
- work towards a greater level of openness - this is a recommendation only and not a failing of what has been achieved so far
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team realises the benefit of open source and making things open
- the team is making use of existing patterns and common components within DVLA as well as wider GDS solutions such as GOV.UK Pay
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team is using cloud technology and making use of multiple availability zones
- the team has accounted for the service not being available and can revert to other systems and processes as required
- the team has factored in their dependency on 3rd party systems and has the ability to switch providers as required, for example multi payment service providers via the common payment gateway