Duplicate and replacement Driving Licences alpha assessment report
Service Standard assessment report Duplicate and replacement Driving Licences Assessment date: 26/11/2024
Service Standard assessment report
Duplicate and replacement Driving Licences
Assessment date: | 26/11/2024 |
Stage: | Alpha |
Type: | Assessment |
Result: | Amber |
Service provider: | DVLA |
Service description
- the service will allow customers to change their address on their driver record and licence.
- the customer could also request a duplicate licence if their original has been lost/stolen/defaced/destroyed.
- during these transactions a customer will be able to carry out optional tasks like change their photograph or be prompted to conduct mandatory requirements which is different to the legacy system.
- the service could also require a fee before submission. Customer will also be required to create a DVLA customer account as part of the transaction.
Service users
This service is for…
- customers who hold UK Driving Licences
- customers who hold Provisional UK Driving Licences
- vocational UK Drivers
- UK Customers with Medical Conditions
- users who want to transact online
Things the service team has done well:
- the service re-uses user flows from other DVLA services which the team have iterated and improved throughout Alpha.
- there is an internal portal which allows a DVLA operator the ability to see what a user sees to help them and keep them in the digital channel.
- the service re-uses many technical components from the wider DVLA service including open source, security considerations and building on the performance dashboard already in place.
- good evidence provided of iterations to the service including change of address where old address is involved, before you continue page, GB driving licence page.
- the team has worked closely with the HO for passport and picture upload to confirm tech to be used.
- the service is 24/7 for users, utilising a queuing technique for consistency.
1. Understand users and their needs
Decision
The service was rated amber for point 1 of the Standard.
During the assessment, we didn’t see evidence of:
- User research with a wide range of access needs. The team have not tested with those with access needs. They must change their approach to recruitment and before Public Beta test the service with:
- those that have a wide range of accessibility needs,
- those that never have, never will,
- those that have been online but now do not,
- those that want to but are unable.
- user research with those that use Assistive Technology. The team are aware of a gap that although they utilise DAC for accessibility testing, they have not tested with sufficient users with accessibility needs. This must be done before they launch into Public Beta.
- user research on the unhappy paths and exit pages.
- research with users on mobile devices.
- face to face user research
These gaps should be addressed before the service moves into public beta. As things stand, the team do not have the evidence to show the service works for everyone who needs to use it.
- the team’s personas featured stock photos and ages. It was unclear whether they were indeed personas (based on multiple user’s experiences), or case studies (an in depth look at one person’s experience). It is important to label and represent them correctly to aid correct interpretation and decision making.
2. Solve a whole problem for users
Decision
The service was rated amber for point 2 of the Standard.
During the assessment, we didn’t see evidence of:
- the team described how these new services co-exist alongside services such as DVLA account. This didn’t effectively communicate all the steps a user must complete to request duplicate or replacement licence. Artefacts such as a map of the whole service would better communicate this, demonstrating their understanding of the end-to-end journey for users, including aspects not owned by the project team.
3. Provide a joined-up experience across all channels
Decision
The service was rated amber for point 3 of the Standard.
During the assessment, we didn’t see evidence of:
- the primary focus for the team has been improvements to the digital service. It wasn’t clear how the team are working with other parts of the department to improve non-digital channels as part of this work.
4. Make the service simple to use
Decision
The service was rated amber for point 4 of the Standard.
During the assessment, we didn’t see evidence of:
- the team haven’t tested their designs with mobile users. The team didn’t have stats showing the percentage of mobile users of their service, but mobile consistently accounts for over 50% of visits to GOV.UK (source: https://insidegovuk.blog.gov.uk/2020/07/23/improving-the-mobile-experience-on-gov-uk/). This highlights the importance of testing the designs on mobile devices and ensuring the experience is equal to other devices.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
During the assessment, we didn’t see evidence of:
- the team have been unable to test the service with users of assistive technology such as screen readers.
- users who don’t want to create a DVLA account will be sent to the legacy service, which may have accessibility issues.
6. Have a multidisciplinary team
Decision
The service was rated green for point 6 of the Standard.
7. Use agile ways of working
Decision
The service was rated amber for point 7 of the Standard.
During the assessment, we didn’t see evidence of:
- although the team have access to a pool of user researchers and one is allocated to the team, the user researcher is not part of some of the agile ceremonies (for example stand up). The panel recommend that all members of the multidisciplinary team join stand up to share updates and progress and link in with all other members of the team.
8. Iterate and improve frequently
Decision
The service was rated green for point 8 of the Standard.
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
10. Define what success looks like and publish performance data
Decision
The service was rated green for point 10 of the Standard.
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.
Next Steps
This service can now move into a private beta phase, subject to addressing the amber points within three months time and CDDO spend approval.
To get the service ready to launch on GOV.UK the team needs to:
- get a GOV.UK service domain name
- work with the GOV.UK content team on any changes required to GOV.UK content