Change your address on your vehicle log book (V5C) beta assessment report

The report for Department for Transport's change your address on your vehicle log book (V5C) beta assessment on 20 September 2020

Service Standard assessment report

Change your address on your vehicle log book (V5C)

From: Government Digital Service
Assessment date: 02/09/20
Stage: Beta
Result: Met
Service provider: Driver and Vehicle Licensing Agency

Service description

Service users

This service is for registered vehicle keepers who want to change the recorded address of their vehicle

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has demonstrated a good understanding of two potential user types and a common need to update vehicle address details in a logbook, which has stemmed from a real service problem
  • the team carried out an impressive amount of remote research under pressure. The team used a range of appropriate and mixed research methods, including 29 usability tests with participants who ranged in age, technical confidence and ability, and location across the UK
  • the team has managed to do research with users with cognitive and neurological access needs and have mapped user personas from GOV.UK along the service journey. The team have also undergone a WCAG audit and conducted usability tests on a fully coded prototype
  • the team user research was indeed a team sport. The user researcher made sure their team was involved in research sessions, analysis and findings were regularly played back to the service team. The team also worked closely with the contact centre, looking at speech analytics, contact centre messages, feedback and user satisfaction scores

What the team needs to explore

Before their next assessment, the team needs to:

  • do more research with fleet operators and people who live in new builds in the UK to get a deeper understanding of potential pain points, unhappy paths and needs - are there more user groups that haven’t yet been identified?
  • do more research with users with access needs across a range of devices and tools - look to include users with visual, motor/mobility, auditory, neurological/cognitive access needs
  • do more research with users who score low on the teams’ identified digital inclusion scale - remote working makes this rather difficult / not feasible at present
  • do home visits and observations of users going through the offline journey and at dealerships, to build a greater understanding of their journeys and needs - remote working means this isn’t feasible at present

Considerations not recommendations:

  • it would be good to consider DVLA contact centre staff as users, especially when it comes to them being involved in both the offline and online journeys - do contact centre advisors have needs too?
  • be mindful that a partnership with a recruiter doesn’t limit the breadth of users, so that participants recruited from one recruitment agent/ body aren’t too familiar, or primed when it comes to taking part in research with government services - consider using different approaches for the recruitment of users

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a clear understanding of drivers and vehicle owners across the multiple journeys one might take as and how this service works alongside other services offered by DVLA to solve the whole problem
  • the team is constantly feeding insights from developing this service into the wider digital programme which is looking at services it delivers across the drivers and vehicles portfolio

What the team needs to explore

Before their next assessment, the team needs to:

  • keep evolving the understanding of types of users and their needs when it comes to updating a vehicle logbook as a service, not just updating vehicle address details - for example, that could be people who need to update their name and colour of their vehicle
  • do more research looking at transaction confirmation and any potential unhappy paths - for instance, what if someone doesn’t supply an email and gives an incorrect home address?

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is supported by an offline call centre journey which is there to support users through the service who are unable to complete the online journey
  • the service is also still accessible via the postal channel too

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is consistent with other government services and the team has used interaction patterns
  • where the team has needed to introduce new patterns, the service team have tested and iterated designs based on user feedback and are intending to feed this back to the design community
  • the service team has clearly thought about the cases in which the user can’t complete their task online and have created new pages to guide the user on what to do next

What the team needs to explore

Before their next assessment, the team needs to:

  • do more research around mobile testing and offline parts of the journey - the team is planning to update the V5C log book to include a URL to the service, it would be ideal to test and iterate a few different designs to encourage channel shift behaviour

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated their commitment to provide a fully accessible service - they have conducted extensive user research into accessibility and assisted digital users

What the team needs to explore

Before their next assessment, the team needs to:

  • for the address lookup, carry on working with the third-party provider to reduce the delay between new houses being built and the new addresses being added to the lookup register
  • test the end-to-end journey including non-digital parts to carry on their commitment to accessibility

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a service owner with a deep understanding of the context of users that are affected by this service and has a wide range of knowledge and experience of delivering services that meet the service standard
  • the team took on board the recommendations given at the first peer review assessment and have made considerable progress towards meeting them - they recognise the constraints that their service places on their users and are actively working towards addressing these
  • the team includes a blend of policy professionals and external stakeholders who are kept informed as part of the delivery - they cover all the DDaT disciplines within their core team and the service owner is empowered to make decisions to improve the service

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the team starts planning towards the live assessment for this service within the recommended time frames stipulated in the recommendations

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a clear understanding of using agile techniques and principles which are embedded in their ways of working
  • the team has done extensive research and preparation for their assessment - the same team will be continuing with the service into business as usual and continuous improvement
  • the team has an agile governance structure which supports rapid delivery and change to meet additional needs as a result of research

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to clearly articulate why they built the service, the user needs the service is meeting and the plan for the future
  • the team is constantly reviewing user feedback and looking to push out improvements as quickly as possible to the service
  • the team has proved they’re able to iterate and push out updates without significant disruption to users

What the team needs to explore

Before their next assessment, the team needs to:

  • continue exploring and taking strides towards making the service available 24/7 - they should consider what additional effort is needed to implement this by the live assessment

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done the data protection impact assessment (DPIA) and engaged with DPIO for retention period
  • the team has done the risk assessment and penetration testing
  • data is encrypted while in transit and rest and has role based access control
  • the team informed that they have the implemented lessons learned from other services

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the main success measures are derived directly from the aims of the service such as % digital; user satisfaction levels; reducing the volume of ‘progress chasing’ calls
  • the volume of progress chasing calls has fallen from 50% to 15% since the launch of the service
  • the team is working to calculate the cost savings from the digital service
  • the team had practical examples of where data has led to changes in the product

What the team needs to explore

Before their next assessment, the team needs to:

  • look to implement more features from Google Analytics that will help the team understand how the service is used - for example, to show the number of form validation errors and where they occur
  • look to compare like-for-like user satisfaction scores for different channels across the full process (after the log book has been received)
  • made further progress on calculating costs/savings

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using Ruby on Rails for front end; SOAP Web service (Java) for back end with deployment on Kubernetes hosted in AWS platform
  • the service can work with Javascript turned off and adhere to progressive enhancement
  • no permanent data storage in AWS, once the transaction processing has been completed in REDIS data is stored in the VSS Oracle database
  • the team is using CI / CD pipeline using Drone and Spinnaker
  • for testing, the team is using Cucumber (Ruby) and using an open sourced component called DVLA-Wizard Flow
  • the team is managing all its project documentation and workflows utilising Atlassian tools (JIRA, Confluence, Bitbucket)

What the team needs to explore

Before their next assessment, the team needs to:

  • look into having it in their back log to track the progress - the team is still using the legacy system VSS Oracle database (DVLAs source record) which will be changed as part of a strategic transformation programme and the team should look into having it in their back log to track the progress

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using open source technologies wherever possible
  • the team has published the code and due to security reasons, cannot expose any of the DVLAs back-end services

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using existing ‘Get’ services” with EVL DMS and VSS Web to avoid creating new validation checks with
  • the team is using common Components (re-use and integration with existing services) of DVLA
  • all APIs are SOAP web services

14. Operate a reliable service

The team informed that they have created NFRs and team observed that peak service spikes have been on Mondays and the service has remained stable and fully operational

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using the Kubernetes platform which allows auto-recovery and self-healing. They also utilise monitoring tools such as Instana
  • application and non-repudiation logs will be stored in the common platform ELK stack
  • application monitoring and alerting metrics will be sent and stored within Instana (managed service) and Pager Duty
  • Google Analytics data will be sent and stored within Google (managed service)
  • service will be available 7x8x7 (7am to 8pm 7days)
  • data will be published on Performance Platform
  • 24/7 support, using real-time monitoring and alerting to minimise system outages

Updates to this page

Published 24 August 2022