Driver examiner service beta assessment report
The report for the Driver and Vehicle Standards Agency's driver examiner service beta assessment on the 21st November 2019.
From: | GDS |
Assessment date: | 21/11/2019 |
Stage: | Beta |
Result: | Met |
Service provider: | Driver and Vehicle Standards Agency (DVSA) |
Previous assessment reports
Service description
The service is providing learner drivers with a driving test to allow them to become full licence holders so they can drive a vehicle unaccompanied. The current service starts with driving examiners receiving a daily journal with details of the test candidates that they will be conducting their tests on for the working day. These details are manually copied onto seven different DL25 test reports, one for each candidate. The examiner then greets the candidate in the waiting room to make the necessary identity checks.
The driving examiner conducts the test, including manoeuvres and necessary legal requirements, recording the faults on a test report. Details of show me / tell me questions, vehicle registration, driving instructor details etc are also recorded on the test report. The test ends with a test result and debrief provided to the candidate, with a copy of the test report. A full and thorough write up of weather conditions, candidate description, fault write up etc are then completed back in the office. On successful tests the candidate will receive a full licence from Driver & Vehicle Licensing Agency DVLA).
A key factor in keeping Britain’s roads safe is ensuring that all drivers have the skills, knowledge and understanding to be a safe and responsible driver.
There are different types of tests for different variations of vehicles including cars, buses, lorries, tracked vehicles, tractors and motorcycles.
In a separate service, candidates complete a test to demonstrate theory knowledge and hazard perception skills and once passed can book a practical test where an examiner will assess their driving ability and knowledge against specific criteria, this is where our service begins.
The service currently accepts (among other channels) online test bookings, however for the examiner the process is completely manual and paper based.
The examiner’s daily schedule and associated candidate data is faxed and printed out at test centres daily where it is transposed manually to paper test forms to be used for examinations. During the test, the faults are marked on the paper form, then once the test is complete the candidate gets the result and feedback is given orally. Once back in the office the examiner must write up test and candidate information manually.
The paper forms are posted to Newcastle upon Tyne for scanning and population into the legacy system of record. This scanning process has a high failure rate and high manual intervention need.
Following the scanning, DVLA issues a driving licence for successful licence acquisition candidates.
Service users
Primary users
- 1800 Driving Examiners
- 175 Local Driving Test Managers
- 1.8m driving test candidates
Secondary users
- Approved Driving Instructors
- Corporate Correspondence
- ADI Enforcement Examiners
- Quality Assurance
- Trainers
- Trainees (New Entrants)
1. Understand user needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team understood the end-to-end journey, which includes multiple settings, behaviours and transitions
- the team showed a very good understanding of the complex physical context of the driving test, including the transitions from office to car park to vehicle, and showed a deep commitment to understanding
- the team took care to investigate the ergonomic challenges of using an iPad to record outcomes in the driving test, which were raised in the previous assessment; took expert advice from an academic psychologist on cognitive load and attentional switching, and then devised a programme of work to assess and measure examiner behaviour with the new device
- the team worked hard to understand examiner psychology, including fears and blockers to them using the digital service, and used research both to evaluate design and to build engagement with the examiner community
- the team based on their understanding of driver needs and environmental constraints, the team improved not only test designs, but also device settings, device handling and device carrying/protection
- the team explored accessibility issues and identified design principles to shape design development
What the team needs to explore
Before their next assessment, the team needs to:
- continue to conduct research with new groups and new settings
- continue to do work over the next few months, particularly with less confident or reluctant users. On the whole, it seems as though the new service will be well received
2. Do ongoing user research
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team completed a substantial volume of research across the country, with different groups, and succeeded in ramping up from a small controlled group to larger open groups
- the team conducted research aimed at addressing their biggest challenges, such as reluctant users
- the team used multiple methods, including usability testing and device analytics, to understand the performance of their designs and come up with improvements
What the team needs to explore
Before their next assessment, the team needs to:
- show how they can continue to improve and iterate with the live service, with a smaller and likely less experienced team
- show how to measure performance clearly and meaningfully, going beyond user satisfaction
3. Have a multidisciplinary team
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a service owner with deep understanding of the context of the user groups affected by the service (in particular driving examiners)
- the service owner clearly has sufficient authority to make day-to-day decisions to improve the service - supporting and empowering a highly engaged and collaborative cross-functional team
- the team took on board previous alpha recommendations, ensuring each delivery team has its own user researcher and that a content designer has also been brought on board
- the team has done extensive work in response to prior recommendations, and built the internal and external partnerships required to research, design, build and successfully roll-out the service in a matter of months – whilst ensuring user needs remain at the heart of decision making about the service
- the panel was especially impressed with the team’s consultation of academic experts to inform their user research methodology on in-car safety and device use; and their continuous engagement of the network of driving examiners nationwide throughout development. The panel considers this work exemplary in spirit and practice. We encourage DVSA to share the story of this work (and the lessons learned from it) both internally and across government
What the team need to explore
Before their next assessment, the team needs to:
- make and implement plans to maintain their rigorous focus on user needs throughout the transition to Live and beyond. Whilst obviously benefitting from the skilled support of their co-located contractor teams, the team’s stated timescales for transition to a separately-located, permanent ‘Continuous Improvement’ team are of some concern. In particular this offers relatively limited time to ensure that the deep knowledge the team has gained is internalised (and any necessary new skills developed) within the Continuous Improvement team, who will continue to run, maintain and improve the service once contractor team members move on
4. Use agile methods
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team is working with a clear understanding of agile ways of working, tools, techniques and disciplines
- the team was well-prepared and able to discuss in detail the extensive user research they had done, and the impact this had on design decisions related to the service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the team’s strong agile disciplines and ways of working are adopted and maintained by the Continuous Improvement team - allowing the end-to-end service to remain responsive to change going forward. ensure lessons learned from the delays the team reported at MVP stage are collaboratively fed back to ensure that the relevant governance and assurance approaches are proportionate, well-suited to agile working and well-understood within the department - avoiding unnecessary delays to future service design and development work
5. Iterate and improve frequently
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team was able to explain in considerable detail what they built and why, demonstrating a clear understanding of how their service is built to meet user needs
- the team was able to describe the lifecycle of a user story from user research to production and explain the process for identifying and prioritising insights from user research
- the team proved it has the ability to deploy software frequently with minimal disruption to users
- the continuous integration and continuous deployment infrastructure selected by the team is appropriate and supports frequent updates. The weekly hotfix releases seem to be working well; the Continuous Improvement team should continue to keep a frequent cadence of releases
- the Continuous Improvement team at DVSA seems to be well-positioned to take on the operation and iteration of the service from a development perspective
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the Continuous Improvement team remains able to research, design, test and deploy significant design iterations to the end-to-end service. The panel understands that this team has so far spent more effort supporting ongoing technical fixes than continuously improving the design. Whilst in future the ‘core’ service design may not significantly change – the team cannot afford to assume that this means design is now effectively ‘done’. Design improvements should be planned and folded into the process of continuous improvement. The team should plan accordingly
- consider and address the risks of the current fixed year-end funding envelope for the team and develop contingencies for these. Specifically, they should address the risks that: 1) the remaining work (addressing the other, smaller licence categories) may be rushed, rather than carried out with appropriate attention to detail; and 2) insufficient time may be available to onboard and upskill the Continuous Improvement team – limiting their ability to continue improving the service going forward. The team should ensure these contingencies are in place for user research, design, development and performance analysis capabilities in particular - and they should not seek to return for a Live assessment until they can demonstrate their ability to run the service sustainably. DVSA may wish to reconsider the fixity of funding for some part of the current team – at least as far as is required to ensure that the Continuous Improvement team can preserve (in a sustainable way) the deep knowledge of users and the end-to-end service that has been developed. Given the extent and depth of work done by the team to date (which the panel wholly commends) - to lose this prematurely for want of a few weeks would be at best short-sighted
- ensure that the security and privacy representatives on the project remain closely connected to the Continuous Improvement team so that their viewpoints can be included in subsequent releases
6. Evaluate tools and systems
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the use of serverless technology in particular seems to be a good choice and provides a great deal of security and performance resilience with limited use of the team
- the use of the common TypeScript language in the front-end and back-end of the application supports good skills transfer and allows feature developers to code end to end journeys
- the application seems to be reasonably well-insulated against changes to the target deployment platform, with a minimal amount of re-writing needed for new version of iOS or even other operating systems. However, this is an area for DVSA to continue to monitor closely, and to create any new branches in good time to support planned updates to deployed hardware and software versions
- the use of speech to text was a very good example of taking advantage of a sophisticated hardware function
What the team needs to explore
Before their next assessment, the team needs to
- consider further enhancements fundamentally enabled by the digital service, beyond merely ‘replacing’ the old paper system. In future, as more examiners are digitally native and come into the system using the digital assistance, it might be possible to use the iPad hardware to support the test administration and monitoring. For example, the presence of a GPS radio, ability to do inertial mapping and to track the device accelerometer would all seem relevant to the test
7. Understand security and privacy issues
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- there was a clear understanding of threats, including insider threats. These were mitigated through various controls and monitoring is in place
- the encryption of data at rest and in transit is helpful, with the exception of the minimal bootstrap dataset
integrating with the existing DVSA AWS IAM security structure is a good choice and should enable the continuous improvement team to control the many security zones within the application
What the team needs to explore
Before their next assessment, the team needs to:
- configure google analytics to anonymise IP addresses, as this is considered personally identifiable information
8. Make all new source code open
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- all application code from alpha and beta is open and inspectable
- the team is particularly commended for the contribution of node.js code back into the NPM build dependency management system
What the team needs to explore
Before their next assessment, the team needs to:
- there may be more streamlined ways to manage secrets that do not require an entire gitlab instance, but this would need to be explored in the larger DVSA secrets management process
9. Use open standards and common platforms
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the REST APIs used represent good practice
- the team is commended for the use of the GOV.UK Notify platform to share test results via email and letter with the test candidates
What the team needs to explore
Before their next assessment, the team needs to:
- there might be useful new open data that could be made available via API based on this service, similar to the MOT service. It would be nice to see DVSA investing in this area
10. Test the end-to-end service
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- there appeared to be extensive user testing of the end-to-end service
- technical test frameworks are appropriate, though there are some difficult areas around the integration with the legacy TARS systems. Technical tests are embedded within the build framework
What the team needs to explore
Before their next assessment, the team needs to:
- testing should include the failure of dependent services, such as the back-end TARS system, the back-end MI system and the failure of the driving test scheduling service
11. Make a plan for being offline
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- a manual paper workaround form has been implemented, aligned with the new design of the iPad app – so minimising the overall cognitive burden for examiners
- the entire iPad application works in an offline mode, which is a significant advantage over pure web applications. Subsequent synchronisation has been considered and tested
- by providing 3G/4G iPads with extra batteries the team has mitigated many of the risks of application unavailability
What the team needs to explore
Before their next assessment, the team needs to:
- explore the potential downtime implications if some of the legacy back-end systems are not available, especially in the initial Journal scheduling. While not ideal, it might be necessary to explore being able to create a “blank” test where the candidate details have not been preloaded into the examiner’s daily schedule
12: Make sure users succeed first time
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team used their user research to inform both the amount and content of the training provided to examiners meaning that examiners are familiar with the service when using it in a live test. It is good that this training includes the unhappy paths
- the team has iterated the design since alpha and addressed the recommendations in the alpha report, in particular making it easier to remove faults and making the interactions and layout more consistent
- releasing the journal part of the service early on so that examiners could adjust to the new technology in a safe way is an excellent example of using agile methods to reduce risk and to meet user needs
- the service design around the digital tool is impressive, the physical, cognitive and social context examiners are working in had all been considered and catered to
- the team had considered how to maximise consistency in future categories of tests by creating a design library
What the team needs to explore
Before their next assessment, the team needs to:
- show how they have incorporated the local quality assurance process as part of making sure that examiners are using the service successfully and accurately
- use the analytics data gathered during private beta to validate the positioning of the buttons on the in-test screen and iterate if needed (a smaller private beta group would have made this less disruptive, but it is better to do it once based on data while the service is still relatively new)
- consider using colour or other design conventions to further visually separate the sections on the in-test screen
- consider using colour and/or other design conventions to show which buttons stay consistent between the different categories of test
- provide evidence that examiners can successfully switch between the different categories of test, including infrequently used categories
13. Make the user experience consistent with GOV.UK
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team has used standard GOV.UK patterns and have justified the adaptations they have made for the app, including using the native iOS font so that accessibility settings for text sizes work
- they have been working with the DVSA design patterns working group to increase consistency across the organisation
14. Encourage everyone to use the digital service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- a ‘device familiarisation’ period was used to gather early feedback from driving examiners on their needs and concerns about transitioning from a paper-based to digital-first examining regime
- broad, deep and continuous involvement of and engagement with the network of 1900 driving examiners has been undertaken over the 18 or so months since their last assessment – that their views had been listened to, internalised by the team, and used to shape the design of support interventions
What the team needs to explore
Before their next assessment, the team needs to:
- provide evidence showing how low-skilled digital users have been and are continuing to be supported to use the service – and how those who are unable (or unwilling) to use the service have been supported with alternative arrangements
15. Collect performance data
Decision
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- there is a full time performance analyst on the team working across both teams
- the performance analyst feeds their findings back to the team and all actions are added to the backlog to be prioritised at sprint planning
- the team is using quantitative data to help inform changes to the service
- the team has a very mature analytics structure using a variety of tools to collect and analyse i.e. Google Analytics, Biq Query and data studio. The team considered other tools and discovered these were the most suitable for their needs
- the team has a performance framework in place ensuring they are collecting the correct KPIs to measure their hypothesis are meeting their user needs. The entire team has been instrumental in producing the performance framework. The output of the performance framework also fed into the design process
- the SIRO for the project has signed of using Google Analytics 360 as their analytics package. there are plans in place to use additional tracking i.e. custom dimensions, once there is more data available in the public beta stage of the project
- the team is looking to build a self service analytics function allowing users of the data have easy access to the data they need when they need it
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the performance analysis implementation, learnings and experience gained in the building stage of the project is maintained once the service is live and handed over to the DVSA
16. Identify performance indicators
Decision
The service met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the team has produced a performance framework to develop their key performance indicators
- the KPIs ensure that their hypotheses are meeting their user needs
- as the services matures and develops they will review their KPIs to ensure they are still relevant
- the team produces the mandatory KPIs, cost per transaction, completion rate, user satisfaction and digital take up
What the team needs to explore
Before their next assessment, the team needs to:
- as the service develops and the team has access to more data keep reviewing their KPIs
17. Report performance data on the Performance Platform
Decision
The service met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the team has had discussions on whether they need to display their mandatory KPI’s on the Performance Platform
- they currently do not have a requirement to share their data on the Performance Platform
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that they do store the mandatory KPI data should the situation change and they need to show the data on the Performance Platform
18. Test with the minister
Decision
The service met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- the team has made arrangements to test the service with the relevant minister - this has been delayed due to events outside the team’s control
What the team needs to explore
Before their next assessment, the team needs to:
- confirm that their service has been tested with the relevant minister