Book Your Theory Test alpha assessment
The report from the beta assessment for DVSA's Book your theory alpha assessment on the 10th April 2019.
From: | Central Digital and Data Office |
Assessment date: | 10/04/2019 |
Stage: | Alpha |
Result: | Met |
Service provider: | Driver and Vehicle Standards Agency (DVSA) |
Service description
The service enables users to book a driving theory test for all categories of vehicle. The test itself is delivered by a third party provider of theory test centres. Test results are provided by the test centre when the test is complete. Fraud prevention and detection are important for the integrity of the test and are supported by staff based at the theory test centres.
In order to gain a driving licence in each vehicle category, the candidate must successfully complete a driving theory and practical test (the practical test cannot be booked by a user until they have passed the theory test). The Driver and Vehicle Standards Agency (DVSA) has overall responsibility for the delivery of the theory test service which has been entirely outsourced since it’s introduction in 1996. The current theory test service has 100% UK population coverage through a national network of 176 test centre locations, over 98% of bookings are made digitally. The Driver and Vehicle Agency Northern Ireland are partners to the contract with 6 additional test centres.
Service users
- driving Theory Test candidates including professional lorry and bus drivers
- individuals seeking to qualify to be trainers of learner drivers
- organisations with delegated authority to deliver theory tests for their employees, such as Ministry of Defence and bus companies
- any individual who books a theory test such as candidates, parents and trainers
- organisations who book theory tests under arrangements that support coordination with a training event
- employees of the current service provider: Driver and Vehicle Standards Agency; Driver and Vehicle Agency Northern Ireland
1. Understand user needs
Decision
What the team has done well
The panel was impressed that:
- the team has done an in-depth discovery to gain a detailed understanding of their user groups. Qualitative and quantitative research methods have been used to understand the demographic and needs of their users. Personas have been used well to show the motivations and pain points that need to be considered in the service design
- the whole team is engaged in the research and what the service needs to provide for users
- while some pain points identified occur for users outside the actual booking of a test, the team are thinking about how such issues, for example users experiencing anxiety when taking a test, can be alleviated or better supported during the booking phase
- research has been carried out in a variety of geographical locations to cover a range of test centres in rural and inner city areas
- the team have worked hard to include users with low digital skills and access needs throughout their research and their needs are well evidenced. Research has also been carried out with specialist driving instructors to understand the support and assistance that they provide to users with access needs regarding the wider journey of learning to drive which includes booking a theory test. Currently users that require assistance during their test are directed to a contact centre to book which can cause additional problems. The team are working to have all journeys included regardless of the assistance required
What the team needs to explore
Before their next assessment, the team needs to:
- further research the needs of users who take the test to access employment or as training for a role (eg police or bus drivers) prior to private beta; the team are aware that they have not yet focussed on these users which is a gap in their understanding given that they are the most likely users to be invited to private beta
- continue their attempts to gain data about the current service from the current third party provider to further understand the user journey regarding contact
- ensure that regular usability testing is carried out on the prototype so that insights around the proposed service are as well understood and evidenced as the needs
2. Do ongoing user research
Decision
What the team has done well
The panel was impressed that:
- user needs of members of the public are thoroughly understood; to further this understanding the team have utilised co-production approaches and card sorts as well as usability testing to understand the expectations of users in contrast to the prototype journey
- a variety of users have been included in the research to date and there are plans to continue recruiting individuals with access needs or assisted digital requirements; the team have a good understanding of these users and how to better meet their needs when compared to the current service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that thorough, frequent usability testing is conducted by the team themselves to understand the end to end user journey and ensure the insights are clearly evidenced in the prototype
- further develop the plan for private beta, this will include business users that the team have not yet fully explored as a group and their needs potentially differ from the general public.
3. Have a multidisciplinary team
Decision
What the team has done well
The panel was impressed that:
- the service is managed by a service owner who has the power to make day-to-day decisions to improve the service
- the team work well together and have worked hard and used collaboration tools effectively to create a team across two sites, for example they are alternating planning sessions between the sites
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that all the roles that they have identified for beta are filled, for example a content designer
- ensure that the deep understanding of users and their needs that has been built up in the team is not lost if there is a delay due to procurement processes for Book a theory test and other parts of the end-to-end theory test service
- try building civil service capability by increasing the amount of Civil Servants in practitioner roles; the team is currently a combination of permanent members of staff and contract staff from the supplier Kainos
- plan how they will ensure they have the skills, senior buy in and support needed to manage changes to test booking alongside procurement and redesign of other parts of the full theory test service
4. Use agile methods
Decision
What the team has done well
The panel was impressed that:
- the team demonstrated a mature use and good understanding of agile tools and techniques
- the team showed that it is regularly engaging with its stakeholders and the wider organisation, doing show and tells in a central location of the Nottingham office and inviting them to user research sessions
- the team could outline problems found in research and discuss plans to continue to find solutions for these in beta, for example, in continuing to iterate content on requesting support during the theory test
- the team has looked at what other teams are doing to solve similar design problems to learn from others and reduce duplication of effort and is working with teams in other departments to iterate the design of the appointment booking components of the user journey
What the team needs to explore
Before their next assessment, the team needs to:
- consider how they will conduct a useful and broad enough private beta given contract constraints with the third party supplier; the existing plan to use in-house theory test centres could restrict the learning and iteration of the service in private beta
- consider how the complexity of the redesign of the end-to-end service theory test service will be managed effectively and how learning from Book a Theory Test design and development will be fed into the much broader journey and a wider, interlinked set of contracts and procurements; the activity, people and coordination needed to create a coherent, user-centred, valuable new service should not be underestimated
5. Iterate and improve frequently
Decision
Not met point 5 of the Standard.
What the team has done well
The panel was impressed that the team:
- explored four potential operating models for the service
- ran two separate alphas to understand the viability of the two most promising models
- did not pursue the model which their first alpha showed was not viable
- have a sensible process for prioritising user stories based on research findings
What the team needs to explore
Before their next assessment, the team needs to show that they have explored a range of possible design solutions to a particular problem, rather than only iterating the existing journey in a linear way. The iteration they’ve done in alpha is closer to what the panel would expect during the beta phase.
During the alpha phase the panel would like to have seen the team:
- test more fundamental changes to the design of the form (for example changing the order of questions) which will be more difficult to make once they have real users in beta
- use lower-fidelity prototyping methods to get faster feedback, and not invest so much in working, production-ready code
The team will need to show that they have done this at the start of their beta and that the deep insight they clearly have about their users is translating into meaningful changes to the service. If they continue to make small refinements – as they have done in alpha – there is a clear risk that they will miss larger opportunities to make the service work better for their users.
6. Evaluate tools and systems
Decision
What the team has done well
The panel was impressed that:
- technical architecture is up to date making use of AWS cloud, serverless and microservices technologies
- well known technologies such as Lambda, Node.js, Terraform and Jenkins (for Continuous integration) are being used in the solution
- technical architecture documentation sent was very clear and struck a good balance between explaining the architecture clearly without providing a disproportionate level of detail to what is needed to assess a technical service
What the team needs to explore
Before their next assessment, the team needs to:
- to explore DVSA’s use of Amazon RDS CRM and its applicability to the service
- to investigate any benefits of using the Azure cloud services and Cloud Formation for the service
- to investigate using GOV.UK Pay to take payments
7. Understand security and privacy issues
Decision
What the team has done well
The panel was impressed that:
- data was encrypted at rest and in transit (LS encrypted and Dynamo encrypted)
- a serverless platform is being used (AWS Lambda) mitigating a large proportion of potential DDoS attacks
- the service makes use of Infrastructure AWS (Amazon Web Services) - front end - intercepts traffic and denial-of-service attacks
- the service will make use of active monitoring to capture infiltration - Amazon Guard Duty with Alert Logic
What the team needs to explore
Before their next assessment, the team needs to:
- advise on the outcome of their GDPR (General Data Protection Regulation) assessment once this has taken place
- carry out static code analysis
The panel noted that, with someone’s driving licence, you could find out:
- where and when the user is going to be at a specific date and time
- any medical conditions the users has given as reasons for needing support.
The team should make sure the sensitivity of the information revealed to a user accessing the service again is proportionate to the low barrier to entry
8. Make all new source code open
Decision
What the team has done well
The panel was impressed that:
- the team has a plan to open up their code by mirroring the private repositories master branch into GitHub in the same way as the DVSA MOT service does
- the team are talking to teams in other departments about the design and build of the appointment booking component of their service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that all new source code is published in an open repository
9. Use open standards and common platforms
Decision
What the team has done well
The panel was impressed that:
- Node.js runtime is being used which is de facto across industry
What the team needs to explore
Before their next assessment, the team needs to:
- to investigate the use of Graph QL to connect to Department for Transport
- look into using GOV.UK Pay in FTTS- If GOV.UK Pay is not applicable currently the team should engage with GOV.UK Pay to agree on requirements and future implementation, this is to support a seamless journey for service users to be created as well as meeting any technical constraints within DVSA
10. Test the end-to-end service
Decision
What the team has done well
The panel was impressed that:
- the end-to-end testing of the service was completed (with the exception of the Test Engine)
- a Watkins Skeleton was used for testing
- the team had carried out test integrations during alpha with other services that this service will rely on
What the team needs to explore
Before their next assessment, the team needs to:
- carry out testing on the Test Engine
11. Make a plan for being offline
Decision
What the team has done well
The panel was impressed that:
- test delivery has a large coverage for outage (7am to 8pm)
- a customer contact centre will be available as an alternative if the service is offline
- a continuity plan is in place for the existing service and the team plan for a similar plan to be in place through the procurement taking place for the beta and live phase
What the team needs to explore
Before their next assessment, the team needs to:
- agree the continuity plan with call centre partners as part of procurement during beta
12: Make sure users succeed first time
Decision
What the team has done well
The panel was impressed that the team:
- recognises the existing service has very high completion rates and user satisfaction and is starting with this as a baseline
- have added specific guidance where their research into the wider theory test journey has shown that there are issues
- are working to expand the proportion of users with support needs who will be able to book their test using the digital service
What the team needs to explore
Before their next assessment, the team should:
- design for mobile-first, especially where the primary action risks being lost off the bottom of the page (for example on ‘What type of support do you need?’ and ‘Choose appointment’)
13. Make the user experience consistent with GOV.UK
Decision
What the team has done well
The panel was impressed that the team is:
- using the GOV.UK Design System
- proactively talking to other teams who are working on patterns which aren’t yet codified, for example, appointment booking.
What the team needs to explore
Before their next assessment, the team needs to:
- reconsider the user need for a progress indicator and start by testing the service without one (see progress indicators in the GOV.UK Design System)
- contribute what they’ve learned about appointment booking to the issue on the GOV.UK Design System backlog
- try using GOV.UK Pay to make the user experience consistent with GOV.UK
14. Encourage everyone to use the digital service
Decision
What the team has done well
The panel was impressed that:
- the team has a deep understanding from thorough user research of the needs, attitudes, fears and behaviours of users across the full journey of booking and taking their driving theory test
- the team know that digital take-up rate of the existing service is already high and are concerned that online and offline channels are used appropriately to support user’s needs
What the team needs to explore
Before their next assessment, the team needs to:
- keep up their efforts to gain access to the call centre and call centre records which are currently delivered by a third party to make sure they are able to study and learn more about existing offline support to inform the design of online to offline handovers and assisted digital support
15. Collect performance data
Decision
Not Met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the team have identified areas for improvement by looking at quantitative and qualitative data from the wider end-to-end user journey of taking a theory test; for example, the team has identified that users who require support or accommodations to take their theory test are less likely to pass the test first time and are designing the booking process to provide more information and reassurance to users
What the team needs to explore
Before their next assessment, the team needs to:
- continue their efforts to get meaningful data about the performance of the existing service delivered by a third party, for example the team know that there are a high number of changes to booked test dates, but do not currently have the data to know if it’s a small number of users repeatedly changing test dates or lots of users each making one change
- be able to explain how they will track a user’s journey through the full service to identify completions, areas of poor performance and handovers from online to offline parts of the service
16. Identify performance indicators
Decision
What the team has done well
The panel was impressed that:
- the team is identifying gaps in the data that they have available about the existing service run by a third party supplier and are continuing to pursue different routes to get access to this data to provide a solid baseline
- the team will measure completion rate across users and platforms – this information is not currently available from the existing supplier
- the team are measuring cost per transaction and digital take-up as a percentage
What the team needs to explore
Before their next assessment, the team needs to:
- be able to explain how performance data collected in private beta will be able to be used as a baseline when the private beta is likely to be limited to users of in-house theory test centres who do not represent the demographics of the public service
- look into completion rates across users and platforms – the team has established this information is not currently available from the existing supplier
17. Report performance data on the Performance Platform
Decision
What the team has done well
The panel was impressed that:
- The team are planning on publishing a dashboard and have engaged with the Performance Platform team
- The team will measure ‘test slot’ availability aiming for 95% of candidates to be offered a test booking within 2 weeks of their preferred date at the test centre of their choice on a monthly and annual basis.
- 95% of candidates requiring special arrangements to be made are to be offered a test booking within 4 weeks of their preferred date at the test centre of their choice
- the team are measuring ‘travel time’ to take the theory test so that candidates will be able to take their theory test at a test centre with a travel time of 40 minutes or less or in areas of lower population density no more than 40 miles away
What the team needs to explore
Before their next assessment, the team needs to:
- Have a plan to collect performance data during private beta and be able to report it on the Performance Platform during public beta, as they do for the existing service
18. Test with the minister
Decision
Point 18 of the Standard does not apply at this stage.