Plan and manage roadworks (previously known as Street Manager) beta assessment report
The report from the beta assessment for DfT's Plan and manage roadworks service on 2nd October 2019.
From: | Central Digital and Data Office |
Assessment date: | 02/10/2019 |
Stage: | Beta |
Result: | Met |
Service provider: | Department for Transport |
Previous assessment reports
Service description
The service allows registered users from local highway authorities and utility companies to plan and manage street works, including applying for permits.
Service users
The primary users of this service are professionals from utility companies and the local highway authorities who want to carry out highway works. Secondary users will approve permits and carry out inspections.
1. Understand user needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has conducted extensive testing with their main user groups throughout their private beta, with over 57 local highway authorities, 46 utility companies, 150 survey responses and a large amount of API testing
- the testing has taken place in parallel to the existing system that the service seeks to replace, so that they can monitor how their users will interact with the new service and have iterated based on those needs
- the wider team has participated in research activities throughout the phase by observing testing and helping with note taking and analysis. The team demonstrated how hypotheses are formed with the product owner and user representatives, tested in research and informed high level user needs. It was clear that the team referred back to their personas and high level user needs to inform planning and design decisions, e.g. the decisions to stay with the payments system they preferred, using 24 hour clock, and the varied map drawing options
- the team have plans for when the service isn’t accessible online, either because workers can’t get a signal when in rural areas or if the system is down. Photos of roads can be uploaded onto the system at a later time, and there are also paper versions of forms so that they can be completed manually and entered onto the system at a later date
- the team have done some great work in addressing the recommendations from their previous assessment and workshop. For example they have conducted API testing with 23 organisations and ran a number of workshops around API usage. They developed their personas based on previous feedback to make them more behaviour based. It was mentioned throughout the assessment that the personas have been integral to the design approach, embedding them in story maps, and team refer back to them regularly. The team have also had an increased focus on accessibility testing. In this testing, users have been asked about disabilities, impairments and digital skills, and these have been tested where appropriate. There has also been 2 rounds of accessibility testing with people with severe disabilities.
What the team needs to explore
Before their next assessment, the team needs to:
- continue to use and iterate their personas based on the testing and feedback received over the next phase. There are a large number of personas, which isn’t an issue in itself if the team are using them well, but as the team learns more about each of them they should iterate and prioritise as they move towards a live service
- monitor and test the process for getting users up to speed on using the service - the training. This feels like an important ‘add-on’ to the service that will impact the workload of the people running it. How is this training being evaluated? The training should also be accessible and easy for people to use, and is potentially a way to upskill those with low digital skills. We would expect more research around this area, and to justify its existence at all (say instead on content), in the next assessment.
2. Do ongoing user research
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- there are plans for the user researcher to work closely with the performance analysts during the next phase to understand how the roll out of the service is performing, and to look for vital opportunities for further user testing. The team also plan to make use of a survey that links from the website and pop ups to ask for further feedback
- the team have plans to roll out the service incrementally over the next phase and to monitor usage as people gradually, exclusively begin to access the new service for real. It is good to hear the team are planning to collect data across different geographical areas and additional street data mapping.
What the team needs to explore
Before their next assessment, the team needs to:
- incrementally roll out the service as planned and monitor how their user groups interact with the service from end to end. It’s particularly important to test the service in real conditions without having the fallback of using the previous Eton service, as this will at least raise more questions from users.
3. Have a multidisciplinary team
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the service owner is empowered and knowledgeable, and has a strong relationship with the product owner and the product and technical teams
- the team have such strong focus on product, design and performance analysis.
What the team needs to explore
Before their next assessment, the team needs to:
- have one or more civil servants either in or shadowing key positions, particularly delivery management, user research, design, or technical leadership. While the team have done well to ensure that knowledge is retained as different contractors fill different roles, there remains a significant risk to the Department for Transport that so much knowledge is outsourced. Additionally, while the panel understands the service will ultimately be self-funded by its users, it would be better value for money to try and bring in internal capability if possible
- similarly consider the procurement strategy for post March 2020. The proposed single contract covering 1st to 3rd line support and all ongoing development may exclude Small-Medium Enterprise bidders, and again exposes the Department for Transport to risks associated with outsourcing knowledge and responsibility.
4. Use agile methods
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the large team have developed processes that allowed them to work effectively across three locations
- the team have a sensible approach to managing risk, which involves regular lightweight meetings between the senior product team members.
5. Iterate and improve frequently
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- a Change Advisory Board is in use only for unplanned change, regular planned change is pre-approved
- change is regularly communicated to users through other tools they are already using, such as Trello and Jira Service Desk, which are also used for reporting issues
- the team are planning to make changes that will improve the overall process as well as the system, for example challenging the need to apply for two permits for one piece of work that crosses two streets.
6. Evaluate tools and systems
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- technical design decisions are considered and captured for future reference with Open Design Proposals
- the team used the experience of other government departments to help them select the right mapping technology
- potential lock in has been avoided by building a containerised solution and using open source orchestration technology
- comprehensive monitoring and alerting has been set up to enable the team to understand the health of the service from the infrastructure, security and application perspectives
- the team have developed and tested a support and incident process during private beta.
What the team needs to explore
Before their next assessment, the team needs to:
- show how monitoring, alerting and the support model has continued to evolve in response to any problems experienced during public beta.
7. Understand security and privacy issues
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- a framework and process are in place for identifying, recording and monitoring potential threats and risks to the service
- risk and information ownership is well defined
- the team have commissioned independent reviews of the architecture and penetration testing
- a Data Protection Impact Assessment has been conducted and privacy notices and terms of use updated accordingly
- data retention policies have been designed.
8. Make all new source code open
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- code is stored in private repositories in Github but with the aspiration to share it publicly as soon as the commercial and stakeholder environment permits
- automated test tooling is in place to ensure repositories do not contain sensitive information such as access credentials
- documentation for the service is being developed in the open with the source code being available on GitHub.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the closed repositories that do not contain sensitive information are made publicly available.
9. Use open standards and common platforms
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- open source and open standards have been preferred in the vast majority of technology choices
- the OpenAPI standard has been used to document the evolving API
- the team have followed the API technical and data standards including designing a versioning strategy
- GOV.UK Notify is used to send emails and the team have carefully considered other common platforms for use within the service
- the team have demonstrated a strong commitment to open data
- the team are engaging with potential users of the open data to help them design the open data sets.
What the team needs to explore
Before their next assessment, the team needs to:
- show the evolution of the open data design and demonstrate how the data is accessed.
10. Test the end-to-end service
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- a sophisticated Continuous Integration and Deployment pipeline has been built to enable developers to deploy with confidence
- automated testing has been baked in and includes accessibility, performance and security tooling
- usage patterns have been modelled from user research and engagement to inform capacity planning and performance testing for both the user interface and the API
- autoscaling and resource sizing has been tuned based upon the performance tests.
11. Make a plan for being offline
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team have a very detailed plan for helping users to deal with unexpected service downtime. The service shutter page will redirect users to a GitHub with a step-by-step offline process
- the API and the digital front end are independent of each other, so only one set of users should be affected due to any outage
- resilience has been considered from the start and the service operates across multiple Availability Zones
- third party dependency failures have been considered and mitigations designed so the impact of any failure is minimised
- monitoring, alerting and service desk tools are independently hosted
- an incident process including communication plans has been designed and tested
- databases are replicated, regularly backed up and restores tested.
What the team needs to explore
Before their next assessment, the team needs to:
- as per point 6 - show how monitoring, alerting and the support model has continued to evolve in response to real incidents.
12: Make sure users succeed first time
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team have an impressive understanding of users and their needs, ensuring that all functionality that they build can be tracked down to a genuine need
- the team spoke to users to identify potential unhappy paths that the service could include and tested them extensively
- the team iterated design of the service based on user research, for example changing how they capture dates of roadworks after learning users are always able to accurately record these
- the team made sure that the service is accessible and worked with DAC to perform accessibility audits
- the team made sure that the design patterns they chose, for example maps, are available to all, including users without JavaScript.
What the team needs to explore
Before their next assessment, the team needs to:
- do more research with users who are less aware of subject matter regulations or are new to their jobs, so that the team can make the service easier to use for those users. For example, the confirmation page for submitting a new roadwork should surface what happens next and how long it takes
- consider the wider journey this service is part of and how they join up. If internal training in organisations is likely to happen, what will this entail and what can the service do to eventually reduce the need for this training?
- test the journey of signing up to use the service from GOV.UK as a new organisation or as a new user, as the demand for this journey will grow once the service enters public beta
- make forms that enable offline use of the service available at appropriate points in the journey in the event of users not wanting to, or not being able to, use the digital service.
13. Make the user experience consistent with GOV.UK
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team contributed their findings about displaying maps and uploading files to the GOV.UK Design System backlog
- majority of the patterns used in the service do not diverge from GOV.UK Design System; a few that do, do so to improve the experience for users.
What the team needs to explore
Before their next assessment, the team needs to:
- consider using the GOV.UK Design System so that they don’t need to duplicate effort on a lot of the patterns they use and can benefit from improved accessibility
- clarify the intent behind some of the features. For example, the team was able to explain why they offer three different ways to draw a map, but it’s unclear from the interface why a user would choose one of them over another and what behaviour the team is trying to encourage.
14. Encourage everyone to use the digital service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- there is a plan to make a legislative change to ensure all street works will be managed through this new service, but this change will be after an extensive period of comms and on-boarding
- the team are aware some local authorities who may be more reluctant to move to the new service and have a plan to identify these users early and provide them with additional support
- the team run roadshows and have an active digital support network, including Slack, Trello and Jira Service Desk. They are encouraging users to support each other as well as providing support themselves.
15. Collect performance data
Decision
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the service team now have an embedded Performance Analyst working with them to identify actionable data insights
- there was clear evidence of the team working together with the Performance Analyst to identify opportunities to iterate the service, such as the ordering of lists.
What the team needs to explore
Before their next assessment, the team needs to:
- continue to look for suitable solutions to improve the data collection and dissemination, for example the team mentioned exploring the use of Google Tag Manager and BI tools.
16. Identify performance indicators
Decision
The service met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the team have identified a number of performance indicators that will allow them to identify the services’ strengths and weaknesses
- the team have produced a Performance Framework clearly outlining what data they will collect and how it will be used to support improvements, and that this was created with involvement from the team and other stakeholders.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that they have put in place a suitable method to increase the take up of the satisfaction survey used on their site and use this information to improve their service
- update the performance framework to include the GDS key performance indicators discussed and evidenced in the assessment presentation including cost per transaction, satisfaction, completion rate and take up.
17. Report performance data on the Performance Platform
Decision
The service met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the team has contacted the Performance team to feed their data on to the Performance Platform.
What the team needs to explore
Before their next assessment, the team needs to:
- continue to speak to the GOV.UK Performance team and finalise a plan and next steps to get data from Beta on their Performance Platform dashboard.
18. Test with the minister
Decision
The service met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- the team had already tested the alpha with their previous minister and had plans to test the service with the current minister.