Apply for a Blue Badge beta assessment
The report from the beta assessment for DfT's Apply for a Blue Badge service on 4th December 2018.
From: | Central Digital and Data Office |
---|---|
Assessment date: | 04 December 2018 |
Stage: | Beta |
Result: | Met |
Service Provider | Department for Transport |
The service met the Standard because:
- the service team is empowered by the organisation to make effective decisions in relation to the service
- the service team has a demonstrated and strong focus on user needs and user centred design
- the service team has put good emphasis on performance metrics and data to compliment its approach to user centred design.
About the service
Description
The Blue Badge Scheme provides 3 million severely disabled people across the UK with a parking permit which allows them to park closer to their destination, usually for free and without a time limit. The Department of Transport is responsible for the central policy and legislation governing the scheme with the devolved administrations responsible for their own guidance but the legal obligation to issue badges to eligible disabled people sits with Local Authorities.
The contract for the existing service, which is managed by a third-party, is due to end February 2019. As a result DfT are building a new service to replace it which betters meet the needs of applicants and badge holders whilst reducing overall costs and workload for the Local Authorities
Service users
Our users are severely disabled people who meet the eligibility criteria, organisations who care for and the Local Authorities who assess applications and issue badges and enforce their correct use.
Applicants who need a Blue Badge:
- someone that has a health condition that affects their mobility
- someone that applies on behalf of someone else
- someone that applies on behalf of an organisation that provides a mobility service.
Local Authorities that manage Blue Badges:
- blue badge administrators (Case Workers and Managers)
- blue badge assessors
- enforcement teams
- support centre staff.
Detail
User needs
The team have continued to do high quality research using a range of methods, with excellent involvement from the wider service team. The service owner understands the value of good user research, has empowered the researcher and uses them effectively to improve the service. The team has iterated their understanding of user needs and broadened the personas they use to represent different user groups. This is evidenced in having followed the previous recommendation to focus on the enforcement user type, resulting in valuable improvements to this user journey.
After identifying 44% of users who will require support to complete their application, the team have put significant effort into researching with these users, making effective use of guerrilla and offline methods. This has been supplemented with good research into how different types of local authorities might support these users, informing comprehensive assisted digital guidance for local authorities to use. In private beta this guidance has been tested through contextual research and shadowing of local authority users. In public beta the team should continue this research and focus on following the journey of applicant users from their perspective and ensure they have a method of identifying any local authorities that are performing poorly in providing a good assisted digital journey.
The team has also looked at implementing a mechanism to capture data on the proportion of each type of application, to track channel use and support through public beta. The panel would like to see how this informs the iteration of the support model at the live assessment. During public beta the team should look to validate that their design encourages channel shift while seeking to provide consistent and supported experiences for offline channels.
The panel was particularly impressed by the team working to maximise the impact of their research with applicants, by live streaming playbacks to local authorities and establishing a community to discuss these findings. This open and transparent style show hard work and consideration to overcome the legislative constraints that limit departmental control of how local authorities manage the scheme. Combined with providing tools to help local authorities meet user needs, for example the start page template, the team have created momentum and impact around their research which is exemplary. The team have done well to balance the needs of local authority users with applicant users. The team has discovered and met local authority needs, for example, the print formatting improvements. Where there is tension between the needs of each user group, the team has used evidence from user research and hypothesis driven design to encourage behaviour change. A strong example of this is the work to reduce the medical eligibility questions from 18 to 9 and iterating the walking questions to meet needs of both groups.
The team demonstrated strong collaboration between user research and performance analytics, particularly in validating design hypotheses, as demonstrated in the iterations for eligibility guidance. The team has made full use of the direct access to real users available in private beta, both in validating qualitative insights with data from the live service, and in monitoring user support and feedback to identify or prioritise product improvements.
Team
The service team has a good approach to user centred design, agile working and software development. They have focused on due the the remote working of some team members that they are inclusive and have the right tools to enable effective remote working.
The service to seems to have the right balance between technically skilled, user centred design and business representatives team members. The service team also demonstrated solid agile progresses which both help and empower the team to make the right decisions but also provides the right visibility to the organisation to ensure accountability for the service and its development.
The team demonstrated that both the organisation and delivery partner had establish effective and pragmatic governance process for managing the service and senior business stakeholders.
There is however an ongoing imbalance between contractors and permanent civil servants with there only being two civil servants working on the team. The service team did demonstrate that the organisation was actively putting in place plans to build capability with the support of the delivery partner.
Technology
The team demonstrated that they are working in agile environment using open source. The team using GOV.UK, GOV Registers and GOV.UK Notify. The team is proposing to use GOV.UK Pay. They are using Confluence, docker, Jira, TerraForm, and GitHub for agile project management, requirements gathering and source code collaboration. The team is using Restful APIs and support JSON. API have good maturity and are stored in Swagger HUB with OpenAPI along with the use of OAuth2 for tokens. The team is following OWASP standard for web security. The team is using the collaborative team approach for the detailed design and peer review of the design and code. All developers participate in the proposed design. This distributes knowledge around the team as well as being peer reviewed by other developers. A team of 12 with relevant roles shared between DfT and Valtech. The team is proposed to increase to 19 in the Beta phase.
The Delivery team co-located in Manchester. The team is regularly visited by Product Owner and Service Manager. The team is actively using Video Conferencing and screen-sharing for daily stand-ups, user research playback etc. The team members travel to Local authorities or knowledge sharing, shadowing, Q&A sessions etc. as required.
The team is using Java, Redis as database, & Spring Boot along with Amazon Web Services as cloud hosting. The team is using the elastic scaling for capacity planning and load management. The team is using Sonar for testing. The team is regularly mirroring the master branch of each source code repository into a public facing open source repository to ensure no sensitive information is leaked whilst still maintaining the ability to accept pull requests.
The team is actively doing the Monitoring and Alerting using Kibana for infrastructure burden monitoring and Pingdom for service uptime and basic transaction Monitoring. The team is displaying the data on the relevant service dashboard. Alerts are sent to Service Owner, Web Op, TA and DM.
The team must ensure that frontend assets (CSS, fonts, JavaScript, etc) are served with appropriate:
- cache headers
- minification
- compression
At the moment the service is forcing the user to re-download these assets on every page load. This makes the service take a long time to load for users on a slow internet connection. It also consumes more of the user’s data plan if they are accessing the service from their phone.
The team is suggested to calculate the Carbon footprint and savings.
Design
The team have designed an intuitive service which most users should be able to complete quickly and easily. In beta they have worked in an iterative way, sensibly prioritising the changes that address the biggest difficulties a user might have. The panel were particularly pleased to see:
- a variety of different solutions explored for each design problem that the team have worked on the genuine empathy the team have, and the care they’ve taken to design something which meets their users’ needs
The team have done a good job of balancing the needs of the different users of service.
They have considered, for example, the tension between asking users for the minimum amount of information, and being able to give the staff in councils enough information to make immediate decisions on applicant’s eligibility (where possible). In public beta they should continue to analyse how well this is working, and look for opportunities to optimise:
- the number of questions asked of an applicant
- the number of applicants who require a mobility assessment.
Both in content and in its use of the GOV.UK Design System the service presents an experience that is highly consistent with GOV.UK. They have developed some new interaction patterns, for example a way for users to upload multiple pieces of evidence. Where new interactions have been developed they are well-considered and have emerged as solutions to real problems found in user research. The team should contribute these new interactions back to the discussion on the GOV.UK Design System backlog.
The panel were also pleased to see the team reusing a template for admin interfaces originally developed in the Ministry of Justice. This template has helped them design a clear and uncluttered interface for council staff. The screen that these staff will use to make a decision on someone’s eligibility could benefit from a stronger visual hierarchy however. Observing staff use this system for real decisions will help the team understand which pieces of information are most important.
The service design work that the team have done on support for assisted digital users is excellent, especially in the way that they have engaged with the councils who will be delivering it to make it practical and realistic. The model of a council delivering a service using a platform developed by central government is itself novel and will be of interest to other teams in government. The team should be encouraged to share what they’ve learned about the way they’ve worked and how it’s made this mode of delivery possible and successful.
Analytics
The team made good use of data throughout the demonstration of the service, using it to highlight where improvements had been made, or where further iterations were required.
The vision for the service is to make it as simple as possible; two of the key metrics they identified in measuring success are time taken to complete and completion rate, both of which are being tracked by Google Analytics (GA).
The data they get from GA is heavily segmented, allowing them to identify any significant changes in different types of application, and also allowing them to provide bespoke reports which have been requested by individual local authorities.
Conversely, local authorities participating in the private beta provide a significant amount of useful data, including user satisfaction, processing times, and the cost of processing applications. This currently covers only 19 authorities, but the team explained that they have a mechanism for ensuring that this data is still provided as the number increases.
Data is shared among the team using Data Studio dashboards, which are managed by a dedicated performance analyst.
They gave good examples of where data had helped them make improvements, and also of where analytics and user research combined to enable informed decision making.
The Blue Badge scheme is consistently the Department for Transport’s service that receives the most complaints. The team has access to these figures and will monitor them closely. Many complaints are outside their direct control (e.g. not enough parking spaces), but they receive a breakdown of telephone complaints at a fairly granular level that allows them make improvements to the application process if necessary.
A dashboard for the service has been set up on the Performance Platform. The team will need to ensure they are able to keep this up-to-date.
Overall, the panel was impressed with the teams use of analytics, the range of data they receive, and their plans for improving the tracking of user journeys.
Recommendations
To pass the next assessment, the service team must:
- the team should ensure they follow through on plans to validate the current understanding of proportions of different user types, and look for changes in real world usage compared to findings from survey data
- the team should continue with and expand their research with users applying on behalf of others, especially around submission of identity evidence. They should also conduct some focused research to understand if providing identity documents is a barrier to some users
- the team should conduct more research with applicants around paying the application fee, exploring the impact of paying on application compared with paying on issue of badge. This should be shared openly to support their work in influencing policy
- address the balance on contractor and civil servants working on the service team to ensure that there is a long term sustainable team to run the digital service.
The service team should also:
- review their plans for public beta to ensure that they are realistic, deliverable and do not put and undue strain of the service
- develop robust disaster recovery and business continuity plans during public beta to ensure they are fully effective prior to the service’s live assessment
- the service team should continue to test and refine the assisted digital process for the service with a broad spectrum of users with assisted digital needs
- the service team should continue to invest time and effort conducting user research with users with accessibility needs.
Get advice and guidance
The team can get advice and guidance on the next stage of development by:
- searching the Government Service Design Manual
- asking the cross-government slack community
- contacting the Service Assessment team
Next Steps
To get your service ready to launch on GOV.UK you need to:
- get a GOV.UK service domain name
- work with the GOV.UK content team on any changes required to GOV.UK content Before arranging your next assessment you’ll need to follow the recommendations made in this report.
Digital Service Standard points
Point | Description | Result |
---|---|---|
1 | Understanding user needs | Met |
2 | Do ongoing user research | Met |
3 | Have a multidisciplinary team | Met |
4 | Use agile methods | Met |
5 | Iterate and improve frequently | Met |
6 | Evaluate tools and systems | Met |
7 | Understand security and privacy issues | Met |
8 | Make all new source code open | Met |
9 | Use open standards and common platforms | Met |
10 | Test the end-to-end service | Met |
11 | Make a plan for being offline | Met |
12 | Make sure users succeed first time | Met |
13 | Make the user experience consistent with GOV.UK | Met |
14 | Encourage everyone to use the digital service | Met |
15 | Collect performance data | Met |
16 | Identify performance indicators | Met |
17 | Report performance data on the Performance Platform | Met |
18 | Test with the minister | Met |