Claim a power of attorney refund - beta
The report from the beta assessment for OPG’s Claim a power of attorney refund service on 7 December 2017.
From: | Central Digital and Data Office |
Assessment date: | 07 December 2017 |
Stage: | Beta |
Result: | Met |
Service provider: | Office of the Public Guardian (Ministry of Justice) |
The service met the Standard because:
- The service team are developing the service using an exemplary agile service design approach, releasing early value and learning from users to inform rapid iterations
- The service team are using all recommended practices to get a deep understanding of the service users and their needs (including assisted digital) and the context in which the service needs to operate, in order to design the service accordingly and sensitively
- The service team drew on their knowledge of their users and prior experience with a similar service to make a compelling case to policy colleagues and the minister to take extra time to rapidly build an excellent digital service, rather than manually process automatic refunds. The new service will go into public beta later than the minister would originally have liked, and yet the service will deliver a far more successful outcome for users and the department. The panel fully endorses this approach.
About the service
Description
The service will refund partial POA fees paid between 2013-2017, this service allows users who believe they are eligible, to apply for a refund.
Service users
Donors and attorneys of POA’s registered between 2013 - 2017, up to 20 Office of the Public Guardian case managers
Detail
User needs
The team were well prepared and presented their user research findings clearly and positively. There is at least one user researcher working on the service throughout development, with user research deeply embedded into the delivery cycle. The team showed a deep understanding of their users and their needs and have used a variety of user research methods, such as face-to-face and telephone interviews, remote testing using screen sharing, online surveys and lab-based usability testing. It was pleasing to see that since the Alpha assessment, the team has expanded their user research outside of London and with a broad range of users, including those with assisted digital needs.
The team presented detailed insight into their users, showing the age profile, digital inclusion scale etc. to ensure they reach all users who would fit the usual service profile, joining up with organisations such as the Alzheimers Society to recruit users for usability testing, which has proved extremely successful.
The team were able to provide strong examples of where they have iterated the service based on user feedback, and they did this through a combination of video clips of real users and the demonstration of the current service. The private beta period has allowed them to analyse user behaviour in more detail and they were also able to demonstrate how they had tested the entire end-to-end service and taken on board user feedback relating to the email content and how the transaction appears on users’ bank statements.
The team has carried out two independent accessibility audits, and were able to observe some of the accessibility testing first-hand. This gave them a greater insight into accessibility needs and they have taken on board the findings. A quick accessibility check done by the assessors proved very positive for the service.
The fast pace of iteration and delivery from this team, while ensuring that user needs is always at the forefront, is to be commended and shared wider with other service teams across government.
Team
A great example of a high performing digital service design team. They are working across 3 locations with some roles shared with other teams. They make a success of this by using best practices re: remote working. They also ensure co-location biweekly and use this time effectively for collaborative design and decision making.
They are using agile governance techniques to ensure a fast pace of delivery.
We heard lots of excellent examples of ways of working, eg bringing non-delivery teams into Slack, pointing to a good relationship between disciplines and across functions in the department. The service team have a counterpart team in legal and policy and are also working closely with caseworkers who will be processing the claims. The service team are rallying these teams around the project, holding retros and joint planning sessions where needed.
The service team can scale up and down in response to demand, and the organisation has planned digital staff to support live services as well as carry out rapid development.
Technology
The team has built a solid platform and have addressed most of the recommendations the Service Manual offers. The use of the AWS stack, including components not often used, such as AWS’ key management system.
The triple security risk (Personal information, lists of vulnerable people, government paying out to citizens) has been mitigated by thorough monitoring of all components, reducing attack vectors by limiting the user data exposed without sacrificing user experience, and the use of standard modern encryption.
The panel was impressed with the amount of work put into monitoring and mitigating system load issues and possible downtime.
A development team working on two separate systems adds complexity in planning which, this team seems to have managed well, based on the amount of work demonstrated and the quality of the end products.
Design
The panel were pleased to see the team follows an iterative design approach. The team showed different design variants they explored in the past three months. During their private beta phase, the team investigated and tested a number of alternative versions. The team showed various examples of content and interaction design iterations – both on the start page and in the transaction. The team collected evidence that this helped users to understand the service better and complete their journeys faster. For example, the service now repeats the name of the donor and attorney later in the journey instead of referring only to roles. The team also changed the type of data collected to validate a claim based on in depth research with users.
Content and interaction designers paired and used shared tools to collaborate. The designers in the team made effective use of the guidance in the service manual, the GOV.UK style guide and patterns. They also engaged with the cross-government design community via Slack. Furthermore, they ran design critiques with colleagues from the MOJ design community. This is a good practice for getting more feedback from the wider departmental design community we rarely see from service teams. Also, the team worked with other designers on a pattern library for internal services. The panel would like the team to share these to benefit the wider cross-government community.
As recommended in the Alpha assessment, the team developed a plan for the larger user journey. It now considers news outlets but also charities to provide the starting point for the service. For this, the team plans to publish a dedicated news page on GOV.UK. The team also works with the GOV.UK team on the integration into existing Lasting Power of Attorney content. Besides, the team plans partnerships with two other services – Carer’s Allowance and Winter Fuel Payments which we commend as a good example of cross-departmental collaboration.
The team shared critical feedback though video recordings from user test sessions and explained how they addressed it. To ensure accessibility, the service was tested by 6 users with accessibility needs at the Digital Accessibility Centre (DAC). The users had either a good user experience or were able to complete the task with minor issues.
A high-level accessibility audit, conducted by a member of GDS’ accessibility team, came to a similar result. The reviewer did not find many issues. The issues found – except one – are known issues of GOV.UK Elements and therefore not within the teams control. The accessibility reviewer sees this as a clear indication that the team is thorough and effective in meeting accessibility needs of users.
Among the 217 private beta users of the service, 8 were assisted digital users. Two of them had no mobile phone. The team outlined how GOV.UK Notify will be able to send letters to these users. The service team consulted assisted digital experts from other services and worked closely with in-house caseworkers. Two caseworkers and a case manager took part in sprints, engaged on Slack, and gave input on the service’s assisted digital route. This is exemplary practice delivering effective outcomes which more service teams will benefit from adopting. The caseworkers’ interface contains an advanced AD version with a free text field and dedicated script.
Analytics and performance measurement
The team demonstrated a robust use of data analysis in the development of the service so far. The team used financial and human resource data to prove that building a GOV.UK compliant service with case work capabilities was cheaper than manual processing. User research also demonstrated that users would not be confident supplying bank informaton to a non-GOV.UK site - which would have driven up avoidable costs.
Regarding digital analytics implementation, the team have built impressive additional functionality - for example labelling every error field, to make it easier to identify and prioritise where users are struggling; and importing anonymised data from the case worker system.
The team has also built a ‘Statistics dashboard’ that provides an overview of the service and is used by the case worker manager to provide positive feedback.
The team has developed a mature set of 24 KPIs, including the four mandatory ones. All of the indicators relate back to the service mission statement. It was impressive to see that each KPI has a:
- Purpose
- Way of calculating
- Data source
- Target
- Owner
Performance indicators are monitored over time and backlog tickets are written for future sprints to analyse and act where appropriate.
The team is in discussion with the Government Services data team re: a performance platform dashboard and plans to publish cost per transaction 3 months after being in Public Beta, when volumes have increased.
Finally, the team is using Survey Monkey effectively to collect feedback on the service.
Recommendations
To pass the next assessment, the service team must:
- All the system’s data of interest to stakeholders at the MOJ and the OPG, as well as the public at large be published on the performance platform, specifically number of transactions, money refunded, etc.
- Source code should be published as soon as the service enters public Beta
The service team should also:
- The panel recommends providing more documentation for the benefits of future developers who might inherit the code and may have to rebuild the platform in case it needs changing. Although README files are present in the github repositories, they aren’t always easy to find and use, especially with multiple-repository project like this one. A single operations manual is usually the best way to centralise up-to-date information
- The team might benefit from a more streamlined use of github and popular online continuous integration systems, which would automate even more the development and deployment of the application
- Whilst the team user researcher and front end developer have done a good job with implementing digital analytics and performance measurement so far; the Service Owner should recruit a performance analyst to work across LPA services. This was indicated to be in progress
Next Steps
You should follow the recommendations made in this report before arranging your next assessment.
This service now has permission to launch on a GOV.UK service domain.