Modernising Lasting Powers of Attorney alpha assessment
Service Standard assessment report Modernising Lasting Powers of Attorney 24/05/2022
Service Standard assessment report
Modernising Lasting Powers of Attorney
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 24/05/2022 |
Stage: | Alpha |
Result: | Met |
Service provider: | Ministry of Justice |
Service description
The service will enable a citizen to make a lasting power of attorney (LPA) and to submit it for checking and registration with the Office of the Public Guardian (OPG). We are in the process of drafting legislation to update the Mental Capacity Act (2005) to allow creation and submission via a fully digital channel, and to move to a future where the LPA is data OPG holds, rather than a physical paper instrument. The service will also check the identity of actors to reduce the potential of fraud by false representation and to prove citizen eligibility for the service.
Service users
- donors (people who are making the LPA)
- attorneys (people who will be using the LPA)
- certificate Providers (independent witness who confirms donor is able to make the LPA)
- witnesses
- legal professionals and charities
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the service team have worked hard to overcome barriers they have faced in recruiting participants to take part in research as a result of restrictions during covid and budget pressures stopping external recruitment. As well as sourcing people using the current online LPA service, they worked with third sector organisations and wider MoJ staff to find participants for usability sessions and when restrictions eased they widened their scope to start in-person research
- it’s clear that the team realised that the usability sessions they needed to carry out were far more involved than a single transactional application and they addressed this by having a prototype that provided extra context for users to explain the point in the application they were in and what they were trying to do. This demonstrated a good working relationship between design and research with both working together to create the best prototype they could for what is a necessarily artificial testing process
- the user researcher demonstrated a good understanding of the service user groups and was able to articulate not only the demographic characteristics but also the different user needs that each actor had (Donor, Certificate Provider, Attorney). It was also clear that the research has been communicated in such a way that all parts of the team are working with a common understanding of their users
What the team needs to explore
Before their next assessment, the team needs to:
- the barriers the team has faced in recruiting participants has meant that a lot of the research has focussed on users who are currently applying, or attempting to apply online. This is potentially a problem as the channel shift ambition, more than doubling the proportion choosing online over paper, means they would benefit from research with users who would currently choose paper. The service team should therefore prioritise speaking to these users as they move in to private beta either from following up current applications or, ideally, using external participant recruitment
- the current Alpha design has the fee payment towards the beginning of the journey partly as a result of understandable operational concern to ensure that OPG collect a fee before starting identity checks which incur costs. While this is understandable the research has already shown that this may put users off applying online and they may therefore revert to paper where they can see and complete more of the applications before deciding if they finally want to proceed. The team are hopeful that the position of payment and identity verification can be moved further back in the journey, but if resistance is met, further research should be designed to communicate the risks this could cause for channel shift by explaining the “why” users may not accept an upfront payment
- the organisational ambition is that everyone aged 18+ should have an LPA in place and the team has tried to reflect that by including people of all ages in their research. However the team may wish to consider narrowing the focus of their research in private beta to focus more specifically on the older age groups who may be expected to potentially put these agreements in place
- the team commissioned external research from an agency indicating that approximately 70% of disabled users would be willing to verify their identity online. As the service moves into Private Beta it will be important that the team have a way to refine this estimate, because the survey was largely online based on a sample of those over the age of 18 whilst the current service remit means likely be serving a narrower older user group
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team commissioned and consulted the Research Institute for Disabled Consumers (RiDC) to seek views from elderly users
What the team needs to explore
Before their next assessment, the team needs to:
- source users with actual assisted digital needs and disabilities to test the end-to-end service
- address the opportunities and findings in the Research Institute for Disabled Consumers report
- source more users from a diverse ethnic background
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team have iterated and refined content based on user needs, this included auditing existing content and identifying gaps in missing content
What the team needs to explore
Before their next assessment, the team needs to:
- a few parts of the service deviate from the GDS guidance and style and also between the service itself—the team should audit all screens and align the content with the guidance: https://www.gov.uk/service-manual/design/writing-for-user-interfaces
- test error messages and shutter pages with users
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the design team have iterated and refined content and screens
- the team have explored a range of signing iterations that includes video
- the team have worked with third parties regarding content
What the team needs to explore
Before their next assessment, the team needs to:
- identify and simplify screens that contain content that can be split across screens to make the user journeys easier, such as the ‘create your password’ page also asks the user to agree to the terms of service
- test the content with users that have assisted digital needs
- consider cloze testing, highlighter testing and comprehension testing as part of UR when testing content
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team have consulted RiDC to seek views on disabled and older users’ issue with this service
What the team needs to explore
Before their next assessment, the team needs to:
- as a paid for service, test and research content and placement of guidance or links to it, are most effective within users’ end-to-end journeys, to raise users’ awareness and enable those who are eligible to claim the available financial support from Ministry of Justice
- test the actual service and prototypes with actual users that have assisted digital needs or consult an organisation such as the DAC (digital accessibility centre)
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team is multidisciplinary with appropriate expertise for the development, including embedded specialist skills such as a digital sociologist
- the team actively collaborates effectively with Ministry of Justice Operations and Policy colleagues
-
the service has a strong collaborative working relationships with Operations and Policy Colleagues
- collaboration with the Operations end-to-end service manager, has enabled service design to develop rapidly, for example placing the fee payment element later in the user journey as opposed to at the start of the journey to meet users needs
-
collaboration with Policy colleagues is providing usable insight to Policy formation and vice versa influencing potential legislation the service will need to launch
- the team is utilising expertise from data scientists and technologists to inform its approach
- the team is made up of permanent civil servants and is empowered to seek support from the established digital specialist communities with MoJ
What the team needs to explore
Before their next assessment, the team needs to:
- maintain development velocity in the interim between the current user researcher moving to a new role, and their replacement taking up the role
- enable user researchers to access MoJ’s external user recruitment resource, in order to reduce potential for selection bias in research samples, as well as enabling a broad range of people with disabilities to be included when recruiting users with accessibility, or assisted digital needs during private beta phase
- ensure the service has appropriate access to a performance analyst expertise during private beta
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team uses effective agile ways of working, using appropriate ceremonies and tooling to maintain velocity in an entirely remote working environment
- the team’s collaborative working relationships with Policy and Operations colleagues enable the team to adjust trajectory in response to potential legislative outcomes
What the team needs to explore
Before their next assessment, the team needs to:
- ensure continuous iteration of the service continues in the next phase of development
- the largest risk to delivery is legislation not being passed or legislation not enabling digital signing, the team should continue to work closely with policy to inform and adapt to potential change to support appropriate legislation
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team demonstrated learnings for hypotheses in Alpha - multiple prototypes tested and discarded
- the team demonstrated examples of design iteration based on feedback
What the team needs to explore
Before their next assessment, the team needs to:
- continue to explore providing the paper form for help with fees, and awareness of the support in conjunction with DWP
- continue sharing evidence from user research to inform placement of fee payment and identity verification in the journey, as well as adapt to potential legislative outcomes, through sharing insight and evidence with policy to manage risk
- content design could be refined and reduced by splitting the content - the team should continue to explore and iterate new designs
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team were able to talk extensively on the plans for the technical observability using OPG standard tooling
- the team plans to deploy security measures in pipelines to automatically identify vulnerabilities prior to production, use least privilege principles and use a range of cloud based security tooling including a WAF
- authentication options, personal data stores and other concepts relating to users’ privacy and security were considered - with proof of concept work documented clearly
What the team needs to explore
Before their next assessment, the team needs to:
- ensure they carefully consider fraud vectors and ensure they extend observability events and act on data gathered, the discovery work done around User Behaviour Analytics and Transaction Monitoring should be adopted to help ensure there is adequate protection for such important data
- carry out a full Data Impact Assessment and IT health check
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a well developed approach to performance measurement for the alpha stage with ownership by the product manager and lead architect
- the service is utilising insight from existing services to inform and refine the approach to performance measures in Alpha and in planning toward private beta
- the service has identified initial tooling to measure performance
- the service has taken a proactive approach to planning performance measures and the technology to enable measurement. The team deploys expertise from Data scientists and Technologists to inform their approach to ensure the service measurement and data needs could be met in the next phase of development
What the team needs to explore
Before their next assessment, the team needs to:
- explore potential dedicated metrics to provide insight on the ability of users with access needs to succeed specifically in verifying their identity digitally
- continue to collaborate with the Office of the Public Guardian to provide critical success measures and benchmarks that enable benefits realisation to be measured over time for users’ and the organisation
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team will make use of public cloud offering with containerised solutions inline with OPG and MOJ infrastructure provision
- the team have plans to adopt a number of GOV.UK components - GOV.UK Notify, GOV.UK Pay and GOV.UK Sign on
- the team has invested time in consultation with The LSSA (Legal Software Suppliers Association) and thoroughly thought out the role of an external API and the concept of operating OPG LPA as a service, to allow for machine to machine interactions to take place
- the team has a well thought out approach and pattern to incrementally deliver the solutions through a combination of micro frontends and api services.
What the team needs to explore
Before their next assessment, the team needs to:
- the team needs to carefully explore the API offering, following GOV.UK guidance (https://www.gov.uk/guidance/gds-api-technical-and-data-standards ) but also particularly paying importance to authentication and authorisation techniques, common standards such as OAuth 2.0 should be adopted as a minimum, but the team may consider further hardening such as FAPI 2.0 or similar
- ensure as they start to deliver the solution the reusable components are appropriately versioned and or mocked to allow the teams to build unimpeded
- the team should consider options such as Fargate spot for development environments and testing for cost optimisation reasons
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team has shared architectural design decisions in open source repositories and has a commitment to open source all code for the project / products
- the team has repository and readme guidance readily available
- the team has made contributions to and uses the MOJ pattern library and GDS Design System
- the team has published information on github at https://github.com/ministryofjustice/opg-modernising-lpa-docs
What the team needs to explore
Before their next assessment, the team needs to:
- continue to work in the open and further interact sharing patterns and practice with the wider government community
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team have made use of common shared platforms in government
- the team are adopting the OpenAPI Specification for internal and external apis
- the team have explored and researched new design patterns
What the team needs to explore
Before their next assessment, the team needs to:
- once API’s are developed, submit them to the government API catalogue
- continue to share findings about the new patterns with the design system working group
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team have proposed an architectural design which should allow for high availability
- the team plan to use a combination of Cloudwatch, Pagerduty and Slack to keep them informed of system health and performance
- the team plan to use auto-scaling to manage demands on the platform
- the team describes numerous points of testing in line with test pyramid principles
What the team needs to explore
Before their next assessment, the team needs to:
- define, document and test the RTO and RPO with the business and make sure the ability to rebuild the solution as described is maintained