Apply for probate beta assessment

The report from the beta assessment for HMCTS's apply for probate service on 3 May 2018.

From: Central Digital and Data Office
Assessment date: 3 May 2018
Stage: Beta
Result: Met
Service provider: HM Courts and Tribunals Service (HMCTS)

The service met the Standard because:

  • The service team has a strong empowered service owner, a skilled and passionate multi-disciplinary team, and appropriate support/proportional governance from the wider HMCTS organisation
  • The team have designed a service based on detailed user research that will transform the experience of seeking probate, including adapting policy and legislation to support online applications.
  • Technology choices are proportionate and well-considered, with an established open approach to working in the open, utilising GOV.UK Pay and Notify.

About the service

Description

Apply online for a grant of probate to get the legal right to deal with the property and belongings of someone who’s died.

Service users

Members of the public who are named as an executor or helping someone who is an executor of a will.

Detail

User needs

The team has a grasp of how the service journey fits within the wider bereavement experience and understands the circumstances and mindsets which users bring to the service. It is researching with a diverse range of users and their needs inform the design of the digital service as well as changes to probates policies.

The service users are people who are named in a will, either as executors, beneficiaries or both, and personal connections or solicitors who help them apply. The team is recognising the characteristics that influence how users interact with the service and what they need from it. For instance, some users are prepared to be executors before they take up the service, others only become aware that they are executors after a death, and need help to understand what to do. For the latter, the team designed an experience that explains the difference between executors and beneficiaries, and lets people who are not executors know upfront that the service doesn’t apply to them. The team is also observing that users also have different levels of exposure to the probates process and legal language. For newcomers, terms like ‘codicil’ cast doubt, slow progress, or lead to incorrect answers. Initiated users and solicitors expect to see the term and for them, the simpler ‘update’ is ambiguous. To provide prompts that are meaningful to a range of users, the team is designing content that includes both terms.

The team is doing research with users who have limited digital skills and access to the internet. It is partnering with the Good Things Foundation to support users who need help using the digital service. Whilst it aims to have most people submit applications digitally, users will continue to have the option to submit a paper application. The team is also regularly researching with people who have sensory, motor, cognitive and emotional challenges. It is addressing issues raised in DAC reviews. The team’s research shows that users coping with grief, life changes, and administrative burden associated with a death, sometimes need help with digital tasks that they would otherwise find straightforward. So the team is committed to make it simple for users to complete an application themselves without being overwhelmed.

To accomplish this, the team is planning to research whether users who choose to save and continue their application later are able to resume it easily. It is also planning to research whether applicants who must provide further information after they submit an application, are able to receive and understand the request and provide the information. The team must research these questions as planned with users with a range of skills and abilities. It must also research whether users can find the support channels, whether the support interactions enable them to progress and if not, what changes would help.

The team must also research the extent to which users are succeeding the first time, and what changes would help. Researching whether users succeed the first time typically involves comparing the number of people who try to accomplish something against the number of people who do, and measuring how much time it takes them or whether they did it in a single seating. Whilst the team is monitoring the volume of submissions that result in a grant, it is noticing that these may not be appropriate measurements, since there are legitimate reasons for people to stop and complete their application later. For one, users may need to take time to complete an inheritance tax form, a prerequisite to apply for probates, or they may need to collect information about the person who died, or they may simply wish to stop for personal reasons and resume when they are ready. Yet the team is also observing how misunderstandings, incomplete submissions, and requests for further information after a submission may delay a submission or grant. The team should define of ‘success’ and of ‘the first time’ with a view to what users need to accomplish. It should distinguish necessary pauses from preventable interruptions. For users who have completed the inheritance tax form before beginning the probates application, and who have all the necessary documents to succeed, the team’s research should detect whether a preventable obstacle is stopping them from succeeding the first time.

Team

The team has key specialisms well-represented, and a clearly empowered service owner. They demonstrated great passion and commitment to delivering a high quality service for their users. All team members clearly had a strong understanding of the service users’ needs, and had great empathy for what is clearly a challenging and sensitive landscape.

Agile practices are well-established, and they are making good use of available technology to achieve effective multi-disciplinary working.

The team is relatively large, comprising approximately 30 people split into separate delivery teams. Of these, around one third are permanent civil servants. Whilst significant efforts are being made to recruit and train permanent staff, this still represents a significant risk to long-term sustainability of the Probate service and more widely to the programme, as has been described in previous HMCTS assessment reports.

Technology

The service team has a very conscientious approach to delivering and supporting their software platform, with a strong awareness of security and fraud issues.

Although there are challenges with the ongoing HMCTS platform migration, the team has asserted they have the power to push back their launch date if they feel the new platform has not been tested “in the field” long enough. Also, the same new architecture is already on production for other delivery streams in the same program, making knowledge and patterns widely available.

The new platform, a cloud-native solution based on the Microsoft Azure technology, will support zero-downtime deployments and will contribute to having a more uniform tech stack within the programme.

The team has addressed the main two points raised in the previous assessment. Their code is now Open Source, and the team has improved their knowledge about potential fraudulent activities and ways to mitigate that.

For example, the caseworkers at the registry office, among other things, manually check if the will and the list of executors submitted by the applicant match. Another “gate” is the checks that the bank or building society perform before releasing any asset. The team has worked closely with UK Finance to make sure their approach is secure enough. In general, the team’s effort in making the service as user friendly as possible without compromising on security is commendable.

The team has not experienced any security incident. The platform is monitored 24/7, and alerts are in place in the case of intrusion or any suspicious activity. Their response to top priority alerts is very fast, including outside working hours. The platform has undergone penetration testing by the NCSC and the team runs their own automatic security tests (ZAP tests) overnight. Data at rest is encrypted.

The team is prepared to meet an increased volume in support requests and in traffic, thanks to performance tests done in the past. The team does not foresee any significant spike in traffic, but rather a volume of applications evenly spread throughout the year. If the platform becomes unavailable, a page is shown to advice the citizen on what to do.

The team is using reusable Government components, such as GOV.UK Notify and GOV.UK Pay. Their software stack includes modern and mainstream technologies, such as Java and Node.js and their frontend code is lightweight, without the use of complex frameworks.

In the long term, the team should keep exploring the use of other components, such as GOV.UK Verify for user authentication and the GDS PaaS.

The service is in a very good place but has to make sure the new platform is robust and stable enough against actual usage by the citizens before the launch to public beta.

Design

The team has a good understanding of the journey through Government for someone who has been bereaved, how probate fits into this journey, and what happens before and after the application for probate.

The team has also done good work and are changing legislation to make the overall probate application easier, removing the need to go to a probate office to swear an oath, and are working towards removing other difficult to understand processes. The team will work on the paper form to improve the experience for all users on all channels.

The team are generally using the GOV.UK design patterns. A new design, for a declaration, has been made, and seems to work well. This pattern should be shared back with the design community as this is a common need in services. The team have made some visual changes to the task list pattern that are unacceptable, with some text overlapping other text. The team should re-look at the original task list design and move closer to that, and work with designers around Government to improve the suitability and flexibility of the task list pattern.

There have been many recent changes to the interaction and content design that were not reflected in the deployed production code which should improve the overall flow. These must be implemented before switching to a public beta.

The service will expand to cover more situations, including not having a will. The language in the current form is hard to understand, and it is good to hear that this has been reconsidered and will be much easier to understand.

The team is working with DAC to improve the accessibility of the service, but there are still basic frontend errors that should be easy to fix. The frontend developers should do a sweep of the service and fix navigation errors before going to public beta. The team must continue to check the site’s accessibility and make improvements as the service is iterated.

It is good to hear that user support has been considered carefully and that support methods and scripts will be iterated. The team should continue to check that support is adequate, and that users who need assistance are getting the help they need. The team should also try different methods to highlight the support available throughout the user’s journey.

Common component – Identity and Access Management (IDAM)

The team uses common components for several parts of the user’s journey, including HMCTS Identity and Account Management (IDAM) component. The panel was disappointed that little visible progress had been made to IDAM, such that problems encountered in previous HMCTS service assessments had not been resolved.

As with both the Civil Money Claims and Divorce service, the registration and sign in makes no reference to the service being used and may cause confusion to users. The team should continue to push for changes to the sign in flow and need support from the rest of HMCTS to make services more usable.

Common components should enable service teams to rapidly implement common functionality, meeting well-understood user needs, in their services. However, these must be implemented such that they do not distract from users’ primary purpose – to complete a task like applying for Probate – which is a risk with the current IDAM implementation.

Analytics

The team outlined a wide range of metrics specific to their service that they’ll be using to measure its effectiveness. Within this, they described a subset of KPIs which were of particular importance, including the number of ‘stops’ and drop-out points in the journey.

They explained that sometimes key targets, such as the need to reduce the time taken for transactions, need to be balanced with the need to be sensitive towards users who may be recently bereaved. They gave an example of how they had reduced the time spent on applications by stripping out superfluous requests for information.

The team is collecting data from a wide range of sources, including web analytics, the system’s backend, helpdesk contacts, and the casework system. This is being presented and shared using dashboards built with Google DataStudio.

In the short term, data for digital applications will be held separately from data for paper-based applications making direct comparison more difficult.

The team has targets and a plan for increasing digital take-up in the coming months.

The team lacks a dedicated performance analyst or access to a specialist on an ongoing basis to support performance analysis; team members have completed analytics activity to date.

The panel recommend that the service need support from a dedicated analytics specialist to improve and iterate the service during public beta. The service team said this was now being arranged through a programme function.

Recommendations

Before moving to public beta, the service team must:

  • Make sure the new platform is robust and stable enough against actual usage by users before the launch to public beta
  • Implement a version of the task list closer to the design pattern
  • Fix accessibility errors
  • Implement and deploy the current prototyped and tested designs

To pass the next assessment, the service team must:

  • Share the declaration pattern with design system team and designers around Government, and work towards a new design pattern
  • Be able and make improvements to the registration and sign in flows
  • Continue to rewrite legal terminology into something users understand as new cases are covered by an online journey, and iterate the paper form
  • Research and iterate the support process, including adviser scripts and how users access support

  • Obtain guidance from a dedicated performance analyst to further develop analysis and iterate the current set of performance measures
  • Research whether users with a range of skills and abilities can find support channels, and whether support interactions enable them to progress and if not, what changes would help.

The service team should also:

  • Research the extent to which users are succeeding the first time, and what changes would help. Use observation and measurements appropriate to a process that may not be possible to complete in a single seating.

Next Steps

Subject to resolving mandatory recommendations above, the service will be approved to launch on a GOV.UK service domain. The team should liaise with Digital Engagement Manager James Pitman to confirm that necessary actions have been taken, and subsequently begin the process to set up your *.service.gov.uk domain.

You should follow the remainder of the recommendations made in this report before arranging your next assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 6 August 2018