Claim a power of attorney refund alpha

The report from the alpha assessment for MOJ's claim a power of attorney refund service on 1 September 2017.

From: Central Digital and Data Office
Assessment date: 1/09/17
Stage: Alpha
Result: Met
Service provider: Ministry of Justice - Claim a Power of Attorney Refund

The service met the Standard because:

  • The service team have a clear understanding of the challenges and scope of the service. They have a well developed plan for the Private Beta phase.
  • The service team demonstrated an understanding of the users of the service, and plans to develop this understanding further.
  • The service team are working successfully together in an iterative, user-centred manner.

About the service

Description

The service is to allow users to apply for a partial refund of the fee paid to register a Power of Attorney between April 2013 and 31st March 2017.

Service users

The users of this service are people who have previously paid to register a Power of Attorney between 1st April 2013 and 31st March 2017.

Detail

The panel would like to commend the service team on their preparation for the assessment. Their presentation was focused and well delivered and they were able to provide a lot of relevant information to the panel.

User needs

The service team demonstrated a strong understanding of their users and their needs. The information gathered from Power of Attorney service and the research methods used has helped to identify not only high level needs of the users but a better demographic description of the users and their behaviour.

The service team showed a plan to provide support for users during the process and to settle queries that users may have after claiming the refund. It was also positive to see that, even though they are not working on the backend of the service yet, they have considered internal users in their plans.

We strongly recommend increasing the number of user research sessions to cover any possible variance on their needs or behaviour using the service. We also encourage the use of a navigational prototype that users can use by themselves instead of walking through the service with them.

It was very positive to hear that there is a plan to do more research with assisted digital user or accessibility needs. We would like to see more information about users that might have issues related with age (memory loss, cognitive difficulties, accessibility issues…). We also recommend extending research with lower-skilled users that might need extra support.

Team

Despite working in different locations, the team showed strong evidence of the way in which they were managing to work together as a multi-disciplinary team in an iterative way. There was a good blend of skills, although the team will need to ensure that they have enough developer skills available to them in future phases. The panel were particularly impressed by the way in which the team had challenged approaches to the delivery of the service in order to ensure that a user-focused digital service is delivered.

Technology

Technical prototypes of the service have supported an iterative programme of user research in the alpha phase, and the team has done a good job of understanding the overall technical risks for the service. The use of existing Office of Public Guardian tools and platforms for build and deployment seems reasonable, and we are glad to see gradual improvements being made to the stack.

The team is commended on a good plan for managing source code, especially the use of pull requests to ensure a second pair of eyes on merge. We note that application source code will need to be made public by the launch of public beta.

Many of the most sensitive areas of the system have not yet been fully designed or tested, such as the scheme for managing and rotating cryptographic keys and the mechanisms to protect this very sensitive personal and financial data from rogue actors. We look forward to seeing more in this area before the casework system begins to process actual money refunds.

Design

The service team have demonstrated a user-centred process with design and research fully embedded in the process, and have iterated significant parts of the design based on things they’ve discovered in user research. The design is consistent with GDS design patterns and the team have made use of common platforms by integrating with GOV.UK Notify.

The panel would like to see more iterations of ‘unhappy paths’ and what the user experience would be in those cases. As an example, the first step of the journey is an eligibility question. If ‘No’ is selected, the user lands on a page with no further guidance on next steps.

As mentioned in the user research section, the panel would like to see research with more participants to get best possible feedback from users. For example, the hint copy on the bank details page instructs users to find their details on their card. Some banks, such as Natwest and the Co-Op, do not display this information on bank cards. Testing with real data and a wider pool of users may have surfaced this beforehand.

More testing should be done with users on different devices. For example, the ‘Start now’ button on the start page is very low on the page. On tablet and mobile especially, this would be greatly amplified. It is our recommendation that more options are explored.

The team have shown that they are doing the hard work to make things simple for users, reducing the amount of information that an applicant has to submit. They have a good understanding of the scope of the service, pushing back to senior stakeholders to improve the experience for users.

Analytics

The team had a good understanding of the KPIs and funnels that they will use to measure success in the private beta phase. They know how they will gather this information from the digital service, from their call centre and from the team processing the applications.

Recommendations

To pass the next assessment, the service team must:

  • Undertake user research with:

  • users outside of the London area;
  • Assisted Digital users;
  • users using their own devices to access the service;
  • people using non-desktop devices;

  • Ensure that processes are in place to record metrics for Assisted Digital users who use the service;
  • Ensure that the DOB for an attorney is mandatory and not optional;
  • Have a further conversation with GDS technical assessors when the back office system is ready for deployment;
  • Carry out penetration tests and privacy impact assessments on the donor/attorney-facing component as planned, and then schedule another round of these tests once the casework components are built. Ensure that the back office system has been the subject of a penetration test before being used to process live applications;
  • Create a plan for managing, rotating and securely storing and using the encryption keys necessary to decrypt personal and financial data. Demonstrate that financial data cannot be leaked to the front-end application under any circumstances and document the circumstances in which this data is held and processed in a decrypted state.
  • Document the end-to-end operations model, to include the vetting, training and supervision of caseworkers and developers and the implementation of any “two-person rules” in the system. This model should also include a dedicated and empowered service manager.
  • Document the audit characteristics of the system, including the proactive ability to monitor sensitive data egressing the network environment.
  • Open all source code before public launch.
  • Inform GDS if their plans to involve around 800 users in the private beta stage significantly change.

The service team should also:

  • Test with users:

  • the addition of the appeals process into the user flow;
  • a flow that allows the case/reference number to be verified and removes the need for further details to be entered;
  • options for the start page that include moving the “Start Now” button higher up the page;
  • the text phone option (or other non-audio route for accessing the service);

  • Consider their approach to requests for updates on the status of applications and whether this is a candidate for being delivered digitally;
  • Consider other government services that may share users with this service, and the potential for cross-referencing;
  • Ensure that the OPG data retention policies support the retention of information about past applicants for the appropriate length of time;
  • Ensure that the technical resources allocated to the service are adequate to support it before inviting each successive group of users;
  • Continue their good work on a comprehensive rollout and communication plan for the Public Beta stage by working with likely partners and stakeholders to develop this plan.
  • Consider the auditing scheme of the system carefully to include field-level provenance information in order to fully support future requirements of the General Data Protection Regulations.
  • Make a plan to publish any data which can be shared publicly from the service, especially of a statistical or performance nature. This data should be made available in machine-readable, open formats.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 30 July 2018