Help With Fees - Beta Assessment

The report from the alpha assessment for MoJ Help With Fees service on 14 June 2016.

Stage Beta
Result Met
Service provider Ministry of Justice

Result of service assessment

The assessment panel has concluded the Help With Fees service has shown sufficient progress and evidence of meeting the Digital Service Standard criteria and should proceed to launch on a service.gov.uk domain as a public beta.

Detail of the assessment

Service Manager: Zuz Kopecka

Lead Assessor: Steve Wood

Researching and understanding user needs [points 1, 2]

There are two user groups for this service: the public (applying for help with court fees) and staff, who have to process the applications.

The service team has a sound understanding of its users. The public’s high-level user needs centred on not being able to complete the application form correctly and not being able to provide the necessary supporting evidence. This led to problems for HMCTS staff, who were rejecting around 70% of claims, which in turn resulted in delays for the applicants, and the subsequent resubmissions led to extra work for staff.

The service team had discovered that the above public user groups were subdivided into two general types. One is exemplified by divorce applicants, who tend to be proactive and submit claims by post. The other includes people reacting to an ongoing legal process, such as being threatened with immediate eviction, and often need help at the court.

Some users of this service will clearly need assisted digital support. It was therefore good to learn that scripts have been created and tested with court and call centre staff.

The service team has conducted testing with users who have accessibility needs, including people with cognitive and visual impairments.

A sound research plan is in place for the public beta phase. This is going to focus on assisted digital - looking at face to face and telephony as options - and new features for the service.

The value of the user research and iterative development that the team has undertaken is demonstrated by the aforementioned 70% rejection rate being reduced to 16%. Moreover, much of the rejection now is due to ineligibility rather than the way applications have been completed.

Additionally, the time taken to process an application has more than halved, and the service now has a user satisfaction rating of 88%.

Running the service team [point 3]

The team is made up of a product manager, user researcher, delivery manager, designer, content designer, web ops and three developers. It is clearly a talented team, that possesses a deep understanding of its users and the technology needed to develop a very good service. The team is co-located, visits courts all over the country, and collectively takes part in research. The service is being built using agile methods, working in a fortnightly sprint cycle with the usual ceremonies such as daily stand-ups, planning and retrospectives in place.

It was noted that there isn’t data analyst in the team, but one will be made available to the service soon. This role becomes an important element of a service as it goes through the development phases. When the team returns for the live assessment, the GDS panel will be looking for evidence of actionable insights - and improvements - that have come from analysing the way people use the service.

Designing and testing the service [points 4, 5, 12, 13, 18]

Research shows that around half of the users at the courts were able to complete the service unaided. In lab sessions the completion rate was almost 100%. There is evidence that the use of the service on a mobile device is at least as good, if not better, than on a desktop PC.

The service was tested with users who are blind, are deaf, have dyslexia, and who have various cognitive disabilities such as asperger syndrome.

The minister knows about the service and it will be demonstrated to him during the public beta phase.

Technology, security and resilience [points 6, 7, 8, 9, 10, 11]

The approach the team is following is sensible and is in line with other Ministry of Justice services. The team is using MoJ platforms and GDS templates and styles. The code is hosted on GitHub under an open source licence (available for other departments and teams to use).

The service does not store user data in the public-facing part. A privacy impact assessment has been carried out. It has also been tested by the MoJ’s ethical hacker. The connection between the public-facing service and the staff backend is made using secure APIs.

The service is hosted on AWS and is scalable. The public-facing and staff environments are in separate AWS VPCs and can be recreated inside half an hour.

In the event of the service being taken off-line, an error page will appear on GOV.UK and users will be pointed to the printable .PDF version of the form.

It was good to hear that the service team are talking with GOV.UK Notify. GOV.UK Verify has been considered, but was found not to be suitable for this service.

Improving take-up and reporting performance [points 14, 15, 16, 17]

Google Analytics is already deployed on the service, but sufficient data to provide meaningful insight will not be available until the public beta stage. Work has already started with the GDS Performance Platform and the service team is looking to use a MoJ data analyst to provide meaningful insight.

The KPIs that the service team are looking at focus on reducing the rejection rate and improving user satisfaction.

The target for digital uptake is 70%.

The improvements team has made to MI recording were impressive and it the panel will be interested to hear whether they have managed to use it for policy evaluation at the live assessment.

Recommendations

Before the live assessment the service team must:

  • Reconsider the 10 minute timeout and ‘start new application’ features. These features make sense within the context of the private beta, but should be either removed or restricted to the public iPads located in the several courts that were used in the private beta.

  • Optimise the start page title for search on GOV.UK and the broader web. The service name “Help with fees” makes sense within the context of user journeys through the courts and tribunals system, but in the broader context of GOV.UK (and users searching on Google) it could give the impression of applying to any other fees relating to government. The team should optimise the start page title to make the domain of the service clearer.

The service team should also:

  • Ensure that the plan to bring a data analyst into the team is adhered to. Questions about improvements to the service based on data analytics are likely to be asked at a live service assessment.

  • Reconsider the decision to cut the budget by 66% for the live beta phase. The panel is concerned that the excellent work to date will stall without continued investment at or near the current level. The department needs to consider this decision very carefully as the team will be splitting its attention across two services. This is likely to adversely affect delivery, and may pose problems at the live assessment.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 18 January 2017