Submit your appeal alpha assessment

The report from the alpha assessment for HMCTS's submit your appeal service on 11 September 2017.

From: Central Digital and Data Office
Assessment date: 11 September 2017
Stage: Alpha
Result: Met
Service provider: Her Majesty’s Courts and Tribunals Service (HMCTS)

The service met the Standard because:

  • Developing the service in a user centred way
  • Have a clear understanding of their users and their needs
  • Thinking about the end to end service rather than just a digital form

About the service

Description

The ‘Appeal a Benefit Decision’ service is for benefit claimants who have received a benefit decision they do not agree with, they have the right to appeal to an independent tribunal who will look at the decision again, this part of the journey is the submission of the appeal to the tribunal.

Service users

The users of this service are benefit claimants or their representatives who are looking to contest a decision by DWP to reduce or remove their benefits.

Detail

User needs

The team showed a good understanding of user needs that was grounded in appropriate levels of research. Research had been conducted with appellants, HMCTS staff, welfare rights advisers and Judges. A broad range of techniques had been used including – observation, interviews, surveys and lab-based research. These had led to a comprehensive list of user needs. Team members were able to talk confidently about how design features in the prototype were linked to research findings.

The understanding of the needs of Assisted Digital users was less well developed and focused more on how intermediaries would cater for users with the most complex needs. It was not completely clear how a telephone line met user needs or how the service would make best use of the upcoming HMCTS face-to-face assisted digital service.

Pain points in the current service were understood, particularly around receiving information, providing reasons for appealing, legal challenge and feelings of confusion. There was evidence in the prototypes shown of these pain points being addressed.

Team

The team has demonstrated that they are operating in a multidisciplinary way with good agile practices and tools. The team demonstrated that they have appropriate governance that empowers them to make decisions that have encouraged DWP (key stakeholders) to be part of their programme board with the creation of a mirror of the board in DWP.

There is also a long term plan around building sustainable multidisciplinary teams within HMCTS, with the Service Owner, Delivery Manager and Product Owner being civil servants. The organisation has identified other roles that they are planning to convert from being contractors into civil servants over the next few years.

Technology

The team have used simple prototypes to quickly test changes to forms with users, a good approach appropriate for alpha.

The service collects information from a user, which is submitted to a back-end system which is currently in the process of being replaced with a new system. The team demonstrated that the service is decoupled from the back-end system and how design decisions such as the format of fields were dictated by the needs of users, rather than the constraints of integration with back-end systems.

Notification of the progress of the process uses the GOV.UK Notify platform. Tracking the progress of an appeal is conducted using a separate tribunals-wide service which is in private Beta and is been assessed separately.

For private Beta, the team outlined working with DWP to reduce the amount of information the user has to re-submit to the appeals process and using another tribunals platform to provide uploading of supporting evidence.

Design

The team has made good use of established styles and design patterns and has adapted them to fit the needs of their users where appropriate.

The team showed that they had tested multiple solutions to specific problems and iterated the design of their service accordingly.

The teams prototypes were very well organised and were of an appropriate fidelity.

The team recognised that the submit process is just one part of the end-to-end user journey, and that much of that journey is delivered by DWP. They showed that they were working with DWP to improve those aspects of the user journey, with some success.

The team showed that users were able to complete this part of the service, but when evidence submission and appeal tracking are added they should test whether users need more support. In particular, design patterns like save-and-return and task list may become necessary.

Analytics

The team have been considering the type of key performance indicators that they would like to measure during private beta onwards, speaking to the wider business and DWP. The team have also registered with the performance platform.

Recommendations

To pass the next assessment, the service team must:

  • Show that users can complete the end-to-end service, from the point where they dispute a DWP decision to the point where a tribunal verdict is given.
  • Make sure that users who can’t complete the end-to-end service in a single session (for example if they need time to gather evidence) are given adequate support.

The service team should also:

  • Make sure they’re using the latest styles, components and design patterns from GDS
  • Publish information about any new patterns they’ve developed (for example, the high-level progress bar).
  • Pilot Assisted Digital help with their supplier and make sure it meets user needs.

Next Steps

pass - alpha

You should follow the recommendations made in this report before arranging your next assessment with the Digital Engagement Manager.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 30 July 2018