MOJ forms Beta reassessment

Service Standard assessment report MOJ forms 14/06/2024

Service Standard assessment report

MOJ forms

Beta reassessment

Assessment date: 14/06/2024
Stage: Beta reassessment
Result: Amber
Service provider: Platforms & Architecture

Previous assessment reports

Service description

  • MoJ Forms is a form building platform, enabling teams to prototype, create and host fully digital, accessible forms quickly and affordably.
  • Improving the quality and structure of data collection across MoJ
  • Accelerating the digital transformation of paper, PDF and email processes

Service users

There are multiple users in the MoJ Forms product space.

  • Initial target users were user-centred design (UCD) professionals within DDaT (Digital, Data and Technology) when the team started private beta in late April 2021:

  • content designers
  • user researchers
  • interaction designers

This was because of their knowledge of the service standard and the GOV.UK Design System as well as their experience of designing citizen-facing forms for government.

As part of private beta, the team quickly learned that other user groups that sit outside of the DDaT profession also had needs and requirements. These user groups had a need to improve their data collection processes and improve accessibility of forms for their end users.

The team have therefore started going beyond these core UCD roles. By doing so they are enabling the Justice Digital strategy to empower teams that lack DDaT professionals to make simple, faster and better services with a form building platform. MoJ Forms ensures they can design and build efficient, accessible forms that are hosted securely via MoJ Cloud Platform.

Non-UCD users are typically in:

  • operational teams such as case workers or operation leads
  • policy teams responsible for certain processes

The team knows that there will be additional support required here and that’s something they explored in a discovery in 2021. They have since put measures in place to provide extra support to those user groups.

Things the service team has done well at reassessment:

  • shared analytics with the Government Digital Service, in order to benefit the new strategic cross-government product.
  • the team has taken on board the feedback we gave around point 5 of the service standard and the service is more accessible and inclusive as a result.

1. Understand users and their needs

Decision

The service was rated amber for point 1 of the Standard.

Recommendations:

  • aside from the user needs for the ‘form fillers, the team needs to review their user needs. A user need should focus on the problem a user is facing, not the solution. For example, ‘As an MoJ employee I need to be able to create online forms quickly and easily so that I can ensure efficient access to justice for all’ suggests that online forms will solve access to justice for all. This need also focuses on the solution (forms), not the problem, like needing a way of getting structured information from users.
  • some of the user needs are incomplete, and the ‘so that’ part of the user needs should be completed if they are being considered as user needs of the service
  • the team should consult the service manual guidance on user needs and guidance on tests of a good user need. The panel can support the team with this if needed.
  • the team would benefit from having granular user needs that fall under several overarching/high-level user needs. This would really help in communicating their user needs for form creators and other user types, which aren’t always clear.
  • the team should be plotting their users on the digital inclusion scale and should also be recruiting users with lower digital skills where possible, to understand how their needs differ from users with higher digital skills.

2. Solve a whole problem for users

There are significant gaps in the governance of the end-to-end service, in particular the editorial process. The team explained that ‘guard rails’ had been built into the online service to help users comply with accessibility standards and the Service Standard, such as guidance and an engagement call. While these features will go some way towards helping form creators in the design process, they will not prevent users who lack the relevant design skills and knowledge from creating inaccessible and poorly designed forms that do not comply with GDS standards, particularly for those with cognitive disabilities and those with English as a second language. The personas at most risk of this include ‘Alex - Senior Policy Advisor’ and ‘Jamie - Operations Manager’.

The team said there were close to 200 form creators, with approx 25% being non-UCD professionals, and the team said this would scale up. This is concerning, as at present form creators do not need to demonstrate they have the competence to create good quality, accessible forms that comply with GDS, GDPR and security standards. At present there is not a robust process in place to check form creators’ work and the resulting forms. Form creators in non-design roles are not given any formal training by the MoJ to enable them to create forms that are compliant with GDPR, security, design and accessibility standards.
The guidance currently provided is highly unlikely to give these form creators the skills they need, in the same way that it would be difficult to give someone an article to read on how to swim and then expect them to be a competent swimmer.

This approach creates a legal risk, through the forms that are hosted on the platform not being accessible, in addition to them not being compliant with GDS standards. This approach also creates reputational risks and a risk that MoJ will need to cover the cost of more contact from citizens who do not understand what they need to do to complete a form as well as the cost of reworking any poor quality forms in future.

Decision

The service was rated amber for point 2 of the Standard.

Recommendations:

  • It is recommended that the team can agree and document a governance process for MoJ so that published forms comply with the service standard. For example only allow people with the relevant content design, UX design skills and form creation skills to quickly create and edit forms on the behalf of non-designers to enable them to meet their deadlines, or look to adopt the GOV.UK website publishing approach where a content designer designs the content and ‘owns’ the words, while a subject matter expert ‘owns’ the facts, but the subject matter experts do not write the content. While this change might not sit with the design team, it should be the responsibility of the wider service team and MoJ.
  • while some forms, due to the number of transactions, go through GDS assessments, many do not seem to go through any formalised assurance process to ensure they are compliant, for example, accessible, easy to use, meet user needs and follow GDS styles. The team mentioned that an alternative assessment process was being discussed. It is recommended that an effective assurance process be agreed and put in place as soon as possible with the assessors including those skilled in areas such as user-centred design, the Service Standard and accessibility. They should be empowered to be able to compel the team to rework forms that are not accessible and do not meet the Service Standard.

3. Provide a joined-up experience across all channels

Decision

The service was rated amber for point 3 of the Standard.

Recommendations

  • The team said form owners were responsible for designing a joined up end-to-end user experience. However, the team also needs to understand how their part of the service fits into end-to-end user experiences and needs to be working to help services join up that experience. The team needs to extend their existing service map to map out and understand the end-to-end user experience, this includes the user needs and experience of form fillers and returning form fillers, form creators, those processing the forms and those using the data captured. They should include pain points and unhappy paths. The team has so far only mapped the happy path steps for the form creators.
  • the team should speak to downstream users of the data to find out what impact the absence of data validation is having on the quality of data collected including any reference numbers and addresses.
  • the team needs to follow the government’s standard for property and street information, so look to introduce postcode look up for UK addresses to be able to collect definitive street information, but also give the option for users without a UK postcode to enter their address manually.

4. Make the service simple to use

Decision

The service was rated amber for point 4 of the Standard.

Recommendations

  • The team should do usability testing with form fillers on mobiles and tablets to get feedback on the user experience on those devices. The team provided data to show that more users use a mobile device to complete forms than other devices.
  • the team needs to test the usability of the service with users with access needs so they can be confident the service meets the needs of those users, for example can users, using assistive technology easily use the ‘right click’ functionality to select a component when building a form.
  • the team have done some testing of the forms being filled in by form fillers/citizens, but need to expand on this and include users with access needs, so they can be confident the form fillers find it simple to fill in forms quickly and easily. They should also analyse data such as completion rates and citizens feedback to help them understand any pain points that can be addressed through iterating the form builder. For example, where users need to save and come back later or need to have only one thing to do per page.

5. Make sure everyone can use the service

Decision

The service was rated green for point 5 of the Standard.

6. Have a multidisciplinary team

Decision

The service was rated amber for point 6 of the Standard.

Recommendations

  • The team should have access to a permanent performance analyst that can provide support and expertise required to measure the impact of iterations and implement analytics tracking on the form builder.

8. Iterate and improve frequently

Decision

The service was rated amber for point 8 of the Standard.

Recommendations

  • effectively measure the impact of the iterations made in Private Beta
  • start incorporating success measures using design hypotheses into their iteration. For example:
  • if we: improve the content on the What You Will Need page
  • we will: see fewer users dropping out of the service, better quality applications.
  • success Measures: % users dropping out of the service (decrease), average number of attempts to complete the service (decrease), fewer repeat returns (decrease) etc.

9. Create a secure service which protects users’ privacy

Decision

The service was rated amber for point 9 of the Standard.

Recommendations

  • An IT Health Check has been performed on the previous technical solution of which components have migrated to the new solution. Continue to work with internal Cyber Security teams to ensure compliance with MoJ security policies and practices and to get the most out of technologies that may be available from central resources.

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

14. Operate a reliable service

Decision

The service was rated amber for point 14 of the Standard.

Recommendations

  • it was not clear if all risks identified as part of threat modelling activity had been captured in risk logs. Recommendation would be to track risk centrally in a single risk register.
  • it was discussed at the tech pre-meet that data recovery had not been tested. Recommendation to work with MOJ security/Business Continuity teams to exercise test plans and ensure that data can be recovered. Recommendation, carry out risk assessment and track in risk register.

Next Steps

This service can now move into a public beta phase, subject to addressing the recommendations given for the amber points within three months time and CDDO spend approval.

This service now has permission to launch on a GOV.UK service domain with a Beta banner. These instructions explain how to set up your *.service.gov.uk domain.

The service must pass a live assessment before:

  • turning off the legacy service
  • reducing the team’s resource to a ‘business as usual’ team, or
  • removing the ‘beta’ banner from the service

Updates to this page

Published 14 October 2024