MOJ Forms beta assessment
Service Standard report MOJ Forms 06/12/2023
Service Standard assessment report
MOJ forms
From: | Assurance team |
Assessment date: | 06/12/2023 |
Stage: | Beta |
Result: | Red |
Service provider: | Platforms & Architecture |
Subsequent assessment reports
Service description
- MoJ Forms is a form building platform, enabling teams to prototype, create and host fully digital, accessible forms quickly and affordably.
- Improving the quality and structure of data collection across MoJ x-MoJ
- Accelerating the digital transformation of paper, PDF and email processes
Service users
There are multiple users in the MoJ Forms product space.
-
Initial target users were user-centred design (UCD) professionals within DDaT (Digital, Data and Technology) when the team started private beta in late April 2021:
- content designers
- user researchers
- interaction designers
This was because of their knowledge of the service standard and the GOV.UK Design System as well as their experience of designing citizen-facing forms for government.
As part of private beta, the team quickly learned that other user groups that sit outside of the DDaT profession also had needs and requirements. These user groups had a need to improve their data collection processes and improve accessibility of forms for their end users.
The team have therefore started going beyond these core UCD roles. By doing so they are enabling the Justice Digital strategy to empower teams that lack DDaT professionals to make simple, faster and better services with a form building platform. MoJ Forms ensures they can design and build efficient, accessible forms that are hosted securely via MoJ Cloud Platform.
Non-UCD users are typically in:
- operational teams such as case workers or operation leads
- policy teams responsible for certain processes
The team knows that there will be additional support required here and that’s something they explored in a discovery in 2021. They have since put measures in place to provide extra support to those user groups.
Things the service team has done well:
The team should be very proud to have developed an easy to use and intuitive service that makes it easy for users to create forms and visualise complex workflows. And of the positive feedback they have received from users.
The team has used a range of research methods to help them understand their users and behaviour, has actively engaged a good number of users in their research and iterated based on their feedback, for example it has developed and iterated the branching feature and made it simple for the user to achieve a complex task. Many important aspects of the wider end-to-end journey have been covered, for example:
- help teams determine if using MoJ Forms is right for their service
- provide accessibility guidance as well as a template accessibility statement to enable teams to easily create statements for their forms
- the ability to connect to Google Analytics to measure the performance of published forms
- validation to help prevent form builders from sending emails from a personal email address.
The team is sharing knowledge with other organisations looking to develop their form building capability, such as other UK government departments and the German government.
The team has created high level goals, and derived KPIs from these.
1. Understand users and their needs
Assessed by: User research assessor (and design assessor when relevant)
Decision
The service was rated amber for point 1 of the Standard.
Recommendations:
- aside from the user needs for the ‘form fillers’, the team needs to review their user needs. A user need should focus on the problem a user is facing, not the solution. For example, ‘As an MoJ employee I need to be able to create online forms quickly and easily so that I can ensure efficient access to justice for all’ suggests that online forms will solve access to justice for all. This need also focuses on the solution (forms), not the problem, like needing a way of getting structured information from users.
- some of the user needs are incomplete, and the ‘so that’ part of the user needs should be completed if they are being considered as user needs of the service
- the team should consult the service manual guidance on user needs and guidance on tests of a good user need. The panel can support the team with this if needed.
- the team would benefit from having granular user needs that fall under several overarching/high-level user needs. This would really help in communicating their user needs for form creators and other user types, which aren’t always clear.
- the team should be plotting their users on the digital inclusion scale and should also be recruiting users with lower digital skills where possible, to understand how their needs differ from users with higher digital skills.
2. Solve a whole problem for users
Design assessor, (with input from Research, and Lead assessor where relevant)
There are significant gaps in the governance of the end-to-end service, in particular the editorial process. The team explained that ‘guard rails’ had been built into the online service to help users comply with accessibility standards and the Service Standard, such as guidance and an engagement call. While these features will go some way towards helping form creators in the design process, they will not prevent users who lack the relevant design skills and knowledge from creating inaccessible and poorly designed forms that do not comply with GDS standards, particularly for those with cognitive disabilities and those with English as a second language. The personas at most risk of this include ‘Alex - Senior Policy Advisor’ and ‘Jamie - Operations Manager’.
The team said there were close to 200 form creators, with approx 25% being non-UCD professionals, and the team said this would scale up. This is concerning, as at present form creators do not need to demonstrate they have the competence to create good quality, accessible forms that comply with GDS, GDPR and security standards. At present there is not a robust process in place to check form creators’ work and the resulting forms. Form creators in non-design roles are not given any formal training by the MoJ to enable them to create forms that are compliant with GDPR, security, design and accessibility standards.
The guidance currently provided is highly unlikely to give these form creators the skills they need, in the same way that it would be difficult to give someone an article to read on how to swim and then expect them to be a competent swimmer.
This approach creates a legal risk, through the forms that are hosted on the platform not being accessible, in addition to them not being compliant with GDS standards. This approach also creates reputational risks and a risk that MoJ will need to cover the cost of more contact from citizens who do not understand what they need to do to complete a form as well as the cost of reworking any poor quality forms in future.
While the team itself might not be able to address these issues, the wider MoJ should address them by taking onboard the recommendations below before the service moves into public beta.
Decision
The service was rated amber for point 2 of the Standard.
Recommendations:
- it is recommended that MoJ agree, document and implement a governance process that results in published forms that are accessible, easy to use, meet user needs, collect quality data, follow GDS styles and comply with the service standard. For example, it is recommended they consider approaches such as only allowing people with the relevant content design, UX design skills and form creation skills to be the ones to quickly create and edit forms on the behalf of non-designers to enable them to meet their deadlines. The MoJ should consider adopting the GOV.UK website publishing approach where a content designer designs the content and ‘owns’ the words, while a subject matter expert ‘owns’ the facts, but the subject matter experts do not write the content. These recommendations should be taken forward by the MoJ in consultation with the wider service team.
- while some forms, due to the number of transactions, go through GDS assessments, many do not go through any formalised assurance process to ensure they are compliant, for example, accessible, easy to use, collect quality data, meet user needs and follow GDS styles. While there is currently not a requirement for all forms to go through a GDS assessment, MoJ forms should still be subject to internal assurance checks by MoJ to check the quality, accessibility and usability of the forms. The team mentioned that an alternative assessment process was being discussed. It is recommended that MoJ agree and put in place an effective assurance process as soon as possible with the assurers to include those skilled in areas such as user-centred design, the Service Standard and accessibility. The assurers should be empowered so they can require the teams to rework any forms that are not accessible and do not meet GDS standards.
3. Provide a joined-up experience across all channels
Design assessor, (with input from Research, and Lead assessor where relevant)
Decision
The service was rated amber for point 3 of the Standard.
Recommendations
- the team said form owners were responsible for designing a joined up end-to-end user experience. However, the team also needs to understand how their part of the service fits into end-to-end user experiences and needs to be working to help services join up that experience. The team needs to extend their existing service map to map out and understand the end-to-end user experience, this includes the user needs and experience of form fillers and returning form fillers, form creators, those processing the forms and those using the data captured. They should include pain points and unhappy paths. The team has so far only mapped the happy path steps for the form creators.
- the team should speak to downstream users of the data to find out what impact the absence of data validation is having on the quality of data collected including any reference numbers and addresses.
- poor quality location data is more likely to be collected when users are allowed to supply free text addresses. For example, it enables users to mistype their address or not provide a complete address. Poor address data makes it harder for post people to deliver letters to the correct addresses. This means citizens might not get important correspondence or that the department is unable to serve documents on citizens and letters are returned to sender, marked as ‘dead’ letters. It is recommended that the team should research what the user needs are for collecting high quality location data and identify any pain points created by the collection of poor quality address data. This should include both citizens and internal users, for example, those using location data within MoJ and potentially those using it across government. They should better understand the quality of the location data MoJ forms are collecting at the moment and why the MoJ is collecting addresses. For example, does the department have a need to send and serve letters on users at an address a post person can find? Do citizens have a user need to receive the letters the department sends to them? Does the department want to reduce the number of ‘dead’ letters it receives? Do internal MoJ and cross-government users need high quality address data so they can use it to help match user records across databases and so they can better identify fraudulent activity through entity matching? Does the MoJ wish to use AI to interrogate and manage data either currently or in future? If so, high quality address data will make the application of AI more effective. To ensure the MoJ is collecting high quality address data, the team should follow the government’s standard for property and street information, so look to design an accessible and user-friendly postcode look up for UK addresses to be able to collect definitive street information, but also give the option for users without a UK postcode to enter their address manually. See point 10 of the Technology Code of Practice (TCoP). The DWP has developed a design pattern for this.
4. Make the service simple to use
Design assessor
N.b. Link to patterns in the GOV.UK Design System (or similar) that the team should be using or is already using
Decision
The service was rated amber for point 4 of the Standard.
Recommendations
- the team should do usability testing with form fillers on mobiles and tablets to get feedback on the user experience on those devices. The team provided data to show that more users use a mobile device to complete forms than other devices.
- the team needs to test the usability of the service with users with access needs so they can be confident the service meets the needs of those users, for example can users, using assistive technology easily use the ‘right click’ functionality to select a component when building a form.
- the team have done some testing of the forms being filled in by form fillers/citizens, but need to expand on this and include users with access needs, so they can be confident the form fillers find it simple to fill in forms quickly and easily. They should also analyse data such as completion rates and citizens feedback to help them understand any pain points that can be addressed through iterating the form builder. For example, where users need to save and come back later or need to have only one thing to do per page.
5. Make sure everyone can use the service
Design assessor with User Research assessor input
Decision
The service was rated red for point 5 of the Standard.
Recommendations
- the panel appreciates that the team is using GOV.UK patterns, but the team should test the form builder with more than one user who has an access need during private beta. For this phase, the team should test with users who have different types of access needs, like thinking and understanding, sight, mobility, and others, to identify additional user needs in this space and potential iterations to the product.
- the team should develop a comprehensive strategy for recruiting users with access needs, and a way of asking users about those needs that makes it easy to identify participants. The team should consider recruiting across government for users with access needs if they haven’t done so already, to give a wider group to recruit from. The panel can support the team with this.
6. Have a multidisciplinary team
Lead assessor
Decision
The service was rated amber for point 6 of the Standard.
Recommendations
- The team should have access to a permanent performance analyst that can provide support and expertise required to measure the impact of iterations and implement analytics tracking on the form builder.
7. Use agile ways of working
Lead assessor
Decision
The service was rated green for point 7 of the Standard.
8. Iterate and improve frequently
Lead assessor with input from User Research, Design and Performance Analyst when relevant
Decision
The service was rated amber for point 8 of the Standard.
Recommendations
- effectively measure the impact of the iterations made in Private Beta
- start incorporating success measures using design hypotheses into their iteration. For example:
- if we: improve the content on the What You Will Need page
- we will: see fewer users dropping out of the service, better quality applications.
- success Measures: % users dropping out of the service (decrease), average number of attempts to complete the service (decrease), fewer repeat returns (decrease) etc.
9. Create a secure service which protects users’ privacy
Tech assessor with Performance Analyst input when relevant
Decision
The service was rated amber for point 9 of the Standard.
Recommendations
- An IT Health Check has been performed on the previous technical solution of which components have migrated to the new solution. Continue to work with internal Cyber Security teams to ensure compliance with MoJ security policies and practices and to get the most out of technologies that may be available from central resources.
10. Define what success looks like and publish performance data
Lead assessor at alpha. Performance Analyst at beta and live
Decision
The service was rated red for point 10 of the Standard.
Recommendations
- the team need to ensure there are sufficient ways of working in place to incorporate analytics into the continual improvement of the service and measure the impact of the iterations.
- implement analytics tracking on the Builder side of MoJ Forms, before progressing into Public Beta
- evaluate the GA4 set up and ensure it is absorbed into any strategic solution adopted by Ministry of Justice as the department matures its Performance Analytics practice. We should aim to move away from siloed instances of the free Google Analytics implementation.
- expand the high level goals of Measuring Success into a full Performance Framework. Start with the overall purpose of the service, then break that down into overall aims. From there turn the aims into success/failure points (what will we tangibly see, if we know we are being successful/failing), and then work out how you can quantify those success/failure points. See here: https://www.gov.uk/service-manual/measuring-success/how-to-set-performance-metrics-for-your-service
- use the Performance Framework to set your service’s KPIs. This will allow you to prioritise data collection, and utilise data sources outside of Google Analytics.
11. Choose the right tools and technology
Tech assessor
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Tech assessor — n.b If possible, link to GitHub or other open repositories
Decision
The service was rated green for point 12 of the Standard.
13. Use and contribute to open standards, common components and patterns
Tech assessor for open standards, components and design assessor for design system
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Technology with input from Lead and Design assessors.
Decision
The service was rated amber for point 14 of the Standard.
Recommendations
- it was not clear if all risks identified as part of threat modelling activity had been captured in risk logs. Recommendation would be to track risk centrally in a single risk register.
- it was discussed at the tech pre-meet that data recovery had not been tested. Recommendation to work with MOJ security/Business Continuity teams to exercise test plans and ensure that data can be recovered. Recommendation, carry out risk assessment and track in risk register.
Next Steps
In order for the service to continue to the next phase of development, it must meet the Standard and get CDDO spend approvals. The service must be reassessed against the points of the Standard that are rated red at this assessment.