GOV.UK Forms beta assessment report

Service Standard assessment report GOV.UK Forms 22/04/2024

Service Standard assessment report

GOV.UK Forms

From: Cabinet Office
Assessment date: 22/04/2024
Stage: Beta
Result: Amber

Previous assessment reports

Service description

GOV.UK Forms is a form building platform that allows users in central government departments to create HTML-based forms with little to no digital skills required. The platform has been created to support digitising existing document-based forms on GOV.UK and to provide a useful tool to publish new digital forms. It’s designed to improve the standard, style inconsistency and accessibility of all forms published on GOV.UK, so that forms published onGOV.UK can be accessible for all users.

Service users

This service is for:

  • form Creators (Primary Users)

  • end to end owners, creators and processors

  • operations or policy professionals

  • UCD professionals

  • technical professionals and teams

  • form stakeholders (Secondary)

  • form processors

  • form reviewers

  • form publishers

  • product governors

  • form fillers (end users)

On behalf of:

  • themselves

  • others

  • an organisation

Things the service team has done well:

  • the panel was impressed by how the team worked together and made decisions effectively to deliver a user centred service.
  • the team clearly demonstrated the importance of user research. It was great that the user researcher and team members were able to clearly outline research undertaken and respond to relevant questions.
  • following the assessment conversation and review of the slide deck, it was good to understand that a range of usability testing had taken place. There had been 48 rounds of testing, including end-to-end observations, sessions using the prototype kit, Figma, production code in a user research environment and the live product.
  • testing with access needs users is key to the development of all services. Understanding that the team had ensured internal and external (Digital Accessibility Centre) accessibility audits took place, and the bugs that were identified were fixed in line with WCAG 2.2 AA.
  • the team has a comprehensive performance framework and has started creating reports to share with form builders to allow them to understand the performance of their forms.
  • the team had some good ideas on what additional metrics were needed in private Beta to monitor the form builder and the forms being built.
  • used the task list design pattern effectively and ingeniously, to guide form builders through the end-to-end considerations of what a form should contain.
  • implemented design system components, and thoughtfully iterated how form builders encounter these components in the platform. This enables form builders to structure questions effectively, while form fillers benefit from autofill efficiency.
  • communicated improvements to the platform effectively to current users, such as recent reference number and payment facilities, using notification emails.
  • the team described a strong security culture, building security into the development process and team planning.
  • the team has clearly defined how responsibility for security and data privacy is split between the platform and its users, and communicated that effectively in the MOU, particularly around sending submitted data by email.
  • their technical choices enable the team to iterate the platform very frequently, particularly by using continuous deployment and investing in tooling to automate manual and error-prone tasks. Their commitment to using “boring” tech and keeping things simple is also valuable in this context.
  • the platform is well-integrated with common components, using GOV.UK Notify, GOV.UK Pay and the GOV.UK Design System.
  • the team is well-prepared for operating a reliable service in public beta. They have tested their disaster recovery processes several times, hold productive incident reviews and game days and iterate based on what they learn from them, have built confidence in the platform’s technical ability to operate at scale through performance testing, and have a well-defined sustainable plan for out of hours support.

1. Understand users and their needs

Decision

The service was rated amber for point 1 of the Standard.

During the assessment, we didn’t see evidence of:

  • the research plan. During the demo a plan was mentioned. The user researcher confirmed their plan focuses on two areas 1) features of the roadmap 2) adoption of the product.

  • Recommendation: The panel would like to see evidence of this within 3 months, for clarity that the research plan exists as a separate artefact, presented in a way that summarises intended approach aligned with intended roadmap.

2. Solve a whole problem for users

Decision

The service was rated amber for point 2 of the Standard.

During the assessment, we didn’t see evidence of:

  • clarity in responsibilities. The service team has explained why assurance is passed on to the departments, and the lack of clearly defined central standards for forms on form building platforms makes this inevitable. But form builders will not always have a good idea of when to ask for help, or the impact of bad design. While the team has been able to engage with users in private Beta, this will be difficult under the incoming self-service model.

Recommendation: Even though the team must ultimately pass responsibility on to the departments using the platform, they could give a more definitive steer. They should cover the limits of responsibilities the platform accepts – and the downsides of poor user experience, bad data and unnecessary contact – within the MOU and potentially within the platform itself. The panel would like to see evidence of this within 1 month.

  • advocating consistent and collaborative assurance processes, detailed guidance and support materials. These will all help non-designers create good user experiences, and return more useful data. While the service team reports that designers in departments have been significantly involved during private Beta, it will become increasingly difficult for them to devote time as the platform becomes more widely used. Overall, it may be optimistic to hope that departments, in establishing their own processes, will always successfully be able to source designers. And regardless of their process, the panel is concerned that non design-minded stakeholders may also intentionally bypass these designers who could help, due to time pressure in making forms live, or making quick changes to already published forms.

Recommendations: continue to implement clearly defined guardrails. This is particularly important in terms of ensuring forms are designed to be usable, in plain English and are accessible (in terms of design and content). The platform could build on the call to engage with digital teams, advocating a collaborative, or even just 2i process, sharing and iterating the ‘Creating good forms’ draft guidance, and considering providing other materials as a starter pack to help form builders, such as the mentioned question protocols and approaches to planning forms. The panel would like to see evidence of this within 2 months.

  • ability to manage change and version forms, which will likely become more important as user numbers rapidly increase. This should be considered as an important part of the self-service assurance process, and documenting and advocating best practice - alongside any technical solutions - will be important for helping non-designers ensure forms do not balloon in size and scope.

Recommendation: build on the thought which has already gone into archiving/deleting/returning forms to draft state, to look into how users can manage changes to their forms once live, what the implications are for users who are in flight, and advocate an awareness of this. The panel would like to see evidence of this within 6 months.

  • the platform solving a whole problem for users – during the assessment the team acknowledged at this stage the functionality of the GOV.UK Forms is still quite limited. The team outlined the plan for Now/Next/Later and within that the intention to work on additional submission types, adding extra users and the save and return functions.

Recommendation: if not in place, introduce means to capture and articulate how users feel about whether the form they create is limited by limited function, perhaps using exit survey questions. The panel would like to see evidence of this within 3 months.

3. Provide a joined-up experience across all channels

Decision

The service was rated amber for point 3 of the Standard.

During the assessment, we didn’t see evidence of:

  • user data being passed through the service in a structured way. Even if the correct security is in place and user data is protected, email submission is not ideal as the only submission solution (for example, potential error from caseworkers rekeying information), or to encourage onward use of data in government systems. It would also likely significantly limit adoption by larger departments like HMRC and DWP.

Recommendation: Immediately follow up on the commitment to explore different submission types. The panel would like to see evidence of this within 3 months.

  • guaranteed quality of data which caseworkers etc. will receive from form outputs. Validating data at the point of capture is one of the key benefits of a digital journey over a paper form, and this could be maximised in using the platform.

Recommendation: Build on the existing validation within the form, to include more nuanced validation, for example on comparing dates, or bespoke error messages which will also improve the content accessibility of the platform. Panel would like to see evidence of this within 6 months.

4. Make the service simple to use

Decision

The service was rated amber for point 4 of the Standard.

During the assessment, we didn’t see evidence of:

  • end users being able to save or download their answers, or being able to save their progress and return.

Recommendation: devote time now to improving these features which will support form builders to make more involved forms, and help the experience for longer journeys with more complex requirements, where users need to find information outside of the form. The panel would like to see evidence of this within 6 months.

  • the ability to create routing/logic beyond basic radio yes/no to show a subsequent question. This would also allow for more complex and tailored journeys, which would do the hard work on behalf of users and ask more appropriate questions at the right times.

Recommendation: investigate what is possible to allow form builders control over more complex routing within the confines of the form builder interface. The panel would like to see evidence of this within 6 months.

  • exploring the ability to reuse data already entered by the user into the form – this will allow form builders to design cleaner forms, and avoid repetition for users (for example, being able to select previously entered addresses, to avoid rekeying error and confusion). The panel would like to see evidence of how this has been explored within 6 months.

5. Make sure everyone can use the service

Decision

The service was rated green for point 5 of the Standard.

6. Have a multidisciplinary team

Decision

The service was rated green for point 6 of the Standard.

7. Use agile ways of working

Decision

The service was rated green for point 7 of the Standard.

8. Iterate and improve frequently

Decision

The service was rated green for point 8 of the Standard.

9. Create a secure service which protects users’ privacy

Decision

The service was rated amber for point 9 of the Standard.

During the assessment, we didn’t see evidence of:

  • the platform’s expectations of its users around email security keeping pace with evolving government standards. This is particularly important here because submitted data is sent by email. The panel recommends reviewing the MOU regularly to keep it current with government security standards, such as replacing the link to the withdrawn Minimum Cyber Security Standard with the current Cyber Security Standard, and including MTA-STS and TLS-RPT in the list in paragraph 24, as they were added to the “must do” list in Securing government email in March 2024. The MOU should be updated and a plan put in place for regular updates after that within 3 months.

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

11. Choose the right tools and technology

Decision

The service was rated green for point 11 of the Standard.

12. Make new source code open

Decision

The service was rated amber for point 12 of the Standard.

During the assessment, we didn’t see evidence of:

  • good reasons for the team’s infrastructure configuration code to not be open. The team have started exploring what they would need to do before making this repository open, and all their application code is already open, along with their architectural decision records and some product decisions. The team should make this repository open within 6 months.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated amber for point 13 of the Standard.

During the assessment, we didn’t see evidence of:

  • the structure of individual live forms on the platform being published in an open, machine readable format under an Open Government Licence, as GOV.UK does through its public content API. The existence of this platform is a unique opportunity to enable analysis by people inside and outside government across a potentially large number of forms, such as exploring how many forms ask for the same information, how often and how a particular design system component is used, comparing wording of similar questions at scale, and to support transparency. The team should explore whether there is scope for collaboration on the (currently dormant) open standards challenge for forms. The team should start publishing this data within 6 months.

14. Operate a reliable service

Decision

The service was rated green for point 14 of the Standard.

Next Steps

This service can now move into a public beta phase, subject to addressing the amber points within 12 weeks and CDDO spend approval.

This service now has permission to launch on a GOV.UK service domain with a beta banner. These instructions explain how to set up your *.service.gov.uk domain.

The service must pass a live assessment before:

  • turning off a legacy service

  • reducing the team’s resources to a ‘business as usual’ level, or

  • removing the ‘beta’ banner from the service

Updates to this page

Published 3 December 2024