Apply for a Grant beta assessment

Service Standard assessment report Apply for a Grant 21/02/2023

Service Standard assessment report

Apply for a Grant

From: Central Digital & Data Office (CDDO)
Assessment date: 21/02/2023
Stage: Beta assessment
Result: Not met
Service provider: Cabinet Office

Service description

There has been no single place to find and apply for government grants. The process is complicated, time-consuming and costly for applicants and exposes the government to significant risk of fraud, error and duplication of funding. Apply for a Grant will be the accompanying service to the recently launched Find a Grant. It will provide grant seekers with a more streamlined application process and grant administrators with the tools required to build digital grant application forms and take advantage of the Spotlight due diligence tool for compliance.

Service users

This service is for:

  • Grant applicants who spend funding to deliver outcomes:

  • Individuals, Voluntary, Community & Social Enterprise (VSCE) sector, small businesses, large businesses

  • Grant administrators whose public body organisations offer funding to meet policy objectives

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have segmented their users, both applicants and administrators, in a way that reflects the problems those people might have using the service, in doing this they have taken into account access needs, digital skills and understanding of the grant application process, organisational capability.

  • a variety of methods have been used to understand how the service is being used: interviews, user feedback, usability testing

  • the team is aware of user groups that the service doesn’t yet cater for. For example where organisations come together to form consortiums to apply for grants. Meeting their needs is in the roadmap.

  • the team has continued to facilitate the cross-government community of grants administrators. This is proving highly valuable as it provides a small team with a mechanism to learn at scale.

What the team needs to explore

Before the next assessment, the team needs to:

  • speak to more applicants who have completed the whole journey. The team has only had time to understand the complete experience of one applicant - for understandable reasons. While feedback has been used to understand points in the journey it is important that the team get a view of the user experience from end to end. The team plans to do this research, it’s important that the further roll out of the service allows for this work to be done.

  • the team would benefit from exploring how users might be encouraged to test the end-to-end journey once the form is built. As the service is essentially a form builder, it is critical that administrators are able to test their finished forms with potential users as part of a wider journey.

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have interpreted the whole problem to be the complexity of the current system, which assumed that applicants understood the structure of government.
  • the system was not fair on people and organisations that want to apply for grant funding (bias towards larger organisations). The new service provides better visibility of organisations that government wants to fund. There is a new focus on applicants.
  • there is an ‘advert builder’ available within the service. This has replaced a manual process where grant application form URLs needed to be linked from the grant advert within the ‘Find a grant’ service. Grant administrators can now create application forms and adverts from a single place, joining up both the ‘Find a grant’ and ‘Apply for a grant’ services into a single ‘Manage a grant’ dashboard.
  • the service makes due diligence easier, passing grants application data to the ‘Spotlight’ system and assisting grant administrators by automating their processes.

What the team needs to explore

Before the next assessment, the team needs to:

  • only certain parts of the end-to-end journey are in scope of this service. Most of the grants applications still go to the as-is grants system in departments.

  • although joined up with the ‘Find’ service, the ‘Apply’ stage doesn’t deliver a user-centred outcome (the receipt of funding).

  • the team should scale up usage of the service in order to gather the data they need to prove the success of the service and strengthen the case to bring those latter parts of the end-to-end service in scope.

  • the panel recommends that the team create a service blueprint of the end-to-end service. This will help to visualise and communicate the activity that happens at, and across the boundaries of the ‘Apply’ service, and other connected services.

3. Provide a joined-up experience across all channels

Decision

The service partially met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • outside of the product there are communication campaigns directed at the grant admin community.

  • regular sessions are delivered by ‘grants champions’ who push information out into departments. The team has also run a number of Q&A sessions that grant administrators can attend.

  • the Grants Centre of Excellence site is the starting point for grant administrators. Guidance can also be found on the Cabinet Office YouTube channel.

  • grant administrators can use their grant application form URLs to market their grants on other platforms and campaigns.

  • each grant application has a link directly to the correct support team in departments.

What the team needs to explore

Before the next assessment, the team needs to:

  • in order to access the service, grant administrators need to email the service team to request account set up. They are unable to self-service. The team should communicate their plans and progress for onboarding new users at the next assessment point.

  • the team should gather/present more data around onboarding grant administrators to the service and the impact on them - both from an account setup, and a system migration perspective (from their existing grants system to the new service).

  • the team should make full use of the Grants Centre of Excellence and deliver on their plans to add more structured content and guidance for accessing the service.
  • the team doesn’t have full control over how a departmental grants team offers support to its users (including via which channels). This presents a risk to meeting the needs of grants applicants and the team should consider how this aspect of the broader service is assured.

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has identified a risk related to ensuring eligibility criteria for each grant is clear to users. For grant admin users there is guidance within the service about how to structure eligibility criteria. The service also provides visibility of other grant teams’ eligibility criteria.

  • the interaction design of the form builder has been iterated and the latest versions work better for grant administrators. The team have kept to standard design patterns, noting that users felt it was a reliable service they could trust.

  • where they have diverged, they have contributed to the Design System communities, asked questions on Slack, been upskilled by other community members and put problems to the wider community, asking how they might solve them.

  • the team have been reusing frontend code from government repos and they have been making effective use of the GOV.UK Prototype Kit.

What the team needs to explore

Before the next assessment, the team needs to:

  • there is no preview of forms created in the form builder - the panel is confident that this is an established pattern within form builders across industries. The service team also recognises this as a gap, noting that grant administrators rely on the URL generated by the service as a workaround.

  • the team noted that internal users might not need the same frontend experience as GOV.UK and found that some grant administrators were struggling to distinguish between the frontend (application) and dashboard (form builder) side of things. The team needs to gather data and evidence to support a decision either way.

  • content will become more prominent and prevalent as the service matures. The team acknowledges that they will need to get a picture of the content landscape across the service and across GOV.UK. The central grants team currently have to run spot checks for duplication with GOV.UK. The panel recommends developing a content strategy to ensure content is clear and not duplicative.

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are making the service usable for less experienced organisations expecting, therefore, it will work for more experienced organisations.

  • the team have spoken with users with limited digital expertise and users with accessibility considerations.

  • the team have used the universal barriers framework, mapping what they know about their users to the various facets of the framework, as well as using them as prompts to expand areas of research and exploration.

  • the team have conducted accessibility audits in partnership with the Digital Accessibility Centre and the service is compliant with WCAG 2.1 AA. Where they have learnt about opportunities to improve the accessibility of parts of the service not within their control (e.g COLA), they have shared those findings with relevant service owners.

What the team needs to explore

Before the next assessment, the team needs to:

  • the team needs to work with grants administrators across government to ensure that they collectively publish high quality information about how users can get support with their application (applicants), technical and design support with the service (grant administrators), and support in situations where the service may be unavailable (everyone).

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have most of the expected roles in place for this stage in development and have plans to continue this with the new supplier.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure that they continue to review their processes for handover to a new supplier and have mitigations in place for the risks of this approach.

  • create permanent roles for Civil Servants to work on this service and gradually reduce their reliance on contractors for this work. If this is to become a core, mandated service for other government departments to use then the team should be resourced more sustainably and offer greater value for money.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were working together using a range of agile ways of working that helped them deliver the service.

  • the team are able to flex their chosen ways of working to better support their work and are not tied to any one set of methods.

What the team needs to explore

Before the next assessment, the team needs to:

  • continue to iterate their ways of working to meet the needs of the team throughout the public beta phase and ensure they are able to scale the team in response to project demands and growth.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have iterated the service to reflect findings from user research that has taken place during the private beta.

  • the team were able to show how user research feeds into their improvements of the service and give examples of where this has taken place.

What the team needs to explore

Before the next assessment, the team needs to:

  • further iterate the service based on the data gathered through their analytics as recommended in item 10 below.

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has had early engagement with their department’s Technical Design Authority, which helped risk-assess the service as low-risk.

  • the service team has a solid process around code reviews, with deployment in staging environment that are checked by two developers; and automatic static analysis offered by the Terraform environment together with vulnerability scanning.
  • the Senior Responsible Owner (SRO) is thoroughly involved in the process.

What the team needs to explore

Before the next assessment, the team needs to:

  • keep project governance, in particular SRO and Senior Information Risk Owner, aligned to the service, making sure there is broad engagement and assurance of security best practices.

  • the team need to ensure that all aspects of the service remain compliant with GDPR.

10. Define what success looks like and publish performance data

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have considered change theory and how their goals might be measured.
  • the team are gathering data, with consent from users, to measure the user journey.
  • the team have considered how they will measure the mandatory KPIs.
  • the team have identified some key measures related to the applicant journey.

What the team needs to explore

Before the next assessment, the team needs to:

  • consider how the suggested metrics are supporting the different users and their needs to reach their overarching goal, clearly demonstrating the difference between grant administrators and grant applicants (carried from alpha).

  • use the data collected to measure iterations made to the site and show how they have made measurable improvements to both applicant and administrator journeys.
  • demonstrate how they are creating greater links with the overall service metrics, including the ability to find a grant and make a decision (carried from alpha).
  • review or consider their roadmap for understanding metrics across the whole service journey e.g. number of grants awarded (carried from alpha).

  • understand the impact that this service will have on downstream users and consider them in the development.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service was designed using a simple but solid architecture; with choices that clearly follow the Technology Code of Practice.

  • the service team has taken reasonable decisions about external components and their replaceability as the service evolves.

  • the use of cloud infrastructure, including the use of Lambdas and REST APIs, is sound and aligned to current best practices

  • the team are aware of the potential duplication of technology with GOV.UK Forms. There are specific requirements within the grants service that support the need for an independent solution.

What the team needs to explore

Before the next assessment, the team needs to:

  • keep exploring whether the COLA service for authentication should be replaced by other solutions, such as the GOV.UK OneLogin.

  • engage with the SRO and departmental leadership on evolving the programme into a business-as-usual service, especially in consideration of the year-on-year funding which is problematic for sustainability.

  • the team should progress their dialogue with GDS and continue to collaborate to the benefit of both programmes.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team uses open sources in its process, so that the GitHub repositories are where the live code is kept.

  • the service repositories are released under MIT licence.

What the team needs to explore

Before the next assessment, the team needs to:

  • keep releasing source code under an open licence.

  • keep making sure that the repositories host code that is in use.

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service uses appropriate common components, such as the GOV.UK Design System and GOV.UK Notify.

  • experts in data standards like 360Giving were consulted as part of the journey.

What the team needs to explore

Before the next assessment, the team needs to:

  • explore whether the authentication framework COLA could be replaced with OneLogin.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has designed the service with a solid cloud infrastructure based on positive scalable capability (auto-scaling).

  • the thinking around business continuity planning is thorough.

What the team needs to explore

Before the next assessment, the team needs to:

  • think about the server longer term, exploring how to make it become business as usual.

  • explore whether the current arrangement in which the incumbent supplier provides support should be replaced by a decoupled development/support arrangement, if it suits the department’s preference and strategy.

Next Steps

Reassessment

In order for the service to continue to the next phase of development, it must meet the Service Standard and receive CDDO spend approvals. The service must therefore be reassessed against the points of the Standard that are not met at this assessment.

While the panel were impressed with much of the work done by the team there were wider concerns about whether the private beta period had allowed the team to test with a sufficient number of users and be sure that the service was solving a whole problem and working for all users.

We would like the team to use this extended period of private beta development to further test the service with the addition of further grants; using these to test some of the areas of concern highlighted in the assessment panel feedback above.

These learnings can then be reviewed at reassessment along with the work done to meet Point 10 of the standard.

Updates to this page

Published 28 November 2023