Apply for local growth funding

The report for MHCLG's Apply for local growth funding service alpha assessment on 03 June 2021

Service Standard assessment report

Apply for local growth funding

From: Central Digital & Data Office (CDDO)
Assessment date: 03/06/2021
Stage: Alpha
Result: Met
Service provider: MHCLG

Service description

Enabling teams to deliver funding programmes that work for all users, through a set of user centred tools and standards that help teams to:

  • create a consistent, easy to use experience for all

  • adapt to the different needs of existing and emerging funding programmes

  • generate quality and accurate data on funded programmes, for use by the department and local institutions

  • educe the admin burden of those designing, managing, and applying for funds, both internally and externally to MHCLG

Our services aim to enable the funding of impactful projects that have positive outcomes for local places and provide the insight needed for the continual improvement of policy. Our vision within the application part of the service, which is the scope we will be presenting at Alpha, is to enable a simplified and consistent way of applying for funding that:

  • creates a consistent, easy to use experience for all

  • can adapt to the different needs of emerging and existing funding programmes

  • reduces the admin burden of people applying for funding

  • reduces the admin burden of people supporting those applying for funding

  • reduces the admin burden of people assessing applications for funding

Service users

This service is for:

  • funding applicants/recipients

  • internal staff of MHCLG who work on funding – for purposes such as fund set-up, appraisal of applications, award, monitoring, report production

  • local authority funding programme and project managers

  • programme managers and project coordinators from Local Enterprise Partnerships

  • cross-government auditors of funding

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has identified and understands their range of users: primary and internal users
  • the team has developed personas for their user types, highlighting: roles, pain points and digital proficiency
  • the team has identified a range of user needs for both primary and internal users which in turn has become the basis of design decisions
  • the team has tested with a good number of users with a range of roles, experiences and needs -the team has tested with a few users with accessibility needs and has a plan to do more comprehensive Accessibility testing in Beta
  • the team has a comprehensive plan to test with users that they have researched minimally with or not yet researched with eg. Policy designers, reviewers, devolved nations, Welsh speakers, users needing help and support

What the team needs to explore

Before their next assessment, the team needs to:

  • look at doing more research with the persona: Liam. In particular, looking at users who have low digital proficiency and low confidence in the task at hand. The team is aware of this and self-stated their desire to test with more ‘Liams’, it will be good to see the results of this in the Beta assessment
  • demonstrate evidence that the team’s research plans for Beta have taken place, eg. testing with policy designers, reviewers, accessibility testing, help & support testing. Appreciating that applicants were the main priority of users in Alpha, a Beta assessment will want to see that other user groups have also been researched

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has found a shared user base, shared tools and consistent needs and is looking to join up the user experience of finding and applying for multiple grants
  • the team has considered a larger strategy for transforming funding while also chipping away at manageable chunks, in this case, the application phase
  • the team carefully considered and documented their assumptions, including impact confidence ratings, and started with their riskiest
  • their pain points were clear and actionable
  • the team has reached out to other areas of government that work with grants or funding for knowledge sharing

What the team needs to explore

Before their next assessment, the team needs to:

  • consider user journey mapping to demonstrate the parts outside of the service such as how users find out about the service as well as any offline journey
  • the team still needs to design and research the help and support avenues. The team is aware of this and has a plan to address it in Beta
  • besides the main journey, the team is looking at a place-based approach but will need to explore what that will mean and how it will work with the service
  • the team has begun, but should continue, to look at how to reduce users entering information multiple times

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team is enthusiastic and empowered to find solutions with their knowledge and close working relationship with policy
  • the team is looking at consistency at the policy level and ensuring the wording, as well as entry requirements when possible, are consistent for the user
  • the future funding model is based on ongoing learning so the users, both internal and external, improve the end-to-end funding process

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the name has been tested with users and they know what language users will use when looking for the service
  • the journey for organisations who have collaborators and the possibility of multiple logins will need more consideration
  • consider how might the service interface with GOV.UK as part of the user’s journey
  • further research with the area leads and others who support the applicants, the application appraisers and any other equivalents to front line staff

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • content professionals have not only been working on the service but have also done content audits to improve policy and supporting guidance
  • the team considered the different needs of the users based on experience- whether they know the fund or what to explore different funds
  • the team have conducted card sorting research on what the applicants would expect to find in the application

What the team needs to explore

Before their next assessment, the team needs to:

  • the team will need to test with Welsh users, in the Welsh language
  • ensure they use GOV.UK patterns https://design-system.service.gov.uk/patterns/
  • the team should research with other funds, where possible, that have other types of applicant pools that might be less digitally mature (e.g. council and university workers might be more digitally capable than those at other institutions)
  • the service as well as tools and microservices within the service should continue to be tested and iterated with a variety of users
  • the wording on the start page should be tested to ensure that users know what is needed before they enter the service- eg is “organisation’s details” sufficiently

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team considered the levels of digital maturity of the organisational users applying for funds and that a user can join another group of users as they learn more
  • the team has considered the digital maturity of organisations, such as local government, as well
  • they have already begun testing with a small number of users with accessibility and assisted digital needs

What the team needs to explore

Before their next assessment, the team needs to:

  • to ensure that users with accessibility and assisted digital needs are not left until the end, the team should make a firm recruitment plan to reach more of those users
  • consider research with new users such as with people who are aware of the funds but have not yet applied to them, are new to funding, or who have previously started applications but have not progressed
  • ensure that the non-digital version is considered and that those users are not disadvantaged in the application process and receive proper support

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has the right balance of skills, expertise and support for both strategic and tactical improvement work to improve the funding application journey - including associated departmental processes and structures
  • the team has built good working relationships with key stakeholders - with strong support and commitment to their approach from the relevant operational and policy areas
  • the team has thought through how demands on them will evolve through Beta, looking to establish long-lived teams (partly embedded in the funding teams themselves) that can ensure learning is preserved - rather than being lost through regular retendering of suppliers

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that they have a dedicated team of developers for Beta, so that they have the ability to test different technologies using a realistic, user-centred approach
  • secure commitment from the department to establishing a long-lived team throughout and beyond Beta

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using agile ways of working to routinely inspect and adapt how they work as they learn
  • the team is committed to maintaining high degrees of engagement with operational and policy stakeholders through dedicated (and well-attended) Show & Tells

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure their ways of working and team structure scales and improves adequately as the team grows and starts to deliver an operational service

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is aiming to improve the quality and consistency of content across the funding processes, with content designers auditing previous funding applications
  • the team have matured their understanding of user needs, user groups, pain points and hypotheses sufficiently through Discovery & Alpha - providing a foundation for rapid validation of these in Beta

What the team needs to explore

Before their next assessment, the team needs to:

  • develop a proof of concept using the described technical stack then start experimenting, checking the technical feasibility and make sure the technical solution is user-driven
  • continue testing, iterating and demonstrating how learnings from users have to lead to improvements

9. Create a secure service that protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team considered a federated authentication system based on OAuth2 in tandem with user permissions and access
  • the team identified potential threats and has a plan for continuous monitoring and regular evaluation
  • data classifications and associated risks were identified and mitigation strategies have been considered

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should continue with further exploration on user collaboration (both on the applicant side and on the caseworker side
  • the team should explore how an MHCLG account will work in a centralised, government-wide digital identity framework

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a meaningful, clearly defined view of what success looks like - and has expressed this both in terms of their key hypotheses about users, and the specific metrics they would use to measure success

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure their measures of success and KPIs are feasible. Whilst cost per application is a key measure of benefit for applicant organisations, the cost of supplying this info to the team likely outweighs the benefits to those organisations. The team should explore whether there are more feasible alternatives or proxies for these measures

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team considers an Event Hubs between the Formulation and the Application System for scalable real-time data ingestion, reflecting the real-world situation in the architecture
  • common technical stack choice (made of Python/Postgres and NodeJS) will enable a wide range of developers to get involved with the project making it easier to maintain in the long term

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should explore and/or apply more than one solution to a problem so they can compare and learn from different approaches. It is essential to have a dedicated team of developers to be able to test these solutions with users
  • if the team decides to use ReactJS (or equivalent JavaScript frameworks) for the Formulation System they should make sure a resilient and accessible approach is taken, using server-side rendering

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the initial proof of concept has been open-sourced and documented.
  • the team confirmed the plans to work in the open and make new code publicly available

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team Considers common platforms for beta, including GOV.UK Notify and GOV.UK PaaS
  • the team used common UI patterns and components in their Sketch prototype

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • have already assessed the impact of outages to the service, including early threat detection and recovery targets
  • have considered necessary contingencies in the event of a long service outage

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the user journey in the event of a long service outage (e.g. use of any legacy application services) is clear, and contingencies are ready to be invoked prior to Beta launch of the application service

Updates to this page

Published 18 August 2021