Report Funding Progress alpha assessment

Service Standard assessment report Report Funding Progress 09/02/2023

Service Standard assessment report

Report Funding Progress

From: Central Digital & Data Office (CDDO)
Assessment date: 09/02/2023
Stage: Alpha assessment
Result: Not met
Service provider: Department for Levelling Up, Housing & Communities

Service description

The Reporting Funding Progress service is part of a larger Funding Service at the Department for Levelling Up, Housing and Communities (DLUHC), which includes a beta service ‘Access Funding’. Report Funding Progress is focused on the monitoring & evaluation phase of this wider funding journey. The service covers the conclusionary activities and aims to:

  • help grant recipient users to manage their monitoring returns and submit them to DLUHC

  • help programme delivery team users to monitor funded projects and support delivery

  • help evaluation and analysis users to prepare reports and insights

Service users

At a high level the service users fall into two categories: internal users who work within the Department of Levelling Up, Housing and Communities and external users who work within funded organisations, such as Local Authorities. These service users can be divided into two additional groups:

Primary users:

  • users who submit data

  • users who process data

  • users who consume and analyse data

  • users who consume reports

Secondary users:

  • users who design funds

  • users who support funds

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the whole team was engaged with user research. This included agreeing research objectives, observing research sessions, reviewing research sessions, and discussing user needs and priorities.
  • the team made good use of previous user research, and user research generated by other teams, e.g. the Access Funding team.
  • the team shared insights using a variety of methods, for example through show and tells and communities of practice linked to related services.
  • the team has compiled a detailed list of user insights, which they plan to test further with an expanded user base.
  • the team had good engagement with the Alpha Working group.

What the team needs to explore

Before the next assessment, the team needs to:

  • undertake more extensive user research. The team was restricted by the amount of time it was allowed to spend in Alpha. As a result, the team did not have sufficient time to undertake all their user research and they had to rely heavily upon desk research (much of it undertaken in 2019) and a limited pool of internal users from the Alpha Working Group.
  • the team needs to prioritise accessibility needs, users with low digital confidence (preferably using the government’s own Digital Confidence Scale), and users from smaller local authorities (including community groups and parish councils), and achieve a wider geographical spread (e.g. no organisations from Northern Ireland have been included in research to date).
  • the team has a very clear understanding of the need to engage with more users, and which groups they need to include. They need to be allowed more time to complete this research.
  • further develop their role profiles. The team have developed role profiles, which are based upon the original personas developed by the Access Funding team. However, there are only four role profiles and these do not include users with accessibility needs, and users with low digital confidence. Nor do they reflect the diversity of their user base. With more extensive user research the team should be able to develop more detailed profiles/personas. They should also be expanded to cover the needs of their secondary users.
  • some of the user needs presented were written as non/functional requirements. Understanding true user needs will help the team to create a service which is truly user centred. User needs should express people’s goals and not include a solution.

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team collaborated closely with the other services in the funding programme of work - access to funding in particular. They are also working with different teams in the department across related projects.
  • the team are speaking to other departments and organisations who also have funding services

What the team needs to explore

Before the next assessment, the team needs to:

  • continue to work in the open with other teams to make the experience of applying and delivering funding quicker and easier
  • ensure that insights can be funnelled up into other teams whose work directly affects this team, for example the work being done around simplification of data collection.
  • by doing additional research with external users, the team can further validate that the scope for each service in the funding service design program makes sense to users, and isn’t only an enabler for the way the department delivers funding
  • consider mapping out more of the journey outside of the direct service, for example how fund recipients collect and receive data before preparing and sharing with DLUHC

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are working with operational and policy colleagues on a regular basis, and have set up a working group that feeds into design work
  • some early thinking and planning has been started around a support model for the service
  • the service is joined up with another team working on simplifying the data collection required from fund recipients

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure that research and testing of designs is not limited to the working group, so that the team can be confident that the service works for a wide range of users
  • work closely with operations team on the support model, research and test this part of the journey with users, including notifications, call scripts and any other communication methods
  • ensure that the service name has been tested with users and that there is an understanding of what terms and language users will use to find this service

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the prototype was consistent with the GOV.UK design system and style guide, using existing components and patterns where possible
  • the team demonstrated examples of design and content iterations to make the service simpler to use, including how data is submitted, structured and filtered.

What the team needs to explore

Before the next assessment, the team needs to:

  • continue to work with the team involved in the simplification of data collection, so that fund recipients are able to easily provide the most important information to the department
  • explore non-happy path journeys, for example overdue deadlines. Research and test notifications, correspondence and error messaging. Ensuring that reporting cadences and submissions are easily understood and managed by fund recipients
  • continue to explore service-wide design/content strategies across the different funding teams (and related teams) to enable consistency and a better experience for fund recipients and internal users

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done some testing with a small number of users that have accessibility and assisted digital needs, and has plans to do more of this in beta
  • the team have been testing the service themselves using screen readers and lighthouse to check for accessibility issues. They have also been working to test any altered GOV.UK components to make sure they meet accessibility standards - the colour coded tags in the task-list being one example.

What the team needs to explore

Before the next assessment, the team needs to:

  • have a recruitment and research plan to test with more users with accessibility and assisted digital needs, both externally and internally
  • research with a wider range of users; different sized Local Authorities, housing organisations and community groups, to ensure that their needs are fully understood and the service can meet them
  • ensure that the support model is thoroughly researched and tested so that users are not disadvantaged and can receive sufficient support for this service
  • continue to test and iterate new/altered design components and patterns so that they meet accessibility standards

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the Service team benefits from an active Service Owner who communicates the strategic context of the organisation to the service team, including constraints and ministerial priorities.
  • all core roles for an alpha phase were present in the team. With an adequate number of user centred design colleagues to prototype rapidly.
  • the Service Team has a well-defined governance structure with a fortnightly Steering Group meeting, which includes active SRO engagement. At a more senior level the team reports to a Departmental Process Improvement Board which includes the Chief Financial Officer - a key senior stakeholder for the overall Funding Service.

What the team needs to explore

Before the next assessment, the team needs to:

  • recruitment for user research appears to have been challenging and blocked the volume of research conducted. The service team should consider whether there is more junior level administrative capacity available to support the user researcher with recruitment. Both to accelerate delivery and deliver greater value for money by allowing the contract user researcher to focus on higher value add work.
  • more fully align team roles and responsibilities with the DDaT job family over beta. In particular, strengthen the Product Management role within the team to ensure core duties are not falling to wider colleagues. For example, a Product Manager should run sprint planning meetings within a team using SCRUM.
  • the only civil servant on the service team is the Service Owner. Plans should be made for reducing the contractor dependence within the service team by recruiting civil servants over beta. Including documentation to support potential handovers. [This is a forward-looking point as the organisational context has only just made this possible].

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team are utilising SCRUM to approximate agile ways of working and conforming well to core ways of working outlined in the SCRUM guide. For example, daily stand ups, retrospectives and sprint planning meetings are taking place.
  • scrum of Scrums are used to ensure the service team is communicating interdependencies with wider service teams under the Funding service umbrella actively.
  • certain professions within the service team participate in small but active communities of practice, providing a valuable avenue for insights across linked service teams to be shared.
  • weekly internal demos were used to content crit prototypes.

What the team needs to explore

Before the next assessment, the team needs to:

  • more clearly communicate that the team is using the SCRUM framework to approximate agile principles and values. Stand ups, retros, sprint planning meetings are not necessarily ‘agile rhythms’, rather they are SCRUM ceremonies. A team could use Kanban and approximate agile closely without using sprint planning meetings for example.
  • continue to manage interdependencies across the wider Funding Service actively to ensure any potential blockers to the service releasing incrementally over beta.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had adequate user centred design capability and capacity to prototype alternative ways of solving the problem.
  • the service team used the feedback loops it had available to iterate content.
  • the service team identified its riskiest assumptions to test in alpha.

What the team needs to explore

Before the next assessment, the team needs to:

  • iterating and improving frequently should be done in response to user needs. Whilst iterative prototyping has taken place the service team has been blocked from demonstrating fundamental improvement because of the insufficient user research conducted. The volume of user research conducted should increase.

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has completed a Threat Modelling exercise with service stakeholders.
  • the team collaborates with internal Tech Design Authority and Cyber Assurance functions on the service.
  • the team has adopted good security practices around encryption in transit and at rest, limiting the number of components being exposed to the Internet, adoption of Pre-Awards authentication platform with its multifactor authentication (MFA) and single sign-on (SSO) features.
  • the team has plans to ensure all transactions and data stored within the platform has complete audit trail.

What the team needs to explore

Before the next assessment, the team needs to:

  • the team needs to explore what additional controls are required for Personally Identifiable Information (PII) data stored within the system (like contact details of a grant owner).
  • the team needs to ensure the MFA is enforced on all user accounts for internal and federated authentication.
  • the team needs to complete an initial Penetration Testing and follow up on the recommended mitigations.
  • the team needs to collaborate with the Data, Cyber Security and Information Assurance teams to ensure data consistency throughout the service.

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has covered the 4 mandatory KPIs in their plans for utilising performance data over beta.
  • further non-mandatory KPIs have been identified and planned for, all of which are strategically aligned to the intended business outcomes for the service.
  • the team demonstrated an understanding of how the analytics tool selected will impact data collection.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure that performance data is widely shared and used as the basis for governance reporting and service team decision making across beta.
  • ensure the service team is able to routinely and easily access performance data, especially the service owner and product manager.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has evaluated buy and build options in context of user and business needs
  • the team plans to use the DLUHC strategic cloud provider, programming language, tools and technology used in pre-award service and across the department
  • the team has agreed with the Technical Design Authority (TDA) and Cyber Team on the reference architecture which is based on microservices and API components for data sharing
  • the team plans to use government shared services like GOV.UK Notify
  • the team collaborates internally and externally on the data model to ensure required flexibility is in place

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure that technical choices meet user, business and security needs.
  • ensure all significant changes to the proposed architecture are discussed with the TDA and Cyber Team.
  • aim to have all infrastructure and services defined as code in a version controlled open repositories.
  • maintain the technical documentation required to run, maintain and support the service.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure beta service components are defined and version controlled in the open repositories.
  • ensure all the repositories are accompanied by the meaningful Readme file and relevant licensing information.

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team collaborates with other teams at DLUHC on defining open data standards for the department and other projects.
  • the team uses the GOV.UK Design System and plans to utilise the GOV.UK Notify service in the future.
  • the team plans to utilise patterns and templates developed for the pre-awards service to build and monitor the infrastructure.
  • alterations to the task list pattern was as a result of testing with users, the team have been considering how to ensure any changes meet accessibility standards.

What the team needs to explore

Before the next assessment, the team needs to:

  • engage with the Data Standards Authority (part of the Central Digital & Data Office) to ensure service conforms with required open data standards.
  • engage with the GOV.UK Notify team and onboard the service
  • have a plan for how to contribute to the open source community
  • continue to iterate design patterns and components when a strong user need requires it. Share any research and insight back into the cross-government design community.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has shortlisted tools required to monitor service health and alert when required.
  • the team has shortlisted requirements and tools required for the application and performance testing.

What the team needs to explore

Before the next assessment, the team needs to:

  • have an initial testing, logging, monitoring and alerting functionality in place.
  • explore how service components can be scaled out automatically to sustain the high load.
  • have a process to redirect the traffic to a placeholder site when the service is unavailable for an extended period.
  • have a plan for how to respond to a failure to critical service components.
  • define the Business Continuity and Disaster Recovery Procedures and agree with the business stakeholders on expected Recovery Point Objective and Recovery Time Objective, and ensure, the technology and the team can deliver on these expectations.

Next Steps

Reassessment

In order for the service to continue to the next phase of development, it must meet the Standard and get CDDO spend approvals. The service must be reassessed against the points of the Standard that are not met at this assessment.

Updates to this page

Published 23 November 2023