Apply for a Grant alpha assessment
Service Standard assessment report Apply for a Grant 09/06/2022
Service Standard assessment report
Apply for a Grant
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 09/06/2022 |
Stage: | Alpha |
Result: | Not Met |
Service provider: | Cabinet Office - Grants Management Function |
Service description
At present, there is no single go-to place to find or apply for government grants. It is complicated, time-consuming and costly for applicants and exposes the government to significant risk of fraud, error and duplication of funding. It also leads to some organisations not being aware of grant opportunities. This pilot service will provide applicants with a single go-to place to find and apply for government grants and provide grant administrators with the tools required to build digital grant applications and integrate with the Spotlight due diligence tool for risk management.
Service users
This service is designed for
- Grant Applicants
- VCSE Organisations
- Small and medium enterprises (SMEs)
Although the pilot focus is on VCSE and SME organisations, the services are available to all applicants.
- Grant Administrators
- Government Body Grant Admins
- Arm’s Length Body Grant Admins
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the service team had built on what they learned from Discovery
- a range of methods were used
- there was a clear view of users and interesting behavioural archetypes for use by the team designing and iterating the service
What the team needs to explore
Before their next assessment, the team needs to:
- do more research into the unknowns and untested assumptions of the end-to-end service
- clarify and research the assisted digital route through the service
- demonstrate the spread of users spoken to, across the digital inclusion scale to give reassurance that the spread is sufficient
- consider user research expertise on the team. Whilst the team presented well and were knowledgeable, none of the team are user research specialists. For a service of this size, that may be a risk for the team.
- meet grant service teams already in public beta and consider what could be reused - evidencing that in the next assessment
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that the team :
- worked in the open and engaged with stakeholders inside and outside the organisation
- showed signs of working with other teams and organisations in order to solve a whole problem for the users
- had built on what they learned from Discovery
What the team needs to explore
Before their next assessment, the team needs to:
- tie work back to user needs / jobs to be done to evidence a clear narrative of what the team is addressing in response to the research
- be able to explain how, what the service the team is working on, will join up with the other grants team services already in public beta into a journey that solves a whole problem for users, demonstrating the connectivity between these transactions or journeys
- expand on the service team scope considering other areas and solutions, for example creating an API and allowing third parties to build applications that connect with the service
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has demonstrated that designers and user researchers are working with front line operations staff
- the service team had built on what they learned from Discovery
What the team needs to explore
Before their next assessment, the team needs to:
- consider how users that have no access to the internet will in the future be able to access the service
- be empowered to find the best way of solving the problem, rather than working up from an existing solution, that appears to be a constraint
- provide more evidence on how the design / user testing session have shaped the service
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that the service team:
- has, tested for usability with actual and potential users, using appropriate research techniques
- has made sure the service helps the user to do the thing they need to do as simply as possible
- are being consistent with the design of GOV.UK
What the team needs to explore
Before their next assessment, the team needs to:
- enhance the usability testing by adding ‘routes’ and ‘error handling’ to the prototype
- provide A/B testing evidence on design patterns tested with users, for example ‘warning text’ over ‘before you continue’ pattern
- add conditional logic to the prototype, to allow a more realistic usability testing. The prototype should be 1 -2 sprints ahead of development as close to the finished product as possible
- use grouping buttons established patterns (inline) if necessary, but avoid more than one call to action where possible
- explore the difference between ‘actions’ for example should a button say ‘Save’ or ‘Save and continue’ with consideration on what happens in the backend, does the data cache or is it permanently saved and how the user understands this
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has used the GOV.UK design system that meets accessibility standards
- the service team had built on what they learned from Discovery
What the team needs to explore
Before their next assessment, the team needs to:
- avoid excluding any groups within the audience they’re intended to serve, or at least plan in detail on how the service will meet these users’ needs in the next phase
- consider that Public sector organisations have a legal duty to consider everyone’s needs when they’re designing and delivering services, the service team need to address this critical point
- provide assisted digital research data, and or evidence on content reducing cognitive overload with people with neurodivergent abilities
6. Have a multidisciplinary team
Decision
The service did not meet point 6 of the Standard.
What the team has done well
The panel was impressed that:
- there is a multidisciplinary team providing expertise across different disciplines
- the service team are working well together and policy, and there is a clear service owner in place
- there is a supplier in place to work across problem areas until 2023
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that there is a clear distinction between roles on the service team, most notably a product manager providing leadership against the vision. The panel noted that the product manager highlighted in the presentation, was also working as a user researcher.
- ensure that they can appropriately challenge and set their scope around the user needs they are meeting and actively considering the whole problem or user journey
- develop clear problem ownership of an end-to-end transaction, or clearly explain how they are empowered to make decisions across the journey if they are only working on a component part e.g. apply rather than find a grant.
- continue to ensure that policy is challenged and adapted based on user research
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team are working in scrum and running a series of design thinking sessions, providing a good balance of agile techniques
- there is an appropriate level of governance, and the service owner is actively working with the team
- the team actively encourage operational and policy colleagues to see the work they are undertaking
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that they are actively learning and adapting from their research, and using the opportunity to iterate the design rather than parts of the problem
- consider how the team can use agile techniques to de-risk the problem space and ensure they inspect, learn and adapt from findings
- ensure that the team has the appropriate challenge and empowerment in place with the senior team to consider different options, ideas and hypotheses to the problem, using the governance they have established
8. Iterate and improve frequently
Decision
The service did not meet point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team is working in an iterative way and using agile methods to improve their thinking and knowledge
- the service team are passionate about developing on the problem space
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate clearly content design and interaction design iteration, rather than features and functionality, and how they are actively using user feedback to provide a clearer path for users to complete first time
- gather further feedback from grant applicants, especially as the team plan to use a series of templated questions provided by grant administrators to check that these are fully understood when completing the journey. The service team should highlight how these questions have been iterated, and the content changed based on their research.
- ensure that alpha provides the opportunity to remove impediments for users. The service team discussed that on their latest research, users still found areas problematic which the panel would like to see addressed before the service is in beta
- develop a clear plan of areas to test and explore in beta that allow for cementing knowledge when users are using the service on their own, and how the service metrics will inform design
9. Create a secure service which protects users’ privacy
Decision
The service did not meet point 9 of the Standard.
What the team has done well
The panel was impressed that the Team:
- have considered privacy of data is important
- have identified a IDAM solution to manage access
- have a pattern that considers how individual data will be managed and stored
What the team needs to explore
Before their next assessment, the team needs to:
- have completed a spike (implementation against one set of users) to identify whether their IDAM solution will work with the rest of their architecture. This should be designed, documented and results available to assessment
- complete a threat analysis on the security threats to their systems and mitigation on threats. Design the sec architecture for their system based on risk and threats and designed an IDAM solution to fit with that. All of the above needs to be documented
- provide a more detailed DPIA on how the IA of different groups will be managed, and design a data storage solution that can be managed at scale across government rather than point to point for this solution.
- demonstrate how data is protected and managed by the architecture of the data environments and how IDAM will be used to ensure appropriate access is given to the data stored in the grants data store
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team have considered their performance framework and metrics across their scope of the journey
- the team are planning to use a range of tools to evaluate service performance, and actively exploring some of their data constraints
What the team needs to explore
Before their next assessment, the team needs to:
- actively consider the targets to measure against, based on current levels of activity
- demonstrate how they are creating greater links with the overall service metrics including the ability to find a grant
- develop their plans to publish and share performance data across the team so that it can be seen and used, and users across government that have an involvement in the grants process would know how to access this
- consider how the suggested metrics are supporting the different users and their needs to reach their overarching goal, clearly demonstrating the difference between grant administrators and grant applicants.
- review or consider their roadmap for understanding metrics across the whole service journey e.g. number of grants awarded
11. Choose the right tools and technology
Decision
The service did not meet point 11 of the Standard.
What the team has done well
The panel was impressed that the team have:
- thought through how the critical systems (SAP) will form the basis of their architecture
- delivered a good pattern for information flow giving the technology constraints.
- have considered the basic security of their system
What the team needs to explore
Before their next assessment, the team needs to:
- review the architecture proposed as it is very point-to-point, considers only the integrations between the systems and it is not rigorously considered how this would scale across government. An architectural diagram of how the grants administrators will access and use the data once uploaded into the grants data store needs to be designed and documented
- provide the service design and service architecture end to end including how the data is processed afterwards and accessed by applicants and administrators
- consider the service design of how the centralised system hands over or gives access to departments to make decisions, it needs to be documented and shown, with the data architecture and data flow better illustrated post-submission for both grant applicants and grant administrators.
- consider specifically if the use of one instance of SAP as a grants Data store, and then integrated with another instance of SAP is a good solution if data flow was only to get into Spotlight and GGIS. However grant demonstrators will need to access and extract the actual data for their own processes and validation and payment. Doing this solely through a sap integration will prove complex and difficult
- consider that the architecture needs to be separated out so that the data collected is hosted within a system/data environments that can be accessed securely and easily from other departments. This interim data store can still integrate with spotlight and GGIS but also needs to be available via APIs for grant administrators
- define the full IDAM design which has not been considered at this stage in enough detail. Both for managing grant applicants and grant administrators. Access of grant administrators to the raw grant data itself needs to be considered in more detail
- review the assumption that most government departments use Salesforce to manage grants, needs to be explored further to ensure the architectural integration assumptions are well tested. An alternative approach would need to be considered otherwise we will be locking all government into a single proprietary technology solution
- consider the Security architecture of how the solution will be managed, monitored and how threats are managed and avoided also need to be demonstrated and documented
12. Make new source code open
Decision
The service did not meet point 12 of the Standard.
What the team has done well
The panel was impressed that the team :
- made good open sources choice and approach on the front-end technologies
- made a good approach and storage of code on GitHub as per CDDO standards
What the team needs to explore
Before their next assessment, the team needs to:
- consider that the integrations and access to grants data solely via the Salesforce API carries risk for scalability of this architectural approach
- demonstrate how the use of APIs and supportive services to allow access to the grants data securely to other departments who do not have Salesforce or who cannot integrate with Salesforce will be supported
- consider what elements of the end to end service can and will be open source and demonstrate their thinking on their decisions around this
13. Use and contribute to open standards, common components and patterns
Decision
The service did not meet point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the solution proposed uses common technologies and IDAM approach in the development of the front end
- there was a good use of GDS patterns for the front end
- there was a good use of service design patterns
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how open standards and common technical components are demonstrated in the architecture design and patterns
- explore how the back end systems and connections form a pattern for interoperability and access to the systems by other departments
- consider that the intention is for other departments to connect via a SAP API or via another data store solution with API, then the team need to demonstrate how this is a repeatable pattern for connectivity
- consider the expectation that the IDAM solution should follow open standards (for access to data, not to the front-end system).
- provide the appropriate level of documentation to demonstrate this service standard has been met
14. Operate a reliable service
Decision
The service did not meet point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team considered the existing technology stack and reused similar technology stacks to support the service stack
What the team needs to explore
Before their next assessment, the team needs to:
- provide documentation how the new integrations, IDAM and access routes will be supported, scaled and managed
- demonstrate that the solution can work for most government departments and provide a consistent and reliable service to all those who need to access, in particular the grant data and grant applicants
- demonstrate in particular how the IDAM solution will be managed and supported and whether this can work to scale for the needs of a cross gov service
- demonstrate that the back end of the solution is appropriately designed at scale for use across gov (as per previous recommendations above) and that this architecture is reliable, scalable, managed and supported