Carry out a due diligence check alpha assessment

Service Standard assessment report Carry out a due diligence check 09/04/2024


Service Standard assessment report

Carry out a due diligence check

Assessment date: 09/04/2024
Stage: Alpha
Result: Amber
Service provider: Department for Levelling Up, Housing and Communities

Previous assessment reports

  • Non applicable

Service description

This service makes it easier for civil servants to ensure the appropriate due diligence is carried out before government engages with external stakeholders. It achieves this by guiding civil servants through a standardised process that adheres to the government’s recently published principles of engagement. By equipping civil servants with a tool for putting the principles into practice, the key aims are: first, to reduce the risk of inadvertent harm or damage, thereby increasing confidence in decision-making; and second, to remove lack of knowledge or confidence in due diligence as barriers to engagement, thereby encouraging civil servants to engage more widely.

Service users

This service is for…

  • Civil Servant ‘Investigators’: civil servants who are planning or considering engagement with external stakeholders, and therefore need to carry out due diligence checks on an individual or organisation in order to be able to make an informed decision about whether to proceed with the proposed engagement, balancing the benefits of doing so with identified risks.

  • SMEs in Due Diligence: civil servants with subject matter expertise in due diligence, who work in dedicated due diligence functions carrying out a formal due diligence process.

  • SROs of Civil Servant ‘Investigators’: Senior Responsible Officers of the aforementioned Civil Servant ‘Investigators’.


Things the service team has done well:

  • the team successfully used existing GOV.UK Design patterns to ensure the service was consistent and considered needs of users with accessibility requirements (e.g. layout, colour and cognitive load)
  • the team demonstrated a good approach to prototyping and iteration, illustrating examples of content and layout changes in response to user feedback
  • the team demonstrated how they prioritised and designed the service in response to several significant policy and legal constraints

1. Understand users and their needs

Decision

The service was rated amber for point 1 of the Standard.

During the assessment, we didn’t see evidence of:

  • the team has run some great user research sessions in 13 weeks including usability testing and interviews with different user groups. However, the team must spend time conducting more user research sessions to fully understand the end-to-end user journey and user needs when the service is reproduced in Spotlight. The panel understood that the collaboration with Spotlight was only a few weeks before this assessment, so the team needs more time to iterate and find out how end-users will do when the service is plugged in Spotlight.
  • the team has a thorough As-is journey mapping and a high-level to-be concept journey in Mural. The panel expected to see an end-to-end user journey of the service that will be built and tested in Beta phase. The team must conduct the usability test with the prototype in Spotlight and must find out users’ actions before and after the digital journey.
  • currently all the usability tests were based on scenarios. The team needs to use real cases when testing the prototype in Spotlight to influence insights in future design iterations.
  • the team must write down a clear user research plan for the next phase.
  • the team demonstrated how their design was in response to user research with SMEs and SRO user groups, however, have done limited testing of the prototype and end-to-end journey with these user groups. Further research and testing with these user groups is required, in particular the sign-up journey and the “search” function for previous reports (the start and end of the user journey). There wasn’t sufficient evidence to provide confidence that the service will meet the needs of these users.

2. Solve a whole problem for users

Decision

The service was rated amber for point 2 of the Standard.

During the assessment, we didn’t see evidence of:

  • overall, the team demonstrated how they have prioritised work in response to several political and legal constraints and their understanding of users’ needs established in Discovery, however, the team was unable to test how the Spotlight journey meets the needs of users across in the time they had. The team is confident that the needs they’ve identified can be met by integrating Spotlight, however, there wasn’t enough evidence yet to demonstrate that it will solve a whole problem for their users.
  • the team tested a limited part of the user journey from the perspective of limited users. This meant that there was not convincing evidence that the broader range of the journey including the start and end of the journey (sign-up and report generation/sharing) does support a range of real users to use the service successfully (including SMEs and SROs).
  • the team acknowledged that there is no systematic way to carry out due diligence. While we understand there will be mandatory training for anyone who is to use the tool the team should explore and provide more evidence in their user research of key painpoints for adopting the new process/tool across departments and ensure these are addressed to improve take up across varying operational settings.

3. Provide a joined-up experience across all channels

Decision

The service was rated amber for point 3 of the Standard.

During the assessment, we didn’t see evidence of:

  • the team did not demonstrate how they’d designed an offline (or supported) service journey in response to user needs and testing.
  • to ensure that the anticipated support is sufficient and has no adverse effects on access and take-up of the service, the team must explore and test the support channels and offline journey of the service in Spotlight.

4. Make the service simple to use

Decision

The service was rated amber for point 4 of the Standard.

During the assessment, we didn’t see evidence of:

  • the team should do further usability testing with SME and SRO user groups to ensure that the service helps them do the thing they need to do as simply as possible
  • the team should do further testing of how Spotlight will impact the way in which users will search for information (i.e. alternatives to search strings)
  • the team should do further testing with actual users of how the service can support users to successfully complete a due diligence check on a subject

5. Make sure everyone can use the service

Decision

The service was rated amber for point 5 of the Standard.

During the assessment, we didn’t see evidence of:

  • the team has conducted some research with users with accessibility requirements, however the team must continue conducting research with more assisted digital users and accessibility users (e.g a mix of people with visual, hearing, motor and cognitive impairments, including those who use assistive technology).
  • the team should also test the offline journey of the service to ensure that users are supported to use the service where needed

6. Have a multidisciplinary team

Decision

The service was rated amber for point 6 of the Standard.

The team had all expected roles for their alpha and showed recruitment plans for their next phase.

During the assessment, we didn’t see evidence of:

  • as per recruitment plans ensure the team has more DDaT roles available as permanent civil servants.
  • agreed roles and responsibilities with Spotlight. The service team showed their governance ambition. Since the assessment happened soon after the decision to integrate with Spotlight the team needs to provide more evidence that this arrangement is operationally feasible.

7. Use agile ways of working

Decision

The service was rated amber for point 7 of the Standard.

Team shows understanding of agile, and UCD.

During the assessment, we didn’t see evidence of:

  • testing latest prototypes in as close to real life scenario as possible. For example: the panel would have liked to see more evidence of how the team increased their confidence that the tool improves the job of the investigator.

8. Iterate and improve frequently

Decision

The service was rated amber for point 8 of the Standard.

During the assessment, we didn’t see evidence of:

  • testing and iterating the integrated new spotlight journeys
  • how the team gained confidence that the “to be map” will be adopted across different departments. The panel would like to see more evidence of real cases of community engagement where the prototype has been tested and further iterated based on the testing.

9. Create a secure service which protects users’ privacy

Decision

The service was rated amber for point 9 of the Standard.

The team reassured the panel that data is collected on legal basis – there will be no new powers for government as a result of introducing the tool.

During the assessment, we didn’t see evidence of:

  • workflows that included vulnerability scanning or the workflow the team were proposing for the Spotlight module development, if the workflows were similar to those described for the custom tool, then the team seem to be well experienced but would still need to ensure they use practices that catch code and open-source security vulnerabilities and issues as early as possible in the development lifecycle
  • monitoring and logging of usage and any suspicious activity – understand the spotlight team are well versed, but the Community engagement team need to ensure they do some threat modelling , the system should alert for security incidents but also in a highly sensitive area ensure that any potential for misuse or malicious activity by a Civil Servant SME or SRO is minimised and all information stored full auditable.
  • clear alignment of data access and storing agreement with the legal arrangements for storing and collecting data.

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

The team could benefit from doing analysis around what are the most common engagement across different departments to support the tool testing strategy.

11. Choose the right tools and technology

Decision

The service was rated amber for point 11 of the Standard.

During the assessment, we didn’t see evidence of:

  • any proof of concept or value that showed the solution fitting into the overall spotlight ecosystem, and it felt to early in the engagement with the Cabinet Office to fully understand that.
  • business continuity planning and provisioning and observability were still not clearly defined and planned at this stage.
  • accessibility needs are a little concern, there was good aspects, the journeys being designed and built using the GDS design system, but it still is a concern based on the fact Spotlight is COTs platform and no demo of that given, that practices for example like progressive enhancement approaches, design aspects and usability needed considering within the constraints of the platform.

12. Make new source code open

Decision

The service was rated amber for point 12 of the Standard.

During the assessment, we didn’t see evidence of:

  • how the team were planning to open source the spotlight module and if this was indeed possible without the divulgence of sensitive material, the team would be advised to build a plan for securely adopting open source, publishing of open-source code and what is possible.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated amber for point 13 of the Standard.

During the assessment, we didn’t see evidence of:

  • engagement with government services who currently use the proposed platform, I would suggest the team looked at patterns and processes used by other module developers and see if there is any opportunity for reuse and or lessons learnt.

14. Operate a reliable service

Decision

The service was rated green for point 14 of the Standard.


Next Steps

This service can now move into a private beta phase, subject to addressing the amber points within six months time and CDDO spend approval.

To get the service ready to launch on GOV.UK the team needs to:

Updates to this page

Published 14 October 2024