Register and find a beneficial owner Alpha assessment report

Service Standard assessment report Register and find a beneficial owner 07/08/2024

Service Standard assessment report

Register and find a beneficial owner

From: Assurance team
Assessment date: 07/08/2024
Stage: Alpha
Result: Red
Service provider: MHCLG

Previous assessment reports

  • n/a

Service description

This service aims to solve the problem of:

  • collecting data on and accurately identifying beneficial owners of property in the UK
  • supporting beneficial owners to understand their responsibilities and to fulfil their obligations towards Government
  • supporting fraud and law enforcement investigations relating to property ownership
  • improving transparency across property ownership

Service users

This service is for:

  • beneficial owners
  • conveyancers
  • law enforcement officials
  • local authorities

Summary

The service team has made good progress during the Alpha in a complex policy area and challenging environment. The overall ‘Red’ rating is mainly due to uncertainty in the approach and restricting the scope to only ‘registration’ (without ‘publication’ or search) journeys. The panel highlighted that the work done to date is good but depends on critical policy and technology decisions without which it cannot progress. The expectation is that the ‘registering’ journey will be achieved with relatively small changes to an existing HM Land Registry (HMLR) service but this has not yet been agreed and the alternative would be to create a standalone service.

Before arranging a reassessment the team must:

  1. resolve critical policy and technology questions so there is a clear approach to the Beta phase (such as whether it will be a HMLR enhancement or a standalone service)
  2. determine the approach to publication/search journeys through which users can access and use data on beneficial owners. This will require further Alpha activities without which the service cannot meet its identified user needs or policy objectives
  3. discuss with MHCLG/CDDO Assurance teams whether this will be a new service or a ‘change’ to an existing HMLR service. Their proposed approach to the ‘registering as a beneficial owner’ journey (if agreed) isn’t a service but the publication/search journey could be. This will ensure it is assessed at the right time with an appropriate scope

Things the service team has done well:

The service team has explored a complex area and made challenging prioritisation decisions, recognising their constraints. The recommended approach strives to reuse existing HMLR services before creating a new service.

They adopted a hypothesis-driven approach to their research and design to maximise their insights and progress. They engaged with a variety of potential data users including Central Government, Local Authorities, Law Enforcement agencies and NGOs/Advocacy groups in Phase 1 of Alpha to understand this complex problem space.

The service created a Figma prototype mirroring the GOV.UK Design System showing the journey for a conveyancer to submit information about a beneficial owner if the service is delivered standalone (rather than part of the existing HMLR service).

The team has spent a good amount of time analysing the data requirements of the project, and have a strong understanding of what will be needed to deliver the required functionality. They have engaged with different government organisations to understand where data can be reused, and have explored options for retrieving this data. The team has looked to reuse patterns and standards from wider government, and plans to implement open standards allowing for further reuse of the data they make available. The team has also provided some detailed information on the current HMLR technical infrastructure. If they choose to build the service into this infrastructure, then this technical discovery will be valuable.

The Alpha phase included a multidisciplinary agile team (primarily through an external supplier) with policy leads embedded for oversight and guidance. They adopted a Scrum-based approach with the typical events for planning, retrospectives, review and stand ups. They identified appropriate tools to support collaboration across the core and expanded team.

1. Understand users and their needs

Decision

The service was rated red for point 1 of the Standard.

During the assessment, we didn’t see evidence of:

  • an in-depth understanding of conveyancers, the primary user group upon which the ‘collect’ service relies for its success - especially as this service puts additional responsibilities on this group and there has been some ‘pushback’ already.
  • carry out some research with beneficial owners to validate the assumptions underpinning the research strategy decision to deprioritise them. The team articulated and evidenced clearly why beneficial owners have not been a focus of primary research. The panel would encourage doing some research on the wider journey of beneficial owners, such as if/how they give consent for registering and how they would raise queries.
  • an understanding that professional users have varied needs and that inclusion must be considered from the start - for example what paper processes are still in place for conveyancers and what variations in digital confidence exist conveyancers. The team need clear evidence of how this has been explored and the remaining assumptions to be investigated as it progresses into Beta.
  • engagement with the full range of potential users of this service including users from different size firms, different geographical locations, with different levels of experience in their role and varied levels of digital confidence

2. Solve a whole problem for users

Decision

The service was rated red for point 2 of the Standard.

During the assessment, we didn’t see evidence of:

  • an understanding of the workflows and ecosystems of primary users (conveyancers) - without this it is not possible to design a holistic solution to meet user needs.
  • enough iterative design and research with prototypes to explore possible solutions and approaches fully. This is particularly important when there are several risks such as interoperability within other government departments which are also undergoing transformation.

  • a clear roadmap that would incorporate the “find” element reflecting when there would be enough entries to make the service useful to more users, especially as finding the beneficial owner is listed as a key user need, not registering the owner.
  • testing the required confidence in data accuracy, in particularly where responsibility is with the users (conveyancers), the service or a best endeavours with a declaration model. The existing services involved have issues with data veracity meaning that the users are not confident accepting and being responsible for the accuracy.

3. Provide a joined-up experience across all channels

Decision

The service was rated amber for point 3 of the Standard.

During the assessment, we didn’t see evidence of:

  • plans for the support channel and arrangements. This proposed service relies on another government department (HMLR) which may or may not agree to take on this component to their existing service. If HMLR cannot meet this service need, the team would need to develop a standalone service with service-specific support channels. Until that decision is made, the joined-up experience could not be properly assessed.
  • plans for how the service might incentivise conveyancers to enter this data, the way the new rules would be signposted, and phased in as part of the recommendation. For example, how would users know it was mandated and what lead time is needed to support this rollout
  • completing user research with existing operational teams to understand the challenges they encounter. It isn’t clear how this service will work for operational delivery colleagues who need the information on beneficial owners and how this service will address those issues. An MVP approach for find would be needed in private beta to test this.

4. Make the service simple to use

Decision

The service was rated red for point 4 of the Standard.

During the assessment, we didn’t see evidence of:

  • adequate and robust testing of the service with users/ potential users through multiple iterations.

  • iterations of the service based on insights from testing with users particularly as there was not enough consensus within the small sample to create meaningful insight for further change.
  • consideration of the devices a user might opt to use and the implications for design decisions.

5. Make sure everyone can use the service

Decision

The service was rated red for point 5 of the Standard.

During the assessment, we didn’t see evidence of:

  • an understanding of the varied potential needs of primary users (neurodiversity, assisted digital, access needs) and how this might impact the design of the service.
  • an understanding of any users who might be excluded from the service, particularly as there is a reliance on using the service of the other government departments.

6. Have a multidisciplinary team

Decision

The service was rated red for point 6 of the Standard.

During the assessment, we didn’t see evidence of:

  • a sustainable team to develop, implement and support the service during Beta. The service team for the Alpha phase has largely disbanded and there is no clear plan for how a Beta phase (and the related Alpha phase for ‘publication’) will be resourced.
  • how the insights and outputs from the Discovery and Alpha phases will be transferred to the Beta team to maintain continuity and momentum.
  • a clear Service Owner accountable for the quality of the service and empowered to make decisions on the service’s strategy and priorities.

7. Use agile ways of working

Decision

The service was rated amber for point 7 of the Standard.

During the assessment, we didn’t see evidence of:

  • how agile ways of working will be sustained and adapted in the Beta phase.
  • how the service team has reviewed, measured and improved their ways of working.

8. Iterate and improve frequently

Decision

The service was rated red for point 8 of the Standard.

During the assessment, we didn’t see evidence of:

  • having iterated the service prototype based on user research insights. The team developed a prototype and tested it with 5-10 conveyancers but did not have sufficient time to make design iterations and test these.

9. Create a secure service which protects users’ privacy

Decision

The service was rated red for point 9 of the Standard.

During the assessment, we didn’t see evidence of:

  • a clear plan on how the new functionality provided by this service will fit into HMLR’s existing e-DRS service. Though there has been exploration into what is currently offered by HMLR, the team need to understand how any additional data they capture and later make available through subsequent journeys may impact the security requirements of the service.
  • a plan for penetration testing, both automated and manual. Though the team is potentially looking to rely on HMLR’s existing strategies, they should look to document these as part of their development plan moving into beta.

10. Define what success looks like and publish performance data

Decision

The service was rated red for point 10 of the Standard.

During the assessment, we didn’t see evidence of:

  • a defined performance framework with clear metrics and targets indicating whether the service is successful or otherwise. This should include both value/impact evaluation as well as service quality metrics.
  • how the 4 mandatory KPIs will be measured.
  • an evaluation plan for Private Beta showing how the phase will be assessed before seeking to move into Public Beta, including user volumes, segments and completions.
  • clear plans for introducing analytics tooling (e.g. web analytics) and capability (e.g. a performance analyst) to implement performance analysis into the service and support publication of performance data.
  • data and insight informing the Alpha phase and service design such as existing offline processes, quantitive research, and related services’ performance.
  • baseline performance for current arrangements against which the new service can be evaluated.
  • an understanding of the performance metrics (and data) for the HMLR service through which beneficial owners would be identified.

11. Choose the right tools and technology

Decision

The service was rated red for point 11 of the Standard.

During the assessment, we didn’t see evidence of:

  • a confirmed decision on where this service will be hosted. If the service will be hosted as part of HMLR’s e-DRS service then the team needs to understand and document what tools and technologies are used in that environment. In particular, they should investigate the development toolchain and how it will fit into the work that needs to be done to deliver this service.
  • a test plan which makes good use of automation. Though there has been some investigation into HMLR’s current test strategy, this has shown that there is a heavy reliance on manual testing and limited capability to run services locally. As part of this work, the team should explore options to improve this testing strategy.

12. Make new source code open

Decision

The service was rated red for point 12 of the Standard.

During the assessment, we didn’t see evidence of:

  • a defined plan for how the team will make new code open source. If the service is targeting HMLR’s platform, then this could include HMLR’s current open-source policy and processes. The team must understand how they can make any new code that they produce as part of this service open source and reusable by the public.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated green for point 13 of the Standard.

14. Operate a reliable service

Decision

The service was rated red for point 14 of the Standard.

During the assessment, we didn’t see evidence of:

  • how the team has investigated the HMLR production environment and support arrangements to confirm it is appropriate to meet their needs. The team needs to understand what is available in terms of platform reliability, monitoring, assurance and disaster recovery. This is critical for the decision on how the service will be provided.

Next Steps

In order for the service to continue to the next phase of development, it must meet the Standard and get CDDO spend approvals. The service must be reassessed against the points of the Standard that are rated red at this assessment.

Updates to this page

Published 3 December 2024