Search local land charges Live assessment

Service Standard assessment report Search Local Land Charges 25/04/2024


Service Standard assessment report

Search Local Land Charges

Assessment date: 25/04/2024
Stage: Live Assessment
Result: Green
Service provider: HM Land Registry (HMLR)

Previous assessment reports

Service description

This service aims to solve the problem of conducting a local land charges search as part of the house-buying process. The service provides a means of searching for local land charge information (a type of restriction on land) in a variety of ways; and provides users with free search results near-instantly. Users also have the option of paying for a pdf official search certificate, which conveys the same essential information.

HMLR holds a central, digital register of these charges, which is not a fixed dataset. Searching this register is a statutory service provided by HMLR, and as such is subject to various legislative constraints.

Service users

This service is for…

  • Personal Search Companies
  • Conveyancers
  • Solicitors
  • Estate Agents
  • Surveyors

Things the service team has done well:

On user research and design standards

  • the team demonstrated how they’re using their CRM and support team records, plus service feedback and wider customer engagement activities to understand user painpoints in the service. Feedback loops are in place including accessing logs for those using the assisted digital route so they can learn from their experience, as well as tracking other LLC queries that come in and using these to improve the service
  • the team evidenced a comprehensive understanding of their users and their workflows and the differences in needs/search behaviours and contexts between citizen and professional users
  • (Points 2 and 5) the team tested with users with a range of access needs. They demonstrated an understanding of the barriers around using maps. They are overcoming these barriers by making it easier for users to access the Get Help route by making it more prominent. They provided examples of their user-centred design process and the way they use findings from research to identify where and why users encounter problems when using the service.
  • (Point 2) the team demonstrated their understanding of obtaining local land charges information as part of a wider end to end to journey that involves obtaining multiple pieces of documentation from local authorities. They are aware of how each user type conducts the search task and this has been considered in the design of the search results page and the paid output from the service. They are exploring ways to make the free results more valuable to users and more inclusive
  • (Point 1, 3 and 4) the team shared how they prioritise insights, use personas and user feedback to inform further research plans. For example, the support team can categorise calls and feedback to provide an insight route for the team. The team understand what common user issues relate to and have designed appropriate in-service content and help routes to reflect this. They have clear plans in place to continue to iterate and improve the service with rounds of user research on their roadmap going forward.
  • (Points 4 and 5) the team is using the GOV.UK Design System components and patterns, which helps to meet accessibility standards. They demonstrated an approach to iteration that’s based on the key findings from usability testing. For example, the Get help design, feedback form and Other ways to search design.
  • (Points 3 and 4) the team described the research findings and principles they’ve taken from looking at other services with search results. They described an understanding of the unique needs of their service users being central to their design rationale and have iterated after each round of usability testing.
  • (Point 4) the team demonstrated how they’ve designed the search journey in line with research findings from both user groups on search behaviour. For example, they explored a range of search parameters and used findings from usability testing to prioritise presenting postcode search.
  • (Point 4) the team shared some user research findings on the importance of displaying the charge location as a more impactful development than filtering. The insights from regular usability testing and engagement with the customer support team contribute to ongoing design iteration.
  • (Point 5) the team has used the DAC accessibility report to guide and prioritise their approach to fixes and improvement to make the service easier to use for all. They’ve focused on the five areas highlighted in the report initially.
  • (Point 4 and 5) the team described how DAC provided guidance on the map design in the accessibility report and how they’ve used this to make iterations. They’re researching solutions to accessibility issues around the charge area maps and are providing a contact option to support users. They’ve also developed a more structured priority of information around the instructions to users on editing the map and articulated a robust rationale for their approach.
  • (Point 5) the team has updated the accessibility statement with information about the map so that users understand limitations. They described continuing engagement with other services that use maps to share learnings.

On technology standards

  • the team deliver efficiently by making good use of shared tools from both HMLR and wider government and have open-sourced code that they have produced. They have a robust, mature platform based on AWS and have a range of monitoring processes and tools to ensure a reliable service. The team could reduce their dependency on manual testing by exploring automated accessibility testing using tools such as Axe or Pa11y.
  • the team have conducted all necessary security measures to protect any user data.

On performance standards

  • the team maintains a performance framework, deriving their success metrics from the high-level aims of the service, and continually reviewing the metrics it uses. Data is widely available to all team members through a set of dashboards. The team had good examples of how data was used to help make decisions and to evaluate the effectiveness of changes.

On multidisciplinary team and agile delivery

  • the team have worked hard to bring together a team with the right skills mix; and have created a team identity in a programme of over 200 people.
  • they model the values that long lived teams bring to the service development with a balance of 80:20 civil servants to contractors. The team efforts for iterative user research, communications, sharing information across the different teams, and ways of working are focussed and timely.
  • the team is robustly built, they iterate frequently building and growing learning and tacit knowledge and model ways of working with others on the programme.

1. Understand users and their needs

Decision

The service was rated green for point 1 of the Standard.

2. Solve a whole problem for users

Decision

The service was rated green for point 2 of the Standard.

3. Provide a joined-up experience across all channels

Decision

The service was rated green for point 3 of the Standard.

4. Make the service simple to use

Decision

The service was rated green for point 4 of the Standard.

5. Make sure everyone can use the service

Decision

The service was rated green for point 5 of the Standard.

6. Have a multidisciplinary team

Decision

The service was rated green for point 6 of the Standard.

7. Use agile ways of working

Decision

The service was rated green for point 7 of the Standard.

8. Iterate and improve frequently

Decision

The service was rated green for point 8 of the Standard.

9. Create a secure service which protects users’ privacy

Decision

The service was rated green for point 9 of the Standard.

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

11. Choose the right tools and technology

Decision

The service was rated green for point 11 of the Standard.

12. Make new source code open

Decision

The service was rated green for point 12 of the Standard.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated green for point 13 of the Standard.

14. Operate a reliable service

Decision

The service was rated green for point 14 of the Standard.


Next Steps

Green - live

This service can now move into a live phase, subject to implementing the recommendations outlined in the report and getting approval from the CDDO spend control team.

The team should repeat the development phases (discovery, alpha, beta and live) for smaller pieces of work as the service continues running, following spend control processes as necessary.

Updates to this page

Published 14 October 2024