Non-domestic Rate Reforms alpha assessment

Service Standard assessment report Non-domestic Rate Reforms 29/02/2024


Service Standard assessment report

Non-domestic Rate Reforms

Assessment date: 29/02/2024
Stage: Alpha
Result: Amber
Service provider: HMRC (VOA)

Previous assessment reports

  • Not Applicable

Service description

The NDR Reforms programme is a design and implementation programme to reform non-domestic rating to enable a sustainable move to three-yearly revaluations, it will provide greater transparency and implement new reliefs.​ The programme is delivering on outcomes from HMT’s Business Rates Review and Welsh Government’s NDR Reform.

The programme aims to:

  • implement a new requirement on ratepayers to conform with Information Duty and Annual Confirmation and design new processes to enable ratepayers to fulfil their obligations.
  • make available evidence summaries to help ratepayers understand their rateable values.
  • design and implement a new online system to support the processes above.
  • design and implement a robust and intelligent Compliance function.
  • streamline the Appeals process by removing the Check step as a separate process (the first step of the current ‘Check, Challenge, Appeal’ process) and by introducing a shortened Challenge window and changes to the statutory deadline for the VOA to resolve Challenges.

Service users

This service is for:

  • ratepayers
  • agents acting on behalf of ratepayers
  • Billing Authorities

Things the service team has done well:

The panel was extremely impressed with the way in which the team approached user research for this service. They stood up teams to understand and address different journeys and user groups and allowed the space for deep insights to develop. The focus on universal barriers and low digital skills was really appreciated, as part of a wider understanding of how diverse the NDRR cohort can be. Identifying these users came about through dedication to understanding the product space, which the team followed with targeted research with excluded groups. Overall, extremely good work team.

In addition the team demonstrated:

  • collaborating well with service users including front line staff and SMEs to develop the service.
  • making changes to offline channels (engaging with digital support teams)
  • writing, testing and iterating mainstream guidance and how users will find the service via GOV.UK
  • testing for usability frequently with users who have low digital literacy or access needs.
  • using the GOV.UK and HMRC Design Systems and engaging in a feedback loop with these teams
  • ensuring design consistency across the squads through regular design check-ins
  • regular usability testing with user groups with different needs
  • researching and reviewing assisted digital options
  • being aware of and testing their riskiest assumptions first
  • creating comprehensive artefacts and storing information in a way that helps bring new members of staff up to speed.
  • working in an agile way, facilitating large teams (5 squads) in a co-ordinated way
  • good understanding of users and pain points
  • clear definition of what success looks like for the service and how that will be measured

1. Understand users and their needs

Decision

The service was rated green for point 1 of the Standard.

2. Solve a whole problem for users

Decision

The service was rated amber for point 2 of the Standard.

During the assessment, we didn’t see evidence of:

  • a plan for how the user-facing service will join up with internal stages of the journey – how submissions will be processed internally by staff and how this might change their ways of working.

3. Provide a joined-up experience across all channels

Decision

The service was rated green for point 3 of the Standard.

4. Make the service simple to use

Decision

The service was rated green for point 4 of the Standard.

5. Make sure everyone can use the service

Decision

The service was rated amber for point 5 of the Standard.

During the assessment, we didn’t see evidence of:

  • a clear plan for ongoing accessibility testing during the Private Beta phase

6. Have a multidisciplinary team

Decision

The service was rated green for point 6 of the Standard.

7. Use agile ways of working

Decision

The service was rated green for point 7 of the Standard.

8. Iterate and improve frequently

Decision

The service was rated green for point 8 of the Standard.

9. Create a secure service which protects users’ privacy

Decision

The service was rated amber for point 9 of the Standard.

During the assessment, we didn’t see evidence of:

  • how the service is going to prove identity of the individuals accessing the service by ensuring the individual completes a personal details verification and identity verification check to reduce fraud vectors to the service
  • how the service is going to ensure verification has been completed for Agents accessing the service as they will be using Organisation affinity types, as the service is not to use Agent Services Account, we don’t have irrefutable proof the Organisation is a substantive Agent

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

11. Choose the right tools and technology

Decision

The service was rated amber for point 11 of the Standard.

During the assessment, we didn’t see evidence of:

  • how the service proposes to support users who need to migrate their accounts and/or data from the Check, Challenge and Appeal service, or plans to assist in the transition over to Non-domestic rate reform for example through guidance and comms

12. Make new source code open

Decision

The service was rated green for point 12 of the Standard.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated amber for point 13 of the Standard.

During the assessment, we didn’t see evidence of:

  • where the team have explored developing a reusable service for managing Agents and Ratepayer relationships that could be used by other VOA services in the future

  • where the team have explored developing a reusable service for looking up and matching geospatial data provided by ArcGIS, or enhancing existing MDTP components which currently use ordnance data for address lookup.
  • where the team have explored all options around establishing an enrolment in Enrolment and Agent Client Database (EACD) without the need to use ETMP as the head of duty
  • where the team have fully articulated the risks associated with not completing a digital two-way handshake, as with the current Agent Services Handshake, and these have been accepted.
  • how the service will comply with the Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 and ensure compliance with the WCAG 2.2 AA accessibility standard, although the team did plan to meet the WCAG 2.1 AA standard

14. Operate a reliable service

Decision

The service was rated amber for point 14 of the Standard.

During the assessment, we didn’t see evidence of:

  • the casework that could be automated through business rules in Microsoft Dynamics and the potential impact non-automated casework may have on the VOA business area due to the up to 20 times increase in expected customer engagements.
  • how the service is proposing to support users with recovering lost credentials, or the process to link properties to the correct Government Gateway account where a user has lost access and choses to register for a new credential. This would also apply to users who have previously delegated access to an Agent, and then are supported through the lost credential process.

Next Steps

This service can now move into a private beta phase, subject to addressing the recommendations given for the amber points within three months time and CDDO spend approval.

To get the service ready to launch on GOV.UK the team needs to:

Updates to this page

Published 15 October 2024