Non-Domestic Renewable Heat Incentive Alpha Assessment Report

The report from the Non-Domestic Renewable Heat Incentive Alpha Assessment on the 25th March 2021

Service Standard assessment report

Non Domestic Heat Incentive

From: Central Digital & Data Office (CDDO)
Assessment date: 25/03/2021
Stage: Alpha
Result: Not Met
Service provider: Ofgem

Service description

The Non-Domestic Renewable Heat Incentive (NDRHI) is a government environmental scheme that provides financial incentives to increase the uptake of renewable heat by businesses, the public sector and non-profit organisations.

Our new service will replace the existing service and will allow for the continuing administration of the NDRHI scheme and, in doing so, we will need to provide the following:

  • The ability for scheme participants to manage accreditation(s) and submit output data; any necessary sustainability information; and declarations.
  • Interactions with a payment system to ensure scheme participants are regularly paid for their heat produced.
  • The means for internal users to review data submissions and amendments to existing accreditations and come to swift decisions, such as on eligibility and accuracy.
  • A platform to manage customer interactions and communications.

Service users

  • NDRHI scheme participants
  • Biomethane producers
  • Consultants
  • Ofgem operational teams who administer the scheme e.g Amendments; Periodic Data; Payments; Audit and Compliance; Reporting and Enquiries.

1. Understand users and their needs

Assessed by: User research assessor

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used call data to identify the scope for their MVP. This, among other evidence, enabled the team to define a clear scope for their MVP, around improved submission of ‘periodic data’
  • the team established detailed personas, which clearly define a range of users
  • the team worked closely with agents to identify pain points in the current journey and have proposed simple automations to resolve these issues.

What the team needs to explore

Before their next assessment, the team needs to:

  • establish user needs which link observed insights to specific interactions or features in the service prototype. This can be done with more granular user stories, or other methods like acceptance criteria. Currently, the user stories are very high-level and fail to demonstrate why the proposed solution is the best way to meet these high-level needs
  • identify and test which parts of the service have been designed based on assumptions or have been influenced by the existing service and test these assumptions with users.

2. Solve a whole problem for users

Lead assessor, particularly with design and research input

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have identified a well-defined MVP and are thoughtfully bringing incremental value to their service redesign
  • the team are acutely aware of the compliance issues that users face because of failings of the current service, and understand the value to the user that service improvement is likely to bring
  • the team identified points of repetition and error within the current service and are focussing effort on removing these
  • the team clarified the problem space by creating a traceability mapping matrix and a fishbone analysis to establish external and internal pain points
  • the team have started looking at aiming to solve the whole problem for users identifying where the key pain points are throughout NDRHI journey

What the team needs to explore

Before their next assessment, the team needs to:

  • test further iterations of their journey and refine their MVP further. Testing has been limited. The team need to think about alternative designs, because these may generate unexpected insights
  • keep prototyping and thoroughly testing new functionality as if they were in Alpha. The team are holding back a large number of features which they intend to add after Beta. They have a well-defined scope, but it is very narrow
  • look at the full end-to-end journey and how users would find the service (especially as NDRHI scheme registration closes on 31 March 2021)
  • conduct some contextual research to better understand user behaviour and any problems they may have before directly interacting with the NDRHI service
  • look at the potential opportunity of reorganising Ofgem business areas to more coherently administer the NDRHI scheme, and what would be needed in order to do this
  • the team talked about many user needs, I’d advise them to maintain clarity on the main goals of the user, to get paid for the energy they produce but also remain compliant with statutory regulation.

3. Provide a joined-up experience across all channels

Lead assessor

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • research was done with internal and administrative teams to create initial personas, identify pain points and establish user needs
  • the service has its own branding but the team are using GOV.UK patterns consistently
  • the team have started to look at improving the process for back-end staff.

What the team needs to explore

Before their next assessment, the team needs to:

  • the team have done some but need to do more research to understand the platforms and devices users would be likely to use and not base decisions on the current service as this has been identified as difficult to use
  • consider potential options for joining-up administrative staff to best meet the user needs. This includes aiming to reduce any potential fragmentation when working across different business areas within Ofgem.
  • consider consistent use of language when interacting with users through the user interface, email or on the phone to eliminate confusion.

4. Make the service simple to use

Design assessor

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has identified the pain points of the existing system
  • the team have focussed on improving content and simplifying the journey for users, removing unnecessary steps and questions where possible
  • the team’s idea to introduce an overview page is good. Users would be able to see what questions they’ll be asked, how long it might take them and check their answers in one place. It could also allow the team to remove more buttons on the question pages (there should not be multiple calls to action on a page) and the progress indicator.

What the team needs to explore

Before their next assessment, the team needs to:

  • Explore a broader range of solutions with upcoming features. Currently, a lot of the service design resembles the existing service which has presented pain points to users. In the future it is recommended that they explore other opportunities of meeting user needs
  • it would be good to question additional steps like needing to log into an account or needing to use a desktop device.

5. Make sure everyone can use the service

Design assessor

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • users who need additional support are expected to use the support line where staff can complete their online service on their behalf
  • the team are planning to have external accessibility audits during beta.

What the team needs to explore

Before their next assessment, the team needs to:

  • the team plan to test with users with low digital skills during beta, it feels as though the team have missed an opportunity to understand how the service could work best for users with lower levels of digital skill during alpha.

6. Have a multidisciplinary team

Lead assessor

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • all the key essential roles that need to be on a multidisciplinary team were in place during alpha
  • a Service designer was on the team from the start working across Ofgem and identifying potential components that could be used for the NDRHI service
  • although the team lost a User Researcher towards the end of alpha, it was recognised that this role was still required and interim cover was put in place until a full time role could be made available
  • the team have recognised the need for sustaining knowledge-share across the whole team, especially as there is a balance of contractors and Civil Servants. Activities include developers attending user research sessions and quarterly joint user story writing sessions.

What the team needs to explore

Before their next assessment, the team needs to:

  • revisit the skills needed in beta and how they are structured in scaled Agile teams. For example, there are three different work streams implementing discrete features, which include three separate Business Analysts. This could potentially introduce risk of disconnected process flows and work management, and potential siloed working
  • ensure demand management of a shared skills across the streams of work (e.g. User Researcher) is in place, and ensure one workstream does not monopolise a single person’s time
  • integrate a Performance Analyst into the team to develop the performance framework before moving into beta. In beta, this needs to be a full time resource.

7. Use agile ways of working

Lead assessor

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have a mature understanding of digital ways of working using Agile Scrum. The team run fortnightly sprints incorporating expected ceremonies including stand-ups; sprint reviews and retrospectives
  • the team are working competently using online collaboration tools to successfully work remotely including, Mural, Confluence, and Teams
  • the team are taking multiple opportunities to develop and grow through continuous improvement. This includes having an Agile coach on the team, and leveraging retrospectives to hone how the team works together
  • the team are taking an approach that ensures the service has the greatest value for users and the team are using a value proposition canvas to help prioritise work in the backlog
  • the team are liaising with extended stakeholders to help make informed decisions, including Policy and Subject Matter Experts who have a detailed knowledge of the NDRHI scheme.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure risks are mitigated and challenges are addressed when scaling Agile in beta, including a lack of cultural buy-in from wider stakeholders who are not directly part of the Scrum team, disconnected process flows and work management, and potential siloed working.

8. Iterate and improve frequently

Lead assessor

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • The team have worked with subject matter experts to create content and have iterated it based on research with internal and end users
  • The team are considering bigger changes to the service, for example, combining periodic data with payment processes which are currently handled by separate teams but don’t need to be.

What the team needs to explore

Before their next assessment, the team needs to:

  • the alpha phase is an opportunity to try a number of different approaches to solving a problem for a broad range of users and the team have missed this opportunity. By focussing on iterating what already exists the team are overlooking potential opportunities to make the service even better
  • investigate further iterations with a breadth of users experimenting with different design patterns. The team should have gathered sufficient user insight to be confident that iterations are supporting user needs and aim to solve a whole problem
  • run a couple of sprint cycles to further test the end to end journey with a new group of users
  • further refine in beta to sustain continuous improvement.

9. Create a secure service which protects users’ privacy

Tech assessor

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • they have put security first in their consideration of technical choices, have reviewed relevant certifications and have ensured data sovereignty can be maintained in the UK
  • plans are already in place for external security testing and accreditation of the new system before any sensitive data will be migrated from the existing system.
  • the team plan to scan both the cloud infrastructure and source code for potential security vulnerabilities.
  • access to the administrative functionality for “internal” OFGEM users will be restricted by IP address, giving an additional layer of security for these higher-risk activities in the system.

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate automated dependency update tools (such as Dependabot) to ensure that any third party software libraries in use are kept up to date
  • define a process that ensures both infrastructure and software is regularly security patched.

10. Define what success looks like and publish performance data

Analytics or lead assessor

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have started to think about success factors for the service, and created an initial foundation to build upon when measuring service performance
  • the team are aware of NDRHI performance data that BEIS require to make publicly available
  • the team are aware of the stakeholders who may be interested in performance data.

What the team needs to explore

Before their next assessment, the team needs to:

  • integrate a Performance Analyst into the Scrum team
  • consider more bespoke performance indicators that are more user need focussed, in addition to the four generic KPIs and business focussed KPIs
  • create a performance framework that details the KPIs, how goals are going to be measured, the measurement tools that are going to be used and how insight is tied back to user needs to demonstrate whether these are being met or not
  • investigate measurement tools in addition to Google Tag Manager (GTM). GTM will not provide quantitative numbers on actual users due to opt-in cookie preferences. Splunk is an example of a tool that could be used alongside GTM and give more precise insight

  • be aware that personal identifiable information (PII) has to be omitted when using GTM for security purposes
  • consider using analytics reporting tools such as Google Data Studio for visualising needs-based service performance for key stakeholders

  • make contact with the data.gov.uk team for publishing KPIs.

11. Choose the right tools and technology

Tech assessor

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are planning to use some existing APIs and services to avoid rebuilding existing functionality from scratch
  • the team have built a working prototype using GOV.UK Design System patterns and the AngularJS framework that has allowed parts of the user journey to be tested and iterated during Alpha, and which has allowed the team to prove the utility of some Azure components in advance of their use in the Beta development.

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the offerings/limitations of using technologies already used elsewhere within Ofgem as well as dependencies to components and access to data pools. In a future assessment, it’d be useful to expand on how they have considered options and how these options meet service/user needs
  • clarify the status of the various OFGEM “common platforms” and “shared services”, and how these will be used/re-used. There are two potential issues here - firstly, devolving responsibility for core parts of the NDRHI service to other teams may mean the service team is not fully empowered to fix issues that arise. Secondly, if NDHRI develops a new “shared service” the team may be obliged to provide support and feature development for other teams across OFGEM, which could have a detrimental impact on delivery of NDRHI itself
  • consider looking at other platform hosting offerings as the choice of Azure-specific technologies could tie the service into a single third-party supplier, and if there are open and/or more portable technologies that could be used in their place (for example OAuth rather than ActiveDirectory)
  • ensure the new service works for all users, including those without JavaScript-enabled browsers.

12. Make new source code open

Tech assessor — n.b If possible, link to GitHub or other open repositories

Decision

The service did not meet point 12 of the Standard.

What the team needs to explore

Before their next assessment, the team needs to:

  • understand the unequivocal requirement of both the Technology Code of Practice (point 3) and the Service Standard (point 12) to make all new source code open
  • make source code developed during the Alpha available in a public repository
  • commit to developing all new components of the Beta service in the open, except where there is a specific demonstrable reason to keep it private (e.g. the implementation of fraud detection algorithms).

13. Use and contribute to open standards, common components and patterns

Tech assessor

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are using GOV.UK Design System components where possible, and are committed to feeding back any new patterns they find they need to develop as part of this service
  • the team have opted to use open technologies such as Kubernetes and .NET Core
  • the team are considering either integrating with GOV.UK Notify for the new service, or re-using an existing OFGEM notifications service; either way they will not build messaging functionality from scratch.

What the team needs to explore

Before their next assessment, the team needs to:

  • further explore if there are other open components and technologies that could be leveraged within the service, rather than proprietary products.

14. Operate a reliable service

Lead, Design and Technology assessors.

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have identified the importance of technical resilience for the service, which has been included in the success factors. This is also a performance indicator that Ofgem needs to feed to BEIS.

Next Steps

Alpha reassessment

In order for the service to continue to the next phase of development, it must meet the Standard and get GDS spend approvals. The service must be re-assessed against the points

  • Standard point 4: Make the service simple to user
  • Standard point 8: Iterate and improve frequently
  • Standard point 12: Making new source code open

The panel recommends the team sits their next assessment in 2-3 weeks time. Speak to your Digital Engagement Manager to arrange it as soon as possible.

Updates to this page

Published 10 June 2021