Great British Insulation Scheme suppliers reporting service beta assessment

Service Standard assessment report Great British Insulation Scheme (GBIS) suppliers reporting service 29/11/23

Service Standard assessment report

Great British Insulation Scheme suppliers reporting service

From: Department for Energy Security and Net Zero (on behalf of CDDO)
Assessment date: 29/11/23
Stage: Beta
Result: Not Met
Service provider: Ofgem

Previous assessment reports

Alpha report not published on GOV.UK – internal DESNZ assessment

Service description

The Great British Insulation Scheme (GBIS) is a scheme that mandates certain energy suppliers to deliver energy efficiency measures to low fuel-efficient homes in England, Wales and Scotland. The scheme is regulated for under the Energy Company Obligation (ECO) legislation. In scope suppliers will report to Ofgem monthly what energy efficiency measures have been completed (either by them or by third parties). At the end of the scheme, Ofgem will determine whether each supplier has met their obligations using the number of approved measures on the register.

The digital solution being developed will allow obligated energy suppliers to upload monthly to Ofgem their completed customer energy efficiency measures, so that these measures can be assessed against the eligibility criteria of the scheme. Assessments will be carried out by Ofgem internal users and will be part automated and part manual.

At alpha, the service was assessed internally in DESNZ by a departmental panel. This was on the basis that each bulk upload was a single transaction. At beta it was decided each measure reported should be considered a transaction, resulting in the service being reclassified as high volume and being assessed by a cross government panel.

Service users

This service is for users responsible for regulatory compliance within:

  • Large energy suppliers - Large energy companies with resources and capability to deliver GBIS or previous ECO schemes inhouse with a dedicated team (Scottish Power)

  • Medium energy suppliers - Small to medium energy suppliers, with limited resources and capability to deliver GBIS or previous ECO schemes by themselves (Fox Energy)

  • Regulator, Ofgem - Internal Ofgem support/compliance analyst team

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • The team tested the service with small and medium suppliers and understood how to help suppliers know what they need to correct in their record.
  • They undertook an accessibility audit and written their accessibility statement.
  • They introduced a customer satisfaction score.

What the team needs to explore

Before the next assessment, the team needs to:

  • Be able to explain what the end-to-end journey involves from the user’s perspective and explore in detail where pain points occur across the entire journey.
  • Using a holistic picture of how people work through the journey end-to-end, demonstrate how they access support when they encounter challenges or difficulties, so that user needs describe and address the journey from awareness to completion rather than a specific feature.

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • The team has undertaken testing of error measure search function and iterated designs based on feedback from users.
  • They have developed communications templates associated with error measure search function.
  • They have applied a user-centred design approach to aspects of the service they considered within scope.

What the team needs to explore

Before the next assessment, the team needs to:

  • While the panel accept that the service presented is the Minimal Viable Product (MVP), there are elements of the service that are not built and were not presented. The entire user journey must be available before the service considers a ‘Public’ launch.
  • There has been limited exploration of the creation and management of an account by the service team, as recommended in the Alpha Assessment report. The team showed improvements made to the GBIS Guidance page but testing of how GOV.UK One Login is integrated with the service and creating user profiles hasn’t taken place. The team have stated this testing is planned for early December.
  • Whilst the team have developed email correspondence templates, these haven’t been tested with users. Any correspondence / email notifications with users throughout the end-to-end journey should be tested and iterated as needed, before moving into ‘Public’ or ‘Live’.
  • Limited evidence that the team have considered the experience of internal users, and how the proposed service will solve problems for these users. Work with internal users to understand their needs and iterate the service accordingly.

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team needs to explore

Before the next assessment, the team needs to:

  • The team haven’t sufficiently widened their scope to consider and test the full end-to-end journey. As such, there is a risk they don’t have full visibility of the needs and pain points of their users. The end-to-end journey may include aspects not owned by the GBIS service team, but they should be able to demonstrate how these aspects sit alongside the GBIS service, and how they’ve adapted the designs based on user feedback (if there is a need to).
  • The team displayed an understanding of the touchpoints throughout the end-to-end journey, but aspects owned by other teams haven’t been included in their testing. Whilst the service team don’t own these touchpoints, they should understand users’ experiences of these touchpoints, and collaborate with other teams to improve these where necessary.
  • The team shared a process flow that was limited to the steps that happen within the scope of their digital service. This artefact doesn’t represent the full end-to-end journey for users. The team should have artefacts that show and describe the full end-to-end service, which also provide context of the scope of their service within the that journey. Common approaches to these artefacts are service flows, blueprints, service maps.
  • The majority of the assessment focussed on the digital service, with limited detail of other channels that would be available to users throughout the end-to-end journey. The team should provide details of the support channels available to users throughout the end-to-end journey and have feedback from users on the effectiveness of these channels.

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • The team worked with users to test and iterate the design of the service.
  • They utilised established patterns from the GOV.UK design system when building the service.
  • They collaborated with Ofgem UCD team to get feedback on designs and added recommendations to the ‘continuous improvement’ log.
  • They created a design backlog where details of future iterations/work are captured.

What the team needs to explore

Before the next assessment, the team needs to:

  • As recommended in the alpha assessment report, the team must ensure GDS guidelines are followed correctly. Areas of concern in this regard are linked to error summaries and error messaging, which aren’t consistent with the approach used in the GOV.UK design system. The team should review designs against the established patterns to ensure they’ve been implemented in the way documented in the design system. If adapting an existing pattern or using an alternative approach, provide evidence to justify this.
  • There has been no collaboration with the wider GOV design community. Engaging with the GOV design community could help to raise awareness of any inconsistencies in usage of design patterns and give an outside perspective on the design approach taken. Engaging with the design community for a design crit may be useful, specifically on areas of the design where there have been changes to the established patterns.
  • The team have had confirmation that their service won’t live on a GOV.UK domain, however the prototypes still use the GOV.UK styles. This could cause confusion for users as they navigate between Ofgem branded screens and the service itself. Whilst it’s positive that the team have utilised established and proved design patterns for their service, there are specific guidelines for services such as GBIS in relation to styles: https://www.gov.uk/service-manual/design/making-your-service-look-like-govuk#if-your-service-isnt-on-govuk. It would make more sense to align with the digital design styles of Ofgem. Some elements of the GBIS end-to-end journey will use the GOV.UK styles (i.e. GOV.UK One Login), and this is where testing of the end-to-end journey with users could highlight pain points and allow the service to take steps to reduce the impact of these.

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • The team increased the amount of user testing done.
  • They undertook an Accessibility Audit with DAC and are working through the recommendations.
  • They increased their understanding of digital literacy levels for external users.

What the team needs to explore

Before the next assessment, the team needs to:

  • Increase the amount of research done with users with access needs – although the external user group may be capped at 75-100 people. It’s important the team can build a way of addressing different needs through research outside its accessibility audit and ideally beyond neurodiverse conditions.
  • Research with users who are familiar with the regulatory landscape but might be ‘new’ to the system. For example, baseline digital skills are as required for the role, but familiarity with the role itself is not high (eg a new employee). Best efforts should be made to find such users.
  • Resolve any A and AA issues highlighted in the Accessibility Audit before moving into ‘Public’.

Before the live assessment, the teams needs to:

  • Most service guidance is provided in PDF format. Providing guidance in this format makes it difficult for users to find, poses accessibility issues and isn’t the recommended GDS approach: https://www.gov.uk/guidance/publishing-accessible-documents. The panel appreciates this is a department related dependency, and outside of the service team’s control. However, the team should work with Ofgem to understand timescales for adopting the standard GDS approach to guidance and collaborate on its development. Resolving this issue will deliver an improved service for GBIS, and future Ofgem services.

6. Have a multidisciplinary team

Decision

The service did not met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • The team has identified a broad range of individuals/roles which typically sit outside of a scrum team but have found ways to keep these stakeholders engaged and informed.

What the team needs to explore

Before the next assessment, the team needs to:

  • The team do not have a performance analyst (mentioned at point 10). It is highly recommended that the team engage with a specialist in the role to gain insight into measures that can be taken, as well as confidence that those measures already identified meet the needs of the team in identifying areas of interest/iteration.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • The team have considered agile principles and how these can be made more efficient and effective for the team and its broader members. The team has adopted LeSS and are working a single backlog across multiple delivery teams.

What the team needs to explore

Before the next assessment, the team needs to:

  • The team are maintaining a pipeline of work, but need to ensure that feedback from deployments/iterations can be quickly turned into actionable improvements while maintaining the flow for all teams.

8. Iterate and improve frequently

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • The team demonstrated the acknowledgement that iteration of the service is required from the previous assessment. They have also identified specific aspects of the service that they wish to investigate further.

What the team needs to explore

Before the next assessment, the team needs to:

  • As covered in point 10, the team must determine a mechanism for eliciting performance of the service against an agreed set of business standards.
  • Performance analytics should be used to drive additional design/research activity to ensure iterations improve the performance of the service and the user experience.
  • Delivery of the full scope of the service will be needed to ensure that the end-to-end service meets all needs of the users.

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • The team made use of cloud-based security controls and applied appropriate measures to prevent loss of any sensitive data through controls such as external firewalls, secrets and configuration management, and managed identity access principals for service / component access.
  • threat modelling was carried out using the STRIDE methodology and findings were acted on.
  • The team considered and has a plan to implement the upgrading of software components to the latest version by January 2024 ensuring that it does not fall out of support.
  • Integration with the GOV.UK one login solution is in place.
  • The team considered and implemented methodologies to secure document uploads.

What the team needs to explore

Before the next assessment, the team needs to:

  • Consider carefully revisiting the manual notification approach to user offboarding, consider factors such as disabling accounts after periods of inactivity, account lockouts, and enhanced auditing and logging of user activity.

  • Work closely with the OFGEM cloud team and ensure components they are using to maintain the optimum security posture are always using the latest versions available.

10. Define what success looks like and publish performance data

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • The team have identified several KPIs to monitor usage of the service.
  • The team are implementing a CSAT/Feedback mechanism to capture valuable user feedback for review and service iteration.

What the team needs to explore

Before the next assessment, the team needs to:

  • The team do not have a Performance Analyst and are using a mix of their Business Analyst and Product Manager to fulfil this role. The team must ensure they have support from a performance analyst (even if this resource sits outside the direct team) to ensure they have identified and are correctly measuring and reporting the right data to assess the impact of the service during the beta phase.
  • The team must ensure that the agreed KPIs are tied back to the identified user needs of the service. This will allow them to ensure the service is meeting those user needs. A performance framework (as an example) is a good tool to use to allow the team to achieve this and ensure it is clear where the data for each metric is being sourced from.
  • Whilst the team have identified KPIs, it was unclear where, when and how they will be able to access data to assess and monitor the KPIs. The team are considering Google Analytics, Power Bi and a variety of data sources but have not yet finalised the data capture and reporting points. The team must ensure that the mechanisms to capture and access the required data during beta are fully understood and in place prior to release of the service. As the beta phase is above learning how service users use the service in a controlled but ‘live’ scenario, the team may wish to request data more frequently than monthly to ensure they identify any potential pain points and iterations as early as possible within beta.
  • It was not clear how the team will be able to identify if users are struggling to access or use the service, although the team did mention reviewing telephony requests. The team must ensure all relevant data points (both online and offline) have been identified and the data availability, frequency and provision mechanisms for each channel are agreed, documented and in place prior to beta.
  • The team are planning to use Google Analytics data to understand in service user behaviour. It is not yet known how many of the small target user base will accept cookies and therefore allow GA data to be collected (many large organisations may automatically block cookies from devices). The team must establish the impact of cookie consent on their user base to establish if GA data is representative enough to be used if iterating the service based on this data.
  • Key KPIs relate to the number of errors contained in each file upload, with the expectation that error volumes and reasons will reduce over time as suppliers correct common errors. The team must ensure a mechanism is in place to review both the type and volume of errors being reported by user group to ensure they can target guidance and iteration to support and prevent common/future errors occurring.
  • One of the primary aims of the service is to allow suppliers to monitor their progress against identified obligations. In order to do this, the team should investigate the viability of offering a ‘Summary’ snapshot dashboard for service users with high level usage info (for example, X measures submitted, Y measures approved, Z target measures required), if their research indicates a user need for this type of summary.
  • The team should investigate/consider monitoring service user onboarding and offboarding requests to ensure they have a clear picture of the number of actual service users. They should also consider reporting and management of users who do not interact with the service for a period of time to ensure only authorised users can access the service.
  • Whilst the portal provides detailed information concerning measures uploaded and their status, it was not clear how the reupload following correction of errors process will impact what the user sees. The team should ensure that the information displayed to users is relevant and reflective of their current status without users needing to search for specific items or required information.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • The team made beneficial use of shared services and integrations and used a proxy-based pattern to interact with them.
  • There were numerous environments deployed and appropriate pipeline technology to progress and test artefacts and perform no downtime deployments in a safe manner.
  • Public cloud functionality was well utilised and appropriately managed by a separate team.
  • Rigorous testing at all levels was described by the team.
  • The solution was developed using modern architectural approaches and tools that were familiar within the OFGEM organisation.

What the team needs to explore

Before the next assessment, the team needs to:

  • Review the overall data capture plan. Working only with file submission and a web user interface seems to provide validation complexity and unnecessary areas of risk, and affects feedback efficiency compared to other methods of data ingestion such as an application programming interface (API). It is understood the current strategic direction of Ofgem has not fully embraced APIs and user feedback has factored into building this system, and this would be considered once other priorities had been delivered. It is recommended though this is investigated as soon as is possible as it would appear to be a highly suitable use case for an API and designed well could bring an enhanced service to the end users and overall benefits to Ofgem.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before the next assessment, the team needs to:

  • Consider coding in the open and the advantages it could bring.

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • The team utilised established patterns from the GOV.UK design system when building the service.
  • The team have made use of the GOV.UK Notify and One Login platforms.
  • The team are utilising services such as Ordnance Survey to provide Geospatial information and TrustMark to validate information.
  • The team provide a data dictionary to ensure that standard data formats are used.
  • The team utilise several common standards and patterns within the system.

What the team needs to explore

Before the next assessment, the team needs to:

  • Keep under review the use of CSV for file format. Whilst it is a widely used and common pattern, if the service should have an extremely high submission in single files, processing efficiency and validation can be easily compromised.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • The team carefully considered the overall architecture and availability of the system following widely recommended highly available architectural patterns.
  • Data management and backups are well-managed, and data is suitably replicated.
  • Logging and monitoring have been designed to enhance overall observability of both application and infrastructure, and teams are in place to respond to and be able to fix issues quickly.

What the team needs to explore

Before the next assessment, the team needs to:

  • Review the cost and sustainability of the system. Whilst it is applauded that the system is built with a multi-region failover, it must be questioned given the windows of use and nature of the platform, if an always active disaster recovery environment is always required and is the more cost effective and sustainable solution, given the system RTO and RPO requirements.

Next Steps

The service must undergo a re-assessment against the points of the Standard that were not met at this assessment (1-6, 8 and 10). The recommendations for the ‘met’ points are looking ahead to the public beta phase.

Updates to this page

Published 12 July 2024