Renewable Electricity Register alpha assessment report

The alpha report of Ofgem's Renewable Energy Register service assessment on the 09/01/2020.

From: Central Digital and Data Office
Assessment date: 09/01/2020
Stage: Alpha assessment
Result: Met
Service provider: OFGEM

Service description

A service aimed at supporting users in the administration of their obligations/activities within three government environmental schemes:

  • Renewables Obligation (RO)
  • Renewable Energy Guarantee of Origin (REGO)
  • ROO-FIT

To provide this service, we need to support users by:

  • maintaining a record of renewable electricity generating stations accredited within these schemes
  • validating data submissions for generated electricity
  • issuing RO certificates (ROCs) & Renewable Energy Guarantees of Origin (REGOs) for eligible renewable energy generated
  • ensuring compliance by obligated energy suppliers.

Service users

RO, REGO and ROO-FIT scheme participants as well as Ofgem internal users who administer the scheme

  • generators who own/operate renewable energy generating stations and claim support under the relevant schemes.
  • agents who act on behalf of generators to claim support, completing all their tasks and interactions with Ofgem
  • traders who buy & sell ROCs
  • licenced electricity suppliers who have an obligation to present ROCs to Ofgem annually to represent a certain proportion of the electricity they have sold, which has been generated by renewable means.

Assessment overview

Overall, the panel were pleased to see an enthusiastic and passionate service team and were impressed with the quantity of work completed during the Alpha phase.

The service team clearly conveyed the potential for improvement on the current service in terms of opportunities to better meet user needs, provide a high quality user experience, and meet Ofgem’s business goals. The development of this service can be seen as an important part of the Government’s commitment to meet a net-zero target for greenhouse gas emissions.

The existing service encompasses a number of differing schemes, a variety of users and needs, and a number of differing user journeys - some with complex business logic. These factors potentially contribute to increasing the risk around the development of the service, and migration of users during the Beta phase.

Within the next phase of delivery it is recommended that:

  • the team prioritise developing the new service for specific schemes, personas and/or journeys, and release these to users incrementally (instead of back-loading the migration / cut-over plan for the full functionality) - see point 5
  • ensure there is adequate user research resource in beta to collect usability testing feedback and explore assisted digital and accessibility needs in greater depths
  • if the service team plan to continue with their big-bang cutover approach, that they work immediately to make sure they have willing users to “double submit” their returns during the critical September/October testing periods.

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were able to demonstrate changes made to the service based on user research
  • the service has been tested with all users and plans are in place to test with external users of assistive tech and also to test the service on different devices
  • team have recognised the need for support and plans for assisted digital support if any.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure service is tested on different devices, all journeys are tested with all types of users including users of assistive tech
  • run research sessions with people’s real data, once it’s migrated from the old system.

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • a good plan is in place for ongoing User Research
  • have a plan to explore assisted digital and accessibility needs in beta.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure there is a full time user researcher in the team throughout the beta development
  • explore more fully whether API/batch submission capability would meet a user need.

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the expected set of skills and team members have been included for the duration of the Alpha phase, there is a clear separation of roles, and understanding of roles and responsibilities
  • the team have ambitious plans to evolve the existing service and create better outcomes for users, and have a Service Owner with clear vision for this
  • the team reports that they are empowered to make decisions based on their plan to meet user needs, and create a high quality service for users.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that there is adequate User Research and Design support throughout the Beta phase, to be able to deliver the ambitious scope
  • make a plan for knowledge transfer and handover to ensure knowledge is not lost when supplier team members leave the service team. This could be especially important when bringing in new software developers given the ambitious scope of the Beta phase.

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are following a regular cadence of agile ceremonies which they use to prioritise, plan, and deliver their work, and to share progress with the wider organisation (including; stand-ups, sprint planning, show and tells, retrospectives)
  • the use of various tools and collaboration platforms has allowed the team to successfully incorporate elements of distributed working (including; slack, azure dev ops, surface hubs, skype)
  • the team is using good development practices - pairing, code review, modelling.

What the team needs to explore

Before their next assessment, the team needs to:

  • work with stakeholders and the wider organisation to explore if any changes to the programme governance and reporting model are required to better support the agile development and running of the service
  • create an outcomes focussed roadmap for the Beta phase, with a backlog of user stories developed through iterative processes, and informed by user research and data.

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team completed 5 day User Research and Design sprints throughout the Alpha phase, with User Research sessions informing the scope for the following sprint, and weekly planning sessions to agree areas of focus
  • the Alpha phase has provided clear evidence of what didn’t work for users, and what has been retained and improved
  • the team makes collaborative prioritisation decisions on the next most important activity or area to explore.

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how to deliver regular iterations and deployments to the service throughout the Beta phase. The team acknowledges that the current migration / cut-over plan for moving users to the new service is technically complex and high risk. Currently, this risk is entirely back-loaded into October which made it difficult for the panel to have confidence in the team’s delivery capability. The team should re-explore opportunities to deploy smaller sections of the service for specific user groups earlier in the Beta phase, both to de-risk the migration / cut-over plan, and to focus on delivering value early and often, to test the service is meeting user needs. This may be difficult for the team to achieve, but so will implementing the entire service in one go over a single weekend
  • ensure their use of progressive enhancement is appropriate, and that they are not delivering critical or important functionality that is not accessible to some users
  • consider the best way to migrate users to the new service with minimal impact and confusion. Currently, it is likely that every user will have to re-register over a period of a few days which could cause significant strain on Ofgem’s support capability
  • be wary of viewing the beta phase as implementing what was designed in the alpha phase. There is a lot to be learned through the process of building and testing something with users; it’s important that this continues in beta.

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team showed that they had undergone considerable effort to fully evaluate whether they were better to build or to buy the service, before deciding that the amount of customisation that would be required if they bought would create unnecessary complexity
  • the service team ran a number of technical spikes during the discovery phase to evaluate their planned approach for a number of systems, including identity, data modelling and migration plan
  • the service team were able to clearly demonstrate the evaluation process they used on a single component (notifications), including demonstrating that they had considered other services before deciding that existing Ofgem service met their needs.

What the team needs to explore

Before their next assessment, the team needs to:

  • increase the scope of their technical spikes and focus on how different components will integrate and work together to ensure the entire service will function as planned, rather than just individual components.

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • had a good understanding of the security and privacy threat model that the service will face.
  • the service team could show some consideration for the potential fraud vectors they may face when the service goes live
  • are planning to build a transparent audit log that all users can see to ensure data integrity in the system
  • have considered how they will introduce AV scanning into their document processing workflow, which (if not properly configured) could represent a significant potential vulnerability in their service.

What the team needs to explore

Before their next assessment, the team needs to:

  • more fully describe their approach to security and risk management
  • should consider how they segregate audit and security monitoring from the operations team so that it cannot be tampered with, especially considering the current plans to build in certificate audit as part of the application and not as a separately secured service
  • follow the NCSC guidance on safely importing data: https://www.ncsc.gov.uk/guidance/pattern-safely-importing-data

8. Make all new source code open

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has made considerable use of open sourced code and patterns that were provided by other public sector bodies including MoJ and BEIS.

What the team needs to explore

Before their next assessment, the team needs to:

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has made extensive and consistent use of existing platforms, patterns and standards throughout the service
  • the service team has a strong focus on not rebuilding existing functionality, but reusing it where they can
  • the service team plans to share their own patterns and services with other teams within Ofgem
  • the microservice architecture that the service team have chosen will make adopting and changing open standards and common platforms much easier.

What the team needs to explore

Before their next assessment, the team needs to:

  • consider whether API/batch submission functionality would meet a user need, especially considering the service will exist in some form for at least 20 years and the easiest time to add this functionality would be during Beta
  • make all new source code open so that they can share their own patterns and services with other organisations.

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team already has an effective deployment environment and that Ofgem already has the proven capability to develop and to deliver in this environment
  • the service team understand the systems they intend to build and the testing requirements they will have for those systems
  • plan to augment their testing with automated tools for static scanning and for accessibility compliance
  • have a clear focus with the rest of Ofgem on increasing test coverage within their code.

What the team needs to explore

Before their next assessment, the team needs to:

  • re-arrange their delivery plan so that there is more testing with users on a live system earlier in the process, which will identify potential issues much more quickly
  • start testing with the new data model as soon as possible, so as to find potential areas where it may be less effective than the previous data model
  • start testing with the Ofgem design system equivalent as soon as possible, as there is a risk that some patterns they have used may not be compatible with Ofgem branding and design guidance
  • significantly increase testing plans if they intend to continue with the big bang migration approach, and ensure that testing time is not cut down to keep to the timeline if development is delayed
  • give more examples of the unhappy path at future assessments.

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team have already considered options for continuing to run the service if it goes offline
  • the service team were able to explain how users would be affected if their service was unavailable for any length of time.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to prepare and test offline plans for the service, including ensuring that Ofgem’s support capability would be able to handle the increased activity.

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have worked really hard to make good, considered design decisions throughout what is a big service; their approach during alpha started with the core of the user journey, and expanded from there
  • the language used in the service manages to be clear and straightforward, but uses the appropriate technical terms where users will be familiar with them
  • the team have created some nice interactions to make the process clear and faster, for example the calculator.

What the team needs to explore

Before their next assessment, the team needs to:

  • think about the onboarding journey more carefully
  • try to avoid having 6 green buttons on the start page, and needing to explain the mechanics of users vs. accounts
  • spend more time designing for users who will be claiming existing accounts, rather than completely new users (since users of the existing system will, overwhelmingly, be in the majority)
  • make it straightforward to claim an account and verify that data from the old system is what’s expected
  • think a bit more about the journey for small generators in Northern Ireland – can there be a simplified version of the service that meets just their needs?
  • make it clearer what is required as supporting evidence – it’s very open-ended – this guidance should appear where the user is entering the information.

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • while the service won’t be GOV.UK branded, the team has made good use of patterns and components from the GOV.UK Design System, and the panel recommends that this continues during beta.

What the team needs to explore

Before their next assessment, the team needs to:

  • engage with the GOV.UK Design System team to work out the best way to use components and page templates from the Design System without using the GOV.UK branding
  • do research to check that the chosen branding is perceived as official and trustworthy.

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team provided clear evidence of the benefits and improvements of the new service over the existing service and offline/paper-based channels.

What the team needs to explore

Before their next assessment, the team needs to:

  • test the new service with more users with a wider variety of assisted digital needs
  • create an assisted digital support model
  • continue to estimate and quantify the support requirement for the new service (for first-time users, migrated users and business as usual usage) and work with the relevant groups in Ofgem to manage these impacts
  • consider the resilience of the offline / paper-based processes and whether these need to be refreshed or tested (the team reported some of the offline / paper-based processes have not received a request for around 5 years).

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have used quantitative data from the existing service to inform their user research and design for the new service (despite this being difficult to get from the current system and incomplete in some areas)
  • the team have identified the data sources they’ll use, including Google Analytics and the service database.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that data will give insight into the full range of user journeys, not just the ‘happy paths’
  • demonstrate they have an ongoing roadmap for performance analysis, for example considering how data collected during the dual-running period can feed into the product backlog
  • keep up to date with the latest guidance around data protection, for example working with cookies.

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has followed a custom approach to identify the most important key performance indicators for the service (in addition to the mandatory 4)
  • the key performance indicators include a balance of metrics which will measure; quality of outcome, quality of service, and technical performance and the team are developing a baseline measure for these.

What the team needs to explore

Before their next assessment, the team needs to:

  • consider running a ‘performance framework session’ (details shared with the service team) to further explore the relevant key performance indicators. The team has followed a good approach in this area. Running a performance framework session may help them to further develop their own thinking or highlight additional metrics to monitor
  • consider how they will use the insights generated from analysing the key performance indicators to continue to iterate and improve the service.

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have had initial contact with the Performance Platform team at GDS
  • the team has started to consider and plan how to report on the 4 mandatory key performance metrics, and additional metrics which are required to judge the success of the service.

What the team needs to explore

Before their next assessment, the team needs to:

  • make progress in agreeing the specific service definitions which will be required to calculate and report on the 4 mandatory key performance metrics (e.g. definition of a transaction, transaction cost etc)
  • confirm which data will be used for each of the KPIs - including a feature to measure user satisfaction and identifying opportunities for automation where possible.

18. Test with the minister

Point 18 of the Standard does not apply at this stage.

Updates to this page

Published 24 January 2020