Check your energy deal alpha reassessment

The report from the alpha reassessment for Ofgem's check your energy deal service on 25 July 2017.

Service Standard assessment report

Check Your Energy Deal

From: Central Digital and Data Office
Assessment date: 25/07/2017
Stage: Alpha
Result: Met
Service provider: Ofgem

The service met the Standard because:

  • User needs have been explored and understood
  • There is strong evidence to suggest that the solution the team plan to explore in private beta will meet key user needs
  • The team have a good approach to service governance and agile delivery

About the service

Description

The service gives an authoritative and independent view of whether a user on a standard variable tariff has a bad energy deal and links to cheaper ones that they can switch to quickly and easily.

Service users

The users of this service are domestic energy customers.

Detail

Ofgem have a strong service team working on this service, and they have made considerable improvements since their last assessment. They have done enough to justify moving into private beta. In their approach to the private beta, the service team should ensure they follow the recommendations the panel have made in this report.

User needs

The service team have worked hard in alpha to understand the reasons that people stay on more expensive tariffs and the barriers to switching. The team expressed what it learned in several clear and well evidenced behavioural profiles. The team has used what it learned to work through a number of service concepts.

The team have tested their prototype with a large number of likely users, and have some evidence that the service they have designed will be usable. They had tried different wordings and determined the most effective. There was less evidence of trialling different interface options to achieve the same goal and iterating based on the findings.

The team has concentrated on testing and iterating the path for users who have been on an expensive tariff for 3+ years - their main targets. They have recently done pop-up and lab testing with other potential users who could save money but who could not use the primary service (due to being out of scope and so OFGEM not having received data about them in advance). However this exercise seemed limited to getting acquiescence with a proposed design suitable only for private beta – incorporating a form to capture contact details for re-invitation later (though their suitability was not checked). It is also clear that this path does not meet the needs of many who will never be eligible but still could save money immediately - which the panel previously asked to be addressed.

The team have made a good start by including 8 people with lower digital skills in their research and testing. Among these were a user with dyslexia and another with arthritis. Despite family members usually acting for them, they completed the user journey themselves.

They could do more focussed research with potential users who lack confidence using digital services, and are therefore unlikely to use existing digital price comparison and supplier switching services. Similarly, for research with people with access needs, to meet the public sector equality duty of eliminating disadvantage caused to inaccessible services.

The inclusion of accessibility clauses in tenders and the intention to conduct an audit of the end to end journey was excellent.

Plans for beta have matured to be: invitation only, to 11,000 EON customers, in 5 Northampton postcodes. Ways to manage attempted usage by other people need to be developed further. Qualitative and quantitative research is planned around a geographically central Citizens Advice Bureau that has been contracted to provide assisted digital support. Feedback will also be sought via an inline satisfaction survey with telephone follow up supplemented by data about switchers from EON. The intention to research more users with access needs including in their homes etc is welcome.

The planned mass initial invitation to the 8 week private beta carries risks and limits experimentation, due to the short timescale and mainly one-off approach. Phasing communications would enable the service team to see how each iteration impacts the behaviour of users.

The team should also be alert to whether the (take up by the) sample is large enough and if it represents the wider (target) population, and consider widening the scope of research as necessary.

Before private beta begins the team need to learn whether the service is effective for likely users, including those out of scope, with a version that uses real tariffs and customer data. Research into issues such as whether to present information about cheap deals on either the OFGEM site or a partner’s site should inform commercial discussions.

Team

The panel continue to be impressed with how the team have approached service governance and delivery. The team are co-located and work in fortnightly sprints. They conduct regular retrospectives, and had good examples of retrospective actions leading to positive changes in team structure and delivery approach. The product manager is empowered to make decisions about service design. Senior management and policy professionals participate in user research and showcases. Overall, the panel have confidence in the team’s ability to deliver.

The panel are pleased to see that the team now have a dedicated designer and content designer in post since the last assessment.

The panel was disappointed that we were unable to view the prototype before the assessment. The team should make a separate stable version of their prototype available to the panel in advance so that preparation for the assessment can be properly completed ahead of time. The panel would also like to see the team better prepared for future assessments, with documents or different prototype versions pre-loaded so that time is used effectively.

Technology

The team have taken a very lightweight approach to technology, prioritizing building prototypes to learn from users. Beta will have a different architecture and software, an approach we encourage which the team should continue to iterate through beta as well as continuing to use prototypes.

For a beta the team will need to work on integrating data from possibly a number of different energy suppliers and had mature thoughts on reusing existing standards and methods for interchange of data from organisations external to Ofgem.

There is a tension in the design of the service between making the data easily available to users in order to nudge their behaviour and making the data easily available for a scraper to obtain the data in bulk. The team has a good relationship with the Senior Information Risk Owner (SIRO), conducting a privacy impact assessment on the data, and threat-modelling on the data, engaging consultancy to assess the impact of a household’s current energy supplier and address could be used against an individual, or how making the data available in bulk could be used by a bad actor to identify individuals to be targeted. The team sought and received guidance from the Information Commissioner’s Office (ICO) on the privacy implications of the service and how the service should behave in light of forthcoming General Data Protection Regulation (GDPR), in particular for data retention of personal data.

The team offered some mitigations against bulk-scraping the service.

The list of regulated energy suppliers and other reference data used by the service should be made available as GOV.UK registers.

To meet point 8 at Alpha, we would expect the team to explain how they plan to make all new source code open and reusable. This was not done. The team has yet to publish any code as open source. They must do this to pass the next assessment.

Design

The team has made significant improvements to the design of the service since the previous assessment. The prototype more closely follows the Ofgem brand, and the team were better able to articulate which Ofgem standards they needed to follow, and which they would need to define themselves. There will need to be further improvement as the team progresses to building a beta service - both in iteration and in design quality. The desktop view in particular needs further attention.

Where existing Ofgem standards do not exist, the panel recommends the team adopt GOV.UK design patterns, which are well tested. These should include form elements, using large font sizes, and form validation patterns.

Since the previous assessment the team had done some testing of unhappy paths of the service. The team were confident that users understood the signposting about not being eligible, though this appeared to be related to mention of restricted postcodes. The panel were looking for the service team to demonstrate the viability of a public service that is only accessible to only one third of households - but the team seemed to have focused on testing the unhappy paths for their private beta. As such the panel still has concerns that the long term viability has not been demonstrated, and advise the team to do more testing in this area, particularly before considering public advertising through social media channels.

There may be alternate design solutions for users without data - such as integrating with a price comparison website in a less customised way - but other options have not been considered or explored by the team. Such a path may still get some users to switch, and presents a less hard stop than telling them they’re not able to use the service.

The team had added a form for users to be notified about when the service was available to them - but this is mostly relevant during private beta, as the service is unlikely to get data for the other 2/3 of households. The team should consider whether this will give users unrealistic expectations - or worse, delay them switching - whilst they wait to be contacted.

The team showed some examples of changes they had made, but did not seem well prepared to present and discuss them. Examples of iteration generally included showing or not showing a page, rather than exploring different solutions to a design problem. The panel would like the team to explore a wider variety of concepts in addition to testing with and without pages or lines of content.

The supplier picker page had improved significantly since the previous assessment - though the panel still questions whether it is needed. The team stated it was not needed for security reasons, though Ofgem lawyers ‘prefered’ it. Their stated reason for including the page was that users needed reassurance the results related to ‘their’ data. The team had tested with and without the page, and concluded it helped. The panel believes this is one possible solution - but that there may be others that don’t require the user go through unnecessary steps and could be explored.

Significant details in the design of the service remain unresolved - including deals and sign-up pages. The team does not know what information these pages need to contain, nor whether they’ll be hosted by the service, the price comparison partner, or the end utility. With such significant parts of the service still to be defined, the panel believes it is unrealistic to expect to build a test a production service including these details in two months.

Service design

This is a new digital service. There are no non digital channels to support or transition from.

There is no Minister to test this service with, but it has been tested by the OFGEM Chief Operating Officer.

Analytics

There was no analytics in place on the prototype. This is not a concern because there are not yet any real users. Analytics should be in place for Beta. The panel were pleased with the team’s ideas for how the success of the service will be measured. We hope to see the team gathering accurate data from suppliers on switches completed as well as initiated. The team have started engagement with the performance platform, but not to the standard required at an Alpha assessment.

Recommendations

To pass the next assessment, the service team must:

Mandatory items that must be completed in order to meet the standar].

  • Demonstrate the viability of a service that won’t have data for 2/3 of users. This should include prototyping, testing and iterating on the unhappy paths of the service, or exploring how the service could support these users as well. This research should be independent from testing the private beta journey.
  • Ensure appropriate care is taken to secure and protect data submitted to the service.
  • Explore options for how handoff with 3rd party services will work.
  • Ensure all code is open source.
  • Set up a dashboard on the performance platform.
  • Find technical solutions to measure desired KPIs.
  • Have conducted an accessibility audit and resolved any issues it identifies.
  • Have conducted usability testing with users with access needs.

The service team should also:

These are be non-mandatory items that would improve the service.

  • Consider planning for significantly more time to build and test the production service, define key journeys, and ensure the service is accessible.
  • Consider expanding their private beta to last for longer than the 8 weeks initially proposed (the panel feels that 6 months is a more realistic timeframe to complete several iterations complete with user feedback and data)
  • Ensure access to private beta is restricted and that it is not findable by the public or search engines
  • Investigate operating the list of regulated energy suppliers and other reference data as registers.
  • Explore whether the service can integrate with price comparison websites for the remaining 2/3 of users without data (showing a less customised experience).
  • Explore options for how user questions about how the service works may be answered.
  • Explore and test options for how users may save or otherwise transfer results.
  • Explore different ways that users can be reassured the data is theirs.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Not met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 24 July 2018