Check your energy deal alpha assessment
The report from the alpha assessment for Ofgem's check your energy deal service on 9 May 2017.
Service Standard assessment report
Check your energy deal
From: | Central Digital and Data Office |
Assessment date: | 09 May 2017 |
Stage: | Alpha |
Result: | Not met |
Service provider: | Ofgem |
To meet the Standard the service should:
- Demonstrate the viability of a public service that is only accessible to only one third of households and explore the unhappy paths of the service
- Iterate on more areas of the HTML prototype and improve the quality of the design
- Seek extra assurance regarding the type of user data made available and how it can be made secure
About the service
Description
The service gives an authoritative and independent view of whether a user on a standard variable tariff has a bad energy deal and links to cheaper ones that they can switch to quickly and easily.
Service users
The users of this service are domestic energy customers.
Detail
The service team have conducted a thorough Discovery and begun their Alpha well. However, the panel feel that the team have come in for their Alpha assessment a couple of months too soon. They should spend more time in Alpha iterating on their prototype and exploring potential solutions so that they can de-risk their proposition and maximise the probability that their service succeeds in Beta.
User needs
The service team have worked hard in alpha to understand the reasons that people stay on more expensive tariffs and the barriers to switching. The team has expressed what it learned in a number of clear and well evidenced behavioural profiles. The team has used what it learned to work through a number of service concepts.
The team have tested their prototype with a large number of likely users, and have some evidence that the service they have designed will be usable. The team has only tested and iterated the path for users who have been on an expensive tariff for several years - their initial target user group. They have not tested the service with other likely users, who the panel expect will account for the majority of users visiting the service.
The team have made some efforts to include people with lower digital skills in their research and testing. But have done little focussed research with potential users who lack confidence using digital services, and are therefore unlikely to use existing digital price comparison and supplier switching services. The team did not have a clear plan for researching and testing their support model in beta.
The team have done no research with disabled people to understand current barriers to switching, or to explore how a digital service might help. This is a concern with respect to the public sector equality duty as the service could give a financial saving to people who can find and use the checker, and disadvantage those who cannot.
To learn whether the service is effective for likely users, the team will need to develop the service to use real customer data and real alternate tariffs. The team should do this through a limited private beta, before scaling to it’s currently planned geographic beta. Extra work is needed to understand the needs of users who OFGEM do not have data on before a private beta can begin.
Team
This is the first digital service OFGEM have worked on, and they should be commended for their efforts to adopt best practice in agile delivery in the way they have approached service governance and delivery.
The team are co-located and work in fortnightly sprints. They conduct regular retrospectives, and had good examples of retrospective actions leading to positive changes in team structure and delivery approach. The product manager is empowered to make decisions about service design. Senior management and policy professionals participate in user research and showcases. Overall, the panel have confidence in the team’s ability to deliver.
The team have lacked a dedicated designer so far, with the product manager filling a service design role. This has had a negative impact on the team’s ability to try out alternative concepts, with the HTML prototype evolving little over the course of the Alpha. They will need to have a dedicated designer and frontend developer to pass a Beta assessment. The team is also heavily reliant on consultants. The panel is pleased to see that the team plans to train a member of the OFGEM team in the product owner role, and hope to see him playing a key role in the next assessment.
The panel was disappointed that we were unable to see previous versions of the prototype during the assessment. In future assessments, the team should come prepared to show more demonstrable examples of how they have iterated on their service.
Technology
The team have taken a very lightweight approach to technology, prioritizing building prototypes in PHP and HTML to learn from users. Beta will have a different architecture and software, an approach we encourage which the team should continue to iterate through beta as well as continuing to use prototypes.
For a beta the team will need to work on integrating data from possibly a number of different energy suppliers and had mature thoughts on reusing existing standards and methods for interchange of data from organisations external to Ofgem.
There is a tension in the design of the service between making the data easily available to users in order to nudge their behaviour and making the data easily available for a scraper to obtain the data in bulk. The team has a good relationship with the Senior Information Risk Owner (SIRO), conducted an internal privacy impact assessment on the data, and threat-modelling on the data
The panel raised several significant concerns about personal data collection, storage, and the potential for abuse. More detailed comments were submitted directly to the service team.
The list of regulated energy suppliers and other reference data used by the service should be made available as GOV.UK registers.
The team has yet to publish any code as open source and must do to meet point 8.
Design
The team have spent much of their discovery and alpha exploring different service propositions and doing contextual user research to understand the reasons people remain on costly standard variable rate tariffs. This work has enabled the team to identify a proposition that is more likely to result in users switching to a cheaper tariff, and has focused on reducing barriers to people considering switching.
The panel were pleased to see the consideration given to the different service propositions before beginning work on one in particular. Exploring radically different concepts is rarely seen, and gives confidence to the chosen proposition. However, there appeared to have been little substantive iteration on the chosen proposition, or exploration of the different ways it could be delivered.
The prototype demonstrated feels closer to a proof of concept demo than one that has been iterated to explore design options. Whilst it is is likely heading in the right direction (supported by initial user research on propositions), the team have not explored the concept sufficiently to know enough to commence the beta phase. The prototype has usability issues and does not yet have sufficient design quality to feel like an real Ofgem service.
The team had been designing with a mobile first methodology, and concentrated on mobile users in user research. This is an excellent strategy, ensuring important information is prioritised. However, the desktop experience felt forgotten, and the team acknowledged limited work or research had occurred on it. The team should ensure that user research covers a variety of use cases, and not rely entirely on user preference during research.
User data and failure paths
Initially, the service will only have data for approximately 1/3 of households - those with standard variable rate tariffs for 3 years or more. Given that up to 2/3 of households will not be able to use the service, the panel were disappointed to see this journey (failure path) had not been explored. The panel has concerns that not testing this journey means that the team has not learned about the user needs in this area or whether the overall proposition will work.
Additionally, the team indicated they planned to use social media and advertising to promote the service. The panel are concerned about how users may respond if 2/3 of people are turned away - and what negative reaction this may provoke from users on social media.
Dedicated designer
During alpha the team had a designer working half time on design, whilst also covering front-end prototyping. Services should have a dedicated designer (points 3 and 13) working on the project. The lack of dedicated designer has likely resulted in minimal iteration of the design, or exploration of alternate concepts to take to research. The team indicated they had concerns about the handoff from the service to supplier pages, but no design exploration had occurred in this area.
Ofgem, branding, and design quality
The service is operated by Ofgem and as such does not need to follow the GOV.UK brand. However, the service will need to follow the Ofgem brand and feel like an official service (point 13). The team stated this would be the first time Ofgem produce a citizen facing service - however they conceded that no work had occurred yet to ensure the service followed the Ofgem brand other than adding an ‘Ofgem’ logo on each page. The team needs to spend more time on the design to ensure that visual elements follow the Ofgem brand, whilst typography, layout, white space and other styles are improved to ensure the service feels official. The panel recommends the team engage with other non-GOV.UK teams (such as NHS, Petitions, ONS, Ofsted) to see how they have adopted GDS patterns for usability and accessibility whilst working with another brand.
Experimentation and iteration
The team demoed v4.1 of their prototype, and later made previous prototypes available to the panel after the assessment. After testing an alternate proposition, the panel did not see evidence of significantly iterated the concepts or explored design options. The panel would have liked to have seen more experimentation at alpha phase - they will help uncover needs and test assumptions. Having a dedicated designer would have likely meant they challenged the team to try more ideas in research.
Areas to explore include:
- different ways to tell users they are on an expensive tariff
- different ways of answering user questions about how calculations are done
- different ways of presenting ‘deals’ and associated savings they offer.
- iteration on the day/week/month/% picker - which the panel believe may have usability issues
- explore concepts for easing the journey from the service to 3rd party suppliers
Supplier picker page
The team had recently added a ‘supplier picker’ page at the request of legal colleagues as a security measure. The panel are concerned about this addition as it was indicated this security measure could easily be bypassed by trying each option in turn. The team will need to consider carefully if the page is indeed required. The panel have significant concerns about the design and usability of the page, and note that it has not been usability tested or iterated. The team needs to iterate the design of the page and ensure the design is backed up by evidence from user research that users succeed first time.
Saving results
The panel would have also liked to have seen exploration into how results might be saved / transferred / forwarded. Even if these features aren’t possible for the first release, they should be explored as a priority. There was an assumption that all users would happily change supplier straight away (including from a mobile device) and as such, these features weren’t required. The panel would like to see these features explored to confirm the assumption that they’re not required.
The prototype currently does not include any of the supplier flow after the initial sign-up page. The panel recommends some usability testing is conducted with a fuller supplier signup page, including non-mobile optimised journeys. This will help the team learn about the fuller signup process, and whether users are likely to commit to it when presented with a long form.
Service design
This is a new digital service. There are no non digital channels to support or transition from. The team have plans to conduct targeted social media and physical advertising during their public beta in order to encourage take-up of the service, and plan to engage with a third party supplier who specialises in this area. This may be an effective way of reaching the service’s intended user base, but the team will need to carefully measure this to ensure the strategy works whilst only having data for one third of users, and does not cause negative reaction from users who are not able to use the service.
There is currently no plan for the service being unavailable. Whilst the team have identified organisations to partner with to provide support to users with low digital skills, they did not have a plan in place for this. The team have not yet engaged with the performance platform. All of these gaps should be addressed before the team can pass a Beta assessment.
There is no Minister to test this service with, but it has been tested by the OFGEM Chief Operating Officer, and by colleagues at BEIS.
Analytics
There was no analytics in place on the prototype. This is not a concern because there are not yet any real users. Analytics should be in place for Beta. The panel were pleased with the team’s ideas for how the success of the service will be measured. We hope to see the team gathering accurate data from suppliers on switches completed as well as initiated. The team have not yet spoken to the Performance Platform about setting up a dashboard.
Recommendations
To pass the Alpha reassessment, the service team must:
- Present a revised plan for private beta that does not rely on making it publicly available, and greatly limits the volume of users.
- Consult the Information Commissioner’s Office (ICO) and seek their assurance that this service does not put personal data at risk.
- Demonstrate the viability of a public service that is only accessible to only one third of households. This should include prototyping, testing and iterating on the unhappy paths of the service - including for users who Ofgem doesn’t have data for, or for where the address isn’t found.
- Have a designer work full time on the service, ideally supported by a front-end developer.
- Iterate key design elements such as picker page and deals summary to ensure information is clear and users understand what to do.
- Work on the design of the service to more closely align with the Ofgem brand, use GOV.UK patterns where relevant, and spend more time on the desktop view.
To pass the Beta assessment, the service team must:
[Mandatory items that must be completed in order to meet the standard].
- Significantly improve the quality of the design, ensuring it follows Ofgem’s brand guidelines and feels closer to an official service - both on mobile and desktop.
- Explore options for how handoff with 3rd party services will work.
- Ensure all code is open source.
- Plan for the service being taken offline.
- Set up a dashboard on the performance platform.
- Set up analytics on the service.
- Find technical solutions to measure desired KPIs.
- Focus more research on people who are less confident using digital services, and therefore unlikely to use existing price comparison and switching services
- Have conducted usability testing with users with access needs, and validated the accessibility of the service by having it audited by a third party, and any identified issues addressed.
The service team should also:
- Investigate operating the list of regulated energy suppliers and other reference data as registers.
- Explore options for how user questions may be answered.
- Explore and test options for how users may save or otherwise transfer results.
- Extend usability testing to cover more of the utility sign-up process.
- Determine their initial approach to assessing assisted digital needs.
- Challenge the requirement from their legal team that a provider picker page be included in the service.
Next Steps
In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.
Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.
Submit feedback
Get advice and guidance
The team can get advice and guidance on the next stage of development by:
- searching the Government Service Design Manual
- asking the cross-government slack community
- contacting the Service Assessment team