Find an Energy Certificate alpha assessment report
The assessment report for MHCLG's Find an Energy Certificate service on the 04/04/2019.
From: | Central Digital and Data Office |
Assessment date: | 04/04/2019 |
Stage: | Alpha |
Result: | Not met |
Service provider: | Ministry of Housing, Communities and Local Government |
Service description
The service for homeowners and tenants to check the energy performance certificate (EPC) for their properties; for energy efficiency assessors to lodge their assessments; and for data consumers to extract insights from figures on energy performance.
Service users
- Property Owners
- Tenants
- Energy Assessors
- Data Users - civil service economists, data scientists, analysts, academics, Energy Assessor Schemes
1. Understand user needs
Decision
The team did not meet point 1 of the Standard.
What the team needs to explore:
- it is evident from user research that the team have been able to identify most users of the service. However, the assessment panel felt the numbers (34) spoken to is not enough to state that all user needs have been uncovered which would have resulted in breadth of insights
- the team have focused on iterating the service using unmoderated research tools. The panel would have liked to see more around what users are trying to do whilst encountering the service, how they interact with the service face to face (moderated testing), where users currently go for help and more around their thoughts, circumstances and behaviours
- it was good to see that interviews with user groups/focus groups/workshops were used as a research tool but there was less of face to face which is important to uncover user behaviours and pain points
- with regards to accessibility, whilst it is appreciated that the prototype has been tested with the DAC/automated testing has been done, more is required especially in BETA around testing with actual/potential users of the service that use assistive tools to go online
- the team have considered assisted digital and the potential help users may need, however, it was felt that more research and testing is required with these users. Firstly it is important to understand the needs of these users/behaviours and secondly what would they do if they get stuck in the service or don’t understand what to do
- the panel encourages the team to think about what their assisted digital triage process may look like, including any offline support, and start to work with whoever will provide this to make sure they can differentiate between AD and general queries. If there is difficulty finding users that use assistive tools to go online/Assisted Digital (AD) users, then a recruitment agency could be used to recruit.
2. Do ongoing user research
Decision
The team did not meet point 2 of the Standard
What the team has done well:
Citizen/Assessors/Software Providers Research
- it is evident the team have carried out some good research to understand users and their needs. They have also produced really good personas as a way to tell the story about their users and their needs. However, the number of users interviewed and tested do not seem enough to get in-depth insights (from each user type) especially with regards to their behaviours and motivations.
What the team needs to explore:
- continue research and testing to get further insights from a larger pool of users and also consider some moderating testing to obtain more meaningful in-depth insights
- ensure the pool of users also incorporates users who may need assisted digital support and users of assistive tools (accessibility)
- show a plan for ongoing continuous research
- the panel felt that the team had some plans for further research but not an ongoing plan which demonstrated what they will research/test, when and with which user group(s)
- the panel would have liked to see a clear plan showing how the team plans to test the service with Assisted Digital (AD)/access need, and users that require no help.
3. Have a multidisciplinary team
Decision
The team did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the service team are co-located and demonstrated a strong collaboration
- the Service Owner and the whole team understand the governance approach and they have regular interaction with those in the wider policy and project team to ensure work is on track
- the service team have used the experience of related roles within the team to support UR work, i.e. the data analyst has been able to input this area of work.
What the team needs to explore
- ensure a designer is recruited to the team. The team up to this point has been without a designer and this has impacted the development of the prototype. A designer will be able to review the design work undertaken so far and be able to influence the shape of the service going forward in response to the user research undertaken
- re-address the ratio of contract staff and permanent civil servants going into the beta phase. At present 70% of the team is made up of contract staff. The assessor panel understand the digital capability journey the organisation is going through but there is a high risk to delivery without a plan to re-address this imbalance.
4. Use agile methods
Decision
The team met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the service team use a range of tools including Github and slack to communicate effectively and for storing artefacts of the service
- the service team undertake weekly show and tells to a wide audience to communicate the developments within the service
- the service team have developed and are recognising links into the wider data consumer area.
5. Iterate and improve frequently
Decision
The team met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the source code was regularly updated
- the team used Github’s issue system to track its work and link it to source code iterations
- used git branches to mark end of sprints (although tags would have been better).
What the team needs to explore
Before their beta assessment, the team needs to:
- move the source code of the alpha in Github under MHCLG instead of NotBinary
- review the weekly sprints currently undertaken to ensure the team are allowing themselves sufficient time to feed in iterations as appropriate from the needs arising from UR.
[General notes about the tech part of the service]
The technical part of an alpha assessment is concerned with the prototype and the way it was designed. But more importantly, it is also about finding out of the delivery team has the skills and capacity to deliver a Beta service.
This alpha prototype was implemented by an external supplier (NotBinary), and as of this writing it is not known if the same team will be designing the beta service. It seems that the current expectation is that some, if not all, members of the technical team will be replaced.
This makes any assessment on current technical skills needed for Beta likely to become irrelevant later. However the current technical team have usefully shared a document including recommendations to be used for the implementers of the Beta service, which the tech assessor has used to produce the tech sections that follow.
The work needed to design a new user-facing service (including an API, and presumably an administrator interface), while seamlessly swapping the legacy backend of a high-traffic national service for a new implementation, is quite substantial. It seems very ambitious to expect that this will be done in one year.
With a new team starting the beta phase, part of that one year would be spent studying the recommendations of the current team in order to make their own informed decisions, before starting the actual implementation. The service team is encouraged to revise their time estimate, and produce a more detailed plan for the technical implementation of the service that will help the beta team avoid spending too much time working out details. Even so, going with a live service and ending the contract with the current supplier in April 2020 remains unrealistic.
6. Evaluate tools and systems
Decision
The team met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the delivery team have demonstrated a good knowledge of the tools needed to create a digital service: the prototype was built with the GOV.UK prototype kit and deployed on herokuapp, and the source code is publicly available on Github
- the options available to the team for Beta are outlined in the recommendations document, and demonstrate good knowledge of the technology used by modern web applications
- the team also demonstrated good judgement as to the technical decisions that will need to be made, taking into consideration that part of the work will be to replace a 10-year old legacy service
- the team are aware of the tools and components offered by GDS, such as PaaS, and reassuringly don’t expect to solve hard problems, such as address matching, on their own
- the team has not started implementing the beta service at alpha, and is confident that the next team will start the Beta service from scratch.
7. Understand security and privacy issues
Decision
The team did not meet point 7 of the Standard.
What the team has done well
The panel was impressed that:
- privacy concerns are well understood by the team, unsurprisingly since the existing service has been running for many years. The team are also aware of current data protection concerns, including GDPR compliance.
What the team needs to explore
- the service team expect that the beta service will be hosted on GOV.UK Platform as a Service. There is a risk that the team will then assume that all operational concerns, including security monitoring, will therefore be dealt at the platform level. This is only true to a certain extent. Security concerns need to be evaluated beyond hosting considerations, and the team will have to define a threat model, evaluate risks and establish responsibilities. Defining anti-personas, as the team has done, is a good start but technical decisions need to be made as a result of this and other security evaluations. The team hasn’t demonstrated this knowledge to the panel.
8. Make all new source code open
Decision
The team met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the source code of the prototype is fully open and hosted on Github
- the team expects all future code to also be released as open source.
9. Use open standards and common platforms
Decision
The team met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team are aware of existing platforms offered to government services, like PaaS
- the prototype displaying the energy certificate as HTML (as well as PDF) indicates that the team understand the advantage of using open standards
- the team don’t expect to solve hard problems like address matching alone.
What the team needs to explore
Before their beta assessment, the team needs to:
- be aware that over-reliance on specific cloud platforms, like AWS, can result in a service that can no longer be considered 100% open
- keep in mind that the service may have to be migrated to another provider and build their infrastructure accordingly
- the possibility of exposing EPR data as a GOV.UK Register.
10. Test the end-to-end service
Decision
The team did not meet point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the prototype was submitted to the Digital Accessibility Centre, even though it’s probably a bit early in the process
What the team needs to explore
- demonstrate that they have tested the service not only with assistive technology, but also all main browsers, to make sure the service works for every user
- show that they tested the prototype with various browsers, and that user testing sessions were conducted with the user’s usual browser, on their PC, or tablet, or phone.
11. Make a plan for being offline
Decision
The team did not meet point 11 of the Standard.
What the team has done well
- no specific considerations about operation of the service, including maintenance or incidental interruption of the service has been presented by the team. The panel infers that the team expect this aspect to be covered by what the hosting provider offers, like for security and recovery. Yet this point should still be addressed specifically, in particular to demonstrate how to rebuild the infrastructure if need be, and how to help users when interruptions happen.
12. Make sure users succeed first time
Decision
The team did not meet point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team has prototyped the potential service for assessors and service users, and decided to pivot away from building a web service for assessors, instead improving the API for assessor software
- the team has started to explore the issues around property addressing
- the team has improved some of the terminology used throughout the service, despite a lack of content designer and interaction/service designer.
What the team needs to explore
- explore what the service is and reframe the service around the need, not the objects (certificates)
- prototype diverse key screens and journeys as part of this exploration, and iterate these based on 1 to 1 research with users. Be bold in thinking about what citizens could use this energy performance data for, and try to present it in different ways that could be useful, including iterating or replacing the current certificate template
- explore more the data that is currently held within the service, and the needs and problems of exposing more of this publicly
- explore secondary uses of the data, such as being the unofficial source for property sizes. What needs to be built to support this data usage?
- have a clear idea of what should be built in the beta, and document this and the exploration, including all research carried out, for the future beta team.
In the beta, the team should:
- explore the accessibility needs of being an assessor and submitting an energy assessment. The assessor user group have separate needs and will on their own have over 100k interactions with the service
- carry out user research with people who have a diverse set of access needs as well as any more technical accessibility audits.
13. Make the user experience consistent with GOV.UK
Decision
The team met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team has been using the prototype kit and design system in the alpha
- the team wants to move away from PDFs of certificates towards single, canonical URLs for properties.
What the team needs to explore
Before the beta assessment, the team needs to:
- iterate the content and terminology used, and maybe create an internal glossary for consistency
- work with comms teams in MHCLG and BEIS, and GOV.UK, to make sure content and information surrounding the service is consistent and useful
- make sure any non-standard components and patterns are fed back into the design system, including any research.
14. Encourage everyone to use the digital service
Decision
The team met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the service team have looked at the issues raised from users when using the help desk and intend to use a digital pamphlet to support users navigate the service.
What the team needs to explore
Before their beta assessment, the team needs to:
- consider how the outcome of additional user research undertaken with those users needing support to use the service impacts any digital take up plan.
15. Collect performance data
Decision
The team met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- there is a nominated lead for analytics and they have been able to use data from the existing service to influence analysis of the service going forward
- the service team have identified tools to monitor the service, including hotjar and google analytics.
What the team needs to explore
Before their beta assessment, the team needs to:
- ensure insights from assisted digital research is fed into the analysis going forward.
16. Identify performance indicators
Decision
The team met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the service team have considered the data sources and metrics used for the existing service and they have identified a range of additional indicators which they will use to measure and support improvements. These include measures for service users, assessors, software providers and data consumers
- the service team have considered how they will be measuring the mandatory KPI’s. User feedback will be collected via several channels both quantitatively and qualitatively including in service feedback, help desk information and assessor forums. Completion rates will be focussed upon assessor lodgements.
What the team needs to explore
Before their beta assessment, the team needs to:
- review KPI’s following any additional research which may change the service objectives
- establish benchmarks for additional KPI’s .
17. Report performance data on the Performance Platform
Decision
The team met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the service team have registered with the Performance Platform and have a plan to publish a dashboard.
18. Test with the minister
Decision
The team met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- although point 18 of the Standard does not apply at this stage, the SRO/Digital Leader has been working closely and supporting the service.