Apply for a breathing space alpha assessment report

The report for the Apply for a breathing space alpha assessment on the 14/05/2020.

From: Central Digital and Data Office
Assessment date: 14/05/2020
Stage: Alpha assessment
Result: Met
Service provider: The Insolvency Service

Service description

This is a new service that will enable Debt Advisers to enter individuals into a Breathing Space. Breathing Space will give someone in problem debt the right to legal protections from creditors by freezing enforcement action, interest, fees and charges for 60 days, in order to continue engaging with debt advice and identifying an appropriate long-term debt solution.

Service users

  • Debt Advisers (primary user)
  • Creditor organisations (primary user)
  • Individuals in ‘problem debt’ (secondary user - they will not have direct access to the service)
  • Mental Health Professionals (secondary user - no direct access to service)
  • Insolvency Service staff (back end system user)

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the user research was comprehensive and inclusive, a wide range of methodology used to uncover user needs
  • it was clear from the personas and journeys created that the team have been able to uncover in depth the needs of primary and secondary users
  • the team spoke confidently with regards to how users would transfer ownership of the account to another debt advisor if they were not happy
  • they were also able to talk about how users would start their journey and interact with the service
  • it was good to see that through user research they were able to evidence user pain points for example the need for end users to receive notifications throughout the process for greater transparency
  • the team included users of assistive technology in their research and users who fall in the assisted digital spectrum. They were able to show how users would be supported in the process by doing internal workshops.

What the team needs to explore

Before their next assessment, the team needs to:

  • test the service journey with users of assistive technology
  • continue to research with end users to ensure their needs are addressed
  • also continue to include users in the assisted digital spectrum in your research and ongoing testing.

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a very good plan in place for ongoing research
  • it is imperative that the team continue to research with end users to understand their needs and to ensure that changes made take away end user pain points
  • the need for inclusive research must continue as its important design changes made address the needs of assisted digital users and users of assistive tech.

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a multi-disciplinary, agile team in place with a clear separation of roles, and understanding of roles and responsibilities
  • all members of the team are working full time on this project
  • in addition to the core team there is support from the SRO, Solution architect, and policy advisors.

What the team needs to explore

Before their next assessment, the team needs to:

  • moving into beta, the team are bringing on a supplier to fill the roles of developers, testers and a delivery manager. The team needs to ensure that there is sufficient plan to work out how they will integrate with the existing team and how knowledge sharing and support will be handled
  • one of the user researchers on the team are on a contract that is potentially ending soon. There is a risk that the vast knowledge they have might be lost. The team must ensure that user research is well documented and there is a sufficient hand over and knowledge transfer should they leave the team
  • ensure there is adequate Content design support throughout the Beta phase.

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a mature use and good understanding of agile tools and techniques
  • they are working in 2-week Sprints
  • they are using Trello to monitor progress and Jira to track and manage work and Slack for day to day for communication
  • the team showed that they are regularly engaging with their stakeholders, doing show & tells and giving updates to senior stakeholders.

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • this is a user-driven project, prioritisation is informed mainly by user research and it seems the team is empowered to take decisions on their service
  • the team demonstrated how user research feeds into their 2 weekly sprints
  • the team showed their plan for Beta and how user research will fit into their sprints and iterations.

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the tools and systems selected are open source whenever possible and are part of a mainstream technology ecosystem
  • the team made sure that the application is containerised in order to make it easier to migrate to another hosting stack should the need arise
  • the team chose an off-the-shelf CMS rather than embark on rewriting one.

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that the service doesn’t rely on Microsoft technology so much that it becomes very costly should the agency decide to migrate to another provided
  • major technology choices were made so that the platform is aligned with the rest of the agency. While this is understandable, the team should remain aware that it doesn’t mean that those choices are the best in themselves. Therefore the team should encourage organisational change when the solution imposed is not the best.
  • even though they’re now open source, C#/ASP.NET technologies are not common in the open-source community. This might present challenges when it comes to hiring developers or getting help or sharing knowledge, especially with the Government’s developer community (see note below about the GOV.UK Design System).

While the panel applauds the comprehensive architecture and plan designed by the team, this plan should remain open to change and iterating. Dates changes, user needs evolve, leading to possibly major changes in requirements/blueprints. The plan should always remain a living document in order to respond to change.

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has developed a sound security strategy by identifying possible threats, and planning for possible attacks
  • there is a process for quickly checking and applying security patches
  • the team is following NCSC advice on securing an online service.

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that security checks are carried out at all levels of the application stack: at the infrastructure level, but also at the application level. A process for fixing application bugs or security issues should be developed
  • similarly, a process should exist for dealing with data leaks or cyber attacks - for instance describing how to restart the service from scratch should a major attack occur on the existing setup.

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used github to develop the prototype
  • the team intends to make its repository public soon
  • the code for the beta service will be hosted on github and also be made public.

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that the quality of the code made public is up to government standards
  • beyond just the code, the repository should reflect modern development practices, making good use of git/github features such as commit messages, feature branches, tags and releases
  • keep in mind that open source is not just about opening a repository, it’s also about sharing its knowledge. The team should consider actively contributing its code to other services if it finds it could be useful to other projects in the open source community.

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have chosen an open source stack wherever possible
  • the service is hosted on the public cloud (Azure)
  • the team has considered several options and has made sensible choices as to which to choose (with the caveat that many of those choices were pushed by the global organisation, as mentioned above)
  • the team has used the GOV.UK Prototype Kit to design the alpha prototype
  • the team has chosen GOV.UK Notify.

What the team needs to explore

Before their next assessment, the team needs to:

  • advocate for organisation culture change around standards and platforms: advertise GOV.UK PaaS and GOV.UK Verify which weren’t selected because or organisational choice
  • use the GOV.UK Design System to build the Beta service. Even though using it with ASP might be a challenge and the temptation will exist to try and replicate the GOV.UK design using custom stylesheets and markup, considerable effort has been put in the Design System in order to help with making services accessible and device-independent.

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team have developed a comprehensive approach addressing all layers of testing (Unit, Integration, Acceptance, Security, etc)
  • as much testing as possible will be automated
  • a dedicated test manager and automation test engineer will join the team.

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that as the number of tests grow, automated testing doesn’t end up taking so much time that continuous integration becomes a painful process. This will require frequent reassessment of what tests are carried out when in the CI//CD process
  • avoid taking testing out of the hands of the developers. When dedicated testing staff join a team it is not uncommon that developers lose track of the testing of their software, as the testing staff take ownership of it. This can lead to less efficient bug tracking and fixing.

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team have designed a three-stage strategy to deal with outages, covering reliability, operational readiness and an offline alternative if the system goes down.

What the team needs to explore

Before their next assessment, the team needs to:

  • don’t rely too much on the cloud provider to ensure the reliability of the platform: make sure it applies security patches regularly
  • make sure smoke tests are carried out from outside of the platform provider (through services like pingdom) to make doubly sure that the team are aware of all possible problems, including Azure’s alerting system going offline.

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the whole team demonstrated they clearly understood user needs across a broad range of users both direct and indirect users
  • the interaction design elements were clearly utilising best practise from the design system and had clear evidence for deviations from the norm. We were particularly impressed to see the design being consistently tested and retested to build confidence the website will fully meet the user’s needs
  • the team were able to demonstrate a very good understanding of the journeys and interactions prior to the proposed digital service. This illustrated their awareness of the importance of understanding the context users are in and the impact that should have on the overall service design
  • the service team are aware of the tactical way in which users might wish to use the policy in different contexts and are making well considered decisions in order to allow the website to enable this flexibility
  • the panel were outstanded by the quality of the work presented and the confidence the team gave in regards to being clear and open about the design challenges they have going forward.

What the team needs to explore

Before their next assessment, the team needs to:

  • the team has clear constraints set by policy and as a result users can only access the service via a third party consultant. If this is not handled correctly, the resulting impact could be a high level of failure demand for the third party - whom the teams research identified are already struggling to meet service demand. The service team are aware of ways in which they may still meet users needs within the scope of the policy, however they have not prioritised the exploration of this to the same level of rigor they have applied to other areas of their work. Therefore the pass is conditional on the further exploration of the expressed needs of those with problem debt. These included but are not limited to

  • the need to know service eligibility
  • the need to know information they will need to have to make an application
  • they need to know the status of their application
  • the need to know the end date of their breathing space

There is certainly an ethical consideration in regards to conducting this kind of research with end users who are in a financial and mental health crisis. Therefore any future assessment panel should be flexible towards the team learning about these user needs via a proxy (i.e via Debt Advisors) however we would still encourage the team to work as closely with end users as much as possible. Additionally, this approach will also give the service team the capacity to address this issue within the scope of their existing research plans going forward into Beta.

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have clearly been fully utilizing the existing best practices within government and learning from end users the way in which existing design patterns do not meet their needs
  • deviations made from standard practise had been well evidenced to the panel and had been retested to prove they worked for users
  • the panel were particularly impressed with the way in which the prototype has been set up to enable testing in a broad range of scenarios, clearly evidencing a strong user centered design practise within the team.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to contribute back into the design system the lessons being learnt
  • while the digital confidence within the existing proposed user base is high, this does not make the service team exempt from ensuring the service meets the needs of users with access needs. Therefore it is still vital the team is able to demonstrate that the service meets accessibility regulations. Additionally, when the service team expand the service out to meet the needs of those with problem debt they will need to design for users who are less confident online and may need to consider how the service delivers the same functionality in an offline setting - utilization of common platforms such as GOV.UK Notify will likely need to be considered more.

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has identified the entry points to the service so they can have an informed communications strategy
  • they are thinking of building an API for money advisors so that the debt data can be simply submitted to the service which will make it easy for users to use the service.

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • as a new service the team don’t have a benchmark to work against for performance KPIs but they will be exploring the use of Google Analytics and other performance monitoring tools
  • the team intends to work closely with HMT, DWP, MaPs and Insolvency Service to evolve a longer term strategy to measure the impact of Breathing Space.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that data will give insight into the full range of user journeys
  • demonstrate they have an ongoing roadmap for performance analysis.

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team recognises the importance of measuring performance and will be exploring tools to understand volumes, start and ends of submissions and dropouts prior to a natural end date
  • they will also monitor user flows within the service and understand time taken to process and submit into Breathing Space.

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how they will use the insights generated from analysing the key performance indicators to continue to iterate and improve the service.

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has an understanding of the Performance Platform and will be exploring how they could make use of it and what they would need to measure.

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should engage with the Performance Platform team, consider and plan if and how they measure and report on the 4 mandatory key performance metrics.

18. Test with the minister

(Does not apply for Alpha)


Next Steps

This service can now move into a private beta phase, following the recommendations outlined in the report. The service must pass their public beta assessment before launching their public beta.

The panel recommends this service sits a beta assessment in around 6 months time. If the team feels they’ve met their recommendations earlier than that, they can book it earlier.

Speak to your Digital Engagement Manager to arrange it as soon as possible.

To get the service ready to launch on GOV.UK the team needs to:

Updates to this page

Published 26 May 2020