Apply for a breathing space beta assessment report

The report from Apply for a breathing space beta assessment on the 4th of March 2021.

From: Central Digital & Data Office (CDDO)
Assessment date: 04/03/2021
Stage: Beta
Result: Met
Service provider: The Insolvency Service

Previous assessment reports and reviews

Alpha assessment report

Service description

The service allows FCA accredited debt advice providers to submit their clients into a period of breathing space whilst they support them with problem debt consultation and potential longer term debt solutions cancellations through the Breathing Space service. Whilst under the umbrella of Breathing Space a debtor is protected from contact form a creditor in respect of any eligible debts and interest and charges may be frozen. For those debtors in receipt of mental health crisis treatment the protection period is indefinite based on the length of the accredited treatment. The service will send notifications informing creditors and facilitate any eligibility reviews, notification of sold on debts and any debts the creditor may propose should be protected if missed by the debt advice provider.

Service users

  • Debt Advice organisations
  • Creditors
  • Debtors

1. Understand users and their needs

Assessed by: User research assessor

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has continued with UR activities with a diverse range of users and have uncovered additional needs
  • the team identified through research that debtors need correspondence of when their breathing space started and when ended. This is being provided in the service which shows a need being met.

What the team needs to explore

Before their next assessment, the team needs to:

  • carry out usability testing with users of assistive technology, these users do not have to be users of the service. It is important that we test the end to end journey of the service with such users to ensure service is accessible and intuitive.

2. Solve a whole problem for users

Lead assessor, particularly with design and research input

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team demonstrated a good understanding of the third party services end users may be accessing
  • overall they showed consideration had been made to at what point in the user journey they might access this service. Mapping this from both the end user & financial advisors perspective made the holistic service very easy to understand.

What the team needs to explore

Before their next assessment, the team needs to:

  • to learn more about how the different services that end users are eligible for sit alongside the breathing space service. For example, does applying for a breathing space then make you ineligible for any other kinds of service which may support users in this difficult time.

3. Provide a joined-up experience across all channels

Lead assessor, particularly with design and research input

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had mapped all the touchpoints the user is likely to access throughout the end to end journey demonstrating a holistic approach was taken into account
  • whilst the digital service will only be required to be delivered online, it was great to see the service team thinking about how they will be upskilling & training the money advisors.

What the team needs to explore

Before their next assessment, the team needs to:

  • as mentioned in other sections, the team need to consider how their user support channel will be access by users and delivered by the service team
  • the panel assumed the service will mostly be accessed by desktop devices, however if google analytics tells the team otherwise they may need to consider how the service is accessed from other devices.

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed various design challenges of users having difficulty understanding the language, understanding eligibility and showed examples of how they had been utilising components in the design system such as the notification banner component to solve the design problems
  • the team were clear to describe examples of design work that is still outstanding and were able to discuss ideas they may try to resolve these issues
  • demonstrated from research why they have removed elements from the website which users would not find useful
  • overall the team gave a very clear demonstration of the evolution of the design of the service showing good evidence of it being based on user need.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work through the known content & interaction design issues
  • onboarding of the FCA credit advisors seemed a little unclear and the panel got the impression the team didn’t think this was in their remit. Given users can only access the service via these FCA credit advisors, getting them onboarded correctly and quickly is vital to the overall success of the service. In any follow up assessment it would be good to show the panel how you ensured the smooth onboarding of these users, and the kind of support you provided to them. Any needs you learnt about that you hadn’t considered before, and how this has impacted take up of the service
  • the service team lacked analytics support, as a result of this it was unclear what their definition of a simple to use service might be. This is covered in more detail further down in this report, however in any follow up assessment we would like to understand how analytics and usage data impact the design of the service and how you knew the changes you had been making improved things for users.

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were clearly aware of and working on fixing Accessibility issues - the team demonstrate excellent knowledge of known accessibility issues across different roles in the team
  • intends to be fully WCAG compliant by May launch date
  • given the main user of the digital service is a B2B service, their research justifies no formal assisted digital support requirement. However see comments below.

What the team needs to explore

Before their next assessment, the team needs to:

  • despite not needing to offer assisted digital support, the service team will still need to have some user support provision in place. The service team should be considering if this will be delivered in house or outsourced. If outsourced they will need to have a much better understanding of demand, and kinds of problems users are having and they may be required to provide scripts for this support. The team should also consider how this insight will be fed back into the iteration & design of the service.

6. Have a multidisciplinary team

Lead assessor

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a core multidisciplinary team in place, consisting of Product Owner, User Researcher, Interaction Designer, Technical Lead and Business Analyst
  • the team has support from other disciplines (SRO, Service Owner, Project Manager, Policy Advisers, Content Designer, Test Lead) and an external development team.
  • the same team has worked throughout Alpha and Beta and is expected to continue into public Beta
  • although the team is based in locations across the country, they have adapted to remote working making use of various communication tools.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that there is an ongoing support form Performance Analysts who are currently not a part of the core team.

7. Use agile ways of working

Lead assessor

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed very good understanding of agile methods and ways of working
  • they run daily stand ups, retrospectives and sprint Show and tells
  • the team showed they make a good use of different communication and tracing tools
  • they use Slack and Microsoft Teams to communicate and Trello and Jira to track progress and manage actions and issues.
  • the team engage with stakeholders regularly by running external stakeholder show and tells and also provide updates to senior management at project boards and working groups.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that a Performance Analyst is a part of the team’s ways of of working and
  • ensure analytics work is prioritised correctly
  • ensure there is a process for analytics work informing improvements and further development work.

8. Iterate and improve frequently

Lead assessor

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team work in 2 weekly sprints
  • there is a weekly design meeting which feeds changes from user research and into development
  • there is a plan for how user research will fit into sprints and iteration for public beta and live.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that analytics findings are fed into sprints and the work is a part of the iteration cycle.

9. Create a secure service which protects users’ privacy

Tech assessor

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team went through all the recommended steps, and beyond: external pen testing, thorough formal threat modelling, and implementation of NCSC guidelines
  • this is work usually expected at Live assessment, but the panel was impressed that the team has already carried out the work.

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure the security infrastructure adapts to lessons learned during public Beta
  • explore and address new types of attack vectors, such as supply chain attacks, specifically around deployment and use of external software packages
  • explore recovery processes for various types of attacks.

10. Define what success looks like and publish performance data

Analytics or lead assessor

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • considerable work has gone into the Enterprise Reporting Platform which will be used to report aggregate service volumes, trend data and to meet predefined reporting requirements from key stakeholders
  • GTM tracking code has now been added to the service to support performance analysis and identification of pain points/areas for future iteration, exits, errors encountered and usage of the service. Some initial GA reporting dashboards have already been created by the team to monitor service usage
  • A performance framework has been developed for the service with KPIs identified and prioritised for day 1 launch
  • the team is aware of the need to publish service performance information and are considering which KPIs now identified should be published.

What the team needs to explore

Before their next assessment, the team needs to:

  • the team does not currently include a Performance Analyst although they have support of key professions via the Customer Insights Team. The team should engage with the Performance Analysts within this team to help guide them through the analytical considerations and requirements for a compliant service
  • the team should agree which KPIs (in addition to the mandatory ones) show how the service is performing and should arrange publication of these KPIs as soon as practical
  • the service team should consider and agree reporting mechanisms and requirements from an ongoing service support, tracking and iteration perspective
  • the team should be mindful that data received via Google Analytics (or equivalent) may not be representative of their complete user base due to cookie consent blocking etc, especially given the business user base (who can block cookies automatically). The team should identify how they can check that this data is representative of the whole user base prior to using this to inform future changes/iterations
  • the service team will receive categorised customer feedback on a monthly basis from the Customer Insights team. During Beta, receiving the data (categorised or uncategorised) on a more frequent basis will allow the team to analyse feedback trends and identify areas for iteration and improvement on a more timely basis. The team may be able to obtain some of this data via GTM so should continue to explore this opportunity
  • whilst this is a new service, the team should investigate further to identify expected user volumes and to establish a baseline to monitor performance against. This baseline can be changed as insight is gained into the service and its use but the team should be very clear when defining what good looks like for the service and how they can measure success and identify pain points for users.

11. Choose the right tools and technology

Tech assessor

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is runs on the Azure cloud and the team are making use of its full functionality, in terms of monitoring, scaling and deployment
  • the team followed the advice from the previous assessment and are setting up external (non-Azure) monitoring and altering system running smoke tests on the application
  • the team uses modern tools such as Slack or Jira for internal team communication but also communication with external users.
  • the team also use modern software development tools such as github for code and gitlab (for internal documentation).

What the team needs to explore

Before their next assessment, the team needs to:

  • adapt its way of working if public beta outcomes show that the current way could be improved
  • make sure the source code lends itself to its dependencies changing. For instance it should be easy to update to any new version of the GOV.UK Design system. Similarly, if the various external APIs (such as Notify) were to change
  • where suitable, abstract away dependencies (eg Experian address lookup, GOV.UK Notify or Identity server) to make it easy to change them, should the need arise.

12. Make new source code open

Tech assessor

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before their next assessment, the team needs to:

  • package any reusable code as components, so they could be used by other services
  • Flesh out the repository’s description to make it clear what service it relates to, who owns the code, etc. Opening source code is a PR exercise, to show the taxpayer that their money is well spent on good services, as exemplified by their source code.

13. Use and contribute to open standards, common components and patterns

Tech assessor

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service uses the GOV.UK Design system and GOV.UK Notify
  • it also uses non-gov.uk components and systems that are open source (eg, .NET or Identity Server)
  • the team uses OpenAPI for API definitions.

What the team needs to explore

Before their next assessment, the team needs to:

  • contribute back to the cross-gov front-end community any findings from using the Design System (and join the front-end channel on Slack)
  • continue using open technology whenever possible
  • blogging about how the team designed the service would be a valuable contribution to other team across the government looking for an exemplar to find inspiration from.

14. Operate a reliable service

Lead, Design and Technology assessors.

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team presented a complete and sound approach to making the service reliable: traffic estimation, performance testing, stress testing
  • planned for team readiness in case of incidents
  • designed a plan for offline consisting of 4 outage levels and specifying a management process for each.

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure the public beta service performs better than the current private beta, which was sometimes slow and even showed errors when tested.

Next Steps

This service can now move into a public beta phase, subject to implementing the recommendations outlined in the report and getting approval from the GDS spend control team if approval is needed.

The service must pass a live assessment before:

  • reducing the team’s resource to a ‘business as usual’ team, or
  • removing the ‘beta’ banner from the service

The panel recommends this service sits a live assessment in around 9 - 12 months time. Speak to your Digital Engagement Manager to arrange it as soon as possible.

This service now has permission to launch on a GOV.UK service domain with a Beta banner. These instructions explain how to set up your *.service.gov.uk domain.

Updates to this page

Published 10 June 2021