Apply for help arranging Child Maintenance beta assessment

The report for DWP's Apply for for help arranging Child Maintenance beta assessment on 3 December 2020

Digital Service Standard assessment report

Apply for Help Arranging your Child Maintenance

From: Government Digital Service (GDS)
Assessment date: 3 December 2020
Stage: Beta
Result: Met
Service provider: The Department for Work and Pensions (DWP)

Previous assessment reports

Service description

Apply for help arranging Child Maintenance enables citizens to apply to the statutory Child Maintenance scheme when they cannot agree a private arrangement.

The aim of the product is to create an application process with a great user experience that provides a flexible and convenient channel for parents to apply for Child Maintenance, whilst also creating business efficiencies by reducing call-handling time.

Service users

The primary users of the service are receiving parents - the separated parent who has the main care of the child or children and should receive child maintenance from the other parent. Receiving parents make up the vast majority of users applying for child maintenance (97.5% of applications).

Secondary users of the service are:

  • paying parents - the separated parent who does not have main care of the child or children and will need to pay child maintenance to the other parent. Most paying parents have to respond to an application from a receiving parent although some paying parents apply to use the child maintenance service themselves (fewer than 2% of applications)
  • DWP Caseworkers - DWP staff members who process child maintenance applications, support users through making an application over the phone or online and deal with issues related to child maintenance applications, such as shared care disputes and parentage
  • support organisations - third party organisations such as Citizens Advice and Gingerbread that provide guidance and support to users and potential users of the child maintenance service

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has conducted high quality user research on a very sensitive issue with care, creating a detailed and nuanced view of the needs of the primary users of the service – receiving parents
  • the team uses a wide range of data sources and methods to generate their understanding of users, including interviews with users and caseworkers, diary studies, listening to recorded calls to the service and interviews with people who have used the beta service. They have distilled their insight into a small number of clear, high level needs consistent to all potential primary users of the service
  • the user research and testing has allowed the team to make real improvements for primary users of the service, as opposed to simply replicating the existing service (a phone call) online. Most impressively, this has ensured that the emotional need of receiving parents to feel safe is designed in across the service
  • the researcher in the team works closely with those redesigning other parts of the end to end service, sharing resources and findings across the ‘tribe’. The team has also shared evidence of potential issues with policy and legislation, for example the impact of the fee on service user behaviour, with colleagues working on a wider policy root and branch review of the Child Maintenance Service (CMS) system

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to gather and share evidence of user needs across the end-to-end service, focusing particularly on how and whether the assisted digital and wider support offer is meeting the needs of those who get stuck or need extensive help
  • gain greater understanding of the needs of paying parents and people who do not currently use the online service, using the analytics to drive hypothesis driven research
  • continue to gather evidence of the impact of the fee and the related questions on users who have experienced domestic abuse. It’s currently not clear whether or not having to answer these questions as part of applying to the CMS could be re-traumatising. Evidence around the user experience of filling in this part of the form could be an important piece of the evidence to consider when making decisions around the future of the fee
  • ensure methods are always appropriate to the research aims and subject matter, for example, ensure that any research that might touch on domestic abuse is covered in appropriately structured one on one sessions rather than focus groups. Focus groups for this kind of sensitive topic could be a negative experience for participants and is also likely to impact the quality of the research itself, as often people can be uncomfortable sharing in group sessions

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has taken a risk based approach to ongoing research and testing. It has spent most time testing pages and patterns that depart from the GOV.UK design system and trying to solve design challenges specific to this particular service for example how to ask about shared care in a way that is inclusive and accurate. The team is still working on perfecting this and plans to use the same design in the CMS calculator on GOV.UK
  • it has also used survey data, analytics and support calls to prioritise which elements of the service are performing well and which to continue to improve
  • the entry and exit points of the part of the service the team covers have been tested, and they have made changes to the content of the invitation email and confirmation email as a result
  • the team has been creative and pragmatic in recruitment, for example doing pop up testing in Jobcentres and recruiting via organisations who support service users who might otherwise be hard to recruit

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct future usability testing of online elements of the service with shared screens where possible. The team is currently working on this, as aware that they may be missing issues using their current workaround method
  • consider recruiting users through a wider variety of means. For example, it might be useful to recruit potential users who do not currently use the Jobcentre or engage with support organisations
  • build in another round of testing of the new version with users where possible, when usability testing leads to changes to content, pages or journeys - some features of the service appear to have only been through one round of testing with users

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the product is supported by an experienced multi-disciplinary team, comprised of the appropriate agile roles, with the key roles responsible for the control and direction of the product all being permanent civil servants
  • there is good continuity throughout the roles, with most team members being in post since the alpha phase and expected to remain for the upcoming public beta phase. Where personnel changes are planned, for example to cater for maternity leave, appropriate replacements have been identified and onboarding activities are planned
  • there is good commitment from the business to increasing internal capability and reducing reliance on contingent labour, with real improvements, in the form of new apprenticeship roles, expected by January 2021
  • the team has worked well to address the governance issues identified during the last assessment and were able to confirm significant improvement in this area and were able to provide clear examples of where the team’s control and autonomy has resulted in positive changes to the experience of the paying parent user group
  • the team continues to engage with the wider Child Maintenance Service

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to maintain the current good practices and flag any deterioration or new concerns

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has adjusted to the challenge of not being co-located during the COVID-19 lockdown and has plans to return to a co-located setup as soon as they are permitted
  • despite the challenges of lockdown, the team has continued to operate in fortnightly sprints and observe all the appropriate agile ceremonies
  • the team was able to share a clear roadmap for appropriate deliverables throughout the upcoming public beta phase and has a realistic set of criteria for determining when it will be appropriate to exit public beta

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work towards the above goals and flag any concerns that may impact delivery

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to provide clear examples of continuous improvement and their ability to meet changing user needs
  • the team has identified where they can improve their ability to iterate even further - by gaining the ability to check whether the eventual payment method for each case is the one indicated as preferred by users in response to the prompts in this product

What the team needs to explore

Before their next assessment, the team needs to:

  • engage with the wider service to track the outcome of the ‘preferred payment method’ choices by users of this product

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using the right tools and technology for both frontend and backend
  • the team is using the right tools for cloud, testing, security and CI/CD
  • the team is using RESTful APIs

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using various security methods for protection of data in transit and at rest
  • the team has engaged with the DPO and is adhering to GDPR guidelines
  • the access to the data is controlled using authorisation and access management system
  • the team has done the DPIA, ITHC and security risk assessment

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using various open source tools for development of UI, integration and database
  • the team has published the code on Gitlab and GitHub
  • the team is utilising various open tools for various testing

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using GOV.UK Pay, GOV.UK Notify, DWP Bank verification (Modulus Check), and DWP Address Verification
  • the team is adhering to open standard tools and technologies

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using various test tools for Security Testing, Micro-services and API testing, Performance Testing
  • the team is using various open standard tools for testing
  • the team is using CI/CD for quicker integration and deployment

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has well planned SLAs for alternative processes if the service is down
  • the team has the right resources to operate and run the service
  • the team has good plans for the backup and recover

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • consideration has been given to the outcome of manual and automated accessibility testing, and the service has been iterated upon to improve the accessibility of the service as a result
  • design and content decisions have been based on user research, particularly taking into account that half of the users of this service are victims of abuse. The panel was particularly impressed with the design of the ‘Quick exit’ button, which will protect users of this service. The work done by this service team in this area could now positively impact more users of GOV.UK if it does end up being taken forward into the Design System
  • consideration has been given to testing the assisted digital support model and research has been used to try to reduce the dropout rate for the service

What the team needs to explore

Before their next assessment, the team needs to:

  • build on the research already done to ascertain the name of the service and do some further testing of how well users understand it and whether or not it reflects what the scope of this part of the service is. See the content review for more on this
  • ensure that the service is signposted properly within the relevant pages on GOV.UK at the start and end of the user journey for this service, continuing to work collaboratively with the content and service designers on the existing service pages, and the Manage service pages, when they have been redeveloped, including the Portal pages
  • consider how well the proposed Start page is understood by users, especially with regard to fees information and the content priority on these pages. Content sketching may be useful as a way to visualise content priority here
  • ensure that the following recommendations from the alpha assessment regarding the £20 application fee are explored in full before putting the service live:
    • explore removing or changing the £20 application fee
    • consider with policy colleagues the impact of the application fee on the user journey

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has tried to implement the recommendation of the previous service assessment to continue with the work to remove the need for Title in the personal details collection screens or make the list of options inclusive. However, the panel was concerned that the team had encountered challenges from within the business which were preventing this recommendation from being implemented. This will be flagged with DWP assurance, who the team should liaise with when continuing to implement this recommendation
  • existing GOV.UK design patterns have been used throughout the service
  • the team is working closely and collaboratively with other teams on the CMS to ensure consistency
  • careful consideration has been made for how language is used in the service, based on user research, for example when asking users for information about the other parent’s address

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that all design patterns are based on the GOV.UK design system, paying particular attention to use of radio buttons and personal details re-checks, and that all language used is consistent with the style guide, for example childcare rather than child care. See the accessibility review for further information on which design elements need to be reviewed again
  • review the content to make sure it fits within the context of the journey. For example, the ‘You need to apply by phone’ page appears if a user selects ‘No’ or ‘I do not know which country the other parent lives in’, and they might need more context or background about how that selection marries up with the explanatory information on this page
  • ensure that the content within the service is consistent within each page. For example, the Start page links to a different service for Northern Ireland users, but the first eligibility question in the non-Northern Ireland service asks whether the applying parent lives in England, Scotland, Wales or Northern Ireland, and the user can select ‘Yes’ here and still proceed
  • ensure that form inputs styles are consistent with the GOV.UK design patterns, for example, all ‘Check answers’ content. The service asks users to enter their date of birth and give a suggested format, which is consistent with the recommended design patterns - ensure that the date format is then also consistent with the ‘Check answers’ content formatting
  • ensure that the user satisfaction survey link is added to the beta banner for the start of the public beta phase

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has clear and realistic targets for encouraging uptake of the digital service throughout the public beta phase

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work towards these targets and flag any delivery concerns

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has collected extensive data from a number of areas including: Google Analytics, server logs, CRM application and from case workers
  • the team has been careful to consider and monitor its most vulnerable users
  • the team tracks the number of users who opt out of Google Analytics, which will confirm the validity of their statistics over time

What the team needs to explore

The team relies on other applications for data and they feed data into other systems.

Before the next assessment the team needs to:

  • explore the use of data quality statistics from other applications on the data they pass across and receive

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has looked at a range of existing metrics to inform the KPIs
  • the team tracks the four mandatory KPIs - with just a change in one to align with business priority around cost saving
  • the team has a number of live dashboards which are used to track key metrics
  • the dashboards closely align to project objectives and help inform on both the status of the service and future improvements. On one page, where the user has to generate a seven digit pin, they were seeing a higher failure rate. They improved the layout of the page and fed the results to the relevant team

What the team needs to explore

Before their next assessment, the team needs to:

  • as above

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard as they have reached out to the performance platform team

What the team has done well

The panel was impressed that:

  • the team has a wide range of metrics available to the performance platform once it is available

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has demonstrated the service to the relevant minister and has received positive feedback

Updates to this page

Published 5 September 2022