Apply for help arranging Child Maintenance alpha assessment
The report for DWP's Apply for help arranging Child Maintenance alpha assessment on 24 October 2019
Digital Service Standard assessment report
Apply for Help Arranging Child Maintenance
From: | GDS |
Assessment date: | 24 October 2019 |
Stage: | Alpha |
Result: | Met |
Service provider: | Department for Work and Pensions (DWP) |
Service description
The Child Maintenance Service (CMS) enables parents to set up a financial arrangement to cover their child’s living costs if they cannot make a private, family-based arrangement.
Service users
Primary user group:
Receiving parents (parents who will be receiving the money from the other parent)
- in contact with other parent
- vulnerable user (may have experienced domestic abuse)
- not in contact with the other parent
- high conflict
- low conflict
Secondary user group:
Agents (DWP operational staff)
Paying parents (those paying the money)
People with care (not parents but perhaps grandparents or guardians who have care of the child)
1. Understand user needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team have focused on conducting research with users who have the highest conflict with their ex partners and the least amount of contact such as users with the strongest needs, this was expressed through the persona Michelle
- the team have done a lot of research aimed at understanding their users’ context by listening to calls through the contact centre, running a diary study, understanding the supportive roles of organisations and visiting contact centres
- the team have identified and conducted research in the areas of the country where their users have:
- low digital skills
- English as a second language
- the team are aware of the need to examine if information collated through the digital service is usable by caseworkers and adds to their current workload
What the team needs to explore
Before their next assessment, the team needs to:
- work towards producing a single, overarching user need that connects all current and potential users of the service
- consider how they can conduct end-to-end user research. Letters, emails and user expectations once users complete their application will become core areas of the user journey. Testing and iterating the end-to-end service can ensure that call volumes to the contact centre do not increase. The panel recommends testing these elements in future usability tests
- conduct further research with users with access needs. Although some testing with users with visual impairments has been conducted, further research with users with all types of access needs, such as visual, cognitive, physical and hearing impairments must be done to ensure that the service is accessible
- consider all user groups. The team have rightfully focused on conducting research with the users with the strongest needs, such as receiving parents, however the team should also consider conducting research with: paying parents, agents, users with cultural differences
2. Do ongoing user research
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team conducted 60 usability tests without having a recruitment budget. They have shown creativity to recruit via their contact centres, job centres, and pop-up research and through a range of domestic abuse organisations such as Women’s Aid, NIDAS and CAB
- the team gathered some insightful analytics at an early stage of the service development which helped direct focus of the research such as:
- the split between Direct Pay vs Collect & Pay
- the split between paying parent versus receiving parent
- the volume of domestic abuse claimants
- in private beta, the team plans to conduct and utilise:
- further analytics
- analysis of survey responses
- retrospective interviews with users that have recently completed their journey
What the team needs to explore
Before their next assessment, the team needs to:
- understand users’ expectations of the time it will take DWP to, firstly, contact their ex-partner and secondly, receive payment. Core questions still include:
- will users require application updates
- will users cause an increase in contact centre calls by seeking reassurance
- understand how users will manage with a payment reference, application reference, Government Gateway ID and a password
- understand differences that are related to user locations. For instance are there certain types of support available, do levels of awareness change?
- understand if the service works to the same quality standard across all devices
- explore if the current method of determining the number of nights a child may spend with each parent works for users with the most complex childcare arrangements
- consider how to conduct live usability testing
3. Have a multidisciplinary team
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- a full multidisciplinary team is in place, co-located in the Newcastle service centre
- the mix of civil servants and contractors is positively biased in terms of civil servants
- the team demonstrate strong agile work practices
What the team needs to explore
Before their next assessment, the team needs to:
- consider whether the governance in place is appropriate and not a burden on the service team
- consider how the service team will scale to support public beta
4. Use agile methods
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team is working in an agile way in a co-located space which encourages collaboration
- the team sought and are open to input from stakeholders and have engaged with policy colleagues to inform changes to legislation
What the team needs to explore
Before their next assessment, the team needs to:
- consider hosting cross-Child Maintenance Group show and tells
5. Iterate and improve frequently
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team demonstrated a number of iterations to the service based on user research
What the team needs to explore
Before their next assessment, the team needs to:
- expand user research to a wider variety of users
6. Evaluate tools and systems
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- for the Child Maintenance service there was a range of programming languages that could have been used to develop the service, the reason for using node.js as the development language was clearly explained
- the team has implemented comprehensive monitoring of the service to allow the team to be alerted when the service encounters issues and this has also been linked to the automatic shuttering of the service
- the team is using existing backend services to implement the Child Maintenance service and are also using RESTful integration to access them
- the service is built on a platform where the service will auto scale to meet demand
- the team make use of the DWP Tech Radar to make decisions on the tools and technologies to be used in the development of the service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the service is developed to allow the transition to AWS from Crown Hosting to be as straightforward as possible
- ensure the monitoring and logging will continue to evolve to meet the services needs, as the service grows
7. Understand security and privacy issues
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team have a clear and well defined approach to security for the service
- the service team provided a clear explanation and presented a plan on how the security is applied to the development of their service and how it is at the forethought of their development process
- all data stored locally by the service is encrypted at rest and all communications traffic is secured by SDX. The service team also work closely with the security team to ensure the service meets all security standards
What the team needs to explore
Before their next assessment, the team needs to:
- integrate with the government gateway for user sign on to the service
8. Make all new source code open
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team clearly articulated their approach for making all of the source code open and reusable for the Child Maintenance Service.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the coding in the open standard is followed for all future development
- ensure any private repositories that do not contain sensitive details are made public
9. Use open standards and common platform
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team uses common standards including REST, HTML and the CASA framework to develop their service
- the team identified services that can be reused rather than developed to implement their service, examples include Bank Wizard and the ATOS Payment Gateway
- the Child Maintenance service has integrated with the BPM middleware to handle the queueing and delivery of messages to Siebel CMS.
What the team needs to explore
Before their next assessment, the team needs to:
- continue to use existing standards and platforms for the future development of the service
10. Test the end-to-end service
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team demonstrated that the Child Maintenance service implements a comprehensive testing strategy linked to their continuous integration pipeline which automatically executes tests
- the team described their end to end approach to testing to ensure the service will meet the testing standards required
- the team uses several industry standard tools to test their service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the time taken to run the test as part of the continuous integration does not hinder the turnaround time for a build, as the size of the service increases
11. Make a plan for being offline
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the Child Maintenance service has automatic shuttering which responds to errors and will display the shutter page based on tolerances
- the service can also be shuttered at a higher level if necessary, from Akamai
- the telephony route is available to users to apply to the Child Maintenance service, should the service be offline for any length of time
12: Make sure users succeed first time
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team tested and iterated the GOV.UK start page and experimented with interrupt pages to make sure users understood they only need limited information about the other parent
- they are researching with users from diverse demographics, including users with low language and digital skills
- the team are carrying out research with users who have experienced abusive behaviours to understand their needs and make sure the design is inclusive and gives reassurance
- they are working with telephone and in-person support teams to understand the service they provide
What the team needs to explore
Before their next assessment, the team needs to:
- continue work to remove the question checking if the user has reported abusive behaviours
- work with internal staff to give accurate advice on the differences between Direct Pay and Collect and Pay
- monitor users behaviour when selecting Direct Pay or Collect and Pay and iterate if needed
- monitor how successful the ‘other parent’ data is and consider removing some of the questions
- work with colleagues who are reviewing the letters and have a plan for testing and making changes
- explore removing or changing the £20 application fee
13. Make the user experience consistent with GOV.UK
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the service is consistent with GOV.UK pattern and style
- the team is working with other services and departments on common patterns including ‘when can we call you’ and bank details
- the team researched around sensitive language like ‘domestic abuse’
What the team needs to explore
Before their next assessment, the team needs to:
- review and fix small inconsistencies and designs including:
- use curly apostrophes
- consider stacked or inline radio buttons
- content for users who need to apply by phone because they live overseas
- consider questions or statements on check your answers page
- continue to improve and iterate how users estimate the shared care arrangements
- explore how other services use the add / edit a list pattern, make sure the current design only uses one Heading 1 (H1)
- continue work to remove the need for title or make the list inclusive
- update the prototype so it meets the accessibility standards
14. Encourage everyone to use the digital service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team is working closely with other service teams and operational colleagues to encourage take up of the digital service
What the team needs to explore
Before their next assessment, the team needs to:
- continue to look at ways to reduce friction between the telephony and digital channels
- consider with policy colleagues the impact of the application fee on the user journey
15. Collect performance data
Decision
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the team is working with dedicated performance analysts to identify and implement performance metrics
- the team were able to demonstrate solid plans and tools for collecting performance data during beta
- the service team were able to discuss how they had decided on what data they needed to capture and how it will enable continuous improvement of the service
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how the data collected during private beta has been used to iterate the service
16. Identify performance indicators
Decision
The service met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the team were able to demonstrate solid plans for collecting performance data during beta
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how the data collected is being used to iterate the service
- demonstrate how the service is adding value to government
17. Report performance data on the Performance Platform
Decision
The service met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the service team were able to demonstrate that thorough planning had taken place to identify performance metrics
18. Test with the minister
Not Applicable in Alpha phase