Appeal a Benefit Decision beta assessment report
The report for the Appeal a Benefit Decision beta assessment on the 02 July 2018
Service Standard assessment report
Appeal a Benefit Decision
From: | Government Digital Service |
Assessment date: | 02 July 2018 |
Stage: | Beta |
Result: | Met |
Service provider: | Her Majesty’s Courts and Tribunals Service (HMCTS) |
The service met the Standard because:
- the service team demonstrated a strong understanding of user needs and iterating of the service based on these needs
- the service team has taken on recommendations from its alpha assessment, joining up two disjointed service into a more holistic service
- the service team has shared a good technology plan and documentation
About the service
Description
The ‘Appeal a Benefit Decision’ service enables benefit claimants who have received a mandatory reconsideration decision they do not agree with to submit their appeal online to the independent tribunal. Their benefits will either have been stopped or reduced by the DWP.
Once their appeal is submitted they are then kept updated by email and SMS notifications (if they sign up). They can also track their appeal online and access supporting content.
Service users
Benefit claimants who want to appeal a decision by DWP to reduce or remove their benefits, professional representatives who help claimants manage their appeals, friends and family of the benefit claimant registering themselves as representatives, HMCTS staff and the judiciary.
Detail
User needs
The panel was impressed with the level of research that has been conducted across the alpha and beta stages of the service. Consequently the team demonstrated that it has a good understanding of the users of the service, and their needs.
Research has involved conducting contextual interviews to understand needs and pain points, surveys to help segment the user base, and usability sessions to improve and iterate the service.
Although a majority of the usability sessions have been conducted in the lab, the team has managed to engage with organisations that have enabled them to test the service with specific groups for example Welfare Rights organisations, Citizens Advice Bureau, Law Centres. This has also enabled the team to produce well developed personas for the service - seven in total, including two developed through the research that was conducted in private beta.
Although the team has conducted an accessibility audit through the Digital Accessibility Centre (DAC), difficulties with recruitment has meant that ongoing research with people who have access needs, and those on the lower end of the digital inclusion has proved difficult. However, this is an area of research which the team should be able to focus on once the reach of the service increases in public beta. The team should also look to further leverage on the relationship it has with the Good Things Foundation to enable access to users across hard to reach groups, and to test the support model.
Overall user research is embedded well in the team, with all members participating in research activities. This was evident in the team’s understanding of their users, and their confidence in talking about the research conducted, and the problems it has help them address.
Team
The panel felt that the service had a strong team in place to develop the service during the public beta. It was clear from the assessment that the service team has a strong agile culture using the consistent and effective tooling.
The service team seems to have a good balance in its multidisciplinary team of technologists and user, product and content roles. That said the panel felt that there is a concern that the reliance and overall balance of contractors and service contracts to civil servants is not sustainable. The panel understood that the overall HMCT transformation programme has plans to address this and it is suggested that demonstrable progress on addressing this balance is made during public beta.
The panel felt that the service team had a clear expectation of how to run the service in public beta. It is the suggesting of the panel that the service team makes their plans and monitoring metrics for public beta clear and regularly updated.
Technology
Social Security and Child Support (SSCS) has the largest expected digital uptake across the Reform Programme and will be enabling Submit Your Appeal (SYA) ability to citizens as part of the next major release. With the Service being online already for several months, the volume of traffic through Track Your Appeal (TYA) and SYA is expected to increase. There is concern over performance due to the new service functionality with SYA and Increased usage of TYA.
The team consists of three backend developers, three front end developers, one solution architect, three testers, two BAs, one delivery manager, one product owner.
The team uses Restful APIs for various components. The team needs to get to some level of maturity in APIs. The team plans to look into the API maturity model and adapt to it.
The following is in scope for the service: TYA front-end, SYA front-end, Tribunals Case API - creates PDF for RDC, post / get data from Core Case Data (CCD), Notification Service and CCD - by means of loading data into idAM - by means of invoking other components.
The out of scope elements are: Case Loader - there are no changes to the way Case Loader works, and the test environment performance has no concerns at current use levels. The volume needs to be tested in Production, CCD - Performance has been proven by the team separately, email and SMS - are done by the Notify Gov Service Print Solution - physical printing is not required.
The team is able to demonstrate that they are working in agile environment using open source. They use Confluence and Jira for management of the requirements and source code collaboration.
All team participate in the proposed component design. This distributes knowledge around the team as well as being peer reviewed by two other developers. Services are built in using the security rules in the code. This helps in risk and vulnerability management.
The team uses Node.JS and Express.JS for front end development and Java 8 and SpringBoot for backend development. The database used by the service is PostgreSQL. The team uses SonarQube for the continuous development. The developers use Cucumber tool for testing and Gatling for load and performance testing. The team uses the branch strategy and development making use of JUnit testing, smoke testing, integration testing and functional testing. Test driven development is using CodeSafeJS and for testing the Rest APIs utilising RESTAssure. The team uses Microsoft Azure cloud services.
The team has shown plans and technology for providing accessibility for disabled people and assisted technology, but should be able to recruit and provide more evidence at the next assessment.
Design
The service is logically structured and the combination of submitting an appeal and tracking it online means that users are taken all the way from filing disagreeing with DWP to their tribunal.
To further make sure the entire journey (including the previous steps of interacting with DWP), the team should work with GOV.UK and DWP to see whether it’s appropriate to create a step-by-step page to guide users through the entire user journey.
The service uses a few patterns that aren’t yet in the GOV.UK Design System. These new patterns have been well researched, tested and iterated. The team should work with the Design System to make sure their work gets added to the public backlog repository to save other teams in government from duplicating their work, and continue to add new research findings there as they progress with their work.
Analytics
The team has a short list of Key Performance Indicators (KPIs) against which they intend to measure the success of the product, together with a longer list of other measures that could help in its development.
They explained why some of the KPIs they are using (for example quality) are not directly comparable with the metrics they’re collecting for the existing paper form, but that they were looking at how sampling may help with comparisons, which the panel would encourage.
The team does get data from the assisted digital contract holder, and plans to work with Citizens Advice to get data on the support they give to users.
Data on user satisfaction is collected after submissions are complete, but users are able to provide feedback throughout the process. The panel questions the decision to group ‘neutral’ ratings together with ‘very positive’ ratings, and would encourage a weighted measure instead.
The completion rate is currently very low – the team’s assumption is that potential users / intermediaries are testing the system but stopping before they would submit an appeal. The panel would like to see this investigated, and is interested in how this pattern will change in public beta.
Measuring completion rate and drop-off points becomes much more difficult when save and return is introduced, and the panel would like the team to make plans on how they will handle this.
The team has calculated cost per transaction across all channels, and have estimated the savings that expected digital take-up will create. There is resistance to publishing this figure, but it would be good to have a clearer understanding of why.
The Performance Platform is being developed but is not yet live, partly due to data formatting problems. This should be released before the service proceeds.
Overall, the panel was impressed with the current approach to analytics, and would expect this to continue and improve as more people use the service.
Recommendations
To pass the next assessment, the service team must:
- have a clear plan how the service will monitor and iterate to improve user journey based on pain points
- work with GOV.UK and DWP to consider if a step-by-step guide is appropriate for users of this service
- contribute their new patterns and research to the GOV.UK Design System and continue to update it as they find out more in public beta research
- clearly establish positive impact on users before implementing a save and return feature
- conduct further usability testing with user that have access needs
- look into the performance with the increased uses of TYA And SYA
- evidence from the digital accessibility testing and usage that the service supports the needs of users with accessibility needs
- demonstrate progress in reducing the dependency on contract, interim staff or service contracts, increasing the number of skilled permanent civil servants in key roles, particularly service design, user research and technical architecture
The service should spend a minimum of six months to nine months in public beta, using a staged rollout approach. During this period the service team should look to have four assessment workshops based around the user journey pain points.
The service team should also:
- workshop with Government Digital Service (GDS) six weeks before adding further benefit appeal categories
- ensure General Data Protection Regulation (GDPR) compliance by clearly identifying that users have given explicit permission for data sharing
- create a clear plan to increase HMCTS capability within the service team to enable knowledge transfer and increase civil service capability