Apply for Pension Credit beta assessment

The report for DWP's Apply for Pension Credit beta assessment on 15 July 2020

Service Standard assessment report

Apply for Pension Credit

From: Government Digital Service
Assessment date: 15 July 2020
Stage: Beta
Result: Not met
Service provider: Department for Work & Pensions (DWP)

Service description

The service helps pension age citizens who do not have enough money to live on in retirement to claim Pension Credit (PC), a DWP benefit, through their channel of choice. The amount a citizen receives is based on how much other money they have in terms of income, assets and savings. This service makes it easier and quicker for citizens to claim and reduces the amount of information they need to provide to DWP.

The current focus of Apply for Pension Credit is to provide an online journey for citizens to complement the existing telephony and postal channels. The ultimate goal of the service is to make it easier for users to get Pension Credit, through transformation of the end to end user journey across all channels, including the replacement of ageing legacy systems used by DWP advisers.

Service users

Primary users

  • Pension-aged citizens (who do not have enough money to live on)

Secondary users

  • Power of attorney
  • Appointees including corporate appointees
  • Friends / family of pension aged citizens
  • Charities and advice bodies.
  • DWP advisers/G4S who triage people who claim by telephone

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a good understanding of their primary users and the barriers to claiming from previous research and has re-used it for this project
  • the team has a good understanding of internal and external secondary users and established links to work with them, for example, telephony advisers and charities
  • there is some understanding of how users have managed to apply during the pandemic through interviews with charities
  • there is some understanding of the issues users encounter in the new service, but this is only from interviews following the application being submitted
  • the team identified their riskiest assumptions and are monitoring how these are playing out as best they can - although understanding is limited due to the absence of usability testing

What the team needs to explore

Before their next assessment, the team needs to:

  • establish a way to undertake remote usability research. The panel acknowledges that organisational issues are preventing screen sharing, this must be resolved urgently so the team can establish whether the service is meeting user needs and iterate effectively
  • review and update the users’ needs based on what is being learnt - the world has changed for users and this needs reflecting
  • consider ways to establish the completion rate of the service for those doing it without help. These are the users most at risk from being disadvantaged and more work is required to establish whether their needs are being met
  • ensure work is prioritised to focus on whether the new service is working. Looking at parts of the business as usual process such as HRT and letters are important, but possibly not the most urgent tasks
  • make the best use of what is available to help them understand user experience until the usability testing issues are resolved, for example analysing and acting upon feedback responses and telephony enquiries

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done a great job of setting the right scope for the service to commit and deliver, resulting in the clear communication to stakeholders and setting the right expectations
  • the scope of delivering the online form is well defined and the right team has been put in place with the right level of governance and communication to enable the team to deliver to committed tasks

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure they have a view on the overall service which includes more than just the web form, but also areas in terms of understanding the user needs from contact centre perspective, other agents involved, for example
  • ensure that the right level of communication around the service availability for the users is in place so the service can be easily found and used

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has some visibility around the other channels of the service and that they need to be addressed in order to be able to provide a unified service to users across all channels

What the team needs to explore

Before their next assessment, the team needs to:

  • establish whether users’ expectations are met around contact in the rest of the journey - this is the only online element and it was unclear whether users understand they will receive letters and need to report changes by telephone
  • understand if there’s a need to switch to phone if the user gets stuck or has questions during the online journey

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the service makes good use of standard GOV.UK patterns and components.
  • eligibility is dealt with clearly at the start, possibly not all eligibility- see below
  • in general, good separation of questions and use of one thing per page, but not always, see below

What the team needs to explore

Before their next assessment, the team needs to:

  • resolve the issues with usability testing so they can establish whether the service meets user needs and test improvements to the prototype
  • work with users completing the service without help to establish whether questions are understood and interpreted as intended
  • continue to look at what information from State Pension can be reused to lessen the burden on users
  • establish a way for users to leave and return to the service without losing progress
  • check and update things like font sizes and colours, there have been some updates to the Design System
  • improve validation - for example, you can enter a date far in the past for ‘date of claim’
  • understand what factors might be causing applications that are not entitled to claim - some you may be able to fix in the service, others not
  • consider using the task list pattern, as the service is quite long
  • investigate ways to make the income and savings interaction clearer and simpler, as this aspect of the service is complicated
  • avoid certain patterns - some questions have multiple questions revealed on the page, for example ‘Are you an asylum seeker?’ - revealing should be kept to simple, short content
  • avoid potentially wasting users’ time - some eligibility appears to be at the end - questions about living abroad or coming to the UK - if so, eligibility needs to be brought to the start

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the service makes good use of standard GOV.UK Design System components and patterns that have been tested for accessibility, however, this alone does not guarantee the overall accessibility of the service, or usability for disabled people

What the team needs to explore

Before their next assessment, the team needs to:

  • undergo an accessibility audit and share the results
  • test the service with disabled users

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has the appropriate team members in the delivery team, and the morale of the team seems to be very good - being able to bring the right skill sets from the various teams within the department is a point to be very proud of
  • the team had a person representing the policy team in the right engagements and brought along throughout the delivery of committed tasks in their sprints
  • the team is supported by subject matter experts from operations and policy, and a performance analyst who is shared with other teams and has visibility across the performance KPIs and data across the wider teams. The team is also supported by DevOps function who work across the retirement, bereavement and care area
  • the team brings any changes to their design authority - every component is standards based and can be taken up by others in the team/support units - there’s a technical guide that helps onboarding of new team members, and anyone in the team can deploy following documented steps

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using the standard ways of agile working, which means the work left to other team members and teams can easily be taken up and delivered

What the team needs to explore

Before their next assessment, the team needs to:

  • consistently ensure that what is being delivered, communicated and shared with stakeholders is documented at a more mature level, although some documentation in areas is very good and this is applauded by the panel

8. Iterate and improve frequently

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done a good job in attempting to make the user journey changes based on the limited testing responses - it is acknowledged that the situation was not ideal and that there have been multiple challenges but it is worth noting that the effort has been put in

What the team needs to explore

Before their next assessment, the team needs to:

  • iterate the changes further based on the various ways of testing the service with the users, based on the user research, which at this stage requires further technology improvements to make it happen, and ensure improvements are made based on the usability testing too

9. Create a secure service which protects users’ privacy

The service team was planning to implement a penetration tool (Zap), which was taken out of the service because of reliability issues. Agile threat modelling thinking is being initiated. Currently working with SIRO, which involves sharing ITHC results. External pentest was run and came with no high priority security threats. Any medium/low priority threats from the pentest is being dealt with, but some refer to the wider platform so will need to be dealt with in coordination with others. DPIA has been carried out, and there are 2 risks that are being dealt with effectively on data retention and right to be informed (for joint claims, for example – but this is more a legal audit issue than a technology one).

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • there is engagement at corporate level with the appropriate risk management structure
  • recommendations from the peer review were addressed
  • there is wider thinking of legal issues beyond the pure technological aspects

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate that the approach to security is ongoing and planned with SIRO and other security officers

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are actively using performance data to help iterate changes to the service
  • the team ran a performance framework session to carefully determine the KPI’s and metrics needed to understand if the service is being successful or not - the project team have full ownership of the performance framework and iterate it as new features are added to the service
  • the team have used performance data to help make changes to the service
  • while the performance analyst assigned to the team is not a full member of the project team, he has been the dedicated performance analyst assigned to the project from the start - there is a good process in place where the team can feed hypotheses of users using the service to the performance analyst to investigate further and feedback to the team
  • the team is collecting data for the GDS mandated KPI’s - they are still working on calculating the cost per transaction
  • the team is using Google Analytics with Google Tag Manager, managed by the DWP central analytics team
  • they are not passing any Personal Identifiable Information (PII) through to analytics
  • the analytics implementation has been signed off by the project SIRO

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate the possibility of using additional data sources to calculate their competition rates for the service, considering the limitations of cookie consent and the reduction in the volume of performance data being sent to Google Analytics
  • map their success criteria within their performance framework back to their user needs

11. Choose the right tools and technology

The technology stack and tools for the delivery and maintenance of the code are modern and appropriate to the departmental tech stack.

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used well supported technologies that are well established within the department

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that there is a plan to keep supporting the tools for the lifetime of the service

12. Make new source code open

This service has an open source pipeline which includes risk assessment, peer reviews, and security checking. At the moment the code is not open to the public as required by the Service Standard at this stage of development, but there’s a pipeline for the internal GitLab repository. At some point the code will be pushed into the open, and currently there is a prototype on GitHub. The code will be using the ISC Licence, which is what DWP recommends. The team has a licence checker in place to make sure about dependencies. Behaviours are open source, but still working towards pushing to public repo. The team uses DWP open source repos such as form builder, PDF generator, and other components.

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the approach to open source is built in the development/release model
  • the thinking around licensing and the choice of licence is consistent with other corporate guidelines
  • there is a roadmap to releasing the source code with good governance

What the team needs to explore

Before their next assessment, the team needs to:

  • actually release the code and make sure the releases are consistent with the code used in live

13. Use and contribute to open standards, common components and patterns

There are no common government platforms in use, but Notify is on the radar for future sprints. All the code is Infrastructure-as-code on Terraform, and Dockerised, there is no bespoke AWS service, so everything is open standards compliant and can be moved easily without risk of lock-in. The service emulates the paper based form, and therefore doesn’t verify people - but they only pay applicants who already have a pension, so there is little risk of fabricating identities.

Decision

The service met point 13 of the Standard.

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the adoption of Notify as the standard platform to send letters and SMS, in order to reduce the footprint of legacy dependencies and move towards standard platforms

14. Operate a reliable service

From a technology perspective, the service looks reliable. It processes about 250-300 claims a day, which means the containerised approach is well within capacity. The team has designed the service in a way it can handle spikes. DDoS protection is via Akamai and there are payload size limits. Database backups are nightly backups, so recovery is possible while losing limited amounts of requests. Monitoring is effective, using the native Amazon monitoring and Cloudwatch. Live service support gets notified of incidents, and the DevOps team gets alerted. Notifications go to Slack. Hours of support are 8-18 as per DevOps support, which was raised as an amber risk during peer review.

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the architecture is solid and capacity well planned
  • there is live services/DevOps support base on alerts
  • appropriate monitoring for the tech stack is used

What the team needs to explore

Before their next assessment, the team needs to:

  • explore ways to make sure requests between the latest nightly backup and a data loss incident are recoverable. Although this might not always be possible, it would be good to see some evidence of what has been attempted
  • provide more evidence that the 8-18 support is enough to make sure the service is reliable

Updates to this page

Published 24 January 2022