Money and Pension Service pension dashboard

Service Standard assessment report DWP Money and Pension service (MAPS) pension dashboard 02/11/2022

Service Standard assessment report

MAPS pension dashboard

Assessment date: 02/11/2022
Stage: Alpha
Result: Not Met
Service provider: DWP Money and Pension Service (MAPS)

Service description

This service aims to solve the problem of people losing track of their workplace pensions when they have many of them. Research has found that people have an average of about 11 employers in their lifetime, and that around 1.6 million pensions end up unclaimed in the UK. This can contribute to pension poverty or reliance on state support in retirement. To reduce this risk, the ‘pensions dashboard’ concept was a 2019 manifesto commitment and is now written into policy, with the intent of allowing people access their pension information in a single place online, in a clear and simple way. The policy also intends to increase awareness and understanding of saving for retirement and to deliver innovation within the pension industry and improve choice for end users.

Service users

This service is for all UK citizens who have had a workplace pension. Research has found that user engagement with pensions is likely to peak as individuals get older and also as they go through significant life events (such as starting a new job, going through divorce, etc).

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have conducted research with a large group of users and have utilised mixed methods to create an evidence base for their service
  • a host of evidence-based personas have been created to showcase who the users are and what they need. Personas have been mapped onto a lifelong user journey which indicates the differing levels of pensions awareness throughout different age ranges. At each stage on the journey, the team have indicated whether pension engagement is low or high and have used this information to help understand where and how to target their services
  • user needs are written clearly and are not solution specific, therefore enabling the team to assess a range of solutions for each need. for example “I need to know what pensions I have so that I don’t miss out” is an example of a good, solution-agnostic user need
  • users with access needs and low digital skills have been involved in research regularly, therefore the team have developed a sound understanding of what extra support these users may need. The team plan to use the GOVUK prototyping kit from Beta onwards so that they can test the service alongside assistive technology and continue to understand behaviours and needs of users of assistive tech.

What the team needs to explore

Before the next assessment, the team needs to:

  • consider alternative recruitment avenues to support them in broadening their understanding of users across a range of demographics. The team were previously recruiting using online means only, however upon recognising the drawback of the online-only recruitment approach they have now begun popup testing in libraries as a secondary means of finding users. This is a great start, and the panel suggests that they continue using popup research methods in non-library settings, as well as recruiting using offline methods to broaden their sample. Since most of the working population is eligible to use the service, the team should find no issue in recruiting a broader range of users.
  • use different recruitment routes. During assessment, the team and panel discussed the potential of recruiting via internal means – both recruiting civil servants for user research and using cross-government contacts to understand whether there are existing user research panels/contact lists that can be drawn upon. The panel would strongly suggest that the team utilise these approaches to help them in meeting this recommendation.

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that the team:

  • has understood the problem space for end users - in particular, the ways that people engage with their pensions now and the common pain points
  • is engaging with the other organisations who provide pension services and information, including pension providers, other government services, regulators and industry groups
  • has understood a complex landscape, including a lot of existing work in this subject area and many legal and technical constraints
  • has aimed to bring many organisations together to create a journey that makes sense to users and meets them where they are.

What the team needs to explore

Before the next assessment, the team needs to:

  • consider whether they could reduce the scope of the service. It seems like the team is developing all parts of the ecosystem at once, which has made it hard to make sure all parts of the service are meeting all parts of the standard
  • show that they are empowered to challenge legislative and policy constraints and advocate for users. Many factors that have a big impact on the service (service scope, technical architecture, where the ‘Find’ service is hosted, restrictions on offline journeys) seem to be decided elsewhere and it’s not clear that those things are always compatible with meeting user needs
  • demonstrate how this service fits into the existing pension service landscape, showing how this service is different to others, and how users will know which one they need. A key criteria for meeting this service standard will be evidencing engagement with existing DWP pensions teams and having strategic discussions about signposting across and between the suite of government pension services
  • demonstrate what the identity checking part of the service will look like and show it’s been tested with real users in the context of this user journey. Panel expects evidence of high success rate for the id solution during the re-assessment.

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that the team:

  • has tested the technical feasibility of the service with the data and dashboard providers to make sure that the journey can be joined up successfully across multiple organisations
  • is engaging with industry groups and pension providers, including on user research and sharing research findings
  • is working with a separate team in MoneyHelper which is updating the pensions guidance
  • has thought about how user support will be provided to users and shared across different organisations where necessary, and what support channels will work best.

What the team needs to explore

Before the next assessment, the team needs to:

  • consider the dashboard provider and data provider journeys in the same way as you’ve considered the end user journeys. For example, map out and demonstrate what their experience looks like and identify:

  • the steps a provider will take to start providing data or a dashboard
  • what documentation they receive and how they receive it
  • what the touchpoints will be as they onboard, and then once they’re running a dashboard
  • what your team will need to do ‘backstage’ to facilitate each step

  • test the support routes to make sure they work for users. This includes involving user support staff in your work and developing and testing the support materials with them
  • agree where the ‘Find’ service will be hosted long-term based on what works best for users.

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that the team:

  • has explored different frontend design concepts for the service
  • is aware of how confusing pensions are to many users, and has factored this into design decisions such as the information presented on the dashboard
  • understands the barriers that identity checks bring to the journey, and is looking for a solution that can deliver maximum coverage and deliver offline checks
  • has worked with the dashboard providers to make sure they are also carrying out user research on the end-to-end journey.

What the team needs to explore

Before the next assessment, the team needs to:

  • demonstrate how it is designing the service to be simple to use for data and dashboard providers as well as end users
  • avoid using the GOV.UK crown, font and visual language in the ‘Find’ journey if the service is not actually part of GOV.UK - as set out in the service manual, this is misleading and likely to confuse users. You can use the design system patterns to make sure you’re building an accessible, usable service, but must not make it look like the service is part of GOV.UK if it’s not
  • do further research on the branding and visual languages used in the journey and the impact of the transition points, and challenge policy constraints if this is what’s best for the user
  • get regular access to content design expertise so your prototypes can include content design best practice from the start and allow you to focus your iteration efforts on the more unique parts of the service
  • test different starting points to the journey, not just the MoneyHelper landing page, so that they can ensure the service journey and flow makes sense to users landing from various sources. This will enable the team to stress test the fragmented user journey and find creative ways to hold users’ hands to navigate them into the ‘Find’ service from their chosen pension provider start pages
  • explore/test the name of the service. Although MoneyHelper is a known brand with existing pensions guidance, research shows that users are likely to search for verbs – hence why existing services such as ‘Check your State Pension’ And ‘Find Pension Contact Details’ are searchable for users and have high traffic.

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • has carried out user research with disabled users
  • has a plan for how it will make sure the service is accessible in the beta stage, including meeting the WCAG 2.1 AA requirements
  • considered the types of users who may struggle to use the service - researched with less digitally confident users in libraries in particular has given them a very good understanding of the issues these users may face
  • there will be an assisted digital route via Money Helper’s existing relationship with Citizens Advice Bureau.

What the team needs to explore

Before the next assessment, the team needs to:

  • carry out usability testing with a wider range of disabled users
  • carry out usability testing on mobile devices
  • have an alternative option in place for those who cannot prove their identity using the solution the team select, as this is likely to be a significant amount of users
  • explain how they are making sure that users get an accessible experience and assisted digital support if they use a dashboard provider.

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • user research discipline was combined with data analysis, this allowed for a more comprehensive understanding of users and their needs
  • there is a comprehensive handover of acquired knowledge to new members of the team
  • the service team had good make up of skills for Alpha and is looking to recruit roles that we would expect the Beta team to have.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure there is stronger user research and service design leadership. The panel longed to see more work in testing riskiest service design assumptions before iterating interaction design and developing underpinning technology. This could mean evidencing that the team explored other ways of solving the problem other than the MoneyHelper dashboard tool.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has clear governance arrangements that are consistent with the agile principles and is working together with relevant stakeholders
  • the team shared a clear agile development cycle with a clear plan of when the larger and smaller providers will be onboarded
  • members of the team were previously part of other pension services which allowed the team to have a good understanding of key stakeholders.

What the team needs to explore

Before the next assessment, the team needs to:

  • n/a

8. Iterate and improve frequently

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • iterated interaction designs were based on feedback from users
  • the team shared a clear roadmap with key delivery milestones and plan for onboarding the pension providers.

What the team needs to explore

Before the next assessment, the team needs to:

  • show more iterations of distinctly different approaches to solve the problem. The team has focused on iterating the MoneyHelper dashboard design which is something we would expect to see at a later stage of development It wasn’t clear to the panel how the team gathered evidence that this MVP is the best solution to take forward into next phase of development
  • provide a clearer articulation of the service value add for the user. During the assessment it was clear different user groups will have different expectations from the service, some more work is required to establish who is more likely to obtain most value from the service and why
  • develop a clearer plan on how this service will integrate with other existing pension services and how will the user be able to distinguish which service they need to use.

9. Create a secure service which protects users’ privacy

Decision

The service did not meet point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team clearly identified and planned the handling of PII data using UMA over standard OAuth for federated authorisation
  • the data was further secure in that not none was held in the service itself
  • the service is validated against the CSF by DWP, the major partner of the service, to ensure the required information assurance of the service, -including the management of risk; The service has a security manager dedicated to the programme within MaPS and also within the programme partners CapGemini, a major supplier team. Both hold relevant professional qualifications, a third security supplier Vine Solutions will provide technical assurance; NCSC have undertaken reviews at key stages within the development lifecycle
  • unauthorised requests to the dashboards are handled by UMA protocol allowing graceful failure.

What the team needs to explore

Before the next assessment, the team needs to:

  • produce a documented DPIA
  • establish a close relationship with the Cyber Security Centre. The program should ascertain the risk from all angles against the Cyber Assessment Framework NCSC CAF framework ensuring adherence by all ALBs i.e. a) Managing Security risk, b) Protecting against cyber attack, c) detecting cyber security events, d) minimising the impact of cyber security incidents.

10. Define what success looks like and publish performance data

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team works within a wider Analytical Group chaired by DWP
  • the team is considering building a user feedback component that will help them obtain some qualitative feedback from users
  • there have been KPIs identified that will help measure technical performance and delivery of scope within budget.

What the team needs to explore

Before the next assessment, the team needs to:

  • develop a clear performance framework that will enable the service team to monitor user satisfaction and understand whether the service is solving the problem the users have
  • have a clear definition of what success means for a particular user group. For example how will the team establish the service is adding value to the younger users?

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team will host the system in the cloud using standard hosting patterns
  • components and applications used across government have been incorporated, such as SalesForce and the Cross Government Central Complaints System.

What the team needs to explore

Before the next assessment, the team needs to:

  • further explore identity provision in Beta, particularly if common identity documents, such as driver’s licence, is absent
  • consider that if an app utilises Google APIs, there may be a requirement to complete a verification process before publishing the app.

12. Make new source code open

Decision

The service did not meet point 12 of the Standard.

What the team needs to explore

Before the next assessment, the team needs to:

  • clarify which Open Source Initiative licence applies to released source code
  • during private beta manifest the intentions of the wider community to publish into Github where possible.

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team will host the system on a cloud platform using cloud services
  • the system will adhere to Web Content Accessibility Guidelines (WCAG) 2.1
  • the program has elected to build a system that complies with the Government Digital Service standards.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure guidance should be taken from the Data Standards Authority (DSA); APIs to be registered, appropriate protocols and securities to be implemented such as https and OAuth for resource access for the system to be accessible via multiple user interfaces.

14. Operate a reliable service

Decision

The service has not met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • DoS protection has been built at 2 levels 1) AWS platform and 2) Salesforce
  • Access Management is handled with ForgeRock
  • PKI authentication for data movement is validated with certification for identification of person moving data
  • the service team has selected ServiceNow to orchestrate a portfolio of robust cloud-based applications to automate and manage enterprise cloud SaaS services.

What the team needs to explore

Before the next assessment, the team needs to:

  • assert the feasibility of smaller providers to deliver all that is expected of them for the service to deliver on its promise. This is a critical dependency for the success of the service
  • provide more detail around monitoring and logging against its objectives.

Next Steps

Reassessment

In order for the service to continue to the next phase of development, it must meet the Standard and get CDDO spend approvals. The team is encouraged to come back to an assessment within 3-6 months or sooner, once they have addressed the recommendations in the report.

Published 8 July 2024