Money Helper Pensions Dashboard alpha assessment
Service Standard assessment report Money Helper Pensions Dashboard 10/07/2024
Service Standard assessment report
Money Helper Pensions Dashboard
Assessment date: | 10/07/2024 |
Stage: | Alpha |
Result: | Amber |
Service provider: | Money & Pensions Service |
Service description
The Money Helper Pensions Dashboard is a digital service to enable people to access their pension information in a single place online, in a clear and simple way, on a computer, tablet or mobile phone at a time of their choosing. The service will be available across all four nations of the UK under MaPS MoneyHelper brand (HelpwrArian in Welsh). This is intended to support consumer engagement with, and understanding of, pensions and retirement planning.
Service users
A pensions dashboard user:
-
is over the age of 18
-
has one or more pensions from UK pension providers
-
is in the accumulation stage (has not yet accessed any of their pensions, or has accessed one or some but not all of them)
-
may have lost touch with one or more pensions
The team has further divided the users into these profiles:
A problem for tomorrow
-
tend to be aged 26-55
-
are hesitant to engage with their pensions because of low expectations
-
may believe they do not need to know about their pensions yet
-
the dashboard will help them learn about the state pension and understand how to improve on what they have in all their pensions
Time to make progress
-
tend to be aged 35-55
-
may have experienced a life event (for example birth of a child, divorce) that inspires them to get on top of their retirement planning
-
occasionally engage with their pensions but feel they have more to learn
I’m on top of this
-
tend to be aged 40-70
-
have a strong knowledge of pensions or focus on alternatives to pensions for retirement planning
-
may use the dashboard to check they are not missing anything or to see if there is anything new to learn
Things the service team has done well:
The service team walked through a significant amount of research that they have done, which consisted of usability testing, interviews, concept testing, guerilla testing and desk research.
The team created different user personas and segmented the users into different groups such as ‘A problem for tomorrow’, ‘Time to make progress’ and ‘I’m on top of this’. These were segmented into different age brackets which were reflective of the background motivations of the users and user needs to gauge how the service can be integrated into the user’s lives. For instance, the ‘Time to make progress persona’ covered the 35-55 age bracket which were users who may have experienced a life-event, motivating them to look into their pensions, and aid with their retirement planning.
The service team built on previous research and conducted a discovery refresh which included the use of semi-structured interviews with potential users of the service.
The service team demonstrated how the user personas have evolved since discovery. For instance, the user persona ‘life is hard’ evolved to ‘I’m looking for more control’. This illustrates that the team was able to probe further and build on previous research and turn this into a meaningful insight that can be used to aid and design the service.
The service team made good use of desk research and formulated user personas, one of which was from a mental health perspective, which was very effective when walking through the design of the prototype. The service team evidenced some comparative analysis, comparing the pension services that are already live and the gap that this service can fill. For instance, to collate all a user’s pension in one singular system. In doing this, the service team demonstrated that they understood the mental models and the user requirements for the service.
The service team has explored different user needs as it pertains to different user groups. They have made use of collaborative working in workshops with a UCD team to prioritise user needs and demonstrated a framework that was used to transfer user insights from the research into tangible changes in the design.
The service team demonstrated the user needs and cascaded them into different themes such as ‘trust’, ‘help and support’ and ‘comprehension’ which was used to aid the design of the service. For instance, the service team presented that, from a security perspective, users wanted reassurance when using service. This led to the service team conducting research into branding as a way to test different design concepts to gain the user’s trust and assurance.
The service team highlighted the different types of users, such as primary and secondary users. Research into secondary users led to the theme ‘delegating access’ and generated insight from a secondary user’s perspective. The service team have an awareness of the need for an assisted digital route for users who have low digital confidence. The service team presented their intentions to conduct research with users of low digital confidence, by exploring other non-digital platforms such as libraries.
The team has developed high-level research plans for the beta phase and planned what each phase of research will focus on based on areas that require further refinement and unmoderated testing in private beta.
It was good to see the team exploring wider journeys; what branding and simplifying of interactions would do for users through iterations. It was clear that a lot of thought and change has gone into the latest tested iterations and the panel were encouraged by the iterative cycle approach. By focusing on testing with users the team has been able to understand what is and what isn’t needed, and the team should continue this into private beta - with a focus on the “so what” for their users, “So what does this mean and what should I do now…”
It was good to hear about the team’s move away from more custom tools and being able to utilise more of the common applications and components, such as their integration with GOV.UK OneLogin. The team should continue to feedback any research findings to the OneLogin team.
The team talked about doing competitor analysis across various countries and private sector offerings and taking inspiration from the design language these products used in the pensions and financial sector to help inform how they could better show information to users in a valuable way. The team should continue to explore the accessibility of these options and when they have developed some coded options, test these with assistive technology users as they have planned.
The team has focussed on what were identified as their biggest risks found through research and analysis of the previous projects work. They talked about the importance of keeping these at the forefront of testing and it was clear to see that they have used these to inform design hypotheses.
The service does not store any sensitive information about users. The service uses the GOV.UK OneLogin to authenticate users. The team has completed a Data Privacy Impact Assessment. When the team decides to deliver a feature to allow users to share dashboards with other parties, the review of security and privacy arrangements needs to be completed.
The communication with the pension providers is protected with mutual authentication (mTLS) and obfuscation URLs are used to fetch data from pension providers. The team needs to ensure that mTLS certificates issued for pension providers, in a self-service manner, are easy to rotate and pension providers are notified about certificates which are set to expire and not rotated.
Privileged Access Management is in place, and back-end admins/operators are required to use strong passwords and multi-factor authentication (MFA).
The team collaborates with the NCSC, internal Cyber Security and SoC teams. A penetration test was completed, and all critical and high-risk items were addressed. The team has started an engagement to complete a threat modelling exercise.
The service uses shared government components (GOV.UK Design System, GOV.UK OneLogin) and industry/open standards (PKI, OAuth). The team collaborates with pension providers across the sector on Open API specifications in line with the proposed legislation.
The team has Business Continuity Plans (BCP). These were tested recently and will be tested regularly. The team monitors the availability and uptime of internal components. The team wants to explore how and if it will be required to notify users when some pension providers are unavailable when the “Find My Pension” searches run.
The tech team follows good development practices. The infrastructure is defined as code (IaC) and automated Continuous Integration / Continuous Deployment (CI/CD) pipelines are used across all the environments.
The team is aware of some technical constraints/requirements related to the technology selected and will think if it is required to notify affected users.
The team has spoken with other teams across departments and government, in such things as the Welsh language for example. They should use these connections to explore what other work has been done by similar teams to see if there are any learnings or patterns that could be a starting point for testing and solutioning with some of their more complex features. This could be of benefit when it comes to displaying content, graphs and offline information for users; something HMRC and other departments have done a lot of work on previously.
The team had really taken on board the lessons from previous alpha and ways of working and made positive changes particularly around silo working and transitioning to a more collaborative approach. There are currently two projects under separate governance structures and there is a desire to bring this together as it is creating conflict in priorities and the approach of having a single team to deliver a single service with a single backlog should be explored with only one view of priorities. The team should also consider whether a single Service Owner would help solve this problem as they would be the escalation point for priorities for the service.
The team has reused information from the previous alpha as a starting point and refreshed where appropriate.
The team is being innovative with the challenges of non-digital recruitment for User Research but recognise they have a gap and also with their “life is hard” will continue to explore options as they move into beta.
They have a well-defined problem statement with facts and figures to back this up from a user perspective and this is a good baseline for how they measure whether they have solved the problem and the measures of success. Alpha has produced some good measures that will continue to be iterated. A performance analyst is now part of the team to take this forward.
1. Understand users and their needs
Decision
The service was rated green for point 1 of the Standard.
2. Solve a whole problem for users
Decision
The service was rated green for point 2 of the Standard.
3. Provide a joined-up experience across all channels
Decision
The service was rated green for point 3 of the Standard.
4. Make the service simple to use
Decision
The service was rated green for point 4 of the Standard.
5. Make sure everyone can use the service
Decision
The service was rated green for point 5 of the Standard.
6. Have a multidisciplinary team
Decision
The service was rated green for point 6 of the Standard.
7. Use agile ways of working
Decision
The service was rated green for point 7 of the Standard.
8. Iterate and improve frequently
Decision
The service was rated green for point 8 of the Standard.
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
10. Define what success looks like and publish performance data
Decision
The service was rated green for point 10 of the Standard.
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated amber for point 12 of the Standard.
During the assessment, we didn’t see evidence of:
-
the team publishing non-sensitive code in the public repositories. The team need to think about what licensing agreement works best for the code published in public repositories
-
the team should consider how the external parties may contribute to improvements or report security concerns around code developed
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.
Next Steps
This service can now move into a private beta phase, subject to addressing the amber point within three months time and CDDO spend approval.