SLC Sponsor Redesign Alpha Assessment
The Report for the Student Loans Company's Service Alpha Assessment on 03/02/2022
Service Standard assessment report
Sponsor re-design
From | Central Digital & Data Office (CDDO) |
Assessment date: | 03/02/2022 |
Stage: | Alpha |
Result: | Not Met |
Service provider: | Student Loan Company |
Service description
SLC wish to define a new digital and more efficient sponsors online journey that enables students to be assessed correctly using their sponsors’ income. The service must reduce the burden on the sponsor and must determine the supporting policy and process simplification.
Sponsors should be asked to provide as few details as possible and student consent should be sought as part of the journey so that sponsor information can be obtained directly through new and improved data exchanges with other Government Departments, including HMRC and DWP. This will reduce the burden on our sponsors to provide complex, confusing and often error-prone financial information.
The new journey should eradicate as many manual touchpoints as possible for both sponsors and SLC and must increase the ease and accuracy of the assessment.
Service users
This service is for:
Sponsors who support student applications for student finance by providing their financial details to be used to assess the students’ entitlement to funding.
Sponsors can be:
- Parents/Guardians
- Parents’ partners
- Partners
Please note, you can also be a student and sponsor at the same time.
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has undertaken research sessions with a good range of users across discovery and alpha
- the team demonstrated a good understanding of pain points and issues with the current service
- research has been undertaken with accessibility and assisted digital users
What the team needs to explore
Before their next assessment, the team needs to:
- provide detailed plans for user research in the beta phase. This should include the user groups they intend to work with, methods and success measures
- demonstrate options they have considered for solving the user’s problems and why the chosen solution is the best one to meet the user’s needs. The team were very focused on replacing an existing system and the functionality of it rather than considering innovative and creative ways for helping users solve problems or complete tasks
- establish high-level user needs for all groups. Those presented were created user needs in the context of a system replacement and focussed on functionality. The team needs to demonstrate they understand the real-world needs of their users
- update their personas to remove demographic information and stock images. Their use can be distracting for stakeholders and off-putting for users who don’t recognise themselves. They also suggest the personas are in fact case studies, which if they’re the combined experience of groups of people, they’re not
- plan and undertake contextual, face to face research with users. It is important user research findings so far are validated through working with them in their real-world setting
- demonstrate their understanding of and empathy for the user’s journey through theirs and other services in the organisation. A Service Designer can help them to do this
- better understand and design for users’ support needs. The team seemed unsure where users went for help aside from the contact centre. This could uncover new users and needs to be impacted against the new service
- better demonstrate how they intend to meet user needs they have uncovered in the design of the service. For example, users with cognitive impairments or low digital confidence need to see all the questions upfront. Leaving them on an offline journey might not be the best thing for them
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team will be working with both UCAS and GOV.UK to make sure content is consistent across both platforms
- the team is working with other service teams to ensure consistency across the services where they can
- they have worked with HMRC to reduce the questions being asked by using HMRC data
What the team needs to explore
Before their next assessment, the team needs to:
- continue working with the relevant service teams and other departments to ensure consistency across all channels
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team are making improvements across the service, including emails, guidance and the service
- they are working with front line operations and this is contributing to the ongoing work
What the team needs to explore
Before their next assessment, the team needs to:
- document and map the assisted digital customer service journey map, particularly in the event of unexpected service outages or barriers to use. The proposed technical platform relies on Javascript, and the team will need to demonstrate how they will work around this constraint to deliver a high quality service
- continue building on being empowered as a team to set the direction of the service, the technology they use and how they can solve the problem for the user. An observation the assessors had was that the team may be limited by the technology chosen for them and are working within some constraints
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- there were good examples of iterative design presented, especially with the correspondence emails
- examples were provided around how the design of the service has evolved during Alpha such as the action and alerts. These insights would be helpful in the wider Government design community by posting on the relevant GitHub thread
- by using other data sources like HMRC the team have made the service simpler to use
What the team needs to explore
Before their next assessment, the team needs to:
- explore the need to have “your account” banner across all pages whilst completing the transaction. This is not a normal pattern and could confuse users completing the service. Demonstrate through analytics and user research that there is a need for this component
- in beta show examples of error validation and content and how that has been tested and where needed improved
- the check your answers pattern page is not in the correct format and will need amending before assessment unless there is clear evidence of why it should be in this different format
- review the use of complicated, policy or legal wording such as the “before you submit your application” page which is very complicated for the user to understand and is written in an unclear way
- it is not clear why the student date of birth needs to be entered to confirm the decision of the not providing income details. The user is told it’s for security reasons, but the same question is not asked for the other journey of providing the student finance. Review and demonstrate the user need for this question
- the data protection statement and privacy notice is not on the same page as question
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team have plans in place for the beta for those who have Javascript disabled
- the team tested with users who have assisted digital needs, those who use assistive technology and have low digital skills
What the team needs to explore
Before their next assessment, the team needs to:
- continue working at the wider organisation level to reduce or remove the reliance on Javascript for the digital service to operate, in line with government guidance
- demonstrate in beta the journey if Javascript is unavailable and the end-to-end journey for that scenario
- show that user research of the non-digital journey has taken place and that users were successful with that journey
- strongly recommend to have the external accessibility audit completed before the beta assessment with the audit provided to the assessors
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- team were fully resourced and working well with a mixture of permanent and contracted individuals
- team are already in place for private beta in preparation for build; managed to maintain consistency in team make up
- team have regular ceremonies and are making best use of channels to support the dispersed MDT inclusion in them
What the team needs to explore
Before their next assessment, the team needs to:
- work with related teams to provide a seamless end to end journey for users, including set out how they are going to do that in private beta and tackling both happy and unhappy paths
7. Use agile ways of working
Decision
The service did not meet point 7 of the Standard.
What the team has done well
The panel was impressed that:
- team have channels in place to support transparency and communication (MS Teams, Jira etc) across MDT
- team use their ceremonies to walk the boards and review user stories
What the team needs to explore
Before their next assessment, the team needs to:
- although the team talked about iterative development it wasn’t clear how that was happening in practice – product owner identified as responsible for prioritising but based on what, with who, frequency etc? need to set out how the user research etc is used as basis for discussion and how changes get prioritised
- cover a typical sprint cycle – how goals are set, feedback assessed etc and decisions made; including how and when stakeholders are engaged
- demonstrate where they are empowered to make decisions and if there are any constraints around that
- be clear about their focus for private beta
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- team showed good examples of how the service had iterated to reflect feedback/address user needs eg setting out what information would be needed for application upfront, introducing status bar; iterations included content and design changes to service including emails to users
- improvements seem timely and frequent; new service looking to avoid asking for information already held (in HMRC) as well as aligning ‘guidance’ across different channels for consistency
What the team needs to explore
Before their next assessment, the team needs to:
- work with related teams to provide a seamless end to end journey regarding student finance journey, including set out how they are going to do that in private beta and tackling both happy and unhappy paths
- be clear about the focus for private beta
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team has engaged with SLC’s Information Security & Security Operations team and is well aware in online service security, including principles and assessments
- making sure SLC’s data is safe and secure in Salesforce’s cloud
What the team needs to explore
Before their next assessment, the team needs to:
- engage with the part of the organisation that deals with GDPR and offer simple ways for users to request access or modification or their data
- the team will have to carry out penetration testing of the service, preferably through a third party, and preferably on multiple occasions
- the team should also carry out a threat assessment to make sure risk is properly evaluated
- have a cookie policy and implement user consent according to data-protection rules
10. Define what success looks like and publish performance data
Decision
The service did not meet point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the 4 core KPIs had been explored including how they would be measured in private beta
What the team needs to explore
Before their next assessment, the team needs to:
- be clear how users will be invited to use the new service in private beta ie how they are going to capture that cohort
- consider how they will know if they have succeeded – how performance of the new service will be assessed against current service (baselines?) and reviewed against user needs
- consider how they will use the measures to improve the service
- consider if there are other measures needed – as identified through user needs etc
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has built a prototype and were able to iterate it frequently to keep up with iterations during the alpha period
- the prototype used the GOV.UK design even though it was built using Axure, even though it’s not officially supported by the GOV.UK Design System
- the prototype will not be reused and the beta service is being built separately
- the devops platform and pipelines presented are sound and modern. However it will be crucial to integrate automated testing as soon as development starts
What the team needs to explore
Before their next assessment, the team needs to:
- be aware that using Salesforce Lighting to build a high-quality service that complies with the Service Standard will be a very difficult task. While the panel understands that the choice of using the Salesforce platform was an organisation-level decision, it has the unfortunate consequence that it will be quite hard for developers of this service to build it in a way that will make it pass Beta assessment
- take into account the recommendations from the failed Beta assessment for SLC’s “track application for student finance”, as most of them will apply here as well
- revisit misleading claims like “WCAG 2.0 allows sites to require JavaScript” (WCAG doesn’t allow or disallow any technology), “WebAIM states that JavaScript does not impact accessibility” (in fact it states that “There is no easy fix that can be applied to solve all accessibility problems associated with JavaScript”) and “disabling JavaScript is a conscious decision on the part of the Citizen” (it isn’t). It is indeed possible to build accessible services that make use of JavaScript, but it takes a good knowledge of progressive enhancement techniques to get it right, and the panel worries that Lightning will make that difficult
- consider an alternative to Lightning, such as a front-end app that talks to the SLC platform via an API and which adheres to the Service Standard while being flexible enough to allow the team to iterate frequently
- take full account of the impact of javascript dependency on resilience and reliability of the service, and affordability and sustainability impact of additional code complexity created
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team intends to publish its source code later this year, as a one-off first, then continuously
- SLC is “working in the open” with Salesforce on the platform’s GOV.UK frontend components, even though those components aren’t currently open-sourced
What the team needs to explore
Before their next assessment, the team needs to:
- publish as much source code as possible, including back-end code, even if it’s tightly bound to the Salesforce platform
- include open-source publication as part of the continuous integration pipeline so code updates are published automatically
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team has expressed the intention to publish its source code and work in the open
- the service’s notifications will be handled by GOV.UK Notify
What the team needs to explore
Before their next assessment, the team needs to:
- contribute back to the wider cross-government design and front-end communities any insight gained during the design of this service (new components, accessibility findings, etc)
- use open standards, and propose a new open standard if there is not one that already meets their needs
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the alpha prototype was reliable enough to be used during user testing during alpha
- the beta service will be hosted in the cloud and not on-site, and managed by the Salesforce platform which should provide good reliability
What the team needs to explore
Before their next assessment, the team needs to:
- make sure that it doesn’t rely entirely on the platform to ensure the service’s reliability: application-level monitoring, smoke testing and alerting should be designed and controlled by the service team
- have a process in place to remedy any kind of incident: being able to quickly deploy fixed versions of the application, communicating with users when things break
- carry out quality assurance testing regularly (refer to the Service manual)