Manage Your Loan alpha assessment

The report for the alpha assessment for Student Loans Company's Manage Your loan service on 25 July 2018.

From: Central Digital and Data Office
Assessment date: 25 July 2018
Stage: Alpha
Result: Met
Service Provider Student Loans Company

The service met the Standard because:

  • the team know about their users and have done a good level of research with clear acknowledgement of assisted digital users
  • the team were able to demonstrate that they are a multidisciplinary, co-located team within SLC
  • the team showed how they have iterated their design and demonstrated good evidence for design choices made.

Description

The ‘Manage Your Loan’ service will allow customers at all stages of their student loan journey to view their balance, payment and repayment transactions and make repayments towards their loan.

Service users

All SLC customers in study or who have graduated and have taken out a student loan.

Detail

User needs

This team know a great deal about their users. They’ve evidently done a good amount of research, both in discovery and Alpha. They’ve also made good use of data from within the organisation. For example they’ve spent time listening in to calls in their contact centre and involved contact centre staff in a persona workshop.

In Alpha the team have carried out 6 rounds of usability testing with 51 participants. They’ve identified different types of users, including students, repayers, part-time employees, self-employed people, and have endeavoured to carry out research with a wide range of users. They also identified regional differences in experience due to devolved administrations so have made good efforts to conduct research across the country as a result.

From this research they’ve developed an excellent understanding of their users, their needs and their current pain points and frustrations. Through their analysis they’ve identified 2 key factors that influence the needs of users who interact with their service, how much student loan they owe and whether or not they’re in employment. Each persona presented had clear user needs and distinct pain points. The team also displayed a good understanding of the life events that cause their users to be more or less active with the service, for example an annual bonus or earning under the threshold.

It was great to see that the team have involved users with access needs in their research. They’ve tested with users with visual impairments and dyslexia and presented some nice examples of usability issues they discovered and how they’ve addressed them. For example, they discovered that a drop-down menu of 26 letters in a security question was really difficult for users with dyslexia. As a result they’ve changed this to be text input.

We were also pleased to see that the team had been able to include users with lower digital confidence and literacy in their research, despite the challenge that it’s hard to recruit this type of user as most university educated people have higher digital skills/confidence. They were able to overcome this recruitment challenge by going into sixth form colleges to speak with 17-18 year olds.

The team have clearly learnt a great deal from these different rounds of research and presented 4-5 different iterations of their prototype and what they’d learnt and changed with each iteration.

It was clear that the whole team were involved in user research in Alpha. It was great to hear about how they’ve involved their QA testers in the accessibility testing - this was clearly an important learning point for them.

They’ve also involved other teams in the organisation in their user research, for example the marketing team and the ‘Apply’ team. This is crucial as lots of their research insights are wider than the scope of this digital transaction. There are also several teams that are doing work that closely relates to this service. The team should continue to work closely with other teams in the organisation in Private Beta so that they can ensure the wider user journey is consistent and meets user needs. They can do this by inviting them to observe research, take part in analysis and attend show and tells.

A gap in their research to date, however, is testing the end-to-end journey for candidates, from searching for the service in Google, to landing on a GOV.UK page and proceeding into the transaction. They should look to mock up and create usability tasks to test this early in Private Beta in order to understand how the service should fit into existing or new content on GOV.UK.

Team

The Service Team have demonstrated that they are a multi-disciplinary, co-located team within SLC. The Alpha team consist of Delivery Manager, Product Manager, User Researcher, Designer, Content designer, Tech Design, Tech design lead, solutions architect, 3 x BSAs, 9 x Devs, 2 x QA testers and a test consultant. There will be roles added as the team moves into beta including more developer resource to cope with both the live running of the existing service and the new Beta service that is being developed.

The team need to ensure that dedicated resources are in place for Beta and ensure that these resources are not pulled away to work on legacy systems they should be dedicated to Beta developments of the manage your loan service.

The team are a combination of contractors, suppliers from Accenture and permanent members of staff. The service team’s ways of working facilitate knowledge transfer from suppliers to permanents by pair programming, and working closely with developers who work on the repayments team. This demonstrates a good way to transfer knowledge from contract and supplier staff as well as developing the Alpha outputs.

The team should consider ensuring that the suppliers in Beta spend a percentage of time upskilling SLC staff a sure way to build internal digital capability. This is only a recommendation.

The Service team are demonstrating good Agile ways of working. Two weekly sprints which kick off on a Wednesday, iteration planning meetings, daily stand ups and retrospectives. Show and tells are well attended by senior staff and additional show and tells are held for the board that the manage your loan service reports into. The SRO used to work at HMRC and has brought valuable knowledge into the team which will prove invaluable given the team are moving to more frequent data feeds from HMRC of employment statuses as of April 2019. The team have Memorandum of understanding and SLAs in place to work with HMRC. The team that are working on the HMRC work are closely integrated which is encouraging and all other work streams dependencies are managed through scrum of scrums. Policy is integrated into the delivery team as well as operations staff.

The Service Team need to ensure that the dependency on HMRC is managed closely as a break in service will have a negative impact on the user experience of the Manage my loan service users.

The service team demonstrate a clear togetherness and the product owner is clearly empowered to prioritise the backlog. Any differences are always resolved within the team, which is very encouraging. The team have a good rhythm to their sprints, regularly user test and in Alpha have tested regularly with 51 users in 6 testing sessions. The findings are always discussed among the team before any changes are made.

The team should be commended for sticking to agile principles and the organisation should support the team even if there are organisational changes occurring.

The assessors recommend that the team start to prioritise the backlog for the Beta developments.

Technology

The project team has decided to use Java with Spring Boot for the development of the application and use IntelliJ for their integrated development environment (IDE). The already existing Oracle DB will be interacted via ORDS while partially completed transactions and new data from the service to be stored in Mongo. Rabbit MQ will be used to queue requests if Oracle is down. The team uses Thoughtworks products such as GOCD for the continuous integration and Mingle for the backlog management of stories. SLC as an organisation uses Atlassian products. Elsewhere other teams use Jenkins for continuous integration. Other products such as Confluence are used in the organisation.

Source code uses GitLab for internal code as team are not ready to code in the open due to the ‘logistics and strategy’ around data sharing such as the team pointed to being worried they have secret config inside their code. It seems that the organisation does not support coding in the open and the panel would encourage the team to work with senior stakeholders in the organisation to challenge this before returning for Beta Assessment.

The application and infrastructure penetration testing is to be done by an external organisation which is a good practice. The testing will include intrusion detection to be put in place for URLs. Team explained the use of TLS SSL HTTPS for web based products. The team also mentioned Guardian for database protection. They will encrypt data at rest - database encryption for some data and field encryption. Database monitoring for unauthorised access and the day to day monitoring will be done by the service team. Disaster recovery activities consist of 2 data centres in Glasgow with a mirror image. Cloud is not supported by the organisation at the moment, on-prem is dictated by the wider centralised technology team however the panel has been informed that there is a programme that is looking into moving the applications to the cloud.

SLC have a call centre available to handle queries in case the website goes offline. The call centre also assists in taking card payments.

Design

The panel was impressed by clear iterative design, where different approaches were taken to user research and worked on to find the best solution for user needs. We saw good evidence for prioritising the loan balance, making a payment and other elements on the home page.

It was great to hear about testing with users who have access needs, in particular issues around dyslexia and understanding of the loan interest.

The service is consistent with the design of GOV.UK and makes good use of patterns such as tables and typography.

The prototype was comprehensive and behaved realistically, a great basis for testing ideas with users.

As mentioned in the Recommendations section, it’s important to consider all the touchpoints your users may have. The team talked a lot about education, and this service may not be the only or the best place to do that. We discussed sharing your findings with other teams who might be better placed to give education and guidance.

Recommendations

To pass the next assessment the service team must:

  • provide a detailed plan for cloud migration of this service instead of hosting it on-premises
  • consider other sites and touch points where your users journey may connect with your service. For example we discussed that people may have a need to understand their Student Loan payments when looking at their PAYE situation on HMRC pages
  • start coding in the open as this is possible even the if the organisation does not currently do this. It encourages best practices, collaboration, makes it easier to share standards and improves Gov transparency. See link for more info
  • test the end-to-end journey for candidates, from searching for the service in Google, to landing on a GOV.UK page and proceeding into the transaction. Mock up and create usability tasks to test this early in Private Beta in order to understand how the service should fit into existing or new content on GOV.UK.

The service team should also:

  • consider ensuring that the suppliers in Beta spend a percentage of time upskilling SLC staff a sure way to build internal digital capability.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Next Steps

To get your service ready to launch on GOV.UK you need to:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Do ongoing user research Met
3 Have a multidisciplinary team Met
4 Use agile methods Met
5 Iterate and improve frequently Met
6 Evaluate tools and systems Met
7 Understand security and privacy issues Met
8 Make all new source code open N/A
9 Use open standards and common platforms Met
10 Test the end-to-end service Met
11 Make a plan for being offline Met
12 Make sure users succeed first time Met
13 Make the user experience consistent with GOV.UK Met
14 Encourage everyone to use the digital service Met
15 Collect performance data Met
16 Identify performance indicators Met
17 Report performance data on the Performance Platform Met
18 Test with the minister Met

Updates to this page

Published 17 January 2019