Get help to retrain beta assessment report

The report for DfE's Get help to retrain beta assessment on the 21st of November 2019.

From: Central Digital and Data Office
Assessment date: 21/11/2019
Stage: Beta
Result: Not met
Service provider: Department for Education

Previous assessment reports

Service description

The service allows users to create an action plan to help them identify another type of work they could move into and what training might be available nearby to support them in their application.

Service users

People who are in employment; age 24+ and without a degree or equivalent level 6 qualification; who work in jobs at high risk of automation.

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had clearly focussed on iterating the service design based on testing with users and were able to point to a number of changes made as a result of testing. There were a number of occasions when the panel suggested alternative design options (e.g. incorporating salary options, returning to a task list) and the team were able to respond by confirming that these options had been tested but found to compromise key user needs
  • the team have recognised that a significant proportion of their users will have lower digital skills and have targeted the participant recruitment to reflect this in their research and testing. As a result of this research they have scaled back functionality in the service so that it works for users with lower digital skills rather than risk excluding these key groups with additional complexity
  • the team have not hidden from research findings that may be difficult for them to address as they like outside the team’s control. A good example of this was the reaction to the use of the term apprenticeships. Users found this term as off-putting as they associated it with younger people at a different life stage and the team have sought to avoid the use of the term within their service and manage the hand off to the Apprenticeship service to minimise negative impacts for users

What the team needs to explore

Before their next assessment, the team needs to:

  • test the content more thoroughly, as the service contains a lot of content which the team described as motivational but there was little evidence of this actually being tested. We would expect user research to check comprehension and recall of text-heavy pages once users had left them rather than while they were still on the page and able to re-read. At a higher level the team were able to articulate the goal of the motivational content (e.g. help people understand that this training won’t be like going back to school) but they didn’t then test that understanding at the end of sessions. The team should carry out this testing to see if the current content heavy design is effective and alter as necessary

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had identified that users were interested in training as part of a wider journey to find a new job. They had therefore taken steps to position their training offer as part of this journey and to integrate with the DWP Find a Job API
  • the team has shared governance with NCS and have committed to exploring how to line up their user journeys further in future. They have also some shared governance with DWP, and are working to help them integrate routes to the GHTR service into Universal Credit’s user journeys where appropriate, including Universal Credit and Find a Job
  • the team had a shared board that included stakeholders from DWP and the National Careers Service, and seemed to be in regular conversation with these stakeholders
  • the team had identified that the word ‘apprenticeships’ also had negative connotations to their user group and were working with the apprenticeship service to explore whether another name might be used
  • the team has worked hard to create handovers to other services from Get help to retrain where their service does not serve the needs, even if they had not yet explored how other services might link back to their service
  • the team are working hard to address the significant challenge of delivering complex behavioural change policy in a way that complies with the Standard. This is a hard problem to address

What the team needs to explore

Before their next assessment, the team needs to:

  • map out the service landscape from the user’s perspective based on user insight, including overlapping products and services from DWP and NCS across all channels, and use this as a basis for clarifying the proposition and scope of their service, and understanding how these services need to work better together or integrate to help all users across the problem space to get what they need. This is particularly important for routing users into the right services, and helping users find the thing they need for the thing they are trying to do. The team presented good evidence that their target users found words like ‘career’ and ‘apprenticeship’ to be for younger users and hence off-putting. They also identified that their aim to provide career guidance to get users into better work was different from the Careers Service’s aim to provide impartial advice - however this seems like a difference created by policy, rather than a user problem. While the team is doing work to join up with services across DWP and NCS to route users, understanding how users perceive the different offers in this space would help them to scope their proposition with more clarity so users know where to start. The team should also consider what will happen to users from their target group, who do attempt to use the National Careers Service and are not funnelled into the correct type of support on the Get help to retrain service
  • really challenge themselves and policy stakeholders about the name of the service and the problem the service is trying to solve. The team explained that ‘Get help to retrain’ is testing well among users and helps prime users to understand that training will be part of the service, but the panel recommend that the team continue to test other service names that don’t include the word training to see if this impacted users when training was shown as an option. The name of the service is important in the context of other service offers in this space, to ensure users understand its purpose and value
  • do further work around how to help users realise their job is insecure and they should use this service. If this falls within the remit of another service team, the panel recommends questioning as it is likely to be key to making this service a success
  • continue work to improve the relevancy of job recommendations and the viability of the jobs they are promoting to users, for example by exploring how to provide an indication of the size of the job market as well as if it’s growing or not

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • phone support was prominently advertised on the service and that the team had considered a range of reasons why a user might call that weren’t limited to the user not being able to use the digital service. Call centre operators were also able to provide career advice
  • the team are planning on testing with users of the Universal Credit service and were planning to test what kinds of communication could encourage these users to use the Get help to retrain service
  • the team are planning to work with job centres to raise awareness of the National Retraining Scheme in that context
  • the team are working with the Find a job service to influence how they might route users to the NRS services where relevant

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to think hard about routes into their service and make sure they are making decisions about these routes based on clear evidence. The team should continue to build on the good work with trade unions and DWP, but they could also think about other routes in such as redundancy content on GOV.UK (building on the insight they gathered around when users are motivated to take action), and Citizens Advice so that users are finding their service at point of need. The national marketing campaign shouldn’t be the main way users find out the service exists and how to find it
  • do more research and analysis into how this service supports the ongoing journey into training. The evidence gathered through their usability testing and analytics shows that the service is usable and that users are able to get through to training providers, but we don’t yet know if this means a user will go on to retraining. A slow roll out of this service would give the service team a chance to assess whether this approach to getting users into training is effective and then make a more informed decision about whether to roll this approach out at scale. They may also see evidence that the service needs to join up differently with other bits of the NRS. At reassessment the panel would like to see more detail about how this part of NRS fits into the wider scheme, and evidence for why this service is the best tool for the job

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has worked to reduce the burden on the user when doing their skills assessment by pre-populating checkboxes against skills. Use of checkboxes in this way works really well for the context, and encourages users to think differently about their skills. It feels like the flow this team has created works really well for the target users group
  • the team has worked with and integrated content from other departments such as the job descriptions and skills through APIs. This gives users the information they need in the place they need it without sending them out of the service and creating a broken user journey
  • the team has simplified the flow for the user, reducing the number of pages the user has to work through

What the team needs to explore

Before their next assessment, the team needs to:

  • test their service within the wider context of the service landscape and a real world problem for users
  • explore and test other approaches to those that have already been tested to measuring outcomes than capturing user data at the start of the journey. Having the data capture before users can enter the service feels like it could be a potential barrier to users and may also not meet DfE’s requirements as users may enter incorrect data simply to get into the service. They should test to see if having the data capture up front impacts dropout rates. They should explore ways to capture data at different points, for example, look at having this data capture later on in the journey, and relate it to the “save for later” function or before handing users over to training providers. They should also look to other ways of measuring impact eg through feedback loops from training providers
  • review use of the task list pattern and the steps they include within it, the flow between sections of the service and where they return users to when they save their progress and come back to the service later on. The team highlighted that bringing users back to the task list made users feel like they were starting over again, but they also highlighted the stepped action plan page was perceived as very useful even though this duplicates some functionality from the task list (such as adding more jobs to your skills check). They could explore combining these to pages so that the action plan becomes the landing task list page for users, with the detail populating as the user goes through the steps of the service
  • redesign their approach to breadcrumbs as they are applied incorrectly, eg “Home” is a link the start page when on the task list, but then “Home: Get help to retrain” is a link to the task list when a layer down
  • simplify how users can provide feedback. The team needs to find a different place for the survey, which is currently the final task in the task list. This isn’t something the user needs to do to get through the service so it shouldn’t be displayed as a step the user has to complete in order to succeed. There are also a lot of ways users can provide feedback, including 2 separate ways on each page within the service. Use of the “Is this page useful?” pattern copied from GOV.UK doesn’t feel appropriate in the context of a transactional flow and should be avoided (this pattern is also currently under review by the GOV.UK team), but the team are using the Beta banner feedback pattern which works better in this context
  • provide users with a back link in the question flow sections of the service

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the phone service goes beyond helping a user through the online service, and that low confidence users testing the service were able to get through the flow without this support

What the team needs to explore

Before their next assessment, the team needs to:

  • create an accessibility statement and link to this from the footer throughout the service
  • test more with users who have accessibility needs - some of the below accessibility issues listed in the accessibility review (see findings below) indicate screen reader users would face difficulty when trying to access the service:

  • the cookie banner - users should not be able to tab through from the cookie banner to the page
  • page title ordering on the page - the page title should come before the service name
  • using unique Element IDs
  • keyboard access on pages such as user data capture error message page and the action plan page where users are currently unable to use space bar, only enter
  • user data capture page - WAI-ARIA group is currently missing a name
  • use of HTML to format content - this should not be used
  • CSS disabled view of the action plan page - the action links appear before the relevant sections which make it difficult to understand the sequence

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the Service Owner and Product Owner were clearly engaged and interested in the work of the team and their policy. They had a strong knowledge of previous services that had failed to solve this problem and were able to talk confidently about how this service would be different as a result of really understanding user needs

What the team needs to explore

Before their next assessment, the team needs to:

  • identify a more sustainable delivery model. The team explained that a number of roles across the wider programme were filled by permanent civil servants, but aside from the Service Owner and the Product Owner none of the members of this service team were civil servants

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were releasing regularly on their live private beta, using feature flags to reduce and control the associated operational risks

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a strong focus on security and have multi level approval controls in place as well as an ongoing plan for reassessment with DfE security when needed. The team recognised and assessed the risks with user data and the panel was assured they will take necessary precautions and steps to safeguard it as the service develops particularly when they add functionality for administration. They have the ability via Ruby to encrypt the data at rest

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure they vet and review any new channel or feature that could compromise the fact that user data is not currently encrypted at rest. It is likely that such data will need to be encrypted at rest when new features come online that allows an avenue for attack. It is recommended the service team continue to work closely with DfE security to ensure all threats are mitigated

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have been in touch with the Performance Platform team at GDS and have a plan to measure the mandatory KPIs. The reasoning for focussing on ‘reaching the action plan’ as the outcome for these KPIs makes it more manageable
  • the team are keen to use a self service model and are creating dashboards to increase access to data

What the team needs to explore

Before their next assessment, the team needs to:

  • run a performance framework session or similar activity to join up the performance data to the user needs
  • ensure the data are actionable - the panel felt there were a lot of metrics being recorded but it was unclear how some of these could lead to service improvements. The team should also explore how these different measures impact each other
  • think about hypotheses and expected results for some of the KPIs. Without targets, it will be hard for the team to know when to take action or investigate further. As an example, the panel want the team to think more around expectations for returning users - and the time periods involved. It is worth speaking to other services in DfE and across government for help with benchmark data

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has given a lot of thought to their selected technology stack and has iterated their decisions. When they found that a technology did not fully meet their needs they tried another, as was the case for the tool they selected for accessibility testing
  • the team made pragmatic decisions when it came to the architecture of their service and were able to fully articulate why they did so when challenged. While initially they were positioned to use a microservices architecture, they came to the conclusion that a simple, streamlined monolithic architecture served the service better, allowing the team to iterate quickly while still maintaining the ability to adapt and change if the complexity changes in the future
  • the use and leverage of APIs and integrations to source data and the work and strategy that went into ensuring and de-risking their unavailability. The panel was particularly impressed with the service teams work and close cooperation with NCS

What the team needs to explore

Before their next assessment, the team needs to:

  • keep a long view of their service as the programme grows. While the team are currently using a monolithic architecture, a microservices approach may be more appropriate in the future. The team should ensure that they are always able to pivot and adapt accordingly

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

13. Use and contribute to common standards, components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using GOV.UK Notify and the design system and patterns and have designed the service to draw on integrations through various government services. The Ruby codebase is drawn from a common shared framework in DfE so all the development teams collaborate on a single shared asset

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a very thorough and well thought out approach, from responsive UI testing and testing on both mobile and tablet devices as well as robust error handling in the event of integration outages. The health checks on the service allows it to be kept running 24/7 and they are easily able to rollback if the need ever arises

Updates to this page

Published 4 February 2020