The (Digital) Apprenticeship Service - Beta Assessment

The report from the alpha assessment for Skills Funding Agency (Digital) Apprenticeship Service service on 15 December 2016.

Stage Beta
Result Met
Service provider Skills Funding Agency

The service met the Standard because:

  • The service team have a good understanding of the needs of their service users and have developed detailed and representative personas to help develop the service.
  • The private beta includes a representative range of industry sectors.
  • The team have good coverage of the agile team roles and are working closely with policy to make the service easier for users.

About the service

Service Manager: Gary Tucker

Digital Leader: Alan Parnum

Description

The service allows apprenticeship levy paying employers to register, view, and spend levy funds on apprenticeship training.

Service users

The users of this service are employers (including training managers, and payroll and finance staff) of companies with payroll contributions exceeding £3 million per annum.

Detail

Lead Assessor: Thomas Moore

User needs

The service team have developed detailed personas covering the experience levels of service users (both digital capability and apprenticeship knowledge, organisation size and motivations, and the degree to which service users engage with apprentices). The team have a good understanding of the total user base in terms of numbers, and have segmented this user base by user group (reflected in the personas). In addition to the key user group (employers), further personas have been developed for secondary users (payroll and finance staff), and fraudsters. The profiles of internal users highlighted the importance of the relationship between different parts of employer organisations, for example, between finance staff and HR, that necessitates the ability for the service to offer multiple user accounts.

The team have engaged with users through existing BEIS and SFA channels, through blogs and newsletters, and through industry bodies (British Retail Consortium, Confederation of British Industry, and hospitality organisations). The team have worked to ensure that the private beta reflects a representative range of industry sectors.

The team are using a variety of research techniques, including face-to-face (in-situ) research, remote moderated research, and diary studies (unmoderated). The service team have not conducted user research in a research lab environment, but have access to a dedicated research facility in the SFA. Different members of the team attend test sessions and there is a good structure in place for sharing user feedback with the rest of the team. The panel recommends that testing also involves the use of a usability lab to enable real-time group observation of user feedback.

The team have mapped the digital capability of users using the digital inclusion scale, and understand the digital capability required to successfully complete the service, first time, unaided. The team estimates that 10% of users will need assisted digital (AD) support to use the service, with public sector users forming the highest proportion of AD users. The SFA are providing AD support routes through contracted support, and have the capability to monitor and iterate support to ensure it meets user’s needs. An assisted digital plan has been mapped out and the team confirmed that they are in contact with the call centre provider to define call scripts for the call centre operators, however, this is yet to happen.

User research with accessibility users has been very limited, and should be an area the team focuses on as it moves into public beta. The team are engaging with the Shaw Trust ahead of the service’s public beta launch.

Team

The team is multidisciplinary and co-located. Contractors within the team actively up-skill civil service colleagues, and the clan structure the SFA operates provides further cross-team support for each discipline.

The service manager has a high level of autonomy over how the service is developed, and the team have a close working relationship with the policy teams who work on policy areas impacting the service. This close relationship has helped the team to simplify the service.

The team work to two week sprints are working towards a single product backlog.

Technology

The team continue to build on the technology platforms put in place by other elements of the wider service, and in particular are an all-cloud service. They have worked with HMRC to share data via APIs, and with user consent, allowing the service to have low friction. The panel was glad to see this kind of collaboration between departments working well. The team acknowledged that the addition of two-factor authentication to Government Gateway may introduce challenges to their current user journeys, but had plans in place for to meet this challenge.

Password management doesn’t follow the current National Cyber Security Centre (NCSC) password guidance (Password Guidance: Simplifying Your Approach) notably around changing passwords, technical controls vs. complex passwords, and password strength meters. The team must review the service’s current password management policy and bring it in-line with current best practice.

The use of Hotjar to monitor user activity was of some concern to the panel. While there are options to anonymise user data, it is a SaaS product and the team was unclear on how resulting data is stored.

The team operates an open source by default approach, sharing code (with the exception of sensitive code) through Github.

The service uses GOV.UK Notify to provide users with an access code as part of the account registration process.

Design

The team demonstrated different approaches to the design of the service and explained how they were iterated, often starting with paper prototypes before moving on to HTML prototypes and then testing with the production service. The design and research team work one or two sprints ahead of the developers, understanding and prioritising user needs that haven’t been addressed and working through the research cycle before new features are developed. The panel was impressed by the team’s use of service design across the various apprenticeships services. The team demonstrated a thorough understanding of the portfolio and how each service fitted in place. Working across the teams has allowed a consistent design language.

The team have worked with the other apprenticeship services to create a consistent navigation pattern. Currently this combines tabs and breadcrumbs which leads to some repetition. It is advised the team continue to iterate the navigation pattern to ensure a simple and consistent experience. The registration flow has been through a number of iterations, focussed around getting users to understand Government Gateway requirements.

The team are working with GDS content team, and are discussing routes to the service from GOV.UK. The team are looking at improving the account landing page to make it more task focussed and are planning on implementing functionality to support bulk uploads of the apprentice data for the apprenticeship providers.

The team are designing for mobile-first using progressive enhancement, and there is no dependence on JavaScript for the service to work.

The current name of the service,”Apprenticeship service”, doesn’t follow service naming guidance. The team stated that multiple variations have been tested, however, they are having difficulty in finding a single verb to describe the different tasks the service offers. This is usually an indicator that the service is too far reaching. The team will need to find an appropriate name for the service before the service launches as a public beta.

The GOV.UK header will need to be implemented before the public beta launch.

Analytics

The team have developed a Performance Platform dashboard to display the four mandatory Key Performance Indicators (KPIs). The team are using Google Analytics and have set up funnels to track user journeys and identify potential pain points. The non-linear nature of the service makes identifying meaningful end-points in the user journey a challenge. The team have however given this due consideration. In addition to the four KPIs, the team are keen to monitor indicators of policy success. This will be exceptionally challenging, but the team are considering ways of using metrics to measuring this.

Recommendations

To pass the next assessment, the service team must:

  • Work with the call centre provider to define call scripts for call centre operators in order to assist AD users.

  • Increase user research and testing with users with accessibility needs.

  • Review password implementation and ensure approach follows NCSC guidance.

  • Review the use of Hotjar, and ensure user data is safeguarded.

  • Ahead of the service’s launch, the team must determine a name for the service.

  • Implement the GOV.UK header ahead of the service’s launch.

The service team should also:

  • Use the usability lab to enable real-time group observation of user feedback.

  • Explore different options for the account landing page to help users who are not familiar with the workflow.

  • Iterate the navigation patterns to ensure they are simple and consistent and document research findings on the design patterns hackpad.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 13 April 2017