Submit learner data (data collection) live service assessment report

This service collects learner and training/qualification level information from the Further Education (FE) sector. Data is submitted to ESFA to enable payments for some providers and build funding allocations for others.

From: Government Digital Service
Assessment date: 10/12/2020
Stage: Live
Result: Met
Service provider: DfE / ESFA

Previous assessment reports

Service description

This service collects learner and training/qualification level information from the Further Education (FE) sector. Data is submitted to ESFA to enable payments for some providers and build funding allocations for others.

This data covers learners who are at least 16 years old, not in schools or undertaking higher education, and comes from a range of training providers including FE colleges, private training providers and local authorities. It helps distribute around £8 billion of government funding.

This service validates on a submission then allows the data provider to cleanse the data before funding calculations are made. Once completed there are a number of reports made available that give detailed feedback on submissions.

Service users

There are over 2500 unique organisations in England that use the service with about 6000 users in total. Core users are responsible for managing and consuming data for financial forecasting and business intelligence within the following providers:

  • private training providers
  • local authorities
  • colleges
  • universities
  • employer providers
  • National Career Service

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has carefully documented the specific needs of their professional user group, and keeps the needs as a living list which comes under review at each stage of development; user needs are written in plain English to reflect real needs as opposed to development-defined user stories
  • the team has developed simple personas which explain the needs and concerns of different types of user
  • there is a deep understanding of users, in terms of their different orientations, their background, and the language used. As an example, the team was able to defend their decision to retain well-understood educational jargon with reference to their knowledge of users and the learner data system
  • the user researcher has made great efforts to identify and include users with assisted digital needs and access needs and has carefully considered the barriers that might prevent people with these from volunteering for research activities

What the team needs to explore

Before their next assessment, the team needs to:

  • continue thinking about inclusion and what it means, in the context of a service that is typically used by office-based professionals
  • continue to consider the end-to-end journey

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the user researcher and her team had managed to complete a considerable volume of user research, using multiple methods, despite the challenges of the pandemic and the restrictions of working from home (for both the team and their users
  • in particular, the researcher was working closely with a network of DfE researchers to share research opportunities and research findings and to prevent the duplication of effort/limit the burden of research on a stressed user base
  • there is a great use of a variety of feedback mechanisms, from regular user group meetings to Helpdesk data; this thinking about feedback as an ecosystem has really enabled the team to understand their users in the round
  • the team has tested and iterated their designs on a long list of issues, including name change, error reporting and tile design options
  • there is a clear plan for live, and a commitment to continuous research and improvement.

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has a multidisciplinary team with the necessary UCD roles. The only role missing is a performance analyst that’s currently being recruited for
  • they have reduced the ratio of contractors to the permanent staff since the last assessment, including key roles like user researcher, despite the difficulty in hiring.

What the team needs to explore

The team needs to:

  • ensure that when the team moves into BAU mode, they have the necessary roles remaining on the team to continue improving the service, i.e. UCD roles alongside testers and developers
  • recruit a performance analyst for the team

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is showing good agile practices and enables the team to iterate, make incremental improvements and prioritise work
  • they are making changes and improvement from retrospectives
  • they are adapting their teams working practices to the challenges of remote working due to COVID-19.

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have clear structured processes to evaluate and prioritise work coming into the team from a variety of sources
  • they have incorporated what they learnt from user research and other parts of the organisation, like help desk, to make improvements to their service
  • there are defined processes that align with agile working practices to define, build and test user stories
  • there is a team in place and plans to make a continuous improvement when the team transitions to BAU.

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team uses a modern development approach, making full use of the capabilities offered by the Azure platform, providing much of the advantages of using a public cloud
  • likewise, the team has made use of the GOV.UK Design System and not tried to rewrite it to fit the development framework used

What the team needs to explore

Before their next assessment, the team needs to:

  • the downside of fully embracing Azure is that there’s a risk of vendor lock-in, which might act against being able to change cloud providers, should the DfE consider doing so for commercial reasons. The team should make sure changing cloud platforms is not too difficult by identifying and documenting whether similar functionality exists elsewhere and what effort it would take to switch over.

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team is well aware of the security aspects of the service and has evaluated what possible disruptions or fraud vectors are and have tools and processes to deal with the risk
  • regular IT health checks will be run against the system during its lifetime

What the team needs to explore

Before their next assessment, the team needs to:

  • add a privacy policy page

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the source code is publicly available at https://github.com/SkillsFundingAgency?q=dc-&type=&language=

What the team needs to explore

Before their next assessment, the team needs to:

  • provide a description of each repository, as well as a link to the service itself. And some documentation for components that are reusable
  • reduce the number of repositories. The panel doesn’t see why the service’s source code should be scattered over 104 repositories even if, as the team claims, it has many reusable components. This makes the source code very hard to look at, especially with obscure repository names and sparse descriptions
  • group all the reports developed by the team in a GitHub project, and remove the many empty repos, and document the link between the non-empty ones.

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used an open technology stack: C#, ASP.NET, HTML, CSS and the GOV.UK Design System

What the team needs to explore

Before their next assessment, the team needs to:

  • consider working with the Design System team at GDS to share any insights from modifications they may have made to the components or new ones they’ve created as a result of user research

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has an effective deployment process
  • the team can create new environments quickly and easily
  • the service is designed in a way that A/B testing is possible and easy
  • the service was tested for accessibility

What the team needs to explore

Before their next assessment, the team needs to:

  • continue its efforts to recruit testers

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the system is properly monitored via the hosting platform
  • the team have built a dashboard to monitor the health and performance of the service
  • there is a process for the team to be alerted if the service goes down, and to be able to re-deploy it quickly when it is fixed
  • the team has a way to notify its users if the service goes down

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate third-party smoke testing, independent from the cloud platform, so that alerting will happen even in case of a global Azure failure

12: Make sure users succeed the first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using research with users and analytics to inform their design decisions
  • they are working with other service teams in DFE on the end to end journey; sharing insights and problems that arise
  • the team are feeding their work into the policy team that is responsible for the error validation rules - intending to reduce the number of error rules and the burden of these on the user
  • weekly meetings are held with the help desk team to understand their needs and provider queries. Information such as reports has been made readily available in the help desk screens so users can assist providers.
  • the digital inclusion scale has been used in the research to understand what their user’s abilities are as well as conducting research across desktop, laptop, tablet and different browsers
  • the service has an accessibility statement, and the team carried out a WAG accessibility audit. It has been challenging to find users with access needs from within their user base, and the team is now looking to go outside of the user base to find users with access needs
  • relevant content and links to other services are present in the side navigation throughout the service.

What the team needs to explore

Before their next assessment, the team needs to:

  • test the service with users who have access needs; including users who have dyslexic, dyspraxia or dyscalculia, for example
  • keep refining the use of analytics within the design process, particularly around the number of errors and tasks that have to be performed more than once or uncompleted tasks.

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have continued to use the GOV UK design system patterns, for example, the summary list and table components.
  • the service is available across mobile, tablet and desktop devices.
  • the team has needed to use patterns outside of GOV UK design patterns, they have tested it with their user base to make sure it’s clear and understandable. Namely, the tile design presented after the start page.

What the team needs to explore

Before their next assessment, the team needs to:

  • the team’s user base has a lot of domain knowledge that needs to be considered; however, where possible the language used should follow the GOV style guide and recommendations from the content audit
  • the current enquiry form and feedback forms have some intricate design patterns that are long and cumbersome. The team should look to improve these inline with the GOV UK design system and or help from the design community.

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the service can only be accessed digitally
  • the team has made the service to make it easier to use and shared data on how improvements have reduced helpdesk queries
  • the service has been able to make some concessions to data being uploaded during the COVID-19 pandemic. The current situation meant that many users of the service were working remotely and that usual business operations in the education sector were significantly impacted.

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have been collecting performance data throughout their service lifecycle from a range of sources including web analytics, transaction data, survey feedback and the service desk
  • the team established baselines from the historic service which they are comparing to Submit Learner Data performance. This allowed the team to demonstrate a significant improvement when the new service was introduced in the Beta phase and continued improvement since
  • the team is working in partnership with others in the Education and Skills Funding Agency to understand the end-to-end user journey, ask the right questions and minimise duplicated data collection
  • the team segments their performance data based on main user groups to understand changing trends in user behaviour which they can use to improve the service.

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a defined performance framework with clear outcomes, measures and goals. This includes the 4 mandatory measures alongside service-specific performance indicators
  • the team has embedded performance analytics into its research and development process. They are monitoring performance data to identify trends, issues and opportunities. When improvements are introduced they monitor performance data to ensure it achieves the target outcome and make changes to improve further
  • there is a customer experience team using Submit Learner Data insights and data to monitor and improve the end-to-end performance beyond this service’s transaction.

What the team needs to explore

Before their next assessment, the team needs to:

  • appoint a performance analyst to lead on service analytics and insight. The team has a plan to recruit a performance analyst that will be dedicated to Submit Learner Data. This will enable them to further develop their performance analytics, particularly in regards to web analytics
  • consider how the performance indicators and priorities will change as the service transitions into the Live phase ensuring performance data remains a core component of their continuous improvement cycle.

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is collecting data on the 4 mandatory indicators and using this to improve their service. They have the capability to provide data to the Performance Dashboard when a performance dashboard is available
  • the service-specific performance indicators they have identified are actively monitored and used to improve the service including beyond the digital journey
  • the team have developed dashboards for internal users (such as support teams and administrators) to monitor service activity and proactively detect performance issues.

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has shared their services with the CEO.

Updates to this page

Published 22 January 2021