Prevent E-Learning Alpha Assessment
The service provides three training modules aimed to: educate users about the Prevent programme, how to make an informed referral and how the Channel programme works.
Digital Service Standard assessment report
Prevent E-Learning Alpha Assessment
From: | Central Digital and Data Office |
Assessment date: | 21/01/20 |
Stage: | Alpha Assessment |
Result: | Not met |
Service provider: | Home Office |
Service description
The service provides three training modules aimed to: educate users about the Prevent programme, how to make an informed referral and how the Channel programme works. Users will also have the opportunity to undertake a shorter refresher module. Each module provides a knowledge check at the beginning and end to help measure learning. The Home Office provides the training at no cost to users to ensure consistency with government policy. The training is aimed at all public sector workers under the Prevent Duty. On completion of the training, users should have an awareness of where and how to refer individuals to the Prevent programme for further help. For the general public and the intended user base, the platform will also feature a basic landing page providing details about the Prevent programme.
Service users
Prevent is a key part of the Government’s counter-terrorism strategy (CONTEST). The Prevent statutory duty was introduced through the Counter Terrorism and Security Act (CTSA) 2015. Section 26 of the CTSA (2015) places a duty on certain bodies in the exercise of their functions to have ‘due regard to the need to prevent people from being drawn into terrorism’.
The Duty requires local authorities, schools, colleges, universities, health bodies, prisons and probation, and police to prevent people from being drawn into terrorism as part of their day-to-day work. The service is aimed at public sector workers from the key users listed above.
There are 5.42m public sector workers in the UK (16.5% of UK employment) – of that 1.69m work in the NHS; 1.5m in education; 2.02m in local government (including 238,000 police) and; 45,434 in prisons and probation. In contrast, the private sector employs 27.34m. (source: ONS public sector employment September 2019).
The specified authorities are required to ensure that staff have training that gives them the knowledge and confidence to identify individuals at risk of being drawn into all forms of terrorism, and to challenge extremist ideas which can be used to legitimise terrorism and are shared by terrorist groups.
1. Understand user needs
Decision
The service did not meet point 1 of the Standard
What the team has done well
The panel was impressed that:
- the team has strived to recruit participants with various impairments and access needs
- the team has conducted research with participants from various sectors
- the team strived to define user needs and identify groups of users
What the team needs to explore
Before their next assessment, the team needs to:
- conduct research (discovery) with a sample of people from all the public sector groups that Prevent applies to, to understand their needs for customization of the training material on the Prevent platform. This should help the team explore what the requirements and needs of users working in one sector (e.g. education) are compared to the ones working in another (e.g. health)
-
the team should look at the landscape of training on Prevent provided by different organisations, aiming to understand:
- what are the users’ routes into training not provided by the Home Office
- where users can choose which training to take; understand what are the incentives for them to choose other trainings rather than Home Office’s training – both from a service perspective (eg. convenience, better experience) and from a content perspective (eg. content tailored to their context or profession)
- what are the specific needs the different professions and organisations have regarding training on Prevent
-the team should also: - redefine the user groups to make them robust and ensure they reflect the findings from all the research. With the current user groups, the team are focusing on the level of previous experience they have with the service rather than on the characteristics and needs of each group - first, identify the problems the end-users are facing and then map out the current user journey - continue to conduct research with people with access needs and impairments but do not invest in an accessibility audit until they pass alpha - consider how the platform will potentially serve communities, families, friends etc; the existing data is indicating that the communities, families and friends and other groups are also making referrals through Prevent so they would also benefit from this service.
2. Do ongoing user research
Decision
The service did not meet point 2 of the Standard
What the team has done well
The panel was impressed that:
- the team has been conducting user research on various prototypes with a wide range of end users
- the team has completed a number of rounds of research within a relatively short amount of time
- the team has strive to incorporate user research findings in developing the prototypes
What the team needs to explore
Before their next assessment, the team needs to:
- get a better understanding of the landscape of end users and stakeholders and understand what are the constraints that affect the service
- understand how the service the team is developing will link to other services
- map out the user journey/journeys for the service and the pain points based on findings from the further research
- the team should explore in which ways their service can help increase consistency in training nationwide, considering where it links with other existing training and its providers – eg. better catering for specific needs so users choose Home Office’s training instead? Linking up with the organisation-specific training so Home Office’s modules are embedded in those training? Campaign and engagement strategy?
- explore a wider range of alternative solutions striving to focus on what fits for the end users. The team should question whether a training service is even the best solution for the wider problems identified
- while it is good to do an accessibility audit it is best to delay it until the team has trust that the prototype meets the end user’s needs
3. Have a multidisciplinary team
Decision
The service met point 3 of the Standard
What the team has done well
The panel was impressed that:
- the team has a clear understanding of roles and responsibilities amongst their core team, there is a clear separation of roles
- the team currently has a policy lead embedded in the project performing the role of a service manager, the policy lead has started to engage the DDaT profession to upskill as a service manager going forward
- the team has expressed that they are empowered to make decisions about delivery plans to ensure that the work is meeting a user need
What the team needs to explore
Before their next assessment, the team needs to:
- think about the impact of upskilling the policy lead into the role of a service manager and think about bringing a product manager in-house as an additional resource to support the core team
- review the workload between service design and user research, consider recruiting an additional user researcher to help with the UR recommendations above
- produce a knowledge transfer and transition plan for Beta to ensure that the technical knowledge and understanding is kept internally. There is also an opportunity for Home Office UR and Design employees to be upskilled as part of this project that should be considered in Beta
4. Use agile methods
Decision
The service met point 4 of the Standard
What the team has done well
The panel was impressed that:
- the team is using a range of agile ceremonies to plan, prioritise, and execute their work. The use of daily stand-ups, Kanban boards, sprint planning and reviewing including retrospectives are being used
- the team is currently co-located which has enabled a more cohesive and collaborative environment
- the team is making use of good collaboration platforms such as slack, Trello and the use of google drives to ensure
What the team needs to explore
Before their next assessment, the team needs to:
- create outcomes focussed roadmap for the Beta phase, informed by user research and data collection as part of the iterative process
- explain how user research has led to the discarding of design options between the two prototypes currently tested
5. Iterate and improve frequently
Decision
The service met point 5 of the Standard
What the team has done well
The panel was impressed that:
- the team is iterating and testing the two prototypes with users. They are planning more user research and testing around understanding feedback on the two different front end designs
- the team has thought about different ways of delivering the content for the refresher course
- the team has thought about reducing the time it currently takes to complete the course to help users finish in one sitting
- the team is collaborating with content designers to iterate the training modules materials to ensure it is up to date and understood by users
6. Evaluate tools and systems
Decision
The service met point 6 of the Standard
What the team has done well
The panel was impressed that:
- In determining which was the best way to proceed for their problem space, the service team dove into learning about LMS and CMS platforms to fully understand how each worked, and their benefits and limitations. When they found that LMS would not be suitable for them they then turned to look at CMS and found it was the ideal choice. In evaluating CMS solutions they set out on this task by setting for themselves criteria which they would look for: open source, security, accessibility, ease of content management, usage in other government solutions and headless CMS so if they have the ability to switch solutions in the future
- the service will include video content and for this the service team looked at various ways of storing and streaming video. YouTube was found to tick the many boxes they had, including accessibility, and the team will be doing further work to determine if the transcription feature is suitable for the Welsh language or if they will require translation and localisation
- the service team selected a technology stack and designed architecture that will allow them to be flexible in beta. If they find something doesn’t work for them they will have the ability to change direction easily and quickly
What the team needs to explore
Before their next assessment, the team needs to:
- the service team has selected the Next.js React framework and concerns were raised around what the user journey would be like for a user that is not using javascript. This needs to be looked at closely as the service team progresses to ensure that all users will be able to use the service, as this is not a small service, looking to accommodate some 5M+ users, it must be fully accessible. The service team noted that this is something they will be exploring in beta and the panel would like to stress that this should be closely scrutinised as development continues and to be ready to change direction if necessary
7. Understand security and privacy issues
Decision
The service met point 7 of the Standard
What the team has done well
The panel was impressed that:
- the service team is keenly aware of concerns around GDPR and securing personal data and have taken the approach to ask what the right data they need is and what is actually required
- the service team has designed their architecture so that the user facing layer cannot maliciously act on the system and the internal layer has a high level of controls around it. The two user groups are divorced from each other ensuring there is no accidental overlap and mitigating potential threat vectors
What the team needs to explore
Before their next assessment, the team needs to:
- N/A
8. Make all new source code open
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the service team plans to make the service open source and will be using GitHub in beta
What the team needs to explore
Before their next assessment, the team needs to:
- N/A
9. Use open standards and common platforms
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has made it a point to look for solutions already in use in government when available
- the service team has integrated common platforms into their planned beta architecture and will be using Notify
What the team needs to explore
Before their next assessment, the team needs to:
- N/A
10. Test the end-to-end service
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the service team’s technology stack selection, flexibility to change and plans for beta give them a solid foundation to build on
What the team needs to explore
Before their next assessment, the team needs to:
- while it was spoken of during the panel discussion, it should be highlighted that when end to end testing to ensure the case of YouTube being offline and having to use the backup hosted content does not cause any unforeseen problems
11. Make a plan for being offline
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the service team is thinking about all the right things and asking the right questions and acknowledged that plans would be fully implemented during beta
What the team needs to explore
Before their next assessment, the team needs to:
- explore other offline solutions alongside the face to face training if the digital service is down. Also consider options for users that may not have the time to book a face to face session with trainers
12. Make sure users succeed first time
Decision
The service did not meet point 12 of the Standard.
Regarding the solution presented at the service assessment:
What the team has done well
The panel was impressed that:
- the team is invested in reducing the duration of the training modules to meet the current average time of a website visit, so users can complete the training in one sitting
- the team has made a considerable effort to make the training content as easy as possible for users to absorb and retain the key messages
What the team needs to explore
Before their next assessment, the team needs to:
- in line with the recommendations provided in point 2, it’s advisable that, before developing the presented solution further, the team explore different solutions to address the problems identified in user research, considering the wider context and how this service links to the existing Prevent training ecosystem
- we recommend the team focus their efforts on designing the service, and not just the training content as we understand to have been the case so far. During the assessment, the team presented just part of a single user journey as their prototype. As much as possible, they should prototype and test entire journeys, including how different users find and access the service and any steps that happen before and after the training modules (routing according to sector, emission of proof of completion, etc) and provide evidence that users are succeeding first time. The team should test the end-to-end service experience even if the training modules themselves are not all developed – aim for the minimum ‘condensed’ journey that could simulate the end-to-end experience within the practical and time constraints for testing it
- explore reviewing the service name to help all users find the service and understand what it does. The name ‘Prevent e-learning’ can be quite confusing for those unfamiliar with it, whilst ‘Support people when they need it most’ can apply to multiple government services. You can find guidance on naming services at the Service Manual
13. Make the user experience consistent with GOV.UK
N.B. The team presented two prototypes – one following GOV.UK style and an alternative style. The decision and feedback below apply to the GOV.UK prototype.
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team was able to successfully use the gov.uk patterns and style on the e-learning context, creating a pleasant experience
- the team explored the use of experimental patterns such as Task List
What the team needs to explore
Before their next assessment, the team needs to:
- understand if there is a clear user need for tracking progress within the training modules and, if that is the case, to explore and test ways to deliver it other than the step-by-step pattern (which cannot be used to track progress within a service). For example, this could be done using the Task List pattern, returning to the task list in between module sections, which would appear as ‘complete’ as the user progresses. It could also be done by developing a new pattern that is consistent with GOV.UK and that is validated by user research
- conduct user research considering the different use contexts and environments – for example, consider conducting the testing on mobile devices
- continue to focus on what works best for the end-users rather than what people like, as it is best practice for government services
- https://www.gov.uk/guidance/content-design/writing-for-gov-uk
14. Encourage everyone to use the digital service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team has conducted extensive research into what is wrong with the existing service. A collation of an extensive list of pain points of the current service has aided the team in understanding how to make the service work better for everyone
- the team had tested both prototypes at a digital accessibility centre and have fed the recommendations into their plans for Beta
- used multiple methods to recruit users with access needs
What the team needs to explore
Before their next assessment, the team needs to:
- test the service with users with a range of users within the assisted digital community
- demonstrate an understanding of how the current face to face service is meeting user needs and what opportunities there might be to improve the assisted digital experience in line with the new service
15. Collect performance data
Decision
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the team has recognised that the lack of data from the existing service has made it difficult to assess its effectiveness. They have a structured set of key performance indicators in a framework which helps to identify what data they need to collect
- the thought about performance and data analysis extensively choosing to integrate the new solution with google analytics
- made data accessible through open API’s
What the team needs to explore
Before their next assessment, the team needs to:
- collect data that provides insights into a wide range of user’s experience
- create a roadmap to indicate how data collected will be fed back into the product backlog for development
- explain how the user’s journey through the whole service will be tracked and identify any additional data needed at different touch points
16. Identify performance indicators
Decision
The service met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the team has identified what their short, mid and long term desired outcomes for the programme should be
- have planned for how they will track the uptake and user satisfaction, accompanied by having a detailed journey logic map that shows how these KPIs will be achieved
- has a good understanding of what success measures they can track and what data is outside of their scope
What the team needs to explore
Before their next assessment, the team needs to:
- understand what other initiatives are working alongside prevent in order to have a better overview of how prevent contributes to the overall shared outcome
- there are statistics around how many referrals were made under the prevent programme, it would be good for the team to understand the success of the existing service and what worked well in the lead up to achieving the desired outcomes
17. Report performance data on the Performance Platform
Decision
The service met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- intends to integrate with the GOV.UK performance platform and has been in touch with GDS team about the steps to do this during Beta
What the team needs to explore
Before their next assessment, the team needs to:
- provide the new service performance dashboard through continued engagement with the performance platform team
- explain what the data shows when compared against the existing service and how the data has helped inform changes
18. Test with the minister
Does not apply for Alpha