Prevent duty training: learn how to safeguard individuals vulnerable to radicalisation beta assessment

The report for Prevent duty training: learn how to safeguard individuals vulnerable to radicalisation beta assessment on 15th April 2021

Digital Service Standard assessment report

Prevent duty training: learn how to safeguard individuals vulnerable to Radicalisation (formerly called Prevent E-Learning)

From: Central Digital and Data Office
Assessment date: 15/04/21
Stage: Beta
Result: Not met
Service provider: Home Office

Previous assessment reports

  • Alpha assessment report: January 2020 - Not Met
  • Alpha reassessment report: June 2020 - Met

Service description

The service provides three training modules aimed to: educate users about the Prevent programme; how to make an informed referral; and how the Channel programme works. Users will have as well the opportunity to undertake a shorter refresher module. Each module provides a knowledge check at the beginning and end to help measure learning. The Home Office provides the training at no cost to users to ensure consistency with government policy. The training is aimed at all public sector workers under the Prevent Duty. On completion of the training, users should have an awareness of where and how to refer individuals to the Prevent programme for further help. For the general public and the intended user base, the platform will also feature a basic landing page providing details about the Prevent programme.

Service users

Prevent is a key part of the Government’s counter-terrorism strategy (CONTEST). The Prevent statutory duty was introduced through the Counter Terrorism and Security Act (CTSA) 2015. Section 26 of the CTSA (2015) places a duty on certain bodies in the exercise of their functions to have ‘due regard to the need to prevent people from being drawn into terrorism’.

The Duty requires local authorities, schools, colleges, universities, health bodies, prisons and probation, and police to prevent people from being drawn into terrorism as part of their day-to-day work. The service is aimed at public sector workers from the key users listed above.

There are 5.6m public sector workers in the UK (16.5% of UK employment) – of that 1.78m work in the NHS; 1.49m in education; 2.01m in local government, 263,000 police; and; 50,185 prisons and probation.(source: ONS public sector employment December 2020).

The specified authorities are required to ensure that staff have training that gives them the knowledge and confidence to identify individuals at risk of being drawn into all forms of terrorism, and to challenge extremist ideas which can be used to legitimise terrorism and are shared by terrorist groups.

1. Understand user needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a sound understanding of users’ sectors, job roles and what they need from a training service
  • the team had made good use of personas as a tool for communicating about their users

What the team needs to explore

Before their next assessment, the team needs to:

  • better demonstrate what they know about their users beyond the functions of the service. For example, what are the high-level user needs the service meets? What motivation do users have for engaging with it? The user needs presented were mostly created e.g. the service needs to be short, accurate, memorable etc
  • show a better understanding of the scale of the current and potential users. The team found it difficult to answer questions about how many people use the current service and their aspirations for take up of the new service
  • ensure all user groups are represented and understood. The team answered questions about the needs of managers or training teams well, but they were not included in the presentation. These are likely to be people who can influence whether the service is used or not. The team needs to build upon what they learnt about managers and trainers during Discovery and Alpha to show whether their service is meeting their needs in beta
  • demonstrate why the service is the best solution to meet user needs. The team talked about the need to replace the old service as the motivation rather than why the evidence showed this route was the right one
  • show how they’re using their knowledge about why people don’t use the service to better meet user needs and increase uptake

2. Do ongoing user research

Decision

The service not did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have used an appropriate range of research methods
  • everyone has been involved in the research and analysis
  • the private beta sample was representative of the user group. However, the team referred to a male/female split which could be excluding non-binary, trans or intersex people
  • the riskiest assumptions and what had been done to address them was well presented
  • the team provided lots of examples of how the service had changed based on user’s experiences
  • there was good evidence that people could find and complete a training module
  • the team had vastly improved the regularity and range of work with accessibility users and this was reflected in changes to the design

What the team needs to explore

Before their next assessment, the team needs to:

  • test the end to end service for users who complete the first two modules on the new service and need to go onto complete the next two. It was unclear how users would understand what to do and where they could go for help. Opening up the service to public beta creates the potential for large numbers of people to fail at this point, meaning they cannot complete their task first time. As part of demonstrating the service works for everyone, this must be addressed as a priority
  • learn from the experiences of more people using the private beta service. The numbers who have completed it so far are a very small proportion of the existing and potential user group
  • include more detail in their user research plan for the next phase. Although the themes outlined were the right ones, it was unclear how working with public beta users, those exposed to the next private beta and the testing of new functionality would be managed
  • better demonstrate how research will contribute to team understanding of whether the project outcomes are achieved. It’s helpful to see that people feel the course length is appropriate or pass the knowledge checker. But how will research be used to show whether there is a positive impact on the overall Prevent agenda?

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • there’s a breadth of disciplines working well together on the team
  • user research looks like it was beefed up between assessments as per the alpha recommendations
  • there was clear governance and the team said they were empowered to make decisions day to day
  • the plans to transfer knowledge and responsibility to the Shared Application Service team (SAS) looked thorough and well planned

What the team needs to explore

Before their next assessment, the team needs to:

  • (start to) recruit a Product Manager. Building on comments from the alpha assessment, the assessors keenly felt that this role/voice was absent. For example, the articulation of the service felt very policy driven. The vision and the user centered purpose of the service didn’t shine through. We felt that this role/perspective would also help with some of the other gaps identified in this report. We would further recommend that the service should recruit a Product Manager ahead of full Public Beta/Live
  • share clearer evidence of how the team worked together. This is linked to a later point about a lack of understanding among the panel of the differences between the three teams (Homeland Security, LiveWork, Rehab) and how they interacted

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • they selected an appropriate methodology (Kanban) to fit the teams needs and help get the work done
  • the team had found a way to work well together through the pandemic, making use of lots of different technology to keep communicating, collaborating and making progress
  • there was wide engagement and good collaboration with experts, interested stakeholders and other teams across government
  • there were many good examples of how research insights had helped to iterate and improve designs

What the team needs to explore

Before their next assessment, the team needs to:

  • explain more about the team’s structure and how they interact. We’d like to better understand how the three teams are set up and the ways they interact, prioritise and work together
  • consider more frequent team retrospectives to allow the people on the team pause, reflect, review and iterate the ways they work together. It sounded like a long gap to only have retros at the major milestones felt

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • there was lots of interesting evidence of designs evolving in reaction to user feedback (such as the side bar /progress navigation element)
  • data from sources other than user research was playing a part. For example, the knowledge that users made contact to correct certificate spelling led to the introduction of the ‘check your details’ into the user flow.
  • there was a detailed plan on the route through private and public beta

What the team needs to explore

Before their next assessment, the team needs to:

  • evidence and test how users of the live beta will access the other course content - presumably via the existing live service (tho which is also being tested in private beta). We’d like to have heard a bit more about these journeys and how they’ll be managed
  • the lead assessor, on reflection, would have liked to have heard a bit more about how the team take insights and relatively prioritise all the things to fix, and test and learn from

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team thoroughly investigated the market regarding use of a Learning Management System (LMS)
  • the team has valid reasons for selecting a Content Management System (CMS) over a LMS and the CMS being used i.e. Drupal is already in use within Home Office
  • the team is making use of existing Home Office components in AWS i.e. the containerisation platform and existing git /drone for CICD and ACP for alarming and alerting

What the team needs to explore

Before their next assessment, the team needs to:

  • before going live the team satisfies itself that the choice of CMS over LMS, especially due to changes in the open source market remains valid. It is not proposed that this be reviewed but simply that the team is content the CMS solution is right for them

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has realised the need for privacy for user data and has minimised the need to hold user data and storage of such is time bounded
  • strong protocols are in use for Administrative users and use of other techniques e.g. cookies with tags, ITHC etc. have been implemented
  • mutual authentication between APIs is in place
  • a DPIA has been produced and registered within Home Office

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the storage of certain media within an open platform i.e. YouTube and the reliance on a 3rd Party i.e. Google remains valid and is properly documented and risk assessed
  • the team should have a mechanism to ensure that the media on 3rd party platforms has not been tampered with prior to playing to a user
  • the team should consider the higher visibility of a security architect within the assessment process

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is continuing to use existing HO process and procedures for making source code open where possible and storing the code in an appropriate location i.e. GitHub

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made use of an appropriate technology i.e. a CMS and that the CMS in use is already extensively used within Home Office. The CMS in use is Drupal and this has extensive support within Home Office
  • The team is making use of existing services and patterns within Home Office

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is making use of automated testing and that testing is plugged into the pipeline tools

What the team needs to explore

Before their next assessment, the team needs to:

  • the team knows that certain “back doors” exist and must ensure that these are fully removed and tested as such prior to go live

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • they had considered the user journey when the service goes down
  • the ‘this service is down’ message is proportionate
  • ensured that while the “front end” service may be up and available the storage of other media on other platforms may not be available and hence the user receives an appropriate message

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the use of a single AWS availability zone is appropriate for their needs and user expectations. It is understood that there may be little benefit for using multiple availability zones; the time taken to redeploy may have a negative user perception. This may be negated by a better understanding of “time to switch” e.g. is this seconds to minutes or minutes to hours. This was not well understood at the time and is an observation rather than a recommendation

12. Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has researched and designed a new service name
  • the team is measuring how easily users progress through the service
  • the team is keen to find not just a solution, but the right solution (for example to the need to come back to a course without resorting to an account model)
  • some of the video content is tailored to the sector the user works in

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that users are getting the right information on the start page - it should tell users what the service will let them do rather than being an introduction to the general subject
  • explore whether the service name needs all the content in it that it has now - if the users who need the service already know “Prevent duty” and “Prevent training”, do they need the further explanatory part of the service name, or would that work ok in the start page text (this exploration may result in a name that begins with a verb)
  • make ambiguous questions clearer (for example “Enter your country”, which could mean a number of things - example country question), and ensure questions and answers match (for example in the self-assessment “how well do you understand” question)
  • ensure that users can recover from validation errors by using the standard error summary and error message patterns - this includes avoiding generic error messages
  • do a full check on how the service works and looks on mobile devices (for example the hint text on the job role selection screen)
  • find out the extent of difficulties with accessing course videos from workplace networks, and work out how to minimise or mitigate them (for example exploring whether all videos can be made suitable for YouTube’s “restricted mode”, or whether users might need warning in advance about this potential difficulty)
  • make sure there’s a clear way for users moving onto courses not yet included in the beta service to navigate from the beta service to the old service
  • discover where users are dropping out of the service and why

13. Make the user experience consistent with GOV.UK

Decision

The service did not meet point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made use of existing patterns and components
  • the team was confident enough to move away from a common pattern where it was not working well for their users - the panel would encourage the team to share the research findings as they relate to a known gap for the confirmation page pattern
  • the team has validated the user need to track progress within training modules, has designed a pattern that is working for their beta users, and is planning to continue iterating this pattern - the panel would encourage the team to share this pattern and research findings with the design system community

What the team needs to explore

Before their next assessment, the team needs to:

  • identify all the places where the service has deviated from the specifics of standard components and patterns and ensure there is a validated user need to do have done so (for example using a blue secondary button rather than a grey one)
  • work out whether the check your answers stage for the initial country/role questions and the pre and post training knowledge check questions are meeting a genuine user need – how often do people get things wrong, and how important is it when they do
  • check capitalisation in the service name - usually only the first word would start with a capital, apart from an initiative name like “Prevent” (but in other places on GOV.UK it’s “Prevent duty” not “Prevent Duty”)
  • get a second content designer to do a full 2i (including proofreading, style check and plain English) of the beta service and prototype - this should be as soon as possible
  • have an arrangement in place for 2i-ing any changes in the service or course content on an ongoing basis, before the change goes live (this is standard practice for government services - having a 2i buddy, or Home Office content designers might operate a rota)

14. Encourage everyone to use the digital service

Decision

The service did not meet point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • there were lots of great examples of improving the accessibility of the service. Indeed, it was clear during the assessment that this issue was really important to the team
  • there were references to checking customer calls and thinking about how to solve these issues online (such as certificate typos)
  • there were plans to review the face-to-face offering of the service as the digital aspect evolves

What the team needs to explore

Before their next assessment, the team needs to:

  • talk about service promotion. It would be good to hear more around how they’ll promote the service to increase take up. As previously mentioned, the service usage targets/ambitions felt slightly vague
  • consider the competition. We felt that some research and assessment of providers who offer similar training would help the team know who they’re up against and their comparative strengths and weaknesses Plus, they’d get a sense of the choices their (potential) users are making when deciding where to do the courses
  • make as much progress with the face to face work, share what they learn, and outline the plan for the remaining work
  • as previously mentioned, consider other users, such as line managers, and how their influence affects usage

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were at a good stage in their data collection development. For example, Google Analytics has been set up using GA4 and the team were able to explain in detail how Google Analytics configuration has been set up in order to support the data protection impact assessment (DPIA)
  • the team used a mixture of quantitative and qualitative methods to create their KPIs and were able to show how each method support each other
  • they had mapped their KPIs and metrics for each stage of the E-learning journey

What the team needs to explore

Before their next assessment, the team needs to:

  • explore how to measure assisted digital or face-to-face training

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has created a performance framework in order to measure how well the service meets user needs and policy intent

  • the team were able to explain how they are getting their data as well as creating methods to understand data loss due to the cookie consent model
  • team has created a dashboard to visualise their KPIs and metrics for internal use
  • are planning to use legacy GA metrics to baseline the performance of the new service when it becomes statistically significant to do so

What the team needs to explore

Before their next assessment, the team needs to:

  • whilst the team have mapped KPis for each stage of the journey it would be good to hear what the team could do to check that this service isn’t just being used as a tick-box exercise. If the service was trying to boost ability/confidence to safeguard - how could they track/measure that over time?
  • continue to use performance indicators to identify problem areas. Whilst the team have identified a cycle of KPI feedback it might be beneficial to consider a more regular review of performance in the next phase of development
  • consider running an accessibility audit on the design of their dashboard in order to ensure that the dashboard is accessible to all colleagues

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has created a machine readable csv to show KPIs and other relevant metrics to share externally

What the team needs to explore

Before their next assessment, the team needs to:

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are seeking to test with a senior stakeholder (Department Director)

What the team needs to explore

Before their next assessment, the team needs to:

  • get the session with the Department Director planned and booked in

Updates to this page

Published 11 June 2021