Guidance

Lessons Management Best Practice Guidance (HTML)

Published 30 September 2024

Introduction

Purpose

1. In the civil contingencies resilience context, the purpose of learning lessons and capturing positive practices is to drive continual improvements in the way that individuals, teams, departments, organisations, and multi-agency partners anticipate, assess, prevent, prepare, respond to, and recover from emergencies. Learning is therefore integral to managing risks.

2. Lessons identified from a range of sources can play a vital role in directing work to prevent the repetition of past mistakes, driving preparedness activity, and reducing losses in the event of disruption. Lessons can also be harnessed to help develop individual competencies, organisational/departmental capabilities, and enhance multi-agency collaboration.

3. As part of a participatory approach to emergency management, learning processes provide opportunities to consider the impacts of emergencies on individuals, and to build Community Resilience. This can help to empower the resilience community to understand, enable and integrate the capabilities of the public into emergency planning, response and recovery activity.[footnote 1]

Context

4. The importance of learning lessons is clearly articulated in legislation[footnote 2], reflected in national standards[footnote 3] and encouraged through good practice[footnote 4] across sectors and stakeholders.

5. In line with the core principles of the Resilience Framework[footnote 5], well managed lessons can support UK resilience within and across the government departments, organisations and partnerships that make up the UK resilience community[footnote 6] by:

  • informing a shared understanding of the civil contingencies risks we face
  • indicating and directing opportunities for prevention rather than cure
  • offering ‘whole of society’ learning benefits

6. This guidance reflects the UK’s guiding principles of Emergency Preparedness and Response. It is linked with the Resilience Framework (2022)[footnote 7] and the Exercising Best Practice Guide (2024).[footnote 8] A full list of linkages to be read in conjunction with this document are provided in Annex 1.

Figure 1: A Strategic Approach to Resilience - Core Principles

  • Shared and developed understanding of risk
  • Prevention rather than cure
  • Whole society endeavour

Scope

7. This complementary guidance is non-statutory and non-mandatory. Nothing within it adds additional obligations to users. It has been designed to complement existing learning activities, be used in conjunction with established lessons platforms, and to support continual improvement at national and local levels.

8. The UK’s approach to civil contingencies is based on the principle of subsidiarity, where decisions are taken at the lowest appropriate level (individual, community, locality, or national) and coordination at the highest necessary level. This guidance reflects that principle, in the context of decisions relating to lessons.

9. The guidance acknowledges the importance of lesson sharing and/or tracking platforms. There are existing structures, platforms and processes being used to capture and share lessons identified from exercises and emergencies within and across the resilience community.[footnote 9] [footnote 10] The continued use and development of existing, contextualised systems to support the lessons management arrangements, such as JESIPs Joint Organisational Learning platform (JOL Online), is encouraged.

Aim

10. The aim of this guidance is to inform, encourage, and equip senior leaders, central government departments, agencies, arms-length bodies, and wider resilience professionals in the effective management of lessons.

11. The guidance has been developed with consideration to existing good practice guidelines[footnote 11] [footnote 12] including the recently published Organisational Resilience Guidance for UK Government Departments, Agencies and Arm’s Length Bodies (ALBs).[footnote 13]

Objectives

12. This guidance has been developed to deliver against the following objectives: (1) To support and strengthen the management of lessons in the civil contingencies’ resilience context; (2) To support and inform the identification of evidence-based lessons, from a range of learning sources, including emergency exercises and incidents; (3) To support and inform the prioritisation and practical implementation of learning in response to lessons identified and; (4) To support and inform the retention of implemented learning, through consolidation and embedding of change, within and across organisations.

Structure

13. In line with these objectives the guidance comprises the following sections: Lessons Management; Lesson Identification; Lesson Prioritisation; Lesson Implementation and; Embedding Learning and Change.

Key Definitions

14. In the civil contingencies’ resilience context, the term ‘lesson’ can embody multiple, related terms and/or concepts. Core definitions are presented below for clarity. A full glossary of terms relating to this guidance can be found in Annex 2.

  • a ‘Lesson’ articulates an update in knowledge or understanding that has been gained through experience[footnote 14]
  • a ‘Lesson Identified’ refers to an evidenced conclusion, based on analysis of observations and insights. It describes a problem/issue, details a root cause, and sets out a course of action to achieve positive improvements in practice
  • a ‘Lesson Implemented’ refers to an identified lesson that has become ‘learned’ after being actively addressed through a lesson implementation process. It results in measurable changes in behaviour, and positive, evidenced improvements in practice
  • ‘Embedded Learning/Change’ refers to changes implemented in response to a lesson identified, that have since been integrated and consolidated in context, evidenced as retained over time, and remain consistently demonstrated in practice
  • a ‘Notable Practice’ refers to a positive or innovative action that was observed to achieve better than anticipated outcomes

Lessons Management

15. Lessons Management refers to a strategic, organised approach to, and oversight of, planned processes and procedures to achieve evidenced learning from experience in a continual, consistent, manner. The purpose of Lessons Management is to successfully close the loop between identifying lessons and achieving positive, lasting, improvements in practice.

Leadership, Responsibility and Accountability

Progressive good practice in organisational learning is a recognised hallmark of highly resilient organisations.[footnote 15] Top-down accountability and oversight, with visible, senior leadership engagement and a communicated commitment to the management of lessons is vital if measurable improvements in practice are to be realised and retained.[footnote 16]

16. Effective Lessons Management requires both strategic and operational commitments and engagement to achieve evidenced improvements in practice. Leadership, accountability, responsibility, and ownership for lessons, along with supporting organisational governance arrangements, are vital enablers of active learning. Insufficient authority for change, or lack of engagement from strategic decision-makers, can limit practical progress and wider organisational learning.

17. Lessons can be identified from a range of sources and stakeholders. Those involved in lessons management processes can include both statutory responder organisations, such as Local Authorities and emergency services, and non-statutory responders, such as Voluntary and Community Sector organisations. Specific involvement may vary depending on the scale or source of the learning event (e.g. national exercise or local incident), and the nature of its impacts.

18. With increasing lessons management maturity, organisations/departments can proactively capture learning from everyday operations, near-misses, and wider learning reviews. An example of a Lessons Management Maturity Matrix can be found in Annex 6.

The Lessons Management Framework

19. The Lessons Management Framework brings together and builds upon existing best practices from the UK and beyond. It provides a simple, visual overview of the four key processes involved in the effective learning of lessons. All processes sit within wider organisational learning and continual improvement activity.

20. The Framework has been designed to guide evidence-based lesson learning from the point of experience, through to longer-term embedding of change (see Figure 2). It can be applied flexibly to support Lessons Management arrangements in different contexts. It is informed by the well-known ‘Observation, Insight, Lesson Identified, Lesson Learned’ (OILL) methodology for learning lessons.[footnote 17] [footnote 18]

Figure 2: Lessons Management Framework

  • Embedding
  • Identification
  • Prioritisation
  • Implementation
  • Continual Improvement

Process Pillars

21. The four key processes involved in effective lessons management include: Lesson Identification; Lesson Prioritisation; Lesson Implementation and Embedding Change. Their inclusion is informed by an understanding of where challenges in lessons work tend to occur.

Practical Steps

22. Each of the four processes has been broken down into practical steps, to add applied value, and help guide learning activity. These are set out in their respective sections below. Print-friendly ‘Aide Memoires’ that combine key processes and the practical steps detailed below can be found in Annex 7.

Lesson Identification

23. A ‘Lesson’ articulates an update in knowledge or understanding that has been gained through experience.[footnote 19] In the resilience context a ‘Lesson Identified’ goes further, being an evidenced conclusion, based on analysis of observations and insights from experience. It describes and documents a problem or issue, details a root cause, and succinctly sets out a course of corrective action to achieve positive improvements in practice.[footnote 20] [footnote 21]

24. It is important to consider how a range of relevant partners will be included and enabled to contribute views during the lessons capture process. Diverse perspectives from local and/or lived experiences of an event can lead to a more developed and objective understanding of key issues. Inclusive identification processes can also help to foster relationships with those who may have interests or equity in development and delivery of any onward actions.[footnote 22]

25. It is vital in every aspect of Lessons Management to acknowledge the sensitivity, candour, and care with which lessons identified from incidents must be handled. This is especially pertinent in cases where tragedy and trauma, including loss(es) to lives, livelihoods and/or the environment have been experienced.[footnote 23]

Spotlight: Culture

Fostering a strong culture of resilience within an organisation, department or partnership can lead to improved learning outcomes in practice. This involves an active commitment to learning from successes and failures. It also requires the generation of an open learning environment, that respects a range of views and extends the opportunity to speak up when things do not go right - without fear of blame.[footnote 24] Creating a psychologically safe, non-judgemental space for people to share their experiences can be an important facilitator of honest, meaningful feedback. It also supports an increasingly objective, accurate lesson identification process.[footnote 25]

26. The end goal of the identification process is to capture high-quality[footnote 26] evidence- based lessons that can inform the generation of SMART (Specific, Measurable, Achievable, Relevant, and Time-bound)[footnote 27] recommendations for onward action. Identified challenges may also present opportunities for innovation, and to enhance resilience capabilities while working towards wider organisational aims and objectives.

27. The identification process can be visualised as a distinct, six-step process:

  1. Capture
  2. Analyse
  3. Identify
  4. Validate
  5. Report
  6. Share

Step 1: Capture

28. Organisations can prepare a contextualised Lesson Collection Plan to guide the process of capturing and collating observations from learning events. The purpose of the plan is to: outline collection procedures; communicate learning priorities; and set expectations about what, who, when, where, why, and how observations will be captured. Generic but flexible Lesson Collection Plans can be agreed and documented ahead of time. In the case of scheduled exercises, proportionate plans can be agreed during the exercise planning phase.[footnote 28]

29. Collection plans may also reference: specific capture methods and/or procedures; provisions for capturing positive practices; procedures for collaborative, multi- stakeholder capture; the use of shared templates across departments or agencies; collection timelines; and any relevant roles and responsibilities. It may also detail appropriate and proportionate thresholds for increasing or diversifying the scope and scale of lesson capture, according to the size, complexity or severity of impacts associated with an event.

30. Common collection methods include, but are not limited to exercise observation forms, individual feedback forms, group and/or multi-agency debriefs, interviews, focus groups, After Action Reviews (AARs) and audits.

31. Actioning the Lesson Collection Plan soon after the event can support increased engagement and quality of input and returns. Once observations have been captured, they can be collated and stored, with due regard to Data Protection and Freedom of Information requests.

Top Tip

A well planned and reliably managed lesson identification process can withstand scrutiny, providing essential checks and balances to ensure accurate, auditable learning. The selection of two or more collection methods (e.g. individual, electronic feedback forms, and in-person group debriefs) can improve the credibility and quality of a lesson identified.[footnote 29] [footnote 30] It can also encourage diverse views, and provide a richer learning picture when bringing information into a shared space for analysis.

Step 2: Analyse

32. Practical analysis begins with an open, curious and questioning mindset that seeks to explore, examine and understand information gathered. The goal of good practice in this area is to bring a level of rigour and objectivity into the lesson identification process.

33. Analysis helps to ensure the right lessons are identified[footnote 31], and that any onward actions to resolve issues are not misinformed or misguided. It encourages a transition from subjective, assumed and/or unvalidated learning, towards increasingly clear, comprehensive, credible lessons identified.[footnote 32] [footnote 33]

34. The key is to agree on an appropriate and proportionate approach that hones in on the ‘what’ and the ‘why’ of any issues identified, rather than focussing exclusively on the ‘who’. Examples of some common analytical techniques, along with further reading and links to practical resources are provided with the list of linked documents in Annex 1.

Spotlight: The Importance of Analysis

Analysis does not have to be onerous or exclusively academic. Across information collected, it can be helpful to look for the three key things:[footnote 34]

  • Trends: patterns in the data. E.g. During the exercise, multiple observations highlighted issues with obtaining senior clearance for the release of information to the public, resulting in delayed communications.
  • Themes: topical learning areas, supported by trends. E.g. ‘strategic communication - warning and informing’.
  • Insights: deductions from multiple observations that shed light on root issues or contributing/causal factors.[footnote 35] E.g. Delayed communications were impacted by inconsistently documented and poorly understood processes and procedures for clearance of public communications by senior leaders. Communication plans did not make provision for the absence of key personnel. This allowed more time for mis- and dis-information to circulate on social media, compromising safety and complicating the response.

Step 3: Identify

35. Once reviewed and analysed, best practice advocates for identified lessons to be documented in a consistent, high-quality, auditable format. High-quality lessons are those that articulate evidenced justifications for amending existing ways of doing things. They clearly articulate the learning context, the problem encountered and a proposed action (or ‘prescription’) to correct the issue. International best practice defines high-quality lessons as meeting the following criteria:[footnote 36] [footnote 37]

  • be based on evidence
  • be derived from analysis of observations and insights from the experience
  • concisely capture the context from which it is derived
  • have considered and detailed any root cause(s)
  • clearly defines the issue or opportunity for improvement
  • have been validated by key stakeholders to ensure accuracy
  • propose a viable pathway for onward action

36. A documented example of a high-quality lesson is provided in Annex 4.

Step 4: Validate

37. The validation step provides the opportunity for ‘check and challenge’, to ensure the accuracy and integrity of a lesson identified.[footnote 38] Approaching validation with an openness to challenge can help to mitigate any false consensus (i.e. group-think) concerning the root causes of identified issues. Validation can also build increased legitimacy, and therefore better buy-in for action, amongst key stakeholders.

38. To validate learning, it can be helpful to remember the following: ‘connect, check, challenge’. This signposts to the connection with others involved in the original event who can support the validation process, the invitation for those validating to sense-check key learning points, and to maintain openness to challenge in the process. Once validated, those who identified lessons will be able to generate meaningful recommendations for action.

Making Recommendations

Recommendations are a proposed, viable course of action to reinforce a positive finding, or drive change and improvements. They are most useful when they:[footnote 39]

(a) propose an easily understood, practical course of action

(b) articulate any initial (precautionary) steps to resolve gaps

(c) are formulated in a SMART (Specific, Measurable, Achievable, Relevant, and Time-bound) manner.[footnote 40]

Example of a SMART recommendation

Relevant policy and procedures should be reviewed and updated within six weeks, based on a developed understanding of core issues with communication clearances at strategic and operational levels of the organisation. Necessary updates should be reviewed and agreed, before being communicated to staff and practised through small-scale table-top exercising. Successful updates in policy and procedure should then be passed to the Training and Exercise Team, to be included for testing in wider annual exercising activities. A short report, detailing findings and measurable actions taken to resolve clearance issues experienced in the exercise should be submitted to the Head of Communications by the end of the six-week period.

Step 5: Report

39. Proportionate reporting after a learning event like an emergency exercise or incident, provides a documented, auditable record of key learning, identified lessons, and corresponding recommendations.

40. Post-event reports generally include and detail:

  • an overview of the event (learning source and context)
  • steps taken to capture, analyse and validate lessons (methodology)
  • a list of high-quality, evidence-based lessons identified (problems/practices)
  • corresponding recommendations for onward review (proposed actions)

41. Clearly articulating the lessons, as well as recommendations, in the report offers an opportunity to retain learning in support of developing wider corporate/ institutional memory during and after the implementation phase.

Step 6: Share

42. Having an agreed mechanism for sharing reports and/or key points of learning with relevant stakeholders can promote the transfer of knowledge with wider networks. Important benefits of shared learning include:

  • the fostering of a shared and developed understanding of risk
  • the development of communities of practice in Lessons Management
  • opening opportunities for meaningful solutions to be explored with stakeholders who may have interests or equity in the design and delivery of onward recommendations

Lesson Prioritisation

43. Lesson prioritisation involves the strategic review of identified lessons, to inform decisions about the order in which recommendations being taken forward for action will be addressed. Prioritisation can create a bridge between lesson identification and lesson implementation, providing an opportunity to direct sustainable pathways for change within and across stakeholders.

44. Adopting a risk-informed prioritisation process can strengthen the rationale for taking the most pertinent actions forward, while still allowing for near- term, resolvable issues (i.e., ‘quick wins’) to be actioned through business as usual activity.

45. The Lesson Prioritisation process can be broken down into six practical steps:

  1. Organise
  2. Appraise
  3. Assess
  4. Prioritise
  5. Assign
  6. Review

Step 1: Organise

46. Organising identified lessons in a consistent manner supports a systematic lesson prioritisation process. An organisation-wide Lessons Management Register (see Annex 5) or similar, can be used to create a common space and consistent, auditable format for recording and storing identified lessons and recommendations.

47. Within the register, each lesson can be given a unique identifier or code, that can be used to track its status and associated actions going forward. Assigning a contextually relevant learning ‘theme’ (e.g., policy, human resources, training) and logging the identified risk (hazard/threat) it pertains to (e.g., malicious threats, flooding) are equally beneficial. These allow the register to be used for cross-referencing lessons, spotting common or recurring issues, and to inform periodic Risk Register reviews.

Step 2: Appraise

48. Once organised, lessons can be reviewed and prioritised. The appraisal step allows for a final check and balance on the lessons going forward for action. During appraisal teams may consider whether the problem and cause are adequately evidenced and understood; if further information is needed; and whether any relevant or related work is ongoing. If for any reason lessons are not progressed at this stage, they can be marked as ‘No Action’, or for further review. Rationale for the decision can be documented on the Lessons Management Register.

Step 3: Assess

49. Assessing any risks associated with leaving an identified lesson unaddressed can be a helpful way to prioritise lessons and recommendations being taken forward for action. One way to do this is to consider the ‘likelihood and the impact of the issue occurring again’. This can foster the use of hindsight to consider: (a) the likelihood or probability that an issue may recur and; (b) the range of potential impacts it may have if it did.[footnote 41]

50. Examples of how likelihood and impact of recurrence can be assessed are provided in the three tables below. These have been informed by procedures set out in JESIP’s Joint Organisational Learning Guidance[footnote 42], national risk assessment methodologies[footnote 43], and the characteristics of community networks in the Community Resilience Development Framework.[footnote 44] Additional guidance on the concept and practices involved in risk management can be found in HM Government’s Orange Book.[footnote 45]

Likelihood of Recurrence

51. The likelihood of an identified issue recurring can be estimated using the categories and descriptors set out in Table 1 below. A score between 1 and 5 can be assigned accordingly. This will be combined with subsequent impact scores to determine priorities at the end of the process.

Table 1: Likelihood of an identified issue recurring

Score Category Description Examples
5 Probable Likely to recur consistently unless action is taken
4 Possible Likely to recur (e.g., in certain conditions)
3 Unlikely Could reccur
2 Rare May recur infrequently
1 Manageable Unlikely to recur, mitigating factors in place

52. Where the risk of an issue recurring is being assessed within a wider, integrated analysis of hazards/threats at national or local levels, practitioners can also defer to existing methodologies, such as the PHIA Probability Yardstick[footnote 46] designations detailed in the National Risk Register[footnote 47], for consistency in likelihood assessments.

Spotlight: The PHIA Probability Yardstick[footnote 48]

Percentage likelihood of recurrence

  • 0%-5%: Remote chance
  • 10%-20%: Highly unlikely
  • 25%-35%: Unlikely
  • 40%-50%: Realistic possibility
  • 55%-75%: Likely or probable
  • 80%-90%: Highly likely
  • 95%-100%: Almost certain

Impact of Recurrence

53. Considering the impact that an issue could have if it recurred in a similar or other response scenario is also an important factor in the lesson prioritisation process. Common areas for consideration when assessing thematic impact areas are provided in Table 2. Descriptors and scoring suggestions for each impact area are provided in Table 3.

Table 2: Thematic areas for impact assessment

Impact Area Brief Descriptor
Impact of an issue on the ability to protect public and staff welfare.
A. Health and Safety Impact of an issue on the ability to protect public and staff welfare.
B. Community Impact Impact of an issue in terms of human aspects, including direct consequences for communities and vulnerable groups. Key community characteristics that can help reach an informed understanding of the likely risk and resilience of communities include:[footnote 49]

- social and demographic context
- business context
- economic context
- physical assets, infrastructure and natural capital
- community social capital
C. Organisational Capability Impact on the organisation’s ability to respond to an incident.
D. Organisational Reputation Impact on the reputation of team, department and/or organisation.
E. Financial/Legal Level of financial or legal impact and implications of the issue arising.
F. Systems and networks Impact of potential knock-on effects for other organisations, departments, or partners.
G. Organisational Priorities Impact of an issue on strategic goals and objectives.

Table 3: Thematic Impact Assessment Scoring Criteria and Descriptors

*Score Impact Level Example Descriptors[footnote 50]
5 Critical Critical failure in capability, probable fatality, subject to litigation, substantive costs.
4 Major Significant failure in capability, probable major injury, legislative breech, significant costs.
3 Moderate Capability must use alternative arrangements/methods to achieve objectives, non-life-threatening injury, legal implication additional costs.
2 Minor Capability objectives achieved through usual arrangements or minimal intervention, possible minor injury, internally managed claim, minimal cost implications.
1 Manageable Capability objectives achieved in full through usual arrangements mitigating factors already in place make injury unlikely, claims unlikely and additional costs unlikely

54. Once the likelihood and potential impacts of an identified issue recurring and/or remaining unaddressed have been assessed, scores can be combined to inform priority actions.

Step 4: Prioritise

55. A Prioritisation Matrix can be used to determine priority action areas. An example matrix is provided below (see Figure 3). Priority Level descriptors can be defined in context, and aligned according to relevant terminology and/or business areas. Lessons and recommendations can then be arranged and reviewed in priority order.

Figure 3: Example Lesson Prioritization Matrix

Impact + Likelihood

Priority 1
  • Critical (5) + Unlikely (3)
  • Critical (5) + Possible (4)
  • Critical (5) + Probable (5)
  • Major (4) + Unlikely (3)
  • Major (4) + Possible (4)
  • Major (4) + Probable (5)
Priority 2
  • Critical (5) + Rare (2)
  • Major (4) + Rare (2)
  • Moderate (3) + Unlikely (3)
  • Moderate (3) + Possible (4)
  • Moderate (3) + Probable (5)
Priority 3
  • Critical (5) + Manageable (1)
  • Major (4) + Manageable (1)
  • Moderate (3) + Manageable (1)
  • Moderate (3) + Rare (2)
  • Minor (2) + Unlikely (3)
  • Minor (2) + Possible (4)
  • Minor (2) + Probable (5)
Priority 4
  • Minor (2) + Manageable (1)
  • Minor (2) + Rare (2)
  • Manageable (1) + Rare (2)
  • Manageable (1) + Unlikely (3)
  • Manageable (1) + Possible (4)
  • Manageable (1) + Probable (5)

Step 5: Assign

56. Ownership for onward implementation of priority actions can be agreed and assigned. It is important to be clear that original ‘Lesson Owners’, (i.e., the organisation/department who identified the issue ) may not be the same as ‘Implementation Action Owners’, who take forward responsibility and accountability for implementing actions in response. This can be documented on the Lessons Management Register for clarity.

Step 6: Review

57. Levels of prioritisation can be periodically and responsively reviewed, given changes or updates triggered by: New or updated knowledge (e.g., similar lessons arising in other business areas, publication of inquiry reports and/or reviews); Internal changes that may influence traction or progress (e.g., internal restructures, personnel changes, internal policy updates); and external changes that impact on the physical or operating environment (e.g., political, economic, social, technological, environmental and/or legal factors).

Lesson Implementation

58. Lesson Implementation is the process of positively and proactively using identified lessons to improve the emergency response, build resilience and seize identified opportunities. The process involves leading, planning and acting on issues, to prompt strategic review, direct activities, or change structures and behaviours. The goal of a Lesson Implementation project is to achieve measurable improvements in practice that can be evidenced through monitoring, evaluation and reporting systems. The ability to do this effectively is a hallmark of highly resilient organisations.[footnote 51]

59. An identified lesson is only described as implemented (or ‘learned’) when a planned implementation process to address the problem has been completed and evaluated as successful.

60. The Lesson Implementation process can be broken down into six practical steps:

  1. Lead
  2. Plan
  3. Act
  4. Monitor
  5. Evaluate
  6. Report

Step 1: Lead

61. Implementation is significantly influenced by leadership engagement. Before and during implementation, top-down leadership of, and a commitment to, learning lessons and implementing change is vital to ensure actions:

  • progress in a timely manner
  • are seen through to completion
  • deliver measurable improvements in practice

Without senior support, it can be difficult to achieve the necessary authority, resources, and traction for prioritised improvements. Strategic buy-in, good governance structures, and a positive learning culture are key facilitating factors.

Step 2: Plan

62. Responsibility, accountability, and ownership for lessons and respective actions are important enabling factors. Implementation Action Plans can be used to document an overview of senior reporting lines, set out the strategic implementation objectives, and articulate its design in terms of: key actions/deliverables (outputs), timelines (milestones), and agreed measures for success (outcomes). The use of Key Performance Indicators (KPIs), or similar, can support this process.

63. Considering ‘why, what, who, when, where, and how’ (5W1H) can help to guide the structure of Implementation Action Plans. Welcoming diverse views when planning can help to ‘check and challenge’ the feasibility and suitability of a proposed implementation. It can also help to ensure planned changes will be fit for purpose, and beneficial the foreseeable future.[footnote 52]

64. Use of the ‘Chilcot Checklist (PDF, 1,170KB)’ can also help to mitigate the risk of ‘group-think’ (false consensus) around practical, but potentially suboptimal, implementation options. As with identification, an inclusive implementation process can help to foster relationships with those who may have interests or equity in the development and delivery of acceptable, well-fitting solutions. Further resources to support planning can be found in Annex 1.

65. Complex and/or cross-cutting recommendations may require more than one owner to achieve change. In this case, deliverable actions on a single implementation plan can be broken down into clearly owned ‘chunks’ of work that contribute to a shared goal. When well-managed and coordinated, key action parts can positively influence outcomes for the whole.[footnote 53]

66. Once an Implementation Action Plan is agreed, an Implementation Action Tracker (Annex 5) can be set up to oversee progress and support periodic reporting updates.

Step 3: Act

67. Key actions can be delivered in line with the Lesson Implementation Plan. Leadership and line managers with oversight roles can support updated ways of working by establishing a strong direction and vision for the change, while managing those involved in the delivery with honesty and integrity.[footnote 54]

68. An understanding of the impact of change on staff (and/or volunteers) can be helpful. Change Management has been described as a process that involves an ‘unfreezing, moving, and refreezing of values, practices, and procedures within an organisation’.[footnote 55] When working to motivate change at the strategic level, consider how the following can support the process:

  • timely communication and information can help people understand the reasons for change (unfreeze)[footnote 56]
  • leading by example in attitude and actions can demonstrate a commitment to change and support delivery by inspiring wider buy-in (move)
  • opportunities can be provided to practise and rehearse new or updated ways of doing things (e.g., using bespoke, small scale testing and exercising of key changes) to set people up for success - before feeding them into wider/larger scale exercising, can maximise learning outcomes (refreeze)

69. At the operational level, it is important to remember that individuals will need sufficient: capability (e.g., knowledge and skills); opportunity (e.g., a supportive environment); and motivation (e.g. understanding the goal/benefit) if behaviour changes are to be realised.[footnote 57] [footnote 58]

Step 4: Monitor

70. Monitoring key indicators and milestones can be carried out in line with the action plan. Monitoring provides an important opportunity to validate the effectiveness of methods being used to implement change. It can be used to answer two key questions while action is still in progress: ‘Is delivery on track?’ and ‘Is the implementation delivering the desired changes/impacts in practice?’. In some cases, monitoring can indicate a need to adapt or adjust a plan during implementation.

Step 5: Evaluate

71. Once all implemented changes have been completed, objectives, outputs, outcomes and desired impacts set out in the Implementation Action Plan, can be evaluated. The evaluation can be used to draw conclusions about the success of the implementation, and inform a final report. Helpful resources for designing monitoring and evaluation plans can be found in Annex 1.

Step 6: Report

72. A proportionate, documented report of the implementation detailing conclusions from the evaluation can provide an auditable record of actions taken to resolve lessons identified. This can be fed back into key accountable reporting lines, so that the effectiveness and impacts of learning and change can be reviewed, retained and recorded.

Embedding Learning and Change

73. The embedding process focuses on the retention of learning, and making positive changes ‘stick’ after they have been successfully implemented. This is practically achieved through continued efforts to press updated ways of working into common organisational practice, after initial implementation efforts have finished.

74. The purpose of an active embedding process is to prevent the recurrence of resolved issues in the future. Without active embedding it is possible to revert to old ways of doing things, for learning to erode, and the quality and consistency of updates in practice to be compromised.[footnote 59]

75. While it can be difficult to define the point at which a change is truly embedded, (e.g. due to the dynamics of staff changes, organisational objectives and updates in the operating environment), progress in the retention of learning and change can be made, monitored and matured.

Embedded Learning

Learning can be described as embedded when:

  • a change implemented in response to a lesson identified goes on to be systematically consolidated and integrated within an organisation
  • updated behaviours and procedures are normalised and routinized in practice
  • retention of key learning can be evidenced over time
  • changes to practice continue to be consistently validated and assured in both simulated and real-world response scenarios, demonstrating a level of permanence

76. The embedding process can be guided by six steps. In the case of embedding, each step may need to be revisited and reviewed, depending on monitored progress as updated ways of working are being knitted into the fabric of a department/organisation.

  1. Plan
  2. Integrate
  3. Monitor
  4. Assure
  5. Review
  6. Mature

Step 1: Plan

77. The consolidation of learning and embedding of change are unlikely to occur organically. The deliberate design of a proportionate, strategic and systematic embedding process, that maps out where focussed efforts are required to consolidate/ retain positive improvements within the organisation (E.g. leadership; governance; competencies; training), can provide a roadmap for success.

Step 2: Integrate

78. Aligning embedding efforts with existing organisational systems, structures, schedules and reporting can minimise duplication and support a more integrated, sustainable approach to the consolidation of learning and change.

79. Consider how retention of the updated change can be practically encouraged, facilitated and endorsed at strategic and operational levels.[footnote 60] For example, integrating embedding efforts into existing areas, such as ‘Education’ (knowledge/training), practical ‘Equipping’ (skills/behaviours) and ‘Enabling’ (culture, operational environment) can help to maximise outcomes.

Step 3: Monitor

80. It is helpful to consider how embedding progress will be monitored and measured in context. Organisations/departments may wish to consider developing agreed metrics and indicators to help track the retention of implemented changes in practice. Example indicators are provided in the table at the end of this section.

Step 4: Assure

81. The effectiveness of embedding efforts can be assessed and repeatedly consolidated through specific and/or routine activity to test and exercise the key knowledge, skills, and behaviours required in a response. This can help to provide assurance that change has taken root (and is bearing fruit) in individual and organisational performance. Wider activity, such as internal, external and/or peer reviews can also help to assure the retention of learning and change.

Step 5: Review

82. Opportunities to review consolidation of learning and embedding of changes can be aligned to follow production of post-exercise reports, scheduled audit activity, self-assessments, and/or strategic organisational reviews.

Step 6: Mature

83. Once efforts are underway, the ability to develop increasingly effective embedding processes can be developed as part of wider commitments to organisational learning and resilience.

84. Understanding how knowledge is retained in the organisational/departmental context, and how existing, embedded changes can be refined in response to new or updated learning, can encourage increasing maturity in lessons management. An example Lessons Maturity Matrix is available in Annex 6, to demonstrate how continual improvements can be fostered across the four key lessons process areas.

Example indicators of embedding progress
A plan to support consolidation and embedding is in place. Change has been deliberately and meaningfully integrated into organisational governance, strategy, systems, and training.
The change is being communicated in a conscientious, consistent, and inclusive manner that maximises engagement opportunities.
The change in behaviour is occurring with consistency, accuracy, and fluency.
The change has been culturally accepted, assimilated, adopted, and normalised in practice. E.g., the change is now described as ‘the way we do things around here’.
The stability, durability and longevity of change continues to be validated through periodic assurance activity, indicating permanence.
Embedded change is socially supported and shared through Communities of Practice. E.g., through local learning/knowledge networks.
  1. HM Government, Community Resilience Development Framework, 2019 (PDF, 635KB) 

  2. Civil Contingencies Act 2004 

  3. National Resilience Standards for Local Resilience Forums (LRFs) 

  4. Emergency Preparedness 

  5. The UK Government Resilience Framework (PDF, 6MB) 

  6. JESIP Joint Organisational Learning Guidance (2017)(PDF, 646KB) 

  7. UK Resilience Framework 

  8. Exercise Best Practice Guidance 

  9. Treasury Minutes Government Response to the Committee of Public Accounts on the Forty-Third to the Forty-Eighth report from Session 2021-22 (PDF, 2.7MB) 

  10. UK Resilience Lessons Digest (PDF, 8.4MB) 

  11. ISO 22361: 2022 Security and Resilience – Crisis Management Guidelines 

  12. BS 65000: 2022 Organisational Resilience – Code of Practice 

  13. HM Government, Organisational Resilience Guidance for UK Government Departments, Agencies and ALBs (2024) 

  14. Adapted from: The NATO Lessons Learned Handbook Fourth Edition (2022) (PDF, 1,844KB) 

  15. BS 65000: 2022 Organizational Resilience – Code of Practice 

  16. HM Government, Organisational Resilience Guidance for UK Government Departments, Agencies and ALBs (2024) 

  17. The NATO Lessons Learned Handbook Fourth Edition (2022) (PDF, 1,844KB) 

  18. Australian Institute for Disaster Resilience:Lessons Management Handbook (2019) p.11 (PDF, 1,212KB) 

  19. Adapted from: The NATO Lessons Learned Handbook Fourth Edition (2022) (PDF, 1,844KB) 

  20. Australian Institute for Disaster Resilience:Lessons Management Handbook (2019) (PDF, 1,212KB) 

  21. The NATO Lessons Learned Handbook Fourth Edition (2022) (PDF, 1,844KB) 

  22. HM Government, Community Resilience Development Framework, 2019 (PDF, 635KB) 

  23. Hillsborough Charter 

  24. HM Government, 2024. Organisational Resilience Guidance for UK Government Departments, Agencies and Arm’s Length Bodies (ALBs) 

  25. Australian Institute for Disaster Resilience:Lessons Management Handbook (2019) (PDF, 1,212KB) 

  26. Australian Institute for Disaster Resilience:Lessons Management Handbook (2019) (PDF, 1,212KB) 

  27. Homeland Security Exercise and Evaluation Program (HSEEP) (2020) (PDF, 2.9MB) 6-1 

  28. Exercise Good Practice Guidance 2024 

  29. Spilsbury, M.J. et. al.,Lessons Learned from Evaluation: A Platform for Sharing Knowledge (UNEP) 

  30. Better Evaluation: Lessons Learnt Australian Institute for Disaster Resilience: Lessons Management Handbook (2019) (PDF, 1,212KB) 

  31. The NATO Lessons Learned Handbook Fourth Edition (2022) (PDF, 1,844KB) 

  32. HM Government, Professional Development Framework for all-source intelligence assessment: The PHIA Common Analytical Standards, 2023 

  33. Patton, M.Q, Evaluation, Knowledge Management, Best Practices, and High-Quality Lessons Learned (PDF, 60.6KB) (2001) p.334 

  34. Australian Institute for Disaster Resilience:Lessons Management Handbook (2019) (PDF, 1,212KB) 

  35. Australian Institute for Disaster Resilience:Lessons Management Handbook (2019) (PDF, 1,212KB) 

  36. NATO, Joint Analysis and Lessons Learned Centre: The NATO Lessons Learned Handbook, Fourth Edition, 2022 (PDF, 1,844KB) 

  37. Australian Institute for Disaster Resilience:Lessons Management Handbook (2019) (PDF, 1,212KB) 

  38. Exercise good practice guidance 

  39. Australian Institute for Disaster Resilience:Lessons Management Handbook (2019) (PDF, 1,212KB) 

  40. Homeland Security Exercise and Evaluation Program (HSEEP) (2020) (PDF, 2.9MB) 6-1 

  41. JESIP Joint Organisational Learning Guidance (2017) (PDF, 1,646KB) 

  42. JESIP Joint Organisational Learning Guidance (2017) (PDF, 1,646KB) 

  43. HM Government, National Risk Register, 2023 (PDF, 2.5MB) 

  44. HM Government, Community Resilience Development Framework, 2019 (PDF, 635KB) 

  45. HM Treasury, Orange Book: Management of risk - Principles and Concepts (PDF, 465KB), updated 2023. 

  46. Professional Head of Intelligence Assessment (PHIA), Professional Development Handbook (PDF, 6.6MB) 

  47. HM Government, National Risk Register, 2023 (PDF, 2.5MB) 

  48. The Professional Head of Intelligence Assessment (PHIA) Yardstick 

  49. HM Government, Community Resilience Development Framework, 2019 (PDF, 635KB) 

  50. JESIP Joint Organisational Learning Guidance (2017) (PDF, 1,646KB) p.25-26 

  51. BS 65000: 2022 Organisational Resilience – Code of Practice 

  52. Ministry of Defence, The Good Operation, 2017. p.9 (PDF, 1,170KB) 

  53. Homeland Security Exercise and Evaluation Program (HSEEP) (2020) (PDF, 2.9MB) 

  54. HM Government: Civil Service Capabilities Plan - Leading and Managing Change 

  55. Lewin, K.,Field theory in social science: In Poole, M.S and Van de Ven, A.H, Handbook of Organizational Change, 73–107. Thousand Oaks, CA: Sage (2004) 

  56. Health and Safety Executive: Management Standards - Change 

  57. Public Health England: Achieving behaviour change A guide for national government (PDF, 2,039KB) 

  58. Achieving behaviour change: a guide for local government and partners, 2020 (PDF, 1,202KB) 

  59. Manchester Arena Inquiry: Volume 2-I Emergency Response (PDF, 6.9MB) 

  60. HM Government, National Protective Security Authority Embedding Security Behaviours: using the 5Es (2023)