Policy paper

FCDO evaluation strategy

Updated 4 September 2023

This was published under the 2019 to 2022 Johnson Conservative government

Foreword

This is the Foreign, Commonwealth and Development Office’s first Evaluation Strategy.

We are in a time of unprecedented global challenges. Our impact as a department relies on the generation and use of good quality evidence, including evaluation. Be it tackling the most complex international policy issues or working in fragile and testing environments around the world, evaluation identifies good practice, improves our performance and identifies what works.

As a leader on the global stage, good evaluation enables us to respond to new priorities and changing contexts, contributing evidence that helps us to deliver the Foreign Secretary’s vision of “a new era for peace, security and prosperity.” This strategy builds on the good practice of our legacy organisations. It will guide decisions about evaluations across our diplomatic and development activities as we implement the UK’s international priorities set out in the 2021 Integrated Review and the International Development Strategy.

The strategy will help us to deliver our commitment to evidence, accountability and learning, and to establish the mechanisms we need to develop a robust evaluation system. This is a critical part of building an FCDO culture that is driven by evidence and makes the most of the wealth of expertise across the Department.

Permanent Under-Secretary, Sir Philip Barton, KCMG OBE

Change log for the FCDO Evaluation Strategy

Review date Changes made
May 2023 Simplified outcome wording for outcomes 2 and 3
May 2023 Revised milestones in years 2 and 3 (1.2, 1.5, 2.3, 3.4, 4.2, 4.3)
May 2023 Milestones or targets added (2.1, 2.2, 2.4, 4.1, 4.4, 4.5)
May 2023 Amended wording for workstream 4.5

Executive summary

Introduction

We know how well things work because of evaluation. Evaluation is the tool that tells us what works, how and why, so as to maximise the value of taxpayers’ money and better achieve our objectives. It identifies good practice, improves our performance and helps to target resources on activities that will have the most impact on the ground.

As a new organisation, the Foreign, Commonwealth and Development Office (FCDO) must develop an evaluation system which informs decision-making for policy, programming and strategy across a diverse, global portfolio. The Evaluation Strategy outlines our ambition to support, ensure quality and maximise learning of decentralised evaluation, as well as improving mechanisms for identifying, prioritising and implementing central evaluation in critical areas.

There is a strong evidence ecosystem in FCDO, in which evaluation is an important part. This includes evidence generated through situation analysis, programme management, results monitoring, financial and risk management, data analysis, evidence synthesis and research and development. Within this spectrum of evidence tools, evaluation complements other evidence sources as part of a culture of continual improvement and offers a unique contribution in generating robust evidence of what works and why.

Evaluation may be undertaken for learning or accountability, and often these 2 purposes overlap. Evaluation, and the learning generated from it, can be applied to programmes, strategies, policies, portfolios or cross-cutting themes. FCDO draws on a wide range of evaluation tools to meet different information needs and purposes, and this strategy supports appropriate and proportionate application of the full spectrum of evaluation approaches in FCDO.

The UK government has a strong commitment to evaluation, as demonstrated by the establishment of the Evaluation Task Force, the recent National Audit Office review on evaluation and updates to the Government Social Research protocols. In addition, FCDO has scrutiny and accountability obligations as an Official Development Assistance (ODA) spending department, including regular review by the Independent Commission for Aid Impact (PDF, 486 KB) (ICAI) and adherence to the 2015 International Development Act.

The ambition

The FCDO Evaluation Strategy sets the ambition for evaluation over the next 3 years (2022 to 2025). Its overarching goal is to advance and strengthen the practice, quality and use of evaluation so that FCDO’s strategy, policy and programming are more coherent, relevant, efficient, effective, and have greater impact.

In order to achieve this ambition, we will work towards 4 key outcomes. We have outlined the outcomes and their corresponding actions and workstreams in the table below. Annual milestones have been identified against each workstream to ensure we monitor our progress.

Outcomes

Outcome 1: Strategic evaluation evidence is produced and used in strategy, policy and programming

Relevant, timely, high-quality evaluation evidence is produced and used in areas of strategic importance for FCDO, HM Government and international partners.

Key actions and workstreams:

  • enhance oversight of evaluation investments
  • identify potential central evaluations through Evaluation Portfolio Assessments
  • co-design portfolios of evaluated interventions to address evidence gaps on priority themes
  • deliver demand-responsive thematic evaluation
  • support rigorous, experimental impact evaluation for high priority interventions

Outcome 2: High quality evaluation evidence is produced

Users have confidence in the findings generated from evaluations of FCDO interventions, policies and strategies.

Key actions and workstreams:

  • implement the FCDO Evaluation Policy
  • quality assure evaluation design and outputs
  • provide efficient access to good quality suppliers
  • develop and share guidance, templates and best practice

Outcome 3: Evaluation evidence is well communicated to support learning

Evaluation findings are accessible and actively communicated in a timely and useful way to inform future strategy, programme and policy design.

Key actions and workstreams:

  • improve management information of evaluations
  • generate evaluation insights: learning documents synthesising evaluative evidence
  • improve communication and visibility about evaluation support, methods and findings
  • share learning across HM Government and with international partners

Outcome 4: FCDO has an evaluative culture, the right evaluation expertise and capability

FCDO is sufficiently resourced with skilled advisers possessing up-to-date knowledge of evaluation, and minimum standards of evaluation literacy are mainstreamed across the organisation.

Key actions and workstreams:

  • develop organisational capability through individual skills and training
  • lead the Evaluation and Government Social Research Cadre
  • advise on deployment of evaluation staff
  • develop and test innovative methods for evaluation and evidence synthesis
  • advance approaches to support monitoring and evaluation activities on foreign policy and diplomatic priorities

The actions and milestones outlined in the strategy will be reviewed annually to ensure they reflect ongoing learning and adapt to organisational changes. This document was revised in June 2023.

Roles and responsibilities

Operationalising this strategy is the responsibility of the Head of the Evaluation Unit and the Head of the Evaluation and Government Social Research Profession. They are accountable for its delivery, supported by Evaluation Advisers and Programme Managers within the Evaluation Unit and by advisers working on research, monitoring, evaluation and learning across FCDO. In addition, the strategy depends on engagement and co-working with the following groups:

  • staff across the business, especially senior officials
  • Research and Evidence Directorate and the Technology and Analysis Directorate
  • the Investment and Delivery Committee
  • other government departments, especially the Joint Funds Unit and those spending ODA

Introduction

We know how well things work because of evaluation: evaluation is the tool that tells us what works and why, so as to maximise the value of taxpayers’ money and better achieve our objectives. It identifies good practice, improves our performance and helps to target resources on activities that will have the most impact on the ground.

As a new organisation, the Foreign, Commonwealth and Development Office (FCDO) must develop an evaluation system which informs decision-making across a diverse, global portfolio. This strategy outlines the ambition for evaluation in FCDO, and the steps required over the next 3 years to support the journey towards that ambition. It outlines specific actions that will enhance the practice, quality and use of evaluation across the business. It seeks to expand the proportionate use of evaluation across our development and diplomacy work, enabling assessment of both programme and non-programme areas.

This strategy builds on learning from evaluation approaches in the 2 legacy departments (Department for International Development and Foreign & Commonwealth Office) and a review of evaluation policies and strategies in peer government and donor organisations.

The strategy seeks to support, ensure quality and maximise learning from decentralised evaluation – recognising the need for teams spread across the globe to respond to their own context and evidence needs – while improving mechanisms for identifying, prioritising and implementing central evaluations in critical areas. The strategy recognises the different challenges, expertise and experience in evaluating diplomacy and development, and seeks to build on these to apply learning and evaluative approaches across disciplines.

This document is complemented by the FCDO Evaluation Policy, which ensures a common understanding of core evaluation principles and standards. The Evaluation Strategy includes details of how we will promote, monitor and enforce the minimum standards on ethics and quality in evaluation as outlined in the Evaluation Policy.

This is an FCDO-wide strategy. The Head of the Evaluation Unit and the Head of Profession for Evaluation and Government Social Research are accountable for its implementation and will report annually on its progress. Further details are outlined in Section 3.

This strategy covers the period from April 2022 to March 2025, in line with the current Spending Review period. Progress against the indicators outlined in this document will be reviewed annually, and targets revised as appropriate.

In this introduction, we outline the context and evidence system for evaluation in FCDO. In section 2, we specify our goal and 4 outcomes, and detail the activities and milestones against each of those outcomes. The final section of the document outlines the roles and responsibilities for implementing the strategy.

The need for evaluation in FCDO

The FCDO is committed to generating and using evidence. There is a strong evidence ecosystem in FCDO, in which evaluation is an important part. This includes tools such as rigorous evidence reviews, research and development programmes, research analyst reports, economic appraisals, country diagnostics and situation/context analysis to inform how we design strategies, policies and programmes.

Annual reviews and regular monitoring of results and risk help us track how we are delivering. Programme Completion Reports, After Action Reviews and evaluation help us assess how well we delivered and whether it made a difference. Within this spectrum of evidence tools, evaluation [footnote 1] complements other evidence sources as part of a culture of continual improvement and offers a unique contribution in generating robust evidence of what works and why.

Cross government initiatives, such as the establishment of the Evaluation Task Force, [footnote 2] are promoting standards and expectations on evaluation across government departments, working closely with HM Treasury. These initiatives sit alongside FCDO’s scrutiny and accountability obligations as an Official Development Assistance (ODA) spending department, including regular review by the Independent Commission for Aid Impact (PDF, 486 KB) (ICAI) and adherence to the 2015 International Development Act,[footnote 3] which mandates that “the Secretary of State must make arrangements for the independent evaluation of the extent to which ODA provided by the United Kingdom represents value”.

Using evaluation evidence

Evaluation may be undertaken for learning and/or accountability purposes, and often these 2 purposes overlap. Evaluation for learning can help to: manage risk and uncertainty; improve current strategies, policies or programmes by providing evidence to make better decisions; gain an understanding of what works, for whom and when, as evidence to inform future decisions.

Evaluation for accountability can help to generate evidence of how an intervention or policy has worked, for example its efficiency, effectiveness or impact, and supports a commitment to openness, scrutiny and challenge.

Ultimately, evaluation seeks to understand and improve our work and our impact on the ground, and therefore increase the value of our work and our spend. Evaluation can be used to evidence and strengthen our business and country plans, measure key areas under our outcome delivery plans, and contribute to the global public good.

Evaluation, and the learning generated from it, can be applied to programmes, strategies, policies, portfolios or cross-cutting themes. FCDO draws on a wide range of evaluation tools to meet different information needs and purposes. These include:

  1. programme level monitoring and evaluation, with FCDO requiring all programmes to have their progress monitored and for teams to consider whether they are suitable for evaluation

  2. portfolio monitoring and evaluation, drawing together data and learning on strategy, policy and programming across a directorate or country post

  3. strategic and thematic evaluations, managed centrally, which can assess policy, programming or strategy and draw out broader learning

  4. fully external and independent reviews, with a focus on accountability and conducted by the Independent Commission on Aid Impact

Evaluation should not be applied to every policy or programme, but rather where there is an explicit learning need. Consideration of whether evaluation is required should take into account whether policies or programmes are high profile, levels of uncertainty or risk, cost, and learning potential[footnote 4]. There is a wide range of evaluation approaches available, from the most rigorous experimental impact evaluations to light-touch internal learning. Deciding the right evaluation approach must consider:

  • the information needs
  • the feasibility of different approaches
  • the available resources and proportionality

Box 1: Lexicon

For this document and in line with the HM Treasury’s Central Government guidance on evaluation (Magenta Book) and OECD Develop Assistance Committee (DAC) definitions, we define evaluation as the systematic and objective assessment of an on-going or completed FCDO project, programme, strategy or policy, its design, implementation, and results. In FCDO, we use a range of different tools to measure our progress and results, including:

  • monitoring: systematic collection of data to measure progress against intended results
  • process evaluation: independent assessment of how delivery is implemented
  • impact evaluation: independent assessment of what difference the intervention has made, with impacts attribution to the intervention. This includes experimental/quasi-experimental and theory-based approaches
  • portfolio evaluation: evaluation of a group or portfolio of projects with similar aims or collectively contributing towards Directorate General/directorate/post outcome(s)
  • thematic evaluation: evaluation focusing on one or more themes that are relevant beyond a particular project or outcome and cut across countries, regions, and sectors

We promote experimental evaluation approaches where they are feasible and where the findings will promote useful learning. However, we recognise that in some instances they are not feasible or appropriate, and other methods may be more effective in producing the required results. This strategy supports appropriate and proportionate application of the full spectrum of evaluation approaches in FCDO.

The ambition

The overarching goal of the Evaluation Strategy is to advance and strengthen the practice, quality and use of evaluation so that FCDO’s strategy, policy and programming are more coherent, relevant, efficient, effective, and have greater impact.

In order to achieve this ambition, we will work towards 4 key outcomes:

Outcome 1: Strategic evaluation evidence is produced and used in strategy, policy and programming

Relevant, timely, high-quality evaluation evidence is produced and used in areas of strategic importance for FCDO, HM Government (HMG) and international partners.

Outcome 2: High quality evaluation evidence is produced

Users have confidence in the findings generated from evaluation of FCDO interventions, policies and strategies.

Outcome 3: Evaluation evidence is well communicated to support learning

Evaluation findings are accessible and actively communicated in a timely and useful way to inform future strategy, programme and policy design.

Outcome 4: FCDO has an evaluative culture, the right evaluation expertise and capability

FCDO is sufficiently resourced with skilled advisers possessing up-to-date knowledge of evaluation, and minimum standards of evaluation literacy are mainstreamed across the organisation.

This requires an effective, coordinated evaluation system which combines approaches from both legacy departments to enable evaluative thinking and learning across the range of diplomatic and development requirements in FCDO. We will adopt a blended centralised and decentralised approach whereby the majority of evaluation activity and spend is decentralised across FCDO, complemented by centrally managed evaluations.

Outcome 1: Strategic evaluation evidence is produced and used in strategy, policy and programming

This outcome focuses on producing relevant, timely, high-quality evaluation evidence that is used in areas of strategic importance for FCDO, HM Government and international partners. We seek to ensure evaluation investments are proportionate and focused on generating new evidence that supports impactful and value for money programming and effective policy. This includes strengthening our ability to draw lessons and demonstrate the impact of our strategies, policies and programming within or across divisions, departments, geographies or themes. We will undertake an exercise to identify the strength of evidence supporting the 20 highest cost policy areas and programmes, which will inform our priority topics for evaluation.[footnote 5]

Box 2: Central, coordinated impact evaluations on community policing

The Metaketa Initiative tested community policing interventions in 6 developing countries, reaching an estimated 9 million people, and measuring impacts on: community attitudes toward police: community cooperation with the police: police attitudes and behaviours; crime. Randomised controlled trials were conducted in and coordinated across 6 sites, Brazil, Colombia, Liberia, Pakistan, the Philippines, and Uganda, measuring interventions that were implemented by local police agencies and with common elements across all sites. The evaluation found that crime was not reduced citizen’s attitudes did not improve, and interventions were often implemented unevenly and incompletely. This study concluded that national governments should proceed with caution if adopting the community policing model, unless structural constraints or operational practices within the police are also addressed.

The coordinated evaluations produced stronger, more robust evidence than an evidence synthesis from a similar number of uncoordinated evaluations might have. The evaluation helped FCDO better understand where and why such interventions work, and can be used to decide where and how to invest in community policing in the future. The results will be used to inform international evidence-driven policy guidelines.

The evaluation was one of 2 from FCDO included in the Evaluation Taskforce’s Top 10 evaluations: Featured Evaluations: Government – GOV.UK.

To achieve this outcome, we will focus on:

(i) enhancing oversight of evaluations commissioned by FCDO

(ii) strengthening the use of rigorous impact evaluation methods for FCDO activities

(iii) prioritising evaluation investments to meet needs across the range of FCDO development and diplomatic activities.

Identifying and prioritising centrally managed evaluations will be done through both top-down and bottom-up mechanisms, with the Evaluation Portfolio Assessments playing a critical role in identifying under-evaluated areas and topics. We will work in close collaboration with the Research and Evidence Directorate to ensure that our activities complement the portfolio of research and development activity that generates research and evaluation evidence on global evidence gaps. We will ensure gender and inclusion is considered in the selection and design of these evaluations.

Key actions and workstreams

Progress towards this outcome is dependent on a collaborative effort by FCDO’s central Evaluation Unit and advisers working on monitoring, evaluation, research and learning across the organisation. It will include 5 workstreams:

1.1. Enhance oversight of evaluation investments

The Evaluation Unit, working with advisers across the FCDO, will develop improved mechanisms for scrutiny of monitoring, evaluation and learning (MEL) plans in high resource business cases, as part of the existing Quality Assurance Unit processes. Such a mechanism will aim to:

(i) improve the quality and consistency of approach to MEL in programme proposals

(ii) make evaluation plans more strategic, with potential to generate and apply new learning, as well as being technically feasible

(iii) trigger additional support for impact evaluations that are deemed feasible, to maximise quality, value for money and learning. This will enable us to hold back or improve evaluations which are not feasible or not in FCDO’s strategic interest, as well as ensure that evaluations have sufficient technical support throughout their lifecycle

1.2. Identify potential central evaluations through Evaluation Portfolio Assessments

The Evaluation Unit will conduct an analysis of Directorate General (DG) portfolios and generate recommendations on areas where evaluations should and could be conducted. These Evaluation Portfolio Assessments will identify where evaluation could generate insights for high-level decision-making, particularly in areas that might otherwise have been missed. For example, the Evaluation Portfolio Assessment might include recommendations for evaluations on thematic or geographic policy outcomes, cross-cutting objectives, ways of delivering, or specific, innovative programmes. They will enable the identification of opportunities to evaluate non-programmatic activity (eg the effectiveness of diplomatic influencing or the implementation of foreign policy) or identify opportunities for rigorous impact evaluation of programmes, including within areas identified as the 20 highest cost policy/programmes which may have limited evidence or identified as an area with low evidence in the FCDO’s assessment of the ‘Best Buys’ within specific sectors.

Embedded advisers working on monitoring, evaluation and learning will provide expert support to this process, including shaping the final recommendations and supporting their uptake and implementation. The process will include consultation with relevant teams, including from FCDO’s Research and Evidence Directorate, to ensure coherence and value add of proposed evaluations. Recommendations will be put to relevant Directors to action as appropriate and with Evaluation Unit support. The full list of recommendations and agreed actions will be submitted to the Investment and Delivery Committee as part of the annual update on the Evaluation Strategy.

1.3. Co-design portfolios of evaluated interventions to address evidence gaps on priority themes

Over recent years, a number of programmes have addressed evidence gaps in areas of Ministerial and international priority by matching (1) a component supporting a portfolio of innovative interventions, designed and managed by a policy team, with (2) a component supporting a portfolio of corresponding evaluations (impact and process evaluations) and research (eg on prevalence and drivers of the issue in question), designed and managed by a team in Research and Evidence Directorate. Such collaborations will cover a range of thematic priorities including what works to prevent violence against women and girls, what works to prevent violence at scale and the Disability Inclusive Development programme. Programmes of this nature will continue to generate robust research and evaluation evidence to inform decision making within the FCDO and beyond.

1.4. Deliver demand-responsive thematic evaluation

The Evaluation Unit will manage a flexible resource, via the Evidence Fund, to support strategic and thematic evaluations. These evaluations will be identified via the Evaluation Portfolio Assessments but will also be accessible via bidding windows where staff across FCDO can bid for funds and technical support to enhance portfolio and thematic evaluation evidence generation. The Evidence Fund is a dynamic platform, led within the Research and Evidence Directorate, providing demand-responsive evidence and support for dissemination and uptake. We will work closely with ICAI to ensure FCDO-led evaluation adds value and avoids duplication with ICAI’s work.

1.5. Support rigorous, experimental impact evaluation for high priority interventions

Subject to necessary approvals, the Evaluation Unit will develop and implement a new programme to support experimental evaluations of strategic, innovative interventions, assessing what works to fill key evidence gaps, thereby increasing the impact of FCDO investments. Given the technical complexity and levels of investment required to do good experimental and quasi-experimental evaluations, the programme will maximise value for money by focussing on areas with the most potential for learning, identified through the Best Buys and other assessments of evidence gaps, and in consultation with relevant experts. This will include innovative approaches to robust impact evaluation, in areas such as diplomacy and influencing or insecure and fragile contexts. The programme will provide 3 key services:

(i) design, conduct and disseminate impact evaluations

(ii) undertake follow-up studies of existing impact evaluations to assess longer-term impact

(iii) provide technical support to ensure quality and develop FCDO capability on impact evaluations

We will monitor our progress against each of these activities on an annual basis. Table 1 outlines the key milestones.

Table 1: Key milestones for workstreams under Outcome 1

Milestone By April 2023 (+1 year) By April 2024 (+2 years) By April 2025 (+3 years)
1.1 Enhance oversight of evaluation investments Criteria for assessing new evaluation investments developed and new mechanism fully tested. Approach for review rolled out across the evaluation advisory network. System working, MEL reviews complete on all QAU eligible business cases. Formal feedback process in place.
1.2 Identify potential central evaluations through Evaluation Portfolio Assessments Evaluation Portfolio Assessment (EPA) pilot complete and selected evaluations progressed.
Evaluation Unit resource restructured to scale the approach to other parts of FCDO. Guidance and support documents produced for embedded Evaluation Advisers applying to country portfolios as appropriate.
EPAs completed in 1 to 2 Director General areas with thematic/geographic focus: update guidance as required. EPAs completed in 1 to 2 Director General areas with thematic/geographic focus (3 to 5) cumulative assessments.
1.3 Co-design portfolios of evaluated interventions to address evidence gaps on priority themes Follow up VAWG programme (What works to prevent violence at scale) contracted; first call for proposals for Humanitarian Protection impact evaluations (BEPAC). First round of What works…at scale and BEPAC impact evaluations identified and under design. Impact evaluations being implemented, additional studies under design.
1.4 Deliver demand-responsive thematic evaluation Completion of 3 thematic evaluations, 2 to 3 additional evaluations under design. Run bidding window(s) to fund and support thematic and portfolio evaluations. Target figures to be confirmed in Evidence Fund logframe.
1.5 Support rigorous, experimental impact evaluation for high priority interventions Concept note and business case approved; supplier recruited. Supplier recruited and inception complete. Impact evaluations being implemented, additional studies under design as appropriate.

Outcome 2: High quality evaluation evidence is produced

We aim to ensure confidence in the findings generated from evaluation of FCDO interventions, policies and strategies. This is essential for our evaluations to be credible, trusted and used, and to maximise the impact of our findings in design and delivery of policies and programmes. Under this outcome, we aim for:

(i) an organisation-wide agreement of minimum principles and standards for evaluation and when they should be used

(ii) an increase in evaluations that meet expected quality standards in design and outputs

(iii) the use of pre-qualified suppliers in external evaluations to maximise quality and credibility

The Evaluation Policy defines evaluation as systematic (it uses recognised robust and replicable methodology which is appropriate to the evaluation question) and objective (it includes a level of independence from the strategy, policy or programme in question).[footnote 6] All evaluative activities should aim to meet core principles of being useful, credible, robust, proportionate and safe and ethical, including consideration of gender and inclusion in their design, implementation and analysis.

For programme spend, the Programme Operating Framework (PrOF) provides a clear set of rules to ensure we use evidence throughout design and implementation. The rules are supplemented by the Evaluation PrOF guide which includes criteria for staff to consider when deciding whether to evaluate. This includes consideration of strategic value and the broader evidence base to ensure that evaluation investments are targeted on evidence gaps within priority issues. However, deciding whether an evaluation is required and appropriate requires technical input from evaluation experts. Without such expertise there are risks that potential evaluations in key areas are missed, or that evaluations are conducted on areas or using approaches which do not generate useful learning.

Box 3: Credibility is vital for evaluations to influence change

The OECD specifies that “aid agencies should have an evaluation policy with clearly established guidelines and methods and with a clear definition of its role and responsibilities and its place in institutional structure”. Research conducted by DFID found quality and rigour is essential: the credibility of an evaluation (including the evaluation quality) is a key component to the evaluations use and uptake, and that the intrinsic quality of an evaluation is a key modified of its potential value. Evaluation quality factors include: clarity about the evaluation purpose and objectives, the selection of a team with the right mix of skills and expertise, the appropriateness and ‘right rigour’ of the methodology, the robustness of the data and the analytical frameworks, ensuring possibility of verifiable findings, the extent to which plural views are represented in the evaluation findings, and the use of the evaluation findings to draw useful conclusions and recommendations.

Sources:

In refining how we ensure the quality of evaluations, FCDO takes into account recent updates to cross-government approaches to evaluation, including the updated Magenta Book (2020), the Government Social Research (GSR) publication protocol (2021) and the findings of the National Audit Office review on evaluation in government (2021).[footnote 7] Evaluations are aligned to FCDO data commitments, including the Inclusive Data Charter and policies on open access.

Key actions and workstreams

The main workstreams by which we will promote, support and ensure quality and credibility are:

2.1 Implement the FCDO Evaluation Policy

The FCDO Evaluation Policy seeks to ensure a common understanding of evaluation principles and standards. It provides a mechanism to promote, monitor and enforce minimum standards on evaluation as well as demonstrating our ongoing commitment to high quality, objective and transparent evaluation. It outlines standards required in evaluations, including those which assess activities outside of programme spend. The Evaluation Unit will promote the use of the policy for all evaluative activities, track adherence to the standards and follow-up with teams requiring support.

2.2 Quality assure evaluation design and outputs

Independent quality assurance helps to protect objectivity and credibility. It demonstrates to internal and external audiences that we have processes to ensure objectivity, thereby increasing trust in our evaluation outputs. Without quality assurance, there are risks of weak, misleading or even inaccurate evidence informing decision-making and impacting value for money. The Evaluation Quality Assurance and Learning Service (EQUALS) 2, is an essential mechanism for ensuring evaluations meet minimum standards of quality throughout their lifecycle. Through use of an expert panel, it conducts quality assurance of evaluation terms of reference, inception, baseline and final reports.[footnote 8] The standards are set by FCDO and continually updated in line with the latest cross government and international guidance.

2.3 Provide efficient access to good quality suppliers

The new Global Evaluation Monitoring Framework Agreement (GEMFA) ensures the provision of efficient and effective expert services for the delivery of monitoring, evaluation and learning of FCDO diplomacy and development (as well as support other ODA-spending government departments). It gives access to pre-approved suppliers that provide high quality and internationally recognised evaluation and monitoring knowledge and skills, and can deliver MEL across the organisation’s strategies, policy areas and programmes across the globe. This also enables more control over the timing of evaluations – which is critical to uptake and use – by reducing the procurement timelines without compromising on quality.

2.4 Develop and share guidance, templates and best practice

The Evaluation Unit will be responsible for sourcing, promoting and developing guidance and templates on evaluation. This will include updates to the Evaluation PrOF guide, as well as guidance notes on specific methodological or evaluation management issues. It will include surfacing best practice within FCDO on new or challenging approaches, such as developing and managing country level portfolio MEL, in line with priority activities set out in Outcome 1 and learning products developed under Outcome 3.

We will monitor our progress against each of these activities on an annual basis. Table 2 outlines the key milestones.

Table 2: Key milestones for workstreams under Outcome 2

Milestone By April 2023 (+1 year) By April 2024 (+2 years) By April 2025 (+3 years)
2.1 Implement the FCDO Evaluation Policy Policy agreed and published, accessible and actively promoted across FCDO. Policy reviewed to ensure relevance in line with any changes in the organisation. Policy reviewed and updated to ensure relevance in line with any changes in the organisation; redrafted for next SR period as required.
2.2 Quality assure evaluation design and outputs EQUALS 2 fully operational and used across the organisation. Demand for and user satisfaction with EQUALS 2 remains high [90% rate the quality of service as excellent or good]. Demand for and user satisfaction with EQUALS 2 remains high [90% rate the quality of service as excellent or good]. Model reviewed to ensure approach is appropriate and sustainable.
2.3 Provide efficient access to good quality suppliers GEMFA launched, monitored and promoted amongst Evaluation Advisers and staff working on programmes or evaluations. Framework actively managed over SR period (inc. corrective action taken if suppliers not performing). Framework actively managed over SR period.
2.4 Develop and share guidance, templates [footnote 9] and best practice PrOF guide updated. Key guidance documents developed and shared. Mechanism for responding to learning from EQUALS/ GEMFA. Best practice examples on portfolio MEL developed. Evaluation and ToC PrOF guide updated and published.
Key guidance documents developed and shared, in response to learning from EQUALS and wider demand.
Foreign policy and influencing monitoring and evaluation toolkit developed and shared.
M&E tools for programme staff developed by Solutions Hub, including on MEL for adaptive programming.
Outcome Confidence Rating rolled out to all ODA spending programmes.
PrOF guide updated. Key guidance documents developed and shared, in response to learning from EQUALS and wider demand.
Good examples of monitoring foreign policy and influencing shared.
Outcome Confidence Rating being used across the department.

Outcome 3: Evaluation evidence is well communicated to support learning

As a learning organisation, drawing out and sharing lessons from our evaluations is essential. Using evaluation findings and other evidence enhances our impact on the ground, efficiency in how we work, and the value for money of our investments. To support this, we must improve access to and knowledge about evaluations across the organisation. We will seek to:

(i) improve supply of and access to evaluations for all FCDO staff

(ii) promote and track the use of evaluation findings

(iii) encourage a learning environment, where evaluations can support reflection on both success and failure

Improving our repositories of evaluation findings, signposting teams to evaluations with similar themes or methods, and generating overviews and syntheses of existing evaluative evidence can help us understand what has or has not worked in multiple instances and contexts and will mitigate the risk that learning from an evaluation starts and ends with the team that commission it.

Box 4: Light touch review highlights key lessons on girls’ education outcomes

The Evaluation Unit commissioned a review to synthesise findings from recent evaluations of 10 FCDO-funded education programmes in Africa and Asia, including Phase 1 of the Girls Education Challenge. It found that a combination of activities targeting girls specifically, as well as activities targeting both boys and girls, is needed to improve education outcomes for girls. It highlighted emerging lessons on issues such as teacher practice, gender norms, targeting the most marginalised, measuring learning outcomes. Reviewing evaluation findings from across multiple interventions helps identify patterns and key lessons which apply across contexts, to inform future design making, resource allocation and programme-design.

Key actions and workstreams

A combination of actions working coherently together will enable FCDO to promote learning from evaluation. This includes:

3.1. Improve management information of evaluations

We aim to have management information (MI) systems that provide a picture of all evaluations in FCDO and allow us to track each evaluation across its lifespan. This means enhancing interoperability and consistency across multiple data sources. Doing so will increase opportunities for knowledge sharing and collaboration across teams, enabling staff to easily access evaluations on particular topics or using particular methods. The data will inform central insights and visibility work, support recommendations for strategic evaluation (see Outcome 1) and provide a tool for tracking the implementation of, and compliance to, the FCDO Evaluation Policy (see Outcome 2). It will enable us to provide accurate updates to accountability and scrutiny bodies on existing and pipeline evaluations and help us to provide the right support to teams to ensure high-quality and transparent evaluations.

3.2. Generate evaluation insights

Evaluation Insights are learning documents which synthesise evaluative evidence. Evaluation Insights could cover areas such as: ethical processes within evaluations; evaluation supplier performance; thematic syntheses of existing evaluation studies; or, improvement of evaluation methods. Insight reports will support the production of evaluation guidance and practical learning, including through related training and events. This work will be aligned with other sources of learning, such as ICAI reviews and internal evidence work, such as FCDO’s ‘Best Buy’ papers which summarise rigorous cost-effectiveness evidence to help increase the impact of FCDO policy and programming.

3.3. Improve communication and visibility

We will provide and promote accessible information on evaluations to the wider organisation, including those less familiar with evaluation evidence. An uptake and visibility plan will be developed and will include: keeping internal and external repositories of evaluations up to date; developing accessible summaries through an occasional newsletter/spotlight communication; convening and promoting evaluation seminars and events with specialist and wider FCDO audiences; and engaging with stakeholders across the FCDO to promote uptake of evaluation findings by linking to existing knowledge management and evidence communications processes. It will outline specific activities to engage senior leaders, and advocate for the proportionate application of evaluative tools in all directorates and posts. We will develop plans for generating and sharing case studies on how evaluations have had impact, as well as monitoring the uptake of evaluation findings in programme design and wider decision making. We will make use of internal communications channels (eg Teams, Yammer, the intranet) to highlight key findings of published evaluations to create and contribute to a culture of sharing and collaboration. This will be closely aligned to the work undertaken under Outcome 4.

3.4. Share learning across HM Government and with international partners

FCDO has much to share and much to learn from others working across government and across the development and diplomacy sectors. We will mobilise opportunities to disseminate results and learning from evaluations in these external forums, as well as promoting external learning events to FCDO staff. In particular, the Evaluation Unit and broader cadre will connect with those managing similar challenges, such as those working on international influencing activities or ODA-spending departments. Improved management information and knowledge management systems will also support sharing with external audiences. We will continue our engagement with key external networks such as the OECD Development Assistance Committee EvalNet, the C19 Global Evaluation Coalition and Multilateral Organisation Performance Assessment Network (MOPAN).

We will monitor our progress against each of these activities on an annual basis. Table 3 outlines the key milestones.

Table 3: Key milestones for workstreams under Outcome 3

Milestone By April 2023 (+1 year) By April 2024 (+2 years) By April 2025 (+3 years)
3.1 Improve management information of evaluations Improved MI system developed and piloted.
External evaluation repository up to date on GOV.UK and relevant internal libraries.
Comprehensive, up-to-date evaluation database accessible to FCDO staff and being used by Evaluation Unit and Evaluation Advisers. External evaluation repository regularly updated on GOV.UK and relevant internal libraries. Continuation of external evaluation repository, regularly updated on GOV.UK and relevant internal libraries, linked to MI system to ensure its searchable by geography and sector.
3.2 Generate Evaluation Insights Plan for insights work established and insights work underway Forward look for insights developed. 2 to 3 evaluation insights products complete. 4 to 6 evaluation insights products complete (cumulative).
3.3 Improve communication and visibility Uptake and visibility plan developed. Development of communication products including spotlight newsletters, evaluation impact case studies. 1 to 2 evaluation learning events. Annual uptake survey designed and implemented. Uptake and visibility plan developed. Development of communication products including spotlight newsletters, evaluation impact case studies. 3 to 4 internal evaluation learning events (cumulative). Collect data on uptake and use. Regular communication products including spotlight newsletters, evaluation impact case studies. 5 to 6 learning events (cumulative). Annual uptake survey implemented.
3.4 Share learning across HM Government and international partners 1 to 2 evaluation learning events open to external participants and/or including external speakers.
Presentation of evaluation findings at 1 to 2 externally hosted events.
1 to 2 evaluation learning events open to external participants and/or including external speakers (cumulative).
Presentation of evaluation findings at 1 to 2 externally hosted events (cumulative).
3 to 4 evaluation learning events open to external participants and/or including external speakers (cumulative).
Presentation of evaluation findings at 3 to 4 externally hosted events (cumulative).

Outcome 4: FCDO has an evaluative culture, the right expertise and capability

To ensure that high-quality, timely, relevant evidence is generated and used, FCDO needs to ensure that:

(i) appropriate expertise is in place to produce and use evaluation evidence

(ii) FCDO decision making processes include consideration of evaluation and its evidence

(iii) FCDO’s culture and incentives encourage production, sourcing and use of evaluation evidence

Expertise is essential to assessing where evaluation is needed, identifying the appropriate evaluation approach and design, implementing evaluations to a high quality, and sharing the results and learning effectively.

We will work towards having sufficient evaluation skills across the diplomatic, policy, programme delivery and analytical professions to conduct and commission good quality monitoring and evaluation at FCDO. Across FCDO, distributed across the network of country posts and within central teams, there are accredited Evaluation Advisers and others fully or partly working in monitoring, evaluation and learning, with skills on the planning, generation and use of evaluation evidence. This pool of specialists needs to be sufficient in number to meet needs, include an element of flexibility to respond to emerging priorities and scope to contribute to the wider evaluation system, and advisers’ skills must be maintained and updated. We must continue to generate innovative methodologies for hard to measure topics, complex contexts and adaptive approaches, generating evidence to fill critical gaps and improving our capability to use new methodologies going forward.

There is potential to build capability for all FCDO staff to find, appraise and use evaluation evidence, to ensure a minimum evaluation literacy across the organisation. This capability is particularly important to build in non-programming areas, where the production and use of evaluation is less institutionalised, and in areas which are more challenging to evaluate. We will provide a structured continuing professional development offer for both mainstreaming minimum evaluation standards as well as specialised training for experts.

To build demand for evaluation we will work to promote its value through demonstration: the Evaluation Portfolio Assessments and thematic evaluations under the Evidence Fund (Outcome 1) provide opportunities to profile how and why evaluation adds value. In addition, targeted deployment of adviser resource to emerging priorities can support the use of evaluation capability where it is needed most. Increasing the communications and visibility of evaluation evidence (Outcome 3) will contribute to putting a spotlight on evaluation’s usefulness and examples of the impact of evaluation evidence. These actions will promote the benefit of evaluation to FCDO staff, including senior leaders.

Box 5: New tools to evaluate where traditional measures are not possible

The use of geospatial data (including remotely sensed data from satellites and drones) has emerged as a powerful tool for evaluating programs that cover large geographic or hard-to-reach areas. Using this information, however, can be expensive and often requires technical expertise and computing capacity for analysis. To help reduce this technical barrier to entry and to increase geospatial literacy to among those newly commissioning, producing, and using this type of research, the FCDO project CEDIL is developing an introductory, open-access, interactive learning module to provide example datasets, code and provide guidance on using geospatial data to measure changes in climate impacted areas such as agriculture and water availability. The module is available online and was presented to an FCDO audience through a virtual training session.

Key actions and workstreams

4.1. Develop organisational capability, individual skills and training

The Evaluation and Government Social Research (GSR) Head of Profession (HoP) will lead work to develop an organisational evaluation capability offer, which is aligned with and draws on relevant complementary frameworks such as Data Driven Diplomacy and Development (4D) and the cross-government Analysis Function capability offers. This organisational and individual capacity building will serve to mainstream minimum evaluation literacy across FCDO (a level 1 and level 2 offer), using primarily the International Academy and considering the right balance across different grades including senior leaders. It will also build, maintain and extend knowledge and experience for capability level 3 (Proficient), and level 4 (Expert). This will enable FCDO to develop a pipeline of relevant deployable expertise plus ensure FCDO expert skills are up-to-date with latest best practice and innovations. EQUALS 2 will provide short-term technical assistance as a backstop, ensuring technical coverage and high-quality evaluation in a broad range of specialisms across policies, geographies and methodologies.

4.2. Lead the Evaluation and Government Social Research (GSR) Cadre

The GSR Technical Competency Framework, developed by HM Treasury in conjunction with individual departments, sets the minimum standards needed to accredit to the Government Social Research profession. The aligned FCDO framework will form the basis for assessing our monitoring, evaluation and learning expertise through accreditation. Facilitated by the Evaluation and GSR HoP, FCDO advisers will be able to take advantage of the GSR Continuing Professional Development and cross-HMG networking opportunities on offer. The HoP will represent FCDO in cross-HMG GSR Board and Standing Meetings, ensuring FCDO is able to take full advantage of the resources on offer, including access to a larger pool of skilled personnel to fill staffing gaps.

4.3. Advise on deployment of evaluation staff

Decisions about the mix of skills and expertise in staffing is devolved to senior leadership within directorates and posts across the FCDO. A regular review of deployment of evaluation expertise across the business will be conducted, to identify gaps and advocate for appropriate coverage. FCDO will look at cross-government pipeline options such as the GSR mainstream campaign.

4.4. Develop and test innovative methods for evaluation and evidence synthesis through the Centre of Excellence for Development Impact and Learning (CEDIL)

FCDO has funded CEDIL since 2017 to develop and test innovative methods for evaluation and evidence synthesis in international development contexts, and to build the evidence base on research uptake and use in decision-making. It does this through evaluating complex interventions, enhancing evidence transferability, and producing learning on improving research uptake. CEDIL is pushing what is possible in evaluation, through developing methods and sharing learning on methods with FCDO, developing our capacities and abilities to measure more complex areas. As the programme ends, the FCDO will ensure that thematic and methodological learning continues to be publicly accessible and are widely promoted.

We will continue to build networks and linkages with organisations active in the evaluation innovation space to formalise and promote cutting edge methods in the FCDO and to evolve our tools and approaches to measuring the hard to measure. This includes ongoing work to produce guidance on improved methods, developing learning products for the Evaluation Advisory cadres and directly supporting teams facing particularly complex evaluation challenges.

4.5. Advance approaches to support monitoring and evaluation activities on foreign policy and diplomatic priorities

The Evaluation Unit is leading a workstream to strengthen our approach to monitoring and evaluation of foreign policy and diplomatic priorities. The aims are to build on existing expertise and capabilities, generate tools and good practice on measuring impact, and share learning across FCDO including through internal and external cross HM Government working groups, the Evaluation Cadre and other specialist cadres.

We will monitor our progress against each of these activities on an annual basis - Table 4 outlines the key milestones.

Table 4: Key milestones for workstreams under Outcome 4

Milestone By April 2023 (+1 year) By April 2024 (+2 years) By April 2025 (+3 years)
4.1 Develop organisational capability, individual skills and training 4D and Analytical Function aligned FCDO evaluation capability offer developed (Levels 1 to 4). Finalise Level 1 to 4 Evaluation Capability offer mirroring Economics Community offer and publicise widely across FCDO and other relevant government departments (eg ODA spend). Evaluation capability offer reviewed and revised.
4.2 Lead the Evaluation and Government Social Research Cadre FCDO GSR Accreditation round complete. Professional Development Conference held for FY 2023 to 2024.
Successful expansion of GSR Mainstream recruitment within FCDO, seeing increase in numbers placed.
New CPD Speaker Series launched with senior evaluation leaders from other relevant organisations sharing key learnings as well as reflecting on career development.
Increased participation by FCDO GSR members with GSR development opportunities, including serving on cross-HM Government recruitment panels.
FCDO GSR Accreditation round complete for FY 2024 to 2025.
Professional Development Conference held for FY 2024 to 2025.
Further expansion of GSR Mainstream recruitment and exploration of other schemes (such as Placement Students), seeing further increase in #s placed within FCDO.
4.3 Advise on deployment of evaluation staff Mapping exercise of coverage across priority areas complete. Present analysis of mapping to senior officials and continue to feed into business planning and risk management processes. Target to be determined by March 2024, depending on organisational design and structures.
4.4 Develop and test innovative methods for evaluation and evidence synthesis 16 user-oriented summary products produced and published.
9 FCDO or public facing learning events.
Develop work plan for the development of innovative methodological guidance and capacity building. Deliver work plan for innovative methodological guidance and capacity building.
4.5 Advance approaches to support M&E activities on foreign policy and diplomatic priorities Evidence-base developed to inform design and roll out of diplomatic influencing training.
Technical advice to Foreign Secretary priority areas delivered.
Refreshed workplan developed including technical advice and MEL tools that are curated and shared with the evaluation and statistics cadres. Deliver workplan developed including technical advice and MEL tools that are curated and shared with the Evaluation Advisory cadre.

Implementation of the strategy

Roles and responsibilities

Operationalising this strategy is the responsibility of the Head of the Evaluation Unit and the Head of the Evaluation and Government Social Research Profession. They are accountable for its delivery, supported directly by Evaluation Advisers and Programme Managers within the Evaluation Unit and by advisers working on monitoring, evaluation and learning across FCDO.

An efficient evaluation system is reliant upon 2-way communication between the centre and the decentralised teams situated across the FCDO global footprint. Embedded advisers are key to promoting and implementing evaluation activities which contribute towards the outcomes in this strategy. Advisers should be aware of and make use of the services and support offered by the Evaluation Unit and the Evaluation Head of Profession; the centre should provide strategic direction, guidance, backstopping and advocacy to support embedded advisers. As a minimum requirement, advisers will stay informed of central developments and engage in opportunities to advance their own capabilities. In line with other FCDO cadres, Evaluation Advisers are expected to use their professional expertise and contribute up to 10% of their time annually to meet key business objectives. They should also provide updates from their own area of work, including contributing to central MI, sharing case studies and evaluation impact, and contributing to learning and dissemination activities.

The strategy depends on effective engagement and co-working with the following groups:

Staff across the business, especially senior officials

We recognise the importance of buy-in from all parts of the office, especially senior leaders. Our workstreams will be designed to target senior leaders as necessary, advocating for appropriate use of evaluative tools to support decision-making and supporting development of evaluation capacity and capability within teams

Research and Evidence Directorate and the Technology and Analysis Directorate

The Evaluation Unit will engage key research and data specialists on relevant workstreams to ensure learning and knowledge management activities are complementary. Ensuring a shared agenda with the Chief Scientific Adviser is essential to supporting evidence use; we will continue to collaborate with the Research and Evidence Directorate on a range of activities including implementation of the Evidence Fund, evidence mapping and synthesis activities, and generation of experimental and quasi-experimental evaluation evidence. The Heads of Profession for Statistics and Evaluation will continue a close working relationship to ensure support for advisers is coherent. The Evaluation Unit will also engage the FCDO Centre for Delivery and Centre for Data and Analysis to maximise join up and opportunities to collaborate on the continual review and refinement of the PrOF and other delivery initiatives

Investment and Delivery Committee

We will seek steers and approval for our approach, ensuring it remains relevant and appropriate to organisational priorities. Endorsement from the Investment and Delivery Committee also communicates to others the importance FCDO places on evaluation and evidence

Other government departments, especially the Joint Funds Unit and those spending ODA

We will engage with evaluation teams in other government departments, to share learning about approaches, methods and themes, and ensure coherence across shared interests. This is particularly relevant for departments with an international angle to their work, or where we have partnerships such as through the Joint Funds Unit. Given the additional accountability requirements on ODA, and the need for coherence across government, we will continue to ensure open communication channels, share good practice, and provide evaluation support services to departments with ODA-spend through EQUALS 2.

Tracking progress

A written update on progress against the strategy will be provided to the Investment and Delivery Committee on an annual basis.

This strategy is valid from 2022 to 2025. A new strategy will be developed and finalised in advance of the next Spending Review period, which takes into account progress on this strategy and any changes to the organisational design and priorities.

  1. For this document and in line with HM Treasury’s Central Government guidance on evaluation (Magenta Book) and OECD-DAC definitions, we define evaluation as the systematic and objective assessment of an on-going or completed FCDO project, programme, strategy or policy, its design, implementation, and results. 

  2. The Evaluation Taskforce is a joint Cabinet Office-HMT unit providing specialist support to ensure evidence and evaluation sits at the heart of spending decisions. https://www.gov.uk/government/organisations/evaluation-task-force 

  3. https://www.legislation.gov.uk/ukpga/2015/12/pdfs/ukpga_20150012_en.pdf (PDF, 764 KB) 

  4. The Evaluation Programme Operating Framework Guide, available internally for FCDO staff, offers further details on deciding whether to evaluate. 

  5. A condition of the 2021 Spending Review settlement requires the FCDO to provide HM Treasury and the Evaluation Task Force with an assessment with an assessment of how much evidence supports the 20 highest cost policy areas/programmes, using the Nesta Standards of Evidence. 

  6. Based on the HMG Magenta book definition of evaluation and the OECD Development Assistance Committee (DAC) Network on Development Evaluation, we define evaluation as: a systematic and objective assessment of the design, implementation and outcomes of an intervention, programme or policy, or a portfolio of interventions, programmes or policies in a particular area or theme. It involves understanding the process behind implementation, and the effects or outcomes, for whom and why. It identifies what can be improved, and where appropriate, can estimate the overall impacts and cost-effectiveness. 

  7. Magenta Book 2020: https://www.gov.uk/government/publications/the-magenta-book; Government Social Research: Publication protocol updated 2021: https://www.gov.uk/government/publications/government-social-research-publication-protocols; National Audit Office review on evaluation in government 2021 https://www.nao.org.uk/report/evaluating-government-spending/ 

  8. This service is also available to other government departments for their ODA spend. EQUALS 2 is not always appropriate (eg where partners have their own QA systems, or for highly sensitive material); alternative mechanisms for independent QA are outlined in the policy. 

  9. Templates for encouraging best practice on evaluation products, which might include templates for terms of reference, management responses, evaluation reports etc.