Research and analysis

UKSPF place-level evaluation: methodology report

Published 4 April 2025

Executive summary

Introduction

Conducting place-level evaluation of UK Shared Prosperity Fund (UKSPF) activities enables us to take a more holistic approach to understanding how combinations of interventions are working together, aligned with local strategies and delivered alongside other local growth funds. This is a technical document for the UKSPF place-level evaluation, summarising the output of early evaluation design including: selection of places for case studies, the process of developing evaluation plans and a summary of evaluation themes and research methods across all case studies. The report has been co-drafted between the Ministry of Housing, Communities and Local Government (MHCLG) and our contracted evaluation partners Ipsos.

Case study selection

Over 250 Lead Local Authorities (LLAs) were identified as having received UKSPF funding. The approach to identifying areas to participate in the place-level evaluations involved a two-stage process. Firstly, desk-level research was conducted to develop a long list of 80 LLAs. This list was refined through additional analysis of secondary data sources, a detailed review of Local Investment Plans, an evidence review of six-month delivery reports, LLA interviews and additional LLA workshops. Final places were selected using a level of purposive sampling using the information above, to include spread of places across the UK, whilst considering levels of deprivation and other area characteristics (for example urban/rural geography).

Evaluation design

Overarching research questions were designed in order to undertake impact, process and value for money evaluation in each case study. Each place was engaged in an evaluation design process to define additional local research questions, outline the unit of analysis to be explored in each case study, and to understand suitable research methods and data availability for answering the research questions. This process included documentation reviews by the evaluation teams, who developed local theories of change and undertook local evaluation design workshops.

Evaluation themes and methods

Specific themes and interventions to be covered by the evaluation were determined through a co-design process with case study places. Final selection includes a higher proportion of Communities and Place interventions, but also a spread across People and Skills and Supporting Local Business interventions. Contribution analysis will be the main method through which impact is evaluated, supported by a range of mixed-methods research including; quasi-experimental designs, beneficiary surveys and interviews, and delivery staff surveys and interviews. Cross case study analysis will be conducted to identify what works well in different contexts across the case studies (at an aggregate level), using Qualitative Comparative Analysis.

1. Introduction

1.1 Aims of this document

This is a methodological report for the UKSPF place-level evaluation, summarising the output of early evaluation design, including selection of places for case studies, and the process of developing evaluation plans. The report also summarises the evaluation themes and methodologies selected across all place evaluations. This contains only technical information and no research findings.

Find further information about the overarching UKSPF evaluation strategy.

1.2 Structure of UKSPF evaluation

Evaluation activity for UKSPF as a whole consists of 3 tiers:

  • programme level evaluation: this will explore the overall impact, primarily using contribution analysis, and a value for money evaluation of UKSPF

  • place-level evaluation: this will focus on producing up to 35 place-level case studies across the UK, to understand local delivery and impacts, and to generate robust evidence on how effective combinations of UKSPF-supported interventions within a locality work together to support local growth and communities

  • intervention-level evaluation: this will focus on evaluating nine types of interventions supported by UKSPF funding to generate robust evidence on what interventions work, or do not work, for whom and why across the three UKSPF investment priorities

In addition to these three tiers, some local authorities are also undertaking their own evaluations of UKSPF funded interventions.

1.3 Why place-level evaluation?

Conducting place-level evaluation of UKSPF activities enables us to take a more holistic approach to understanding how combinations of interventions are working together, aligned with local strategies and delivered alongside other local growth funds. Evaluations can be designed with and for local places, to meet learning needs for both central government and local government, including why the impact and importance of local contexts matters when implementing local growth policies. Not all areas receiving UKSPF funds could be involved in this place-level evaluation due to time and resource constraints. Therefore, a selection of places has been selected, using the methodology described in section 2. This includes places within each constituent country of the UK.

This strand of the evaluation builds on and complements the other evaluation strands for UKSPF as described in the UKSPF strategy.

1.4 Approach to methodology

This report will set out the two strands to evaluation design: place selection and evaluation planning. The findings of the report are informed by the following data.

Case-study Selection Stage (June 2023 – March 2024)

  • Review of MHCLG management information and project documentation (e.g. Local Investment Plans (LIP))
  • Review and analysis of secondary data
  • Semi-structured qualitative interviews with five key programme stakeholders and 48 Lead Local Authorities (LLA)

Evaluation Planning Stage (April 2024 – November 2024)[footnote 1]

  • Review of local programme monitoring information and documentation
  • Review of secondary data including Census and other official statistics
  • Interviews with UKSPF leads at the LLA
  • A Theory of Change workshop per LLA, including (but not limited to) LLA representatives, delivery partners, local elected members and other local stakeholders

1.5 Limitations

Limitations in establishing suitable areas and methodologies for the place-level case studies include:

Case-study selection

  • Differences in data consistency across the 4 nations of the UK: The analysis to identify suitable places has utilised information on places in each of the four nations of the UK. However, there are differences in the timing of data collection, the categories and the approaches used in each nation particularly in the data analysed for the rural-urban categorisation and levels of deprivation for each area.

  • Challenges with qualitative research: Some LLAs did not respond to the request to participate in the interviews, and therefore it was not possible to collect information about their delivery and ability to take part in the place-level evaluation research.

  • Limitations of management information: The research team used information presented in the LIP and six-month progress reports. The progress reports related to delivery up until March 2023, and therefore held outdated information. Consultations with LLAs were used to obtain further information about the progress of delivery.

Evaluation Planning

  • Delays due to pre-election period: Contact with LLAs ceased during the pre-election period ahead of the General Election in July 2024, delaying the start of evaluation planning for the majority of places.

  • Changes to UKSPF delivery in 2025/26:

    • UKSPF funding for 2025-26 is at a reduced level compared to previous years. From April 2025, this will impact the interventions places may choose to continue or to stop. Additional keeping in touch activity and contribution analysis workshops as described below will help to mitigate the impact of how these delivery changes affect the evaluation.

    • The introduction of 6 new devolution deals will also impact the delivery of UKSPF from April 2025, affecting 11 of the place-level case studies. These 6 new Combined Authorities (CAs) will have lead delivery responsibilities, while 2025-26 UKSPF technical guidance instructs lower tier LLAs previously delivering funding to shut down. Decisions by new CAs regarding local programme team continuation and whether some interventions may be stopped/reduced in scope will impact evaluation activity. To mitigate impact on evaluation activity, we are taking a flexible approach to the final fieldwork period which focuses on impact evaluation. This will allow resource to be retargeted if substantial changes to delivery of interventions are made locally after April 2025.

  • Differences in delivery of Theory of Change (ToC) workshops: Some workshops did not take place due to difficulty finding a time for all partners to attend. In these situations, separate conversations were held to assess and complete the ToC. Most workshops were held in person, but some were online or hybrid, at the request of the LLAs, which may have influenced the ability of stakeholders to contribute.

Place-level evidence

  • Generalisability: The nature of place-level evidence and selecting a number of places to represent wider UKSPF delivery, means it will not be possible to fully generalise from the evidence produced to the whole UK. However, overarching research described below will attempt to understand common factors associated with different types of outcomes and impacts.

2. Case study selection

This section presents the approach used to select areas to be included in the place-level evaluation case studies. This includes a description of the data used, the approach used for selection and a description of the characteristics of the final case study areas.

Over 250 LLAs were identified as having received UKSPF funding.[footnote 2] The approach to identifying areas to participate in the place-level evaluations involved a two-stage process. Firstly, desk-level research was conducted to develop a long list of 80 LLAs. A short list was created by undertaking more detailed primary and secondary research as described below.

2.1 Data

Several types of data sources were used to assess and categorise UKSPF interventions. These included: management information about the expected and current performance of UKSPF interventions, the amount of funding areas received from non-UKSPF funds, local socio-demographic and local geographic data (including official statistics) and primary research with LLAs.

A full list of data sources can be found in Appendix A.

2.2 Approach to long listing

To move from a full list of 250 LLAs to an initial long list of 80, the research team undertook desk-level research to assess and categorise the LLAs. Consultations between the research team and key stakeholders at MHCLG, including individuals on the policy team, monitoring team and delivery leads, were used to develop and agree a set of key criteria to inform the long list selection. The categories used are described below. For categories other than the “whether activity had commenced” category, a quota was used to ensure the 80 LLAs selected were representative of the LLAs receiving funding.

  • Whether delivery of UKSPF funded interventions had commenced: In most cases, if there was no evidence in the internal programme documentation that the LLA had started to deliver interventions by March 2023, the LLA was excluded from the long list. The reason for this is that to successfully evaluate UKSPF interventions in an area, there needs to be confidence that the area has delivered a sufficient level of activity and allowed a suitable length of time for outputs and outcomes to be achieved, to draw conclusions about what has worked well and why.

  • The level of UKSPF funding per capita: The level of UKSPF funding awarded to an area was divided by the population in each area, to account for differences in population size. UKSPF funding per capita was then divided into three categories – low (below £10 / capita); medium (between £10 and £30 / capita); and high (above £30 / capita). A further category of UKSPF funding was added – namely whether the LLA had received the minimum level of funding available (£1 million).

  • The level of funding from other MHCLG funds per capita: The level of funding to each local authority from other MHCLG funding streams (Levelling Up Fund, Towns Fund, Future High Streets Fund, Coastal Communities Fund, Community Ownership Fund, Local Growth Fund and Getting Building Fund) that could contribute to similar outcomes as UKSPF was calculated by dividing the total value awarded by the population in each area. The level of MHCLG funding per capita was then divided into four categories – low (below £100 / capita); medium (between £100 and £500 / capita); upper (between £500 and £1,000 / capita); and high (above £1,000 per capita).

  • Other complimentary funding streams: Whether an area received funding from either the Department for Education’s Multiply programme or the Department for Food and Rural Affairs’ (DEFRA) Rural England Prosperity Fund (REPF) was also assessed. These funds were part of UKSPF and are expected to contribute to similar outcomes as the UKSPF, although they were delivered by departments other than MHCLG.

  • Type of lead local authority: There were 3 types of LLA – lower tier, unitary and combined (where the LLA covered a number of local authority areas). This category was included to ensure that the short-listed areas would include different types of local governance.

  • Rural urban classification: Six rural urban categories for the LLAs were used. These were taken from the Rural Urban Local Authority Classification presented by DEFRA for English local authorities. For local authorities in Scotland, Wales and Northern Ireland, the research team examined relevant data (such as relative population levels in urban and rural areas in each local authority) and assigned them a category in the Rural Urban Local Authority Classification. For combined authorities, the research team examined the rural urban classification of each local authority in the combined authority, their populations and assigned an appropriate category. The 6 rural urban categories used are described here.

  • Coastal community: Whether the LLA contains a coastal community (settlement on the coast) or a community on an estuary. This category was included as coastal communities often face different challenges to those further inland.

  • Former mining community: Whether the LLA area contained communities that were formerly supported by the mining industry – as these communities face different challenges to other areas.

  • Types of intervention being funded: The distribution of interventions across the three investment priorities and expected percentage of total funding directed to each type of intervention was assessed at a high level. This was used to identify areas that are focussing their UKSPF efforts on particular priorities and those that are spreading resources across multiple priorities.

  • Other evaluation activity taking place: Finally, areas were categorised by whether they had been selected as a case study for other Government evaluations, such as the Multiply programme or Towns Fund. Areas that had been selected for inclusion in the evaluation of other funds were not excluded from the long list, but this was noted to avoid overburden of local authorities.

The research team developed a database of all the characteristics described above and an average profile across the different categories. 222 LLAs (86%) provided evidence that delivery had started on UKSPF funded interventions. Taking these places, a long list was then developed which matched as closely as possible the average profile of UKSPF areas, while maintaining a geographic spread across the UK. Specific details of long listed areas are presented in Annex A, but in summary this included:

  • 32 with high levels of funding; 38 with medium levels and 12 areas with the minimum amount or low categories of UKSPF funding[footnote 3]

  • 29 LLAs that did not receive any ERDF funding (2014-2020); 6 with low levels of funding and 45 with medium or high levels of funding

  • 22 Combined authorities, 20 upper tier and 38 lower tier authorities

  • 40 largely urban LLAs and 40 largely rural LLAs

  • 26 LLAs that included former coal mining communities and 38 LLAs containing coastal communities

2.3 Approach to short listing

Further deep dive research into each of the 80 areas on the long list was undertaken and included:

  • additional analysis of secondary data sources, such as the levels of deprivation, employment patterns, and education statistics

  • a detailed review of the Local Investment Plan, identifying the key local challenges and contexts; local strategies that UKSPF funding would support; key planned interventions and what these aimed to achieve; governance and management structures; and staffing to deliver the UKSPF interventions

  • a review of the evidence in the 6-month reports, which details challenges faced, progress made and evaluation plans

  • LLA interviews, to further discuss planned interventions, progress made with delivery, challenges faced and their ability to support an evaluation. Not all LLAs from the long list were interviewed, with some LLAs not responding to a request to take part in an interview. LLAs that agreed to be interviewed were viewed as more amenable to the evaluation process than those that did not. This was taken into consideration during short listing, however, non-participation alone was not an exclusion criterion. Where interviews were completed, these were conducted via MS Teams and lasted approximately 60 minutes

  • three workshops across the evaluation team to ensure consistency of the findings from the data assessment, and to make recommendations about which areas would be suitable to include in the short list. The themes these workshops covered were:

    • Interventions that are being delivered in each area – whether these are a continuation of interventions that are already working well; new, innovative interventions or interventions that are combined with those from other MHCLG funding streams.

    • wider local strategies at an LLA or regional level, including whether the LLA had combined resources with other LLAs to deliver interventions

    • staffing – the number of staff working on the UKSPF in each area, their level of experience and expertise in delivering funded programmes (for example ERDF), the approach to data collection and the governance structures in place

    • evaluation activity – the number and scale of other evaluations taking place in the LLA, including both local UKSPF evaluation and national evaluations (e.g. inclusion in the Towns Fund evaluation).

A slightly different approach was taken when selecting the areas to short list in London and Northern Ireland.

  • London: The LIP for London was developed at a regional level. Therefore, the document review and primary interviews were conducted at the regional level, instead of with each borough. However, analysis of secondary data sources took place at borough level, ensuring that a variety of different characteristics were represented in the case studies within Greater London.

  • Northern Ireland: Documentation was available at the national level only. Following a documentation review and discussion with strategic and delivery leads across MHCLG, a decision was taken to include Northern Ireland as a whole in the evaluation, rather than breaking this into smaller geographic units. Later, a decision was taken to conduct a process evaluation only in NI.

Following this research and the workshops, a proposed short list of LLAs was developed. Selected areas were:

  • those where the research team assessed the LLAs had the ability to support an evaluation

  • a mix of areas continuing existing interventions and those introducing new interventions

  • those that together represented a variety of i) strategic approaches ii) funding levels iii) contexts and iv) geographic range across the whole UK

A final short list was agreed in consultation with MHCLG to ensure learning needs of the department were met.

2.4 Final place selection and characteristics

The final areas selected for the place-level case study research are presented in Table 1. Three local areas were selected in each English Government Office region, four areas in each of Scotland and Wales, and a single case study covering all of Northern Ireland. Further data presented here indicates the variety of areas within the final selection.  

It should be noted that whilst sampling of places has taken a rigorous approach, there is still an element of purposiveness, meaning that the whole of the UK is not necessarily represented through all case studies.

Table 1

East Midlands

  • Blaby
  • North East Derbyshire
  • Chesterfield

North West of England

  • Wyre
  • Liverpool City Region
  • Blackburn with Darwen [footnote 4]

West Midlands

  • Newcastle-under-Lyme
  • West Midlands Combined Authority
  • Tamworth

East of England

  • Uttlesford
  • Watford
  • Southend on Sea

South East of England

  • Hastings
  • Maidstone
  • Adur

Yorkshire and the Humber

  • East Riding of Yorkshire
  • South Yorkshire
  • North East Lincolnshire

North East of England

  • North of Tyne
  • County Durham
  • Gateshead

South West of England

  • South Hams
  • Torridge
  • Cornwall and the Isles of Scilly

Greater London

Scotland

  • City of Edinburgh
  • Highland
  • North Ayrshire
  • Glasgow City Region

Wales

  • North Wales
  • Cardiff capital / South East Wales
  • Swansea Bay / South West Wales
  • Mid and West Wales

Northern Ireland

  • One case study across the whole country

Table 2 demonstrates that in each region of the UK, selected case studies represent a wide range of levels of deprivation.

Table 2. Deprivation levels of each local area short listed

Least deprived (top third) Medium deprived (middle third) Most deprived (bottom third)
Greater London Y Y Y
East Midlands Y Y Y
East of England Y Y Y
North East X X Y
North West   Y Y
South East Y Y Y
South West Y Y  
West Midlands   Y Y
Yorkshire & Humber   Y Y
Scotland Y Y Y
Wales Y Y Y

Cells with a Y indicate a LLA has been selected in the GOR/nation and the deprivation tertile; an X indicates no LLA was available to select in the tertile; and blank indicates LLAs were available in the tertile and region but were not selected. Where areas were available but not selected, reasons included LLAs not responding to requests for scoping interviews or doubts about the ability of LLAs to support research. In Northern Ireland localities have not been selected. Welsh and Scottish assessments level on different data sources, and an assessment of the tertile within Wales and Scotland: Scotland: Scottish Government (2020) Scottish Index of Multiple Deprivation; Wales: StatsWales (2019) Welsh Index of Multiple Deprivation

The funding characteristics of the selected case study areas are presented in Figure 1. This shows that the case study areas include a small number of areas that have received a low level of UKSPF funding, with higher proportions receiving medium or high levels of UKSPF funding. This is in line with the proportions included in the long list[footnote 6]. Further, there is a variety in the levels of other MHCLG funding the local areas are receiving and the areas selected include those with Levelling Up Partnerships and Freeports in the area (as of January 2024).

Figure 1. Local area funding characteristics as of January 2024 for short listed areas

Characteristic Number of areas
Rural England Prosperity Funding 12
Freeports in area 5
Levelling Up Partnership in area 4
Other DLUHC funding - high 5
Other DLUHC funding - upper 5
Other DLUHC funding - medium 21
Other DLUHC funding - low 8
UKSPF funding - minimum (£1m) 3
UKSPF funding - high 18
UKSPF funding - medium 14
UKSPF funding - small 4

For UKSPF funding: minimum - £1million; Small - below £10 per capita; Medium - £10 to £30 per capita; High – more than £30 per capita. For Other MHCLG funding: Low – below £100 per capita; Medium - £100 to £500 per capita; Upper - £500 to £1,000 per capita; High – more than £1,000 per capita.

The selected areas also display a variety of levels of previous regeneration funding (such as Coastal Communities Fund, Coalfield Communities Landmarks Fund, or European Regional Development Fund), with a mixture of no and low levels of funding and high levels of funding. The selection also displays a wide variety of geographic characteristics (alongside having three areas in each region / devolved nation of the UK), including a variety of rural and urban locations, coastal communities and former mining communities (see Figure 2). Again, these proportions are in line with those from the long list average profile and overall characteristics of areas receiving UKSPF funding.

Figure 2. Geographic characteristics of the local areas short listed

Characteristic Number of areas
Rural 15
Urban 21
Coastal 13
Ex-mining 12

Four LLAs contain both coastal and former mining communities.

3. Evaluation design: Place-level evaluation methodology

Building on early methodological feasibility work undertaken during winter 2023/24, this chapter sets out the overarching evaluation approach taken across all case studies. The involves defining research questions, the unit of analysis to be explored in each case study, a methodological overview, and an outline of data collection and data analysis strategies that guide the evaluations. Consistency of methodological design across all place evaluations has been balanced against the need to account for local priorities and types of interventions being delivered, to develop bespoke evaluation designs for each place.

3.1 Overarching evaluation approach

A process, impact and economic evaluation will be undertaken in each of the case study areas:

  • the process evaluation will seek to answer questions on the relevance, efficiency and effectiveness of fund design and interventions undertaken. This includes the following high-level themes: fund design, implementation, delivery and monitoring
  • the impact evaluation will seek to measure and understand the outcomes and impacts of these interventions, as well as the mechanisms that facilitated these changes
  • the economic evaluation will focus on understanding the value for money (VfM) of interventions delivered locally, following the National Audit Office’s 4Es framework: economy, efficiency, effectiveness, and equity

Each place level evaluation is underpinned by a local theory of change. These were first developed during early scoping interviews and documentation reviews described above and finalised during ToC workshops held during the evaluation planning phase. Additional ToCs for each relevant investment priority theme will be developed through the evaluation to refine the contribution claims, to be assessed during impact evaluation.

Finally, to meet local learning needs and priorities, a series of bespoke and locally specific process and impact questions have been identified and refined within each evaluation. These have been designed using the ToCs described above.

3.1.1 Process evaluation framework

Table 3 presents an overarching process evaluation framework for all place-level case studies. This includes common process questions which are of interest in all places.

Alongside the process questions, the table outlines a number of example data sources that can be used to answer those questions. In brief, these sources are local data and documentation; local population research; stakeholder research and secondary data.

Table 3. Overarching process evaluation framework

Evaluation question Local data / documentation Secondary data Stakeholder research Local population research
Fund design        
How was the final set of interventions selected? To what extent were existing interventions used? yes   yes  
To what extent were interventions funded by a combination of UKSPF and other public or match funding? yes   yes  
To what extent did the evidence used to underpin the design of the scheme reflect the needs of the local area? yes yes yes  
How effective were the processes used to select interventions in ensuring local needs were addressed? yes   yes  
To what extent were all relevant stakeholders (with suitable experience and knowledge) included in the selection of interventions decision making process? yes   yes  
Fund implementation        
How effective were supplier engagement activities in securing sufficient interest in delivering UKSPF interventions? yes   yes  
To what extent did UKSPF bring about engagement with new suppliers of services? yes   yes  
To what extent did the procurement processes used to ensure that services that were purchased offered value for money? yes   yes  
How effective were the contractual arrangements in ensuring that suppliers delivered the expected levels of activity? yes   yes  
How effectively did UKSPF learn lessons from and share information with the delivery teams for other local interventions (e.g. Towns Fund, Community Ownership Fund etc.)? yes   yes  
What challenges were faced during the design of UKSPF interventions and how were these overcome? yes   yes  
What aspects of UKSPF intervention design have worked particularly well and why?     yes  
Delivery of interventions        
To what extent were the processes used to identify and engage with target populations successful? yes yes yes yes
How effective has the delivery been in catering to the target population? yes   yes yes
What challenges were faced during the delivery of UKSPF interventions and how were these overcome? yes   yes yes
What aspects of UKSPF intervention delivery have worked particularly well and why?     yes yes
To what extent did the staff delivering UKSPF interventions have the required skills and experience to effectively deliver interventions?     yes yes
To what extent did the staff managing and governing the UKSPF programme have the required skills and experience to effectively manage the programme?     yes  
How effectively did UKSPF resources/interventions combine with other interventions being delivered in the local area? yes yes yes  
Data collection and monitoring        
To what extent did suppliers provide data on interventions, outputs and outcomes in a timely manner? yes   yes  
What resource was required to quality assure monitoring data and did this change over time?     yes  
How effective were the processes used to provide data and information to DLUHC and how did this change over time?     yes  
What resource was required to provide monitoring data to DLUHC and did this change over time? How did this compare to information LLAs collected for themselves?     yes  

3.1.2. Impact and VfM evaluation framework

Table 4 below presents an overarching impact and VfM evaluation framework for all place-level case studies. This includes shared impact questions which are of interest in all places, including questions regarding the value for money of delivery of UKSPF locally as well as specific interventions. These are mapped against potential data sources which include: LLA management information (both internal and data returned to MHCLG); primary research with beneficiaries and stakeholders and secondary data sources.

Table 4. Overarching impact and VfM evaluation framework

Key evaluation question Management information Secondary data sources Primary research - stakeholders Primary research - beneficiaries
To what extent did the UKSPF interventions achieve the outputs and outcomes set out in the Investment plan? yes yes yes yes
What factors have contributed (negatively and positively) towards the achievement of outputs and outcomes? yes   yes yes
Over what time scale have outcomes (and impacts) been achieved? Can the outcomes be evaluated by 2025? yes yes yes yes
Over what timescale are longer-term impacts expected to be achieved, and are areas on course to achieve these?     yes yes
How effectively has UKSPF funding been used in combination with other public funding to achieve outcomes? yes yes yes  
How has UKSPF funding been used to secure private sector match funding? yes yes yes  
Has the UKSPF funding offered value for money? yes yes yes  

3.2 Unit of analysis

There are a wide range of UKSPF funded activities and types of intervention underway across each place, including at a variety of geographies and targeting different demographic groups. In order to meet priority learning needs and ensure evaluation resources are targeted most appropriately we have developed a “unit of analysis” approach.

For each case study, an assessment has been conducted about how evaluation activity should be targeted, including a geographic focus (evaluating across the whole or specific areas of the LLA) and intervention-level focus (evaluating all/most interventions or a subset of interventions). The final decision on the unit of analysis was made during the evaluation planning stage and is presented in each evaluation plan to help frame the evaluation scope and likely analytical methods undertaken to answer research questions. Table 5 presents the criteria for making decisions about the focus for the unit of analysis.

Table 5. Criteria for defining the unit of analysis for each place-level case study

Focus Criteria for decision
Geographic focus Whole LLA
Targeted areas of LLA
- Scale of funding
- Size of area
- Type of LLA (combined, unitary)
- Existing evaluation plans in area
- Variation across area (levels of deprivation)
- Local challenges in implementing evaluation in specific areas
- Geographic coverage of UKSPF interventions
Intervention focus All / most interventions
Subset of interventions
- Number of interventions funded
- Scale of funding for each intervention
- Target number of outcomes for each intervention
- Targeting of cohorts
- Timing of delivery
- Timing of when outcomes will be achieved
- Evaluations covering interventions (intervention level, Towns fund, local project level)
- Ability to evaluate interventions

3.3 Methodological overview

In order to best answer our research questions and evaluate complex and interacting local delivery across a range of investment priorities, theory-level methods will form the basis for evaluation. This will primarily include using contribution analysis to assess the contribution of different inputs, interventions and activities on subsequent programme outcomes and impacts. The contribution analysis will be informed by a number of evidence sources and analytical techniques discussed below in section 3.4 and 3.5 which will help assess a number of contribution claims developed from key causal chains within the local theory of change.

3.4 Data collection strategies

As each evaluation is taking a theory-level approach, multiple strands of data collection and evidence will be needed to support assessment of the research questions.

Secondary data sources and programme MI will provide information about a wide range of outcomes of the UKSPF. However, this will not explain how and why these outcomes have or have not) been achieved, and there will be some more qualitative outcomes (for example, improvements in confidence following a skills intervention) that are not already collected. Therefore, a suite of primary research is required. This will allow specific interventions to be evaluated with bespoke research tools. Quantitative data collection is likely to include surveys of beneficiaries, LLAs and delivery partners, as well as collection of financial data for economic evaluation. Qualitative data collection strategies will include a range of interviews (one-to-one or group), focus groups and observation days or sessions.

A stakeholder consultation strategy covering the types of stakeholders to be contacted, and the type of primary research to be undertaken has been developed during each place-level case study evaluation planning stage. This includes programme beneficiaries, LLA leads, delivery teams (within and outside the LLA), elected members (where relevant) and businesses or other groups unsuccessful in applying for funding (where relevant).

The research team also explored potential secondary data sources which could be used to support the place-level evaluations. Where feasible, these will be used to assess outcomes and impacts. However, this will often depend on the level of geography data is available at and the time period of available data (including possible data lags). Many sources will still be able to provide useful contextual information to support the evaluation. Each data source has been assessed in developing evaluation plans for levels of usefulness, credibility, robustness and proportionality.

3.4.1 Your Community, Your Say survey

The Your Community, Your Say (YCYS) survey was designed for evaluation activity across the UKSPF programme and is level on the Community Life Survey[footnote 7] (CLS). YCYS is being delivered for 20 place-level case studies. The survey is intended to gather local level data at LA or neighbourhood level on the impact of delivery of local interventions, primarily related to Community and Place.

YCYS will be delivered in the places identified in Table 6. For English places, priority was given to those which focused on Community and Place evaluation themes and had a higher number of contribution claims requiring this data. All Scottish and Welsh places were included automatically as Communities and Place interventions will be evaluated in all of them, and there is no CLS for comparable analysis. Details of how this is spatially targeted, including how the data will support evaluation of outcomes, can be found within place evaluation plans. Due to delayed delivery timings, there will only be one wave of YCYS in each place, limiting the ability to compare at a granular level over time. However, data from the boosted CLS will be used to triangulate against findings from this data.

Table 6. Target places for YCYS survey

England Wales Scotland
Tamworth North Wales City of Edinburgh
Durham Mid and West Wales Glasgow City Region
Liverpool City Region South East Wales Highland
Southend South West Wales North Ayrshire
North East Derbyshire    
Maidstone    
North East Lincolnshire    
North of Tyne    
East Riding of Yorkshire    
West Midlands Combined Authority    
Torridge    
Blackburn with Darwen    

3.5 Data analysis strategies

Each place-level evaluation will use contribution analysis as an overarching framework to evaluate the outcomes and impacts of UKSPF funded activity.

In each place, a high-level theory of change (or logic model) has been set up, presenting the expected chain of inputs, activities, outputs and outcomes for local UKSPF activity. Using this, multiple contribution claims have been developed which state hypotheses for how we expect single or combinations of interventions to be achieving outcomes. For example, one hypothesis for how People & Skills interventions are expected to enable outcomes in Southend reads: Tailored UKSPF support for individuals to access courses or training has contributed to an increase in the number of formal qualifications or basic skills gained by supported individuals. Within the contribution framework, evidence that would support and refute each hypothesis is set out.

To ensure consistency across all case studies, a set of rubrics will be developed to assess the quality and weight of different kinds of evidence in assessing the hypotheses. These may include triangulation of data sources, plausibility and representativeness. Each rubric will have a set of defined levels at which they are assessed (e.g. high, medium or low).

Quasi-experimental evaluation

In addition to, and supporting this theory-level method, each case study planning phase involved an assessment of whether and which type of Quasi Experimental Design (QED) might be feasible. Factors that were considered included the scale of activity (funding and beneficiary population size of interventions), whether it would be feasible to collect or receive data of the appropriate level and time period, and whether the intervention had or would be running for long enough to capture sufficient outcomes data. Further detail on the QED feasibility assessments can be found within each evaluation plan.

4. Summary of evaluation themes

This section provides an overview of the evaluation themes that have been identified in the case studies.

4.1 Key evaluation themes

The most prominent intervention themes selected for evaluation across all case studies are below, split by the 3 investment priorities. There are two additional cross-cutting themes identified as being common across many places, and which delivering outcomes that do not lie within one specific investment priority.

Table 7. Evaluation themes by investment priority

Themes Count of themes Proportion of themes across all plans
Communities and place 189 45%
Supporting local businesses 136 33%
People and skills 93 22%

The split of evaluation themes differs slightly between England, Wales and Scotland, with the latter having a more even focus across investment priorities. Welsh and English places have a greater evaluation focus on Communities and Place and Supporting Local business priorities, with only 13% of Welsh evaluation themes on People & Skills interventions[footnote 8].

Table 8. Evaluation themes by investment priority and country

Themes England Scotland Wales
Communities and place 45% 38% 49%
People and Skills 23% 29% 13%
Supporting local businesses 32% 32% 38%

Note: Analysis level on 336 English themes; 55 Scottish themes; 34 Welsh themes.

Table 9 shows the most prevalent evaluation themes selected within each investment priority, across all evaluation plans.

Table 9. Most prevalent evaluation themes within investment priorities[footnote 9]

Evaluation themes Proportion of all themes within investment priority (%)
Communities and place  
Perceptions of local place/facilities/community pride 16
Increased footfall (town centre/specific location) and visitor numbers 7
Community engagement, including using facilities/amenities 7
Volunteering 7
Supporting local businesses  
New business practices including: ways of working and innovation 15
New jobs 14
New businesses 13
Improved productivity/business performance 13
People and skills  
Improving skills 35
Employability (support) 12
Employment opportunities 9
Increased levels of economic activity 8

5. Summary of impact evaluation methodologies

This section provides an overview of the impact evaluation methodologies that are being utilised across the case studies.

5.1 Impact evaluation

5.1.1. Contribution analysis

As discussed above, theory-level evaluation will be the key methodology for understanding impact across the case studies. Contribution analysis (CA) will be the primary framework used. This method attempts to identify the contribution of different factors and influences upon the outcomes and impacts we observe. CA is suitable here as direct attribution (including identifying causation) is not feasible given the complexity of delivery at a local level. Each of the case studies will undertake a CA by constructing a number of contribution claims level on key evidence chains within the theories of change. Using triangulated data and a robust set of rubrics, the contribution claims will be assessed as supported or refuted.

A narrative for each evaluation will be constructed that summarises multi-source evidence to create an overview of how UKSPF has contributed to outcomes and impacts locally.

5.1.2. QED feasibility

At the evaluation planning stage, the feasibility of including QED methods into local evaluation was assessed. 18 places indicated that some type of QED was feasible (primarily difference-in-difference analysis), although further scoping will be undertaken as evaluators build relationships locally. Business Support and Communities & Place interventions were the most common themes for the QEDs planned. We anticipate the final number of QED evaluations to be lower than this due to availability of data and timing of evaluation activity. Counterfactual analysis will be attempted within 3 place evaluations, using Propensity Score Matching.

5.1.3. Caveats/limitations

  • A Randomised Control Trial (RCT) approach is not feasible in any area: The interventions within an area have already been defined and are being delivered, so there has been no opportunity to adapt intervention delivery plans attempt to randomly allocate treatment and control groups within an area (for example a utilising a staggered approach of delivery to form a counterfactual case). Additionally, the delivery of UKSPF interventions is happening over a relatively short period of time, which does not allow sufficient time between cohorts to successfully implement a pipeline design.

  • There is limited opportunity to create counterfactual cases outside a local authority area: All local authority areas have received UKSPF funding, and most are implementing some form of interventions across all three investment priorities. However, in some cases it has been possible to identify specific areas within the LLA area that is not receiving any comparable interventions. An assessment of whether QED with a comparison group was feasible was undertaken in each area and included analysis of: the size of population benefitting from an intervention; potential places not delivering a similar intervention both within and external to the LLA; whether secondary data sources could be used to support the impact assessment.

  • Interactions of UKSPF and other local growth funds: A LLA may have secured funding from other government sources (e.g. Levelling Up Fund, Towns Fund) and combined these with UKSPF funding to achieve outcomes. Although models could attempt to control for these other funds, it may be challenging to successfully identify the sole impact of UKSPF funding. The combination of different funding will be considered through the process and impact evaluation by undertaking interviews with relevant strategic and delivery staff to understand the overlapping impacts. Impact evaluation using contribution analysis will help draw this data together with other sources to attempt to identify the contribution of other funds activity on UKSPF outcomes.

6. Cross-case study analysis

In order to identify what works well in which contexts across the case studies (at an aggregate level), a Qualitative Comparative Analysis (QCA) will be undertaken. QCA is a method for analysing the contribution of different conditions (e.g., aspects of an intervention and the wider context) towards an outcome of interest and can be used to understand the factors which are more likely to support the achievement of the programmes’ outcomes of interest. The following steps will be taken in this analysis:[footnote 10]

  • an outcome of interest/research question will be identified
  • a theoretical framework will be produced which sets out the change under investigation and the hypothesis or theory of how this change came about, established by collaboration between analytical, policy and delivery partners
  • a set of potential explanatory factors which may influence the change will then be identified, including characteristics of a project whose presence or absence may contribute to outcomes
  • a set of cases that have targeted the achievement of the outcome of interest will be selected which can be used to explore configurations of the purported drivers with different results
  • each case will be scored against each of the factors, either zero or one for a ‘crisp set’ QCA, or scores of between zero and one for a ‘fuzzy set’ QCA
  • data will be analysed through specialist software (such as fsQCA), which take a rigorous logic-level approach to identifying patterns across multiple case studies and factors or preconditions. This will help identify combinations of explanatory factors related to the delivery of UKSPF that have led to different outcomes.
  • outputs of the QCA will be reviewed against case study evidence and place-level ToCs to sense check the results. This stage may require seeking additional case material, revising ToCs and carrying out steps 2-6 again. In this sense, the QCA may be an iterative process for seeking multiple causal pathways to address a particular problem.

This analysis will allow the identification of factors which have successfully supported UKSPF delivery across all areas, and factors, or combinations of factors, which have been successful in supporting UKSPF delivery in specific areas (for example, what factors have worked well in rural area, or what factors have worked well in more deprived areas).

7. Timeline

Case study research has been split into 2 tranches, with the timing of research activities and reporting staggered further within each tranche level on locally agreed timelines.

The following are key delivery points for the evaluation:  

  • Evaluation plans designed and delivered (completed December 2024)
  • Interim reports delivered, covering process evaluation research and early impact findings (completing by April 2025)
  • Contribution analysis & evaluation planning workshops (June 2025)
  • Final reporting, covering process, impact and value for money evaluation (Autumn to Winter 2025)
  • Overarching meta-analysis report (December 2025).

8. Appendix A – Long listing data sources

LLA Local Investment Plan

  • Level of UKSPF funding
  • Level of Multiply funding
  • Planned UKSPF interventions
  • Planned outputs and outcomes

LLA 6-month update

  • Level of spend to March 2023
  • Interventions provided to March 2023
  • Challenges faced with delivery

MHCLG internal data

  • Level of UKSPF funding
  • Level of Multiply funding
  • Level of Rural England Prosperity Fund funding
  • Level of other MHCLG funding
  • Evaluation activity taking place (e.g. Towns Fund evaluation, other UKSPF evaluation components)

Evidence review

  • Level of ERDF funding by area (2014-2020)
  • Coastal community in LLA -Former mining community in LLA

National Level Official Statistics

  • Levels of deprivation
  • Population statistics
  • Rural Urban classifications

Primary research with LLAs

  • Update on progress with delivery
  • Challenges faced with delivery
  • Staffing levels
  • Ability to support evaluation
  1. This stage included a lengthy delay as a result of the pre-election period ahead of the General Election in July 2024. 

  2. This is level on the areas that provided a 6-month report to MHCLG on UKSPF activity for March 2023. A total of 252 LAs were involved in UKSPF delivery overall due to changes in lead responsibilities during devolution. 

  3. The areas which received the minimum level of funding did not always fall into the low levels of funding per capita category – therefore the LLAs sum to more than 80 for funding categories. 

  4. Research not undertaken after evaluation planning phase due to local capacity 

  5. Research not undertaken after evaluation scoping phase due to concerns about delivery timings 

  6. The areas which received the minimum level of funding did not always fall into the low levels of funding per capita category – therefore the LLAs sum to more than the total number of places selected across the funding categories. 

  7. UKSPF evaluation activity has also funded a boost to the CLS between 2023/24 and 2025/26, to make the survey representative at an LA level, to support local level evaluation. Previously this has only been representative at a regional level. 

  8. Delivery of People & Skills interventions could only be started in England from 2024-25, but were available in Scotland, Wales and Northern Ireland from 2022-23. 

  9. Only the top 4 themes have been presented here for each investment priority, so totals do not sum to 100%. 

  10. This list is adapted from IEG (2022) QCA: Exploring Causal Links for Scaling Up Investments in Renewable Energy, though it is a common list of steps to all applications of QCA.