Research and analysis

Executive summary - Interim evaluation of MGSF and PTCR programmes

Published 8 October 2024

Introduction and overview

The Department for Culture, Media and Sport (DCMS) commissioned Deloitte in August 2023 to conduct an independent assessment of 2 key funding programmes implemented and delivered by DCMS:

  • the Multi-Sport Grassroots Facilities (MSGF) programme
  • the Park Tennis Court Renovation (PTCR) programme

Both the MSGF and PTCR programmes provide direct investment to build or upgrade grassroots facilities, aiming to boost activity levels and sports participation amongst local communities. In particular, their focus is on delivering projects that benefit those from under-represented groups, and those within communities impacted by higher-than-average levels of deprivation,[footnote 1] to ensure physical activity is accessible to all, no matter background or location.

The Lionesses Futures Fund (LFF) aims to continue to support the growth in female sport participation across England; this funding has also been brought within the scope of the evaluation, and for the purposes of this report is considered a sub-section of the MSGF Programme.

This interim evaluation report builds on the initial feasibility study published by Ipsos UK in May 2023 and an evaluation plan finalised in December 2023. The details of the evaluation plan are outlined in this interim report and associated annexes.

The objective of the overarching evaluation of the programmes has been to monitor their outputs and outcomes, and assess their impact and value for money (VfM). The evaluation comprises: 

  • process evaluation: to understand whether programme activities have been implemented as intended and resulted in the desired outputs in an efficient and effective manner
  • impact evaluation: to understand the extent to which the programmes made a difference in the achievement of the expected outcomes
  • economic evaluation / value for money (VfM): to understand in parallel to the process and impact evaluations, the benefits and costs of the programmes, and whether the use of resources over the course of implementation has been efficient, effective, and equitable

This interim report discusses emerging findings from the first round of fieldwork conducted from January to March 2024, providing particular insights and focus on the process evaluation, as well as early indications of outputs and outcomes as part of the impact evaluation.

Findings will be further reported over the course of the evaluation process, with an additional interim report planned for March 2025 and a final report in March 2026, which will use additional data and undertake causal analysis, which has not yet been feasible as part of this interim report. This will allow a more detailed understanding of outputs and outcomes, which will be explored in more depth based on the findings from the impact and process evaluations, before being assessed as part of an economic evaluation.

The main aims of the evaluation of the programmes are to:

  • monitor the overall performance and progress of the 2 programmes
  • assess how the programmes are being implemented, and the extent to which they are meeting the demand-side and supply-side outcomes and driving sustained impact, to understand the government’s return on investment
  • investigate the existence of causal links between investment in grassroots sports facilities and improvement in participation and physical activity
  • identify lessons learned to inform current programme delivery and potential future programme design and implementation
  • demonstrate accountability and transparency in the allocation of public funding by assessing whether the intended impacts of the programmes have been achieved
  • assess the VfM that the programmes are providing to the taxpayer

Evaluation questions (EQs)

It was agreed during the evaluation planning stage that the below overarching research question will be applicable to this evaluation:

To what extent have the programmes delivered improvements to facilities in need of investment and created a positive impact on physical activity within these facilities in England, Scotland, Wales, and Northern Ireland?

From this, an evaluation framework has been developed, underpinned by a theory of change, to establish a number of evaluation questions that explore this overarching topic in further detail.

A range of sub-questions also sit below these questions, but the core evaluation questions are shown below:

  • EQ1: Have the new / improved facilities resulted in additional participation in sport at the facility and local areas?
  • EQ2: Does the investment in facilities have an impact on participation levels from underrepresented groups and within deprived areas?
  • EQ3: Do the new / improved facilities increase awareness of sports, and / or improve the perception of activity in local communities (e.g. quality of life, pride in place, community cohesion) for individuals?
  • EQ4: Have the Programmes improved collaborative working and available evidence?
  • EQ5: Has the Lionesses Futures Fund achieved its intended outcomes?
  • EQ6: Has the Lionesses Futures Fund helped to create safe and welcoming spaces for women and girl users to play?

Methodology, data and approach

This evaluation uses a mixed-methods approach to evaluate a diverse and representative cross-section of projects. This involves using a range of sources to capture comprehensive information for the process, impact and economic evaluation to help understand how the 2 programmes have been delivered and achieved their outcomes, outputs and impacts in a value for money way. 

This interim report has been supported by only one wave of data collection to date, conducted through February and March 2024.

Significant future data collection activity is planned for two further waves in the financial year 2024 to 2025, and the financial year 2025 to 2026.

Primary data collection & fieldwork

The evaluation of the MSGF and PTCR programmes is underpinned by a range of primary data collection techniques. This has included a comprehensive plan of surveying across facilities, users and households for MSGF, in addition to case studies and interviews with a range of key stakeholders across both programmes.

A summary of the primary data sources that have been used to support this evaluation is outlined below, with further detail available in section 4.1 of the full report. 

Figure 1: Primary data collection (as of May 2024)

Stakeholder interviews Case studies Household survey User survey Facility survey
34 10 Around 5,000 households Over 2,000 users Over 950 facilities

Programme overview

The MSGF programme will invest £320 million between 2021 and 2025.

Over the lifetime of the programme, funding is allocated across the UK as follows:

  • England: £279 million
  • Scotland: £20.1 million
  • Wales: £13.9 million
  • Northern Ireland: £7 million

As of February 2024, £176.3 million has been committed so far across the UK. Of this investment, the North West of England received the most funding per capita (proportionally compared to population size) at £5.80. London received the least funding per capita, at £0.51.

The PTCR programme will spend £30.3 million over the financial years 2022 to 2023 and 2023 to 2024, with DCMS investing £21.9 million and an additional £8.4 million from the Lawn Tennis Association (LTA). As of April 2024, 1,792 courts across 552 parks have been renovated as part of the programme in England. The region with the largest number of courts that received funding was London with 586, with Yorkshire and the Humber the region with the fewest funded courts. Additionally, 96 courts in Scotland were refurbished, along with 89 courts in Wales.

Process evaluation approach

This process evaluation explores whether the programme interventions have been implemented as intended and resulted in the desired outputs, as well as considering the extent to which the programmes have been delivered in an efficient and effective manner. This will examine issues including governance, communication, and delivery, with lessons learned for future refinement of the 2 programmes, and others across DCMS and the wider government. 

Laying strong foundations for the evaluation across all three aspects (process, impact and economic), through robust design and data collection processes has been the initial focus of activity. Findings around implementation and process are more readily available at this stage due to the status of the programme’s delivery and the data collection that has been undertaken so far.

To date, the following data collection activities have supported this activity:

  • A review and assessment of programme documentation and monitoring data.
  • 34 interviews with key stakeholders involved in the delivery of the programmes, to help understand how well the processes and delivery of the programmes have functioned, from their launch to the present day.
  • 10 case studies of facilities in receipt of MSGF or PTCR funding, gathering first-hand insights from a range of stakeholders at each facility to investigate the impacts of the funding, in order to address the evaluation questions and to assess the degree to which the funding mechanisms functioned as intended and were efficient and effective for these sites.
  • Analysis of secondary data including the Active Lives Survey, Active Places Power and the Community Lives Survey.
  • Analysis of initial wave of data available from 3 key surveys covering the MSGF Programme:
    • Facility survey: a survey sent to both funded and unfunded facilities across the UK for completion by managers of the facility, which collected self-reported data from facility managers. The survey explores current and pre-funding levels of participation, as well as other outcomes. 259 responses were received from funded sites (a 62% response rate), and 288 responses received from unfunded sites (a 53% response rate).
    • User survey: a survey distributed to users by managers of funded and unfunded facilities across the UK, which explores the perceptions and views of users at an individual level. It provides granular descriptive findings to support and supplement the counterfactual impact assessment and other data sources. Facility users were purposively sampled using contact details held by facility managers. 2,222 responses were received.
    • Household survey: An online survey conducted using YouGov’s panel. Respondents were selected based on proximity to funded and unfunded sites. The purpose of the survey is to help to fill existing data gaps, particularly those that exist around pride in place and social cohesion. In addition to this, the survey covers themes of general wellbeing and physical activity at and outside funded and unfunded facilities. 5,128 responses were achieved. The data was weighted by age and gender to UK adult (over 18) targets. The data was also weighted by nation to match the distribution of facilities across England, Wales, Scotland, and Northern Ireland.

Impact evaluation approach

The chosen methodology for a counterfactual impact evaluation aligns with the methodology set out in the feasibility study.[footnote 2] This will seek to use quasi-experimental methods to establish if there is evidence of a causal effect from the programmes, relative to a scenario where funding was not granted toward multi-sport facilities or tennis courts. As part of this evaluation, a steering group has also been established to provide challenge and feedback on the impact evaluation methodology. 

The primary objective of this evaluation is to investigate the causal effect of programme funding on sports participation and physical activity, in an environment where certain facilities or courts have been allocated funding, and others have not. As the MSGF and PTCR programme designs do not allocate the funding randomly, but instead grant funding based on a structured selection process, a quasi-experimental method, using a Differences-in-Differences (DiD) model, is best suited to measure the effect of these policy interventions (termed as the “treatment”).

This method seeks to estimate the differences, if any, in the intended outcomes of the programmes between the “treated” groups (facilities that applied and were awarded the funding) and the “control” groups (facilities that applied for the funding but were not selected), where both groups were assessed based on the same selection criteria within each nation. This approach aligns with the impact evaluation approaches set out in the Magenta Book[footnote 3], as well as with the findings of the feasibility study conducted ahead of this evaluation.

Given that only data collection for wave 1 has been completed and the remainder of the survey data collection for the second and third waves is to be conducted in the financial years 2024-2025 and 2025-2026 as planned, the causal impact evaluation methodology outlined above will be applied in the next stage of the evaluation. 

The data currently available from the first wave of data collection, in the form of the facilities, user and household surveys as well as the LTA PTCR booking data for investments allocated pre-2022, is assessed descriptively. This includes drawing insights through a pre- and post-programme comparison for funded and unfunded facilities, from self-reported estimates of participation. Self-reporting was determined to be the most pragmatic and feasible approach in the absence of administrative data on participation. The data is used to perform exploratory analysis and develop emerging results to provide an initial view on how the programmes have been performing, whilst awaiting further data collection before conducting the causal analysis.

Process evaluation: key emerging findings

The key emerging findings below are set out thematically and are broken down by programme; such grouping aims to clearly convey the extent to which the programmes have been delivered efficiently and effectively in key areas of design, implementation, and delivery. At this stage, the process evaluation does not consider the Lionesses Futures Fund due to the early phase of development and design that the fund is still in.

The emerging findings from the process evaluation for the MSGF Programme should be considered by all delivery partners and DCMS, although there are instances where particular observations may be more relevant to specific delivery partners. These instances are highlighted below.

Multi-Sport Grassroots Facilities programme

Early programme development

Despite limited capacity and experience of delivering similar programmes initially, DCMS launched the programme with allocations of funding in the financial year 2021-2022, quickly developing key relationships with stakeholders and progressing design and delivery of the programme to enable funding allocations. (DCMS; England, Scotland, Wales, Northern Ireland)

Funding key performance indicators (KPIs)

KPIs used as part of the assessment criteria for awarding funding were perceived as a facilitator in driving the right behaviour from applicants and delivery partners.

They were generally understood and accepted; and encouraged funding to be delivered in focus areas to benefit those from under-represented groups and those within communities impacted by higher-than-average levels of deprivation. (DCMS; England, Scotland, Wales, Northern Ireland)

Deprivation and multi-sport

The KPIs related to deprivation and multi-sport could have been more clearly defined. or example, the level of granularity on which deprivation is defined sometimes caused issues for selection of appropriate projects. Clarity around what constitutes a ‘multi-sport’ project, and whether there is to be differentiation between ‘sport’ and ‘physical activity’, has similarly impacted potential project selection. (DCMS; Scotland, Wales, Northern Ireland)

Application process

Application processes for funding were unique to each delivery partner, which led to a lack of consistency and comparability across nations, although a standardised approach could present practical challenges given the differences between nations and delivery partners in terms of levels of resource, nation size and the total amount of funding to deliver. Processes also substantially differed in complexity and length. In some cases, during the early phases of the programme, this meant applicants (particularly volunteers) felt they faced a burdensome application process. (Scotland, Wales, Northern Ireland)

Iterations and improvements have been made to application processes over time, with key learnings and insights shared effectively across Delivery Partners. (England, Scotland, Wales, Northern Ireland)

Panel representation

Decisions to award funding and the distribution of the funding have been conducted in different ways across nations. Whilst grant panels have iteratively improved in terms of transparency and diversity of membership, some delivery partners felt that more could be done to improve representation by ensuring greater inclusion of a wide range of perspectives and experiences in the decision-making process in all nations. (Scotland, Wales, Northern Ireland)

Collaboration

Overall, there has been strong communication and collaboration from all parties, and a clear willingness and enthusiasm to work together to achieve the best possible outcomes. Asks of delivery partners have been stretching at times, but stakeholders have been professional, polite, and proactive in rising to the challenge. (DCMS; England, Scotland, Wales, Northern Ireland)

Project delivery

Project delivery has been perceived to be effective, but improvements focused on additional technical expertise and flexibility around the allocation of funding could help improve the efficiency of future delivery. (DCMS; Scotland, Wales, Northern Ireland)

Programme monitoring data

Programme monitoring has significantly improved since the programme launched, and stakeholders engage with the regular processes of reporting and monitoring key updates in programme delivery. However, there are still ongoing issues with the quality and timeliness of data submission from delivery partners to DCMS, with consequences for the value and usability of this data for stakeholders, as well as creating an additional burden for DCMS and delivery partner staff in resolving data issues. (DCMS; England, Scotland, Wales, Northern Ireland)

Stakeholder relationships

Relationships with facilities and local communities have been improved and strengthened by the programme, and delivery partners have widened their networks and understanding of sports participation across the UK. (England, Scotland, Wales, Northern Ireland)

Achievement of outcomes

Stakeholders universally agree that participation and physical activity has increased at funded facilities, although acknowledge that further causal analysis is required to determine additionality. For example, some DCMS and delivery partner staff suggested that the impacts may have been more significant for existing players, rather than encouraging new players.

Funding to date has sometimes focused on clubs with existing facilities as opposed to areas where no facilities previously existed, the latter potentially being a significant aspect of further growth in participation to ensure that those in areas with limited sports provision can get involved in physical activity and sport. (DCMS; England, Scotland, Wales, Northern Ireland)

Overall

Overall, whilst recognising this is an interim evaluation, the evidence suggests delivery of the programme has become iteratively more efficient from the financial year 2021-2022 to date, with many key learnings and improvements implemented.

Evidence also suggests effective delivery of the programme, but the extent to which this continues and improves is subject to ongoing delivery through the final programme phases during the financial year 2024-2025. (DCMS; England, Scotland, Wales, Northern Ireland)

Park Tennis Court Renovation programme

Early programme development

The initial development and design of the PTCR programme was less efficient than it might have been. Internal collaboration with commercial colleagues in particular created delays. DCMS has now overcome these issues and improved processes and planning subsequently.

Funding KPIs

The LTA used an appropriate process with relevant KPIs for identification and selection of sites to be renovated, with input from a range of stakeholders.

Technical assessments of projects and associated cost estimates underestimated the extent of required work and funding on a number of occasions. However, the LTA and DCMS were effective in appointing a new third party to manage the risks from inaccurate technical assessments to overall delivery. A funding shortfall remains for the programme, and talks are ongoing on how to rectify this funding gap.

Collaboration

Communication and collaboration across stakeholders was a core strength of the programme’s delivery right from inception, and dedicated resource for the programme from both the LTA and DCMS has significantly improved the efficiency and effectiveness of delivery. Officials have also been professional, polite and proactive amongst often stretching asks.

Project delivery

Project delivery has been perceived to be efficient and effective, with a high volume of court renovations completed within a constrained period of time. The LTA has used its experience and knowledge and adapted quickly to issues that have arisen. Delays have often resulted from stakeholders external to delivery of the programme and outside DCMS or LTA control.

Programme monitoring data

Programme monitoring has been straightforward and positive, and stakeholders have agreed on the accuracy and timeliness of data being shared. While there have been issues in funding allocation and reporting against targets and allocations for funding, these have been dealt with appropriately.

Stakeholder relationships

LTA and DCMS staff believed relationships with a broad range of stakeholders had improved as a result of this funding, largely due to proactive communication and a transparent approach to decision-making.

Achievement of outcomes

Sentiment amongst stakeholders acknowledged the positive effect on achieving an uptick in participation at the courts in receipt of this funding, whilst acknowledging the causal link between the programme and overall participation impacts will be more challenging to determine.

There was an appreciation that the programme may need to go further than solely renovations, also acknowledging that further proactive initiatives, such as Free Park Tennis, would be key to sustaining this uptick long-term.

Overall

Overall, whilst recognising this is an interim evaluation, the evidence suggests initial challenges were overcome and the programme has been efficiently delivered. The effectiveness of the programme will be better understood with greater booking data collection and analysis.

Observations for further programme delivery and future programmes:

As included at the end of each section of the process evaluation, key observations for both the remaining period of these specific programmes, and for prospective programmes of this type in the future, are as follows:

Table 1: Key considerations from the process evaluation

# Observations Applicability
1 Continue to champion and enable knowledge sharing amongst delivery partners, reviewing DCMS internal delivery processes, communication, and resourcing, to enable teams to be empowered and with the appropriate skills and experience. MSGF (DCMS & E/S/W/NI); PTCR; future programmes    
2 Review the suitability of the ‘deprivation’ KPI and its geographical granularity. This could potentially better account for socio-economic variations within local authorities. MSGF (DCMS); future programmes    
3 Provide greater guidance to delivery partners and potential funding applicants on what constitutes as a ‘multi-sport’ project, to give more clarity on what can be delivered. MSGF (S/W/NI)    
4 Continue early engagement with future applicants, providing accessible and open feedback on potential applications and projects. MSGF (E/S/W/NI); PTCR; future programmes    
5 Consider a standardised application process and additional assessment guidance for future programmes, to enable consistent and comparable processes across nations. Future programmes    
6 Discuss required resourcing for delivery of programmes at an earlier stage and agree sufficient budget and resource allocation for stakeholders to deliver programme requirements effectively and efficiently. Future programmes    
7 Share guidance with stakeholders on ‘what good looks like’ with regards to the diversity of panel representation, as set out by relevant sports councils. Encourage regular review and refinement of panel membership to facilitate this. MSGF (DCMS & S/W/NI); future programmes    
8 Where possible, manage expectations around short-term asks, working with stakeholders to prepare common breakdowns and splits of data. Require stakeholders to improve internal reporting and quality assurance processes so that shared data is accurate, timely, and complete. MSGF (DCMS & E/S/W/NI); PTCR; future programmes    
9 Early engagement with delivery partners to agree the resourcing, skills and experience needed to deliver internally. Continue the lessons learned sessions and champion knowledge sharing amongst delivery partners to improve delivery across all nations. MSGF (DCMS & E/S/W/NI); PTCR; future programmes    
10 Review and streamline data capture, analysis and reporting practices, and consider the platform through which delivery partners and DCMS manage and oversee funding with efficiency and effectiveness of delivery at the core. MSGF (DCMS & E/S/W/NI); PTCR; future programmes    
11 Continue to conduct final assessments and decision-making through panel processes, to ensure a diverse range of views and opinions is taken into account when considering the merits of applications. MSGF (E/S/W/NI); future programmes    
12 Maintain close relationships with beneficiaries of funding through delivery partners and other stakeholders, to support longer-term understanding of impacts and outcomes of funding. MSGF (DCMS & E/S/W/NI); PTCR; future programmes    
13 Establish more consistent and comprehensive post-award assurance with beneficiaries of funding to enable better understanding of the achievement of intended objectives, outcomes, and impacts. MSGF (DCMS & E/S/W/NI); PTCR; future programmes    
14 Review approach to in-year allocations of funding and the ability to finance longer-term, larger projects that may proportionately benefit key under-represented target groups (e.g. women and girls). Future programmes    
15 Improve training and knowledge of the programme team staff in business case processes. Facilitate regular check-ins for staff across teams, particularly for new joiners and those with less experience of DCMS as an organisation. MSGF (DCMS); PTCR; future programmes    
16 Review the way in which technical resource is involved in capital investment programmes, and how to effectively manage and oversee third party procurements where relevant. MSGF (S/W/NI); PTCR; future programmes    
17 Earlier engagement with local authorities and local government stakeholders to facilitate early buy-in, and identify potential risks and blockers to project delivery. PTCR; future programmes    
18 Upskill and train staff in equality, diversity, and inclusion (EDI) matters to enable them to effectively maximise the impact of these projects by engaging broader user bases. MSGF (DCMS & E/S/W/NI); PTCR; future programmes    

Impact evaluation: progress update and key emerging findings

This section provides a summary of the progress made in the evaluation of the MSGF and PTCR programmes and the quantity of evidence collected up to March 2024.

This section also summarises the key emerging findings based on analysis of the available data with regards to the impacts of the MSGF and PTCR programmes to date. This includes descriptive analysis of the impacts of the programme on participation, both overall and sustained, as well as wider the outcomes and impacts across local communities, such as mental and physical wellbeing and community cohesion.

Given the current stage of the evaluation and this being an interim report, further causal analysis will be undertaken once additional evidence is collected as part of the subsequent waves of primary data collection.

Multi-Sport Grassroots Facilities programme

Progress update

  • Good facility survey response rates achieved. The facility survey conducted achieved good response rates for both funded (62%) and unfunded (53%) facilities, enabling descriptive comparisons across key participation measures. Funded facilities reported higher overall participation and sustained participation rates than unfunded facilities.
  • Good response rates also achieved to the user survey. The user survey received 2,222 responses across the sample period across all nations, a substantial improvement on previous user survey activity. This was likely assisted by the integration of a prize draw into the surveys to incentivise responses. Responses provided important contextual understanding of participation, accessibility, and physical wellbeing of users at facilities.
  • Useful insights for wider impacts gathered from household survey. The first wave of a panel sample of 5,000 households near funded and unfunded facilities provided important findings for local community and wider impacts, including physical and mental wellbeing, community cohesion, social networks and pride in place.
  • Case studies highlighted some the positives of the programme but also some of the challenges funded sites experienced. Case studies of 8 facilities in receipt of funding gave depth and insight into the views of facility managers and the real-world impacts of funding, although challenges arose in maximising participation of a wider range of facility stakeholders in this process.

Key emerging findings

  • 82% of funded facilities reported an increase in participation in both direction and magnitude, compared with 65% of unfunded sites since April 2021.
    Furthermore, a higher proportion of funded users across each of the nations reported an overall participation increase, with higher multi-sport participation too. This suggests evidence of strong participation at funded facilities. However, we do not yet know whether funding to date has supported participation at sites with existing facilities to a greater degree than those without any facility provision, or if selection of sites for early investment by delivery partners prioritised facilities with an existing user base. These are important points and will be tested through further analysis to understand the additionality of participation and the extent to which funding has impacted new and existing users of facilities. 

  • A higher proportion of funded facilities (50%) reported an increase in sustained participation since April 2021 relative to unfunded sites (39%).
    Additionally, findings from the user survey suggest there has been a significant uplift in sustained participation among users of funded facilities (90%) relative to unfunded facilities (79%). This suggests users at funded sites are increasingly maintaining their use of their local facility over the medium-term. Multiple future waves of data collection will also help assess the degree to which participation has remained sustained over time. 

  • Case studies of funded facilities across all nations surveyed showed significant uplifts in participation, particularly from younger people and women and girls, and presented numerous improvements in wider impacts and outcomes such as ‘pride in place’.
    This suggests the programme is generating positive impacts for under-represented groups and local communities. Facility managers also believed that rising participation will be sustained and presented numerous examples of how programme funding has improved accessibility for underrepresented groups and improved education and environmental outcomes.

  • Additional future data collection aims to allow the generation of more granular insights regarding participation.
    After one wave of data collection, comprehensive conclusions regarding impacts and outcomes are limited, particularly when analysing breakdowns of participation, primarily due to limited sample sizes available at this stage given funding is still ongoing. Additional data collection plans are in place to inform future evaluation analysis and reports. 

Further detail on the data sources analysed and evidence against evaluation questions is provided in the matrix below (Table 2).

Next steps

  • Further understanding of attribution and causality in future reporting.
    The analysis presented in this interim report is descriptive in nature as outlined above. Future evaluation reports will look to assess whether the programme’s funding has had a statistically significant impact on participation. This is subject to future survey response rates and having sufficient quantity and quality of data across future waves to facilitate the Differences-in-Differences approach, but indications from the first wave of response rates and data collection are positive. 
  • Additional data collection plans are in place.
    The data collected on the impacts of the programme in further waves of quantitative and qualitative data collection will be used to inform future evaluation analysis and reports. Plans for future data collection activities include how to prevent survey response attrition as well as how to maximise survey response rates through various incentive mechanisms to encourage survey participation.

Park Tennis Court Renovation programme

Progress update

  • More data to be shared over summer 2024. In booking data shared, only 1 facility of 78 has undergone court renovation, and less than 20% of the court bookings provided have occurred since 2022. The LTA expects significantly more data to become available over summer 2024 as courts finalise renovations and weather conditions improve.
  • Case studies qualitatively detailed the impacts of the funding on participation and wider impacts. Qualitative case study fieldwork covered 2 sites, one in Wales and one in England. Both sites reported positive impacts of funding on participation and wider anecdotal qualitative impacts.  Challenges were encountered, especially in engaging with users of the funded park tennis court sites selected.

Key emerging findings

  • Insufficient data to highlight participation impacts of the programme at this stage.
    Booking data available at this stage contains insufficient observations to descriptively analyse participation impacts of the programme. This is due to grants for court renovations under the PTCR programme only commencing in quarter 2 of 2023, meaning that the required 12-month pre- and post- intervention dataset is not yet available. Causal analysis is not feasible at this stage, and given limitations of booking data, caution should be taken in inferring impacts and outcomes. Future reporting will aim to utilise a larger booking dataset to provide additional descriptive insights and enable causal inference.

  • A focus of future reporting is to cover the programme’s wider impacts.
    There is less readily available information on wider impacts of the programme at this stage; for example anecdotal evidence of environmental, educational or health outcomes, but this will be a focus of future evaluation reports and activity. Secondary data sources have provided contextual understanding of tennis participation more generally, through sources such as the Active Lives Survey and the LTA’s participation tracker, but the proportion of respondents participating in park tennis is limited.

Further detail on the data sources analysed and evidence against evaluation questions is provided in the matrix below (Table 3).

Summary tables - emerging findings

The matrices below (Table 2 and Table 3) set out the key data sources used to demonstrate impacts across both programmes, and provide high-level summaries of the emerging findings of each data source against the key evaluation themes related to the evaluation questions in section 3 of the full report.

These matrices show what this evaluation is able to understand about impacts at this stage, but also where data gaps currently exist and where further analysis and activity should focus in order to holistically assess the programmes’ impacts and outcomes.

Table 2: MSGF key emerging findings matrix – early stage impact evaluation

MSGF data source Overall participation Sustained participation Breakdown of participation Local community impacts Other impacts
Facility survey 82% of funded facilities reported an increase in participation in both direction and magnitude, compared with 65% of unfunded sites since April 2021.
Artificial grass pitch investments drove the highest reported increase in participation.
A higher proportion of funded facilities (50%) reported an increase in sustained participation since April 2021 relative to unfunded sites (39%). No clear trends yet on differences between funded and unfunded facilities across gender, geography, ethnic minority groups or disabled individuals. Further analysis will be undertaken as sample sizes increase through additional planned data collection. No clear trends yet between funding and its impacts on accessibility, both in terms of access by different groups or sports and operating hours of facilities. The programme aligns with government’s intention to address regional inequalities. Facility managers reported anecdotal evidence of improved environmental outcomes.
User survey User survey findings will not inform causal analysis, but descriptive analysis suggests a similar proportion of funded users (98%) visiting their local facility at least once a month relative to unfunded users (96%). There was a significant uplift in sustained participation among users of funded facilities (90%) relative to unfunded facilities (79%). A higher proportion of funded users across each of the nations reported an overall participation increase, with higher multi-sport participation too. N/A N/A
Household survey Household survey findings will not inform causal analysis, and the sample size of respondents using the facilities was small (<20%) and therefore comparative descriptive analysis was not presented. N/A N/A Households near funded and unfunded sites reported similar levels of wellbeing. Older and wealthier users tend to have better wellbeing and higher levels of life satisfaction. N/A
Case studies Funded sites reported experiencing or expecting to experience significant uplifts in participation. Facility managers suggested participation was expected to be sustained at their site, and that demand was increasing over time. Facility managers across all nations reported anecdotal growth in participation, particularly from younger people, and women and girls. Facility managers presented numerous examples of funding improving ‘pride in place’ in the local community and improved accessibility for under-represented groups. Facility managers gave anecdotal evidence that funding had facilitated improvements in educational and environmental outcomes.
Interviews Interviewees were confident that participation had improved, particularly those ‘closest to the pitch’. Significant uplifts in the women’s and girls’ game were also emphasised. However, further work is needed to understand the additionality of this participation. Mixed views were shared by stakeholders, although most generally were confident that the programme had led to increases in participation that would be sustained over the medium to long term. N/A Benefits to the community through improvements made to local clubs and facilities were anecdotally iterated by interviewees across delivery partners as a significant positive of the programme. Improvement of inter-organisational relationships with DCMS, between the delivery partners, and between delivery partners and the local facilities and clubs.
Secondary data sources In the Active Lives Survey 2022-23, participation in football and general activity levels have remained stable over the last 12 months in England. N/A The Active Lives Survey indicates that the regional divide in activity levels is increasing in England. The Active Lives Survey shows no change in the measures of mental wellbeing in the last 12 months. N/A

Table 3: PTCR key findings matrix – early stage impact evaluation

PTCR data source Overall participation Sustained participation Breakdown of participation Local community impacts Other impacts
LTA booking data Available data covers a limited time period, for a limited number of court renovation types.
However, the number of bookings at funded and unfunded sites has increased since the programme began: 40% for funded courts, and a much higher increase in the unfunded courts (due to a much lower baseline position). The number of players at funded courts is 300% higher than at unfunded courts potentially driven by the larger capacity at funded sites.
It is not possible at this stage given data quality to determine indicative emerging impacts and outcomes of the programme. Further data collection is critical to improve this analysis.
Sustained participation in terms of growth in total visitation was highest in 2020. The average number of sustained visits were however similar across 2020 to 2023. Further data collection is critical to improve the understanding of these impacts. Bookings at funded sites by region are most concentrated in London, and within least deprived areas. Unfunded courts registered a higher rate of increase in female bookings than funded courts.
Again, the above characteristics are likely as a result of the skewed sample distribution of the available data to date, that limits the ability to compare impacts.
N/A N/A
Case studies Facility managers from case study sites reported significant increases in participation in tennis at the sites, including rapid growth driven significantly by the ability to offer an expanded coaching offering. Participation outcomes are believed to be sustained by stakeholders, although some uncertainty was noted due to poor weather at the sites. Case study activity covered a site in England and a site in Wales, and both reported similar positive impacts. N/A Funding has enabled an increase in coaching capacity and increased usage by local schools.
Interviews Stakeholders reported a positive effect of the programme on achieving increased participation at funded courts, whilst acknowledging the causal link between the programme and overall participation impacts will be more challenging to determine. LTA and DCMS staff were confident that the programme has encouraged both new and existing users to become regular users. Additional analysis is required on a larger dataset to understand this further. N/A N/A N/A
Secondary data sources The latest Active Lives Survey (November 2022-2023) indicates no significant change in the number of people playing tennis or in general physical activity levels over the last 12 months. N/A N/A LTA surveys have found park facilities to be more popular among female participants, therefore PTCR is expected to have long-term impact in addressing the current gender gap in tennis N/A

Overall interim conclusions and next steps 

Multi-Sport Grassroots Facilities programme

Overall, as outlined above, additional primary data collection is still required to increase the amount of data available for analysis, primarily through surveys and programme monitoring data, in order to enable robust causal analysis of the impacts of the programme in future evaluation reports.

The findings and conclusions of the evaluation at this stage are limited in how far they can conclusively demonstrate the overall impacts and outcomes of the programme. However, the available evidence does imply positive impacts of the programme on overall participation, sustained participation and participation of under-represented groups. 

There are clear differences between funded facilities and unfunded facilities in comparative descriptive analysis, and this aligns with qualitative reporting from stakeholders on the impacts and benefits of this funding. Overall participation is notably higher when comparing funded and unfunded facilities, and women and girls are likely to be the groups that have benefitted most from this.

There will need to be significant work to understand these impacts in more detail, particularly the additionality of participation and the extent to which users are now attending facilities in receipt of funding, who were not doing so prior to investment. There will also be a continued focus on sustained participation, and quantitively understanding the extent to which funded facilities have been able to retain the participation of existing users above and beyond that of unfunded facilities.

Park Tennis Court Renovation programme

Overall, as outlined above, the key dataset (LTA booking data) that is mainly used to understand participation at funded and unfunded facilities contains insufficient observations at this stage to descriptively analyse participation impacts of the programme.

Whilst some analysis and breakdowns of data have been presented within this report, there is a lack of data available in a number of critical dimensions that will allow assessment of the programme.

This includes:

  • types of project (the current dataset is made up of almost exclusively online booking and gate installation projects, with very few court renovations)
  • a small number of projects since 2022 (only 10% of projects at the 287 courts were completed since 2022)
  • a smaller number of booking observations occurring since 2022 (less than 20% of the provided sample),
  • a limited number of facilities with available data (51 funded sites and 28 unfunded sites)

Case studies and interviews with key programme stakeholders have suggested positive impacts of the programme and effective targeting of courts in need of renovation, which aligns with other assessments conducted by the LTA on tennis participation and activity at facilities. There is less readily available information on wider impacts of the programme at this stage (for example anecdotal evidence of environmental, educational or health outcomes), but this will be a focus of future evaluation reports and activity.

In conclusion, in order to both descriptively and causally analyse participation impacts of the PTCR Programme, additional data is critical, across more project types, more recent time periods and at more courts. The LTA has plans in place to capture additional data and many more sites are expected to offer booking data in the coming months for use in future analysis.

Next steps

Focus of future evaluation activity

Future evaluation activity will primarily focus on improving the availability of data. This larger dataset can then be used to assess the key metrics that help capture and understand the impacts and outcomes of both programmes, with the intention of completing a robust impact evaluation.

This will help in informing a more causal assessment of the programmes’ impacts on participation and assist in assessing the degree of ‘additionality’ resulting from the funding. Understanding these impacts, including those beyond participation, in greater detail will also be critical for informing the economic evaluation which will be covered in future reporting, enabling a comprehensive and robust overall assessment.

In addition to the collection of quantitative data across future waves of survey activity, further qualitative data from stakeholders following the completion of both programmes’ delivery schedules will also be collected, including additional case studies and depth interviews.

Finally, activity related to evaluation of the Lionesses Futures Fund will begin, from designing and developing additional primary data collection, through to qualitative data collection and analysis ahead of the next planned interim report. 

Additional evidence and analysis required

Additional primary data collection and secondary data analysis ahead of the next interim report includes:

  • Surveys: a second iteration of facility, user and household surveys will be undertaken. The appropriateness of particular questions and wording, as well as incentives and distribution methods, will be refined and reviewed ahead of distribution.
  • Case studies: a further 8 case studies will be conducted across MSGF and PTCR programmes. 
  • Interviews: further process evaluation interviews will take place with stakeholders from across both programmes, as they near and pass the completion points of delivery.
  • Programme monitoring data: significantly more programme monitoring data is expected to be available ahead of the next evaluation report, particularly for the PTCR programme, and this will heavily inform future impact analysis.
  1. A deprived area is defined as an area that falls within IMD deciles 1-5, according to the English indices of deprivation 2019

  2. Referred to as ‘Option A’ in the feasibility study: Grassroots Sport Facilities Investment Programme: Impact Evaluation Feasibility Study 

  3. HM Treasury’s Magenta Book provides guidance on what to consider when designing an evaluation.