Consultation outcome

Summary of results and response to the DCMS Social Surveys consultation.

Updated 14 August 2024

1. Introduction

We conducted a consultation to consider options for the structure of our social surveys [footnote 1] in the future, ensuring that meeting user needs and achieving value for money remains at the forefront of any survey design [footnote 2]. We recognise the importance of user research and statistics serving the wider public good, from stakeholders and users inside and outside of government. 

The Department for Culture, Media and Sport (DCMS) Social Surveys consultation ran from 29 February 2024 to 23 May 2024, and received 291 responses [footnote 3]. This document summarises the feedback received, and sets out DCMS’s response.

The DCMS Social Surveys consultation covered 5 key sections:

Section 1: This section looked to understand more about the respondents of DCMS’s Social Surveys consultation.

Section 2: This section looked to understand more about how users of the Participation Survey or Community Life Survey engaged with these surveys, and how they intend to use them in the future.

Section 3: This section looked to understand user needs and preferences, particularly in terms of which factors (frequency, geography, content, time series, cross-referencing) were most important to them.

Section 4: This section looked to explore potential changes to the current digital topic areas of the Participation Survey by the Department for Science, Innovation and Technology (DSIT), and how making such changes may impact survey users. 

Section 5: This section aimed to gain an understanding of whether there was a desire for UK-wide survey data on the topics covered in the Participation Survey and the Community Life Survey, and to what extent these surveys currently only covering England, impacts user needs. 

2. Who responded to the consultation?

We received 291 responses to this external consultation, of which 187 responses were from individuals and 36 from organisations (the remainder skipped this question, or finished the survey before answering any questions). Responses were received from a range of organisations [footnote 4] including government departments, DCMS Arm’s-Length Bodies (ALB), private companies, charities/non-profit organisations, academic bodies/institutions, sector or industry bodies and regional/local authorities.

Respondents came from a variety of sectors.[footnote 5] The most popular responses to the multiple choice question were (of 137 respondents for this question):

  • Community engagement (30%),
  • Wellbeing (27%),
  • Arts (25%),
  • Volunteering (9%),
  • Museums and Galleries (18%)
  • Other (20%) - ‘Other’ type-in responses consisted of e.g. ‘Education’, ‘Health’, ‘Community’ and ‘Charity’. 

The remainder of the sectors were selected by fewer than 20 respondents (15% of total 137 respondents).

In parallel, we conducted an internal consultation with DCMS sector experts, and held further conversations with stakeholders. This published summary and response considers the external consultation questionnaire only. We have considered all evidence, from both internal and external stakeholders, and used this for the basis of our recommendations for the future of these surveys, included at the end of this summary.

3. Overview of consultation results

In total, 38 respondents to the consultation have previously used DCMS social surveys. Of these, 31 had used the Participation Survey and 29 had used the Community Life Survey, with 22 of these respondents having previously used both surveys.

In total, 68 respondents indicated that they plan to use DCMS social surveys in the future. Of these, 60 respondents plan to use the Participation Survey more or the same as they do now, and 63 respondents plan to use the Community Life Survey more or the same as they do now. Therefore, 55 respondents plan to use both surveys more or the same as they currently do.

The consultation focused on five different factors, to understand users’ needs and priorities. These factors were: 

  1. Frequency: how often the survey data is collected and published
  2. Geography: at what geographical granularity the questions are available, for example at the local authority level 
  3. Question content and coverage: whether the current questionnaire topics meet user needs
  4. Comparability with time series: whether data can be reliably compared to past survey results
  5. Cross-referencing between the Participation Survey and Community Life Survey: whether or not users would find it useful to be able to cross-reference between the two surveys

For this consultation summary, we will be considering all 291 external responses. However, a vast majority of questions were only asked to respondents who had selected that they plan to use at least one of the surveys. All factor questions were only asked to users that plan to use one or both of the surveys in the future, or weren’t sure about future use (72 unique respondents: 65 selected Participation Survey, 66 selected Community Life Survey). These responses are the basis for a majority of the analysis in this summary.

4. User needs and priorities

4.1 Outputs and use

The following questions were asked to respondents who have used the survey in the past or weren’t sure about past use.

  • Of the outputs that respondent have made use of in the past for the Participation Survey, 
    • Of 36 users, the most popular were annual data tables (58%), annual reports (58%), quarterly data tables (39%), and technical reports (33%). 
    • Ad-hoc data tables on gov.uk and UK Data Service datasets both received less than 30% usage.
    • Of 35 users, museums and galleries (63%), arts (57%), heritage (57%), and libraries (37%) chapters were the most popular sections.
    • The rest of the sections were less popular: major events (26%), live sports (14%) and the digital sections (31%).[footnote 6]
  • Of the outputs that respondent have made use of in the past for the Community Life Survey, 
    • Of 34 users, the most popular were gov.uk data tables (53%), gov.uk reports (53%), and technical reports (32%). [footnote 7]
    • Ad-hoc data tables on gov.uk and UK Data Service datasets each received less than 20% usage.
    • Of 34 users, respondents were most interested in the volunteering and charitable giving (68%) chapters, followed by engagement and social action (56%), neighbourhood and community (38%), wellbeing and loneliness (50%), and identify and social networks (38%).

4.2 Frequency

The following questions were asked to respondents who plan to use the survey in the future (more, less or the same as now), or don’t know. 

  • The highest proportion of respondents for both surveys used survey data a few times a year, as opposed to more frequent or less frequent options presented.

Table 1: Frequency of surveys output use

Base: Participation Survey (36 responses), Community Life Survey (25 responses).

  More frequent than a few times a year (%) A few times a year (%) Less frequent than a few times a year (%) Don’t know (%)
Participation Survey 25 33 19 22
Community Life Survey 20 60 20 0
  • For both surveys, most respondents disagreed that their needs would be met with data every three years
    • 21% agreed that their needs would be met with data every 3 years for the Participation Survey and 29% for the Community Life Survey
    • 41% disagreed for the Participation Survey and 27% for the Community Life Survey
    • 24% neither agree nor disagree for the Participation Survey and 26% for the Community Life Survey. The remainder ‘didn’t know’.
  • For both surveys, slightly more respondents agreed that their needs would be met with data every 2 years
    • 24% agreed that their needs would be met with data every 2 years for the Participation Survey and 25% for the Community Life Survey
    • 27% disagreed for the Participation Survey and 26% for the Community Life Survey
    • 27% neither agreed nor disagreed for the Participation Survey and 29% for the Community Life Survey
  • For the Community Life Survey, a higher proportion of respondents agreed that if follow-up questions were collected and published every two or three years with headline figures every year, their needs would be met. For the Participation Survey, less respondents agreed.
    • 19% agreed that their needs would be met with headline figures published every year with follow-up questions every 2 or 3 years for the Participation Survey and 39% for the Community Life Survey
    • 29% disagreed for the Participation Survey and 20% for the Community Life Survey
    • 22% neither agreed nor disagreed for the Participation Survey and 24% for the Community Life Survey. These findings were similar for both organisations and individual respondents.
    • Examples for “headline figures”[footnote 8] and “follow-up” were provided, although it is possible that these would still have been interpreted differently by respondents, and therefore the result for this question is caveated with this.

Figure 1: Percentage of respondents who agreed or disagreed that their or their organisation’s needs would be met if the Participation Survey data was available at different frequencies.

Base: 62 responses.

Figure 2: Percentage of respondents who agreed or disagreed that their or their organisation’s needs would be met if the Community Life Survey data was available at different frequencies.

Base: 65 responses.

  • Several comments on frequency were made in the free text boxes. Respondents felt that the benefits of more frequent data included:
    • Ability for ongoing benchmarking and trend analysis
    • Timely and informed policy responses, enabling reporting to stakeholders and boards
    • Enhanced reporting and evaluation cycles, and up-to-date data for funding applications
    • Data credibility and relevance
    • Sector specific advantages 
  • Quarterly publications were noted as being important for some Participation Survey respondents, whereas annual data seemed to be a priority for some Community Life Survey respondents.

4.3 Geographical granularity

The following questions were asked to respondents who plan to use the survey in the future (more, less or the same as now), or don’t know. 

  • Respondents have previously used the survey data at either the most granular level data available or the lowest: England/National level. For the Participation Survey, the highest available is at ITL2 (county level). For the Community Life Survey, this is at the ITL1 (regional) level. [footnote 9]

Figure 3: Proportion of respondents using different geographical granularities in the past for the Participation Survey Survey.

Base: 34 respondents.

Figure 4: Proportion of respondents using different geographical granularities in the past for the Community Life Survey.

Base: 31 respondents.

  • Of 83 respondents, most plan to use data at the local authority (39%), when available for both surveys, or England/National level (18%).

  • Of 63 respondents who had previously used or planned to use the Participation Survey in the future:

    • Only 3% of respondents disagreed that their needs would be met if the Participation Survey was only available at local authority level. 
    • 30% disagreed their needs would be met if the Participation Survey was available at ITL2, 
    • 35% disagreed their needs would be met if the Participation Survey was available at ITL1,
    • 39% disagreed their needs would be met if the Participation Survey was available at national level.

Figure 5: Percentage of respondents who agreed or disagreed that their or their organisation’s needs would be met if the Participation Survey data was available at different geographical granularities.

Base: 60 responses.

  • Of the 59 respondents who had previously used or planned to use the Community Life Survey in the future:
    • All respondents agreed that their needs would be met if the Community Life Survey was only available at local authority level, i.e. no respondents disagreed,
    • 29% of respondents disagreed their needs would be met if the Community Life Survey was available at ITL2, 
    • 38% disagreed their needs would be met if the Community Life Survey was available at ITL1,
    • 43% disagreed their needs would be met if the Community Life Survey was available at national level. 

Figure 6: Proportion of respondents who agreed or disagreed that their or their organisation’s needs would be met if the Community Life Survey data was available at different geographical granularities.

Base: 62 responses.

Headline at differing geographical levels

  • 24% of Participation Survey respondents and 22% of Community Life Survey respondents disagreed that having headline data at local authority level, and the remainder at regional (ITL1) level would meet their needs.

Figure 7: Proportion of respondents who agree, neither agree or disagree, or disagree that their or their organisation’s needs would be met if the Participation and Community Life Surveys data was available at local authority for headline data and ITL1 for follow-up questions. [footnote 10]

Base: Participation Survey (63 responses), Community Life Survey (59 responses).

  • Several comments on geography were made in the free text boxes. The benefits of more granular geographical data included:
    • Local detail and use: local authority level data is crucial for certain organisations to help them understand specific community needs
    • Improved policy making and resource allocation: supports robust policy making and effective targeting of funding/investments, and effectiveness of interventions
    • Strategic planning and benchmarking: Crucial for place-based programs such as levelling up initiatives
    • Local insights
    • Effective planning and outcomes for policy and strategic decisions
    • Lower burden of local data collection for organisations and local authorities

4.4 Survey content

  • For both surveys, respondents reported that the current content generally meets their needs with:
    • 53% of Participation Survey respondents and 54% of Community Life Survey respondents agreed or strongly agreed that the current content meets their needs.
    • 30% of Participation Survey respondents and 32% of Community Life Survey respondents neither agreed or disagreed
    • 6% of Participation Survey respondents and 6% of Community Life Survey respondents disagreed or strongly disagreed. 

Figure 8: Proportion of respondents whose needs would or would not be met if the questionnaire content remained the same as it is now.

Base: Participation Survey (64 responses), Community Life Survey (63 responses).

  • Some respondents suggested additional topics to be included in the survey, such as health, more detailed volunteering questions, and child and youth (for example, include under 16 as included in the Taking Part survey). More specific to individual responses were suggestions such as additional socioeconomic demographics, additional arts options, more detailed museum activities, understanding of barriers in more detail, more detailed individual crafting activities, health impacts of participating, more social welfare and caring responsibility questions, gambling, and more specific area-based questions.
  • Questions that respondents rely on heavily were specific to the area of interest of the respondents. For example if a user was interested in heritage, they were likely to list most of the heritage questions. For the Community Life Survey, many respondents listed volunteering and charitable giving as very important to them. For the Participation Survey, some respondents stated that for various different sectors, types, frequency, and voluntary/free time questions were important to them.
  • From the new questions being added to the 2023/24 Participation Survey (89 respondents), 
    • Pride in Place is the most likely to be used by respondents (75% of respondents said they would definitely or possibly use these questions). 
    • 71% of respondents would definitely or possibly use questions on social prescribing. 
    • 73% of respondents would definitely or possibly use questions on Environment. 
    • 67% of respondents would definitely or possibly use further questions on arts and culture engagement.
  • Pride in Place and Life chances are new questions added to the Community Life Survey for 23/24 and 24/25; 78% of respondents said they would definitely or possibly use, with a further 17% of respondents not knowing.

  • When asked if respondents currently use ACORN (geodemographic segmentation of residential neighbourhoods):
    • Out of 84 respondents, 13% of Participation Survey respondents currently use ACORN, whilst 30% did not know.
    • 26% of 81 Community Life Survey respondents currently use ACORN, whilst 26% did not know. 
    • Of those who do currently use ACORN, the impact on them of removing ACORN data included less awareness of demographics and profiles in communities, and making targeting and allocation of resources harder. However, using an alternative segmentation measure was raised as an option if removed, but it was noted that an element of extra analysis would be missing and IMD isn’t the same as ACORN.

4.5 Time series

The following questions were asked to respondents who plan to use the survey in the future (more, less or the same as now), or don’t know. 

  • For the Participation Survey, respondents generally agreed that it is essential to them that time series is uninterrupted (59% agreed, 28% neither agree nor disagree, and 2% disagreed). 
  • For the Community Life Survey, respondents also generally agreed that it is essential to them that time series is uninterrupted  (24% agreed, 31% neither agree nor disagree, and 4% disagreed). 
  • The benefits of having consistent time series listed by respondents in the free text box included:
    • Monitoring trends: Enabling organisations to monitor changes, and allowing for legacy and impacts of local and broader programmes/strategies
    • Evaluating policy effectiveness: facilitates assessment of whether observed trends are due to national trends or specific local actions
    • Adapting to community needs
    • Academic and sector research
  • The benefits of not having consistent time series listed by respondents in the free text box included:
    • Difficulty in monitoring and evaluating programmes and policies: the Taking Part Survey has already had a interruption in data collection, which creates a hindrance to effectiveness of policy
    • Less effective resource allocation and funding

Figure 9: Proportion of respondents who agree, neither agree or disagree, or disagree that uninterrupted time series for the Participation Survey and Community Life Survey is essential to their or their organisation.

Base: Participation Survey (58 responses), Community Life Survey (58 responses).

4.6 Cross-referencing the Participation Survey with the Community Life Survey

The following questions were asked to respondents who plan to use the survey in the future (more, less or the same as now), or don’t know. 

  • The majority (58%) of Participation Survey respondents strongly agreed or agreed that their needs would be better met if they could cross-reference data across the two surveys. A further 28% of respondents neither agreed or disagreed.
  • The majority (62%) of Community Life Survey respondents strongly agreed or agreed that their needs would be better met if they could cross-reference data across the two surveys. A further 27% of respondents neither agreed or disagreed.
  • The benefits of cross-referencing provided by respondents in the free text box included being helpful for linking up the topics, for example being able to produce cross-tabs of participation rates with community value and volunteering, more accurately meeting the community needs, and supporting programme evaluations. 

Figure 10: Proportion of respondents who agree, neither agree or disagree, or disagree that their or their organisation’s needs would be better met if they could cross-reference across the Participation Survey and Community Life Survey.

Base: Participation Survey (57 responses), Community Life Survey (55 responses).

4.7 Merging the Participation Survey and Community Life Survey

The following questions were asked to respondents who have used one or both of the surveys in the past or plan to use one or both of the surveys in the future (more, less or the same as now), or don’t know. 

  • When asked about the impact of the surveys merging, 19% of 64 respondents said this would be strongly beneficial, 23% slightly beneficial, 19% no impact, 3% slightly detrimental and 6% strongly detrimental. However, the largest proportion, 30%, did not know.
  • The main benefits of combining the surveys were listed in the free text box by respondents as:
    • Enhanced data visibility and accessibility
    • Streamlined and improved survey quality
    • Improved insights, for example between participation in cultural activities and community engagement. In particular, it could be interesting for organisations that rely on volunteer engagement and want to understand the broader impacts of participation
    • Support for policy and strategy development by creating a holistic view
  • The main concerns about combining the surveys were listed in the free text box by respondents as:
    • Loss of specificity and granularity leading to a decrease of the survey’s usefulness
    • Survey becoming too long,unless the survey was restricted in length but then this would most likely mean a loss of important questions
    • Impact on trend analysis
    • Coherence issues: a combined survey with many different topics may be confusing to participants, having impacts like lower response rate

Figure 11: Impact of merging the Participation Survey and Community Life Survey on respondents

Base: 64 responses.

4.8 Priorities

The following questions were asked to respondents who plan to use the survey in the future (more, less or the same as now), or don’t know. 

  • Respondents were asked to prioritise the five main factors (1. Questions at local authority level, 2. Survey coverage and content meeting needs, 3. Questions being asked every year, 4. Being able to compare the current year’s data with previous year’s data, 5. Cross-referencing across the surveys.)
    • For the Participation Survey, respondents prioritised survey coverage and content meeting their needs, geographical granularity (questions being at local authority level) closely followed by questions being asked every year. Being able to compare the data in a time series and cross-referencing across the surveys were not prioritised by many respondents
    • When asked to explain rankings, we can infer that respondents mainly have a preference for maintaining high-quality, relevant survey questions and ensuring local-level data, followed by the importance of annual data collection. While combining the surveys and advanced analytical capabilities are secondary to the need for comprehensive, detailed and timely data that can effectively support organisational decision and policy-making

Figure 12: Participation Survey respondents’ prioritisation of factors average score (1 = highest priority, 5 = lowest).

Average score for each factor = sum of ranking position number for the factor (i.e. 1-5) multiplied by frequency of respondents. Overall divided by total respondents (48).

Base: 48 responses.

  • For the Community Life Survey, the highest priorities reported were for survey coverage and content and geographical granularity (local authority level), closely followed by questions being asked every year. As with the Participation Survey, the lowest priority was the ability to cross-reference across the Participation Survey and Community Life Survey
  • The explanations for ranking this way were very similar to the explanations for the Participation Survey. 

Figure 13: Community Life Survey respondents’ prioritisation of factors average score (1 = highest priority, 5 = lowest).

Average score for each factor = sum of ranking position number for the factor (i.e. 1-5) multiplied by frequency of respondents. Overall divided by total respondents (49)

Base: 49 responses.

  • Whilst overall the factors were prioritised in the same order for both surveys, there were some differences noted between the explanation in rankings for the surveys:
  1. Coverage and content: Respondents of both surveys prioritised survey content meeting their needs. However there was a slight indication that the Community Life Survey respondents’ content needs aligned with broader metrics, such as social cohesion in general, and were less specific than the Participation Survey data needs, which were more likely to be certain interests, for example, crafts or treasure.
  2. Geographical granularity of data: local authority level data was important for users of both surveys, but the Community Life Survey respondents highlighted this importance in free text boxes more, as a reason for their ranking. Some users outlined the importance of understanding social and community cohesion trends at a local level.
  3. Frequency of data collection: Participation Survey respondents emphasised the critical nature of frequency, and whilst important for Community Life Survey respondents, there was some flexibility in how often the questions were asked if it maintains depth and relevance.
  4. Time series: Maintaining consistency and methodology seemed slightly more important to Participation Survey respondents than Community Life Survey respondents.
  5. Cross-referencing between the surveys: Community Life Survey respondents seemed slightly more open to enabling cross analysis between both surveys to provide a more holistic view of community life, whereas Participation Survey respondents were more concerned about specificity and granularity of cultural participation being affected if a merge was to happen.

5. Digital sectors

To assess the impact of removing digital questions, and what improvements could be made to the digital sections, we asked users of the digital sectors some specific questions. Less than 3% of respondents from this consultation provided a view here, and we are unable to robustly report any findings.

6. UK-wide survey data

The following question was asked to all respondents.

  • 45% of Participation Survey respondents are impacted by not having a UK-wide Participation Survey, 26% are not impacted, 30% don’t know if they are impacted.
  • 47% of Community Life Survey respondents are impacted by not having a UK-wide Community Life Survey, 26% are not impacted, 27% don’t know if they are impacted.

Figure 14: Percentage of Participation Survey and Community Life Survey respondents who felt that their or their organisation is impacted by a lack of UK-wide survey. Ranging from users who have been strongly impacted, slightly impacted, not impacted or don’t know by the lack of UK-wide survey.

Base: Participation Survey (61 responses), Community Life Survey (62 responses).

  • Some respondents felt that UK-wide data would help meet some or more of their needs, for example understanding how organisations work across the UK, especially since detailed sources are not currently as readily available for Devolved Administrations. However, having this data was not a necessity for the majority of respondents.
  • Further comments around the lack of UK-wide survey data, for both surveys, included it being difficult to perform comprehensive comparisons for organisations that work across all four nations. This was noted as leading to impacts such as limiting insights, sharing of best practices, evaluation of policy impacts, and potentially perpetuating false hierarchies.
    • Specific to the Participation Survey, there were some impacts on respondents on the ability to understand participation in niche activities, and inconsistency across the nations affects the ability to identify areas of need and support drivers for cultural work.
    • Specific to the Community Life Survey, respondents commented on the England-centric focus that leads to inaccuracies when applied to a UK context. The lack of UK-wide data makes it difficult to evaluate the influence of different policy environments in the devolved administrations. 
  • When asked if respondents would like more signposting to similar data sources for the devolved administrations, more said they would: 
    • Of 48 Participation Survey respondents, 65% said they would, and 35% said they wouldn’t.
    • Of 52 Community Life Survey respondents, 63% said they would, and 37% said they wouldn’t.
  • When asked if respondents had used any Devolved Administrations surveys in the past, out of 25 that filled in the text box, 28% indicated use, and 20% that would like to. Some organisations, whose members span all four nations, regularly use the surveys, finding them essential for comparison purposes. Others have not used the surveys, but recognise their potential usefulness for comparative analyses. Several respondents highlighted barriers including:
    • Inconsistencies in question phrasing, making comparisons challenging. 
    • Reliance on partner organisations for such data
    • Lack of awareness of what is available
    • Not perceived to be relevant to their work.

7. DCMS response

The response to the external consultation reflected a variety of user needs, with some consistent trends and priorities. Along with the internal consultation results and comments from other stakeholders such as DCMS’s Arm’s Length Bodies (ALB) and Public Bodies, we have been considering the future of the two surveys.

Based on the results of this consultation, we have understood that the current frequency (annual), geography (Local Authority level data every few years), content (current questionnaire topics), time series (consistent questions with past years) are all working for users and generally meeting their needs. Therefore, we are aware of the importance of not making significant changes to the user experience of the survey outputs and data.

We will continue to review and make adjustments to our processes, but any changes that affect the user experience will be communicated to users via gov.uk. Any adjustments will not affect 2024/25.  

One potential change explored through this consultation was a merger of our two social surveys, to support coherence and drive value. Respondents were largely positive about this idea, but didn’t prioritise against the other factors: comparability, granularity, frequency, and content. We will therefore continue to explore and refine options around a potential merger, whilst being mindful of any implications for the other factors.

While we make these decisions, and in the future, we will aim to regularly review our publications and we are always interested in feedback from respondents. To comment on any of our outputs, please contact us at evidence@dcms.gov.uk. To comment on the specific surveys mentioned in this consultation, please contact us at participationsurvey@dcms.gov.uk or communitylifesurvey@dcms.gov.uk.  

8. Annex A: Glossary

Term Definition
LA level local authority (LA): 317 LA districts in England, for example, Kent County Council or London Borough of Camden.
ITL2 ITL2: 33 counties and groups of counties in England, for example, East Yorkshire and Northern Lincolnshire, or Outer London.
ITL1 ITL1: 9 regions in England, for example, North West England, or London.
ACORN geodemographic segmentation measure of the UK’s population.
  1. DCMS’s Social Surveys are the Participation Survey and Community Life Survey. The Participation Survey collects data on adult (16+) engagement in the cultural sectors, major events, sport and gambling and the digital sectors (until 2025/26). The Community Life Survey collects data on community engagement, volunteering and social cohesion to adults (16+) in England. 

  2. Any changes would not impact the 2023/24 or 2024/25 Participation Survey or Community Life Survey. 

  3. 291 includes non-usable responses, for example where only a few questions had been answered at the start. 

  4. Five of the organisations chose not to name their organisation. 

  5. We interpret this as the area respondents work within or are interested in. 

  6. The digital sections were asked in separate categories, and received between 6%-23% each. 

  7. The Community Life Survey hadn’t released quarterly publications at the time of the consultation starting, so didn’t have an option for this in the list. 

  8. In the survey, the example of headline and more detailed was:  Examples of headline data: Participation Survey: Percentage of people who visited a museum or gallery, Community Life Survey: Percentage of people who formally volunteered. Examples of follow-up / more detailed questions: Participation Survey: Frequency of engagement, Community Life Survey: Sector volunteered in/ 

  9. Local authority level data will be made available after the time of the consultation being conducted (for the Participation Survey in July 2024 and the Community Life Survey in September 2024) . 

  10. The option ‘I do not use or plan to use this survey’ was provided because this question (and similar) were asked of users of either or both surveys, so they may only use one.