Consultation on the future of the Taking Part Survey: summary of responses
Updated 1 September 2020
1. Respondents to the survey.
Annex A provides summary details of respondents to the consultation. Responses were received from interested individuals, organisations and some DCMS Arm’s Length Bodies. Internal DCMS teams were not encouraged to respond to the consultation (as their views are being captured via alternative engagement routes) and therefore their views are not represented in the data.
2. Use of Taking Part survey outputs
Around two-fifths (42%) of respondents had not used any Taking Part outputs. Those respondents that had used Taking Part outputs were most likely to have used the published reports and data tables (74%) with fewer making use of Ad Hoc data publications (26%), archived cross-sectional data (24%) or archived longitudinal or panel data (11%). It is, however, clear that some users make use of the archived data alone.
Over three-quarters of those who responded to the survey and are data users did not provide detail on the survey measures they found most useful nor what the data was used to inform. Where data users did provide feedback it was clear that there was a mixture of interest with some respondents interested in levels of participation in (or engagement with) a particular activity whereas others had a much broader interest. An interest in understanding differences in participation by different sections of our community (e.g. socio-demographics or geographical region) was apparent as well as drawing comparisons between different DCMS sectors and the populations that did/did not engage.
Respondents cited a number of reasons they use the data. These included:
- To analyse the relationship between participation/engagement and socio-demographic characteristics
- To identify barriers to engagement
- To understand sector performance and monitoring trends
- To improve understanding/academic research
- To identify the need for intervention, for example
- For regional benchmarking
- To evidence funding proposals and resource prioritisation
- To inform work on wellbeing, in particular for young people
3. Gaps in the survey coverage
Around 12% of respondents provided thoughts on where they feel the survey has gaps in coverage which would help address their evidence needs. Where gaps were identified, respondents were keen to use additional findings to increase their understanding of their own evidence base in relation to the scope of their organisation (including to make the case for funding), use the data to benchmark or baseline other analytical activity e.g. evaluations, to explore any links between engagement in DCMS sectors and pro-social behaviours and to fill wider gaps in the evidence base to enable prioritisation of resources. Gaps identified in responses included:
Gaps identified in responses included:
Survey content
- A review into what is classed as cultural participation and the survey amended to reflect findings
- Inclusion of motivation, benefits sought and barriers to engagement at a more granular activity level
- Capture of engagement in culture via community based assets
- More granular identity questions e.g. religion, neighbourhood attachment, voting preferences
- Participation in Arts/Culture beyond/after ‘attendance’ at an event
- Appropriate youth wellbeing and loneliness measures
- Participation in key cultural events
- Inclusion of community engagement and civic activities
- Expansion of the coverage for engagement with Archives including with Primary aged children and outside of Archive premises (e.g. online).
Geographical granularity
- More granular data at a local level e.g. Local Authority
- Regional data on digital engagement
Analysis
- Further analysis of participation and engagement by socio-demographic characteristics
- Further analysis according to frequency of engagement.
4. Impact of changes to methodology
Although no changes to methodology were proposed, respondents were asked about the potential impact of a change to the mode of the survey, the frequency and regularity of the survey or discontinuation of the survey would have on their evidence needs. Approximately 12% of respondents provided views relating to each in turn.
4.1 Impact of a change in survey mode
Many of the respondents who gave views could see value in considering telephone or online approaches to conduct the survey, noting successes in other surveys which had moved from face-to-face to alternative modes. Some respondents were particularly in favour of a change in survey mode if this was managed carefully with a clear assessment of feasibility and dual mode during transition and allowed for robust expansion of the sample size to allow for more localised data.
Concerns around digital exclusion were raised and the need to ensure that the survey remained representative of the population and an accurate measure of participation in sectors (noting that some measures, e.g. digital engagement in culture) would be affected by an online only survey.
Views, where given, on the appropriateness of an alternative mode for collecting data for young people were mixed, with a suggestion that an online mode may benefit youth participation but that it may hinder data collection for younger children. Although survey continuity was mentioned in responses, on the whole it didn’t present as a key concern across respondents.
4.2 Impact of a change in survey frequency or regularity
Where respondents gave views, the general consensus was that changing the continuous nature of the survey and reducing the frequency of reporting below 12 months would limit the usefulness of the survey to measure participation in key events and where ‘what’s on offer’ can change quickly, e.g. the digital and technology space.
However, some respondents did note that there may be merit in exploring a reduced frequency of the survey if it would allow for a less frequent but greater sample size of adults and young people to enable more granular breakdowns of the data.
4.3 Impact of discontinuity of the survey
On the whole, where views were given, respondents expressed concern if the survey were to be discontinued. Reasons given included the absence of other data to fill the evidence gaps that the survey addresses and that, where there are alternative data sources (e.g. other surveys) that the discontinuation of Taking Part would prohibit analysis alongside participation in other DCMS sectors.
Some respondents did note that the survey, in its current form, was not fit for purpose for all of the evidence needs they had, and noted a need to improve measures to capture passive engagement with arts and culture.
5. Next steps
As mentioned above, due the emergence of the Covid-19 pandemic during the consultation period and the resultant effect of the pandemic on potential respondents’ capacity to respond, DCMS does not intend to take any immediate action in response to the views captured during this survey alone.
Findings will be fed into a broader strategic survey review. Any future proposals for changes to the Taking Part survey will be consulted upon in line with the requirements for Official and National Statistics.
6. Annex A
6.1 Consultation respondents
120 consultation responses were included in the analysis including 19 responses from Organisations and 101 responses from Individuals. Around two-fifths (42%) of the respondents had not used any Taking Part outputs to date and few of these respondents went on to identify evidence needs which they felt Taking Part was missing.
Of the 19 responses ‘on behalf of an organisation’ it is suspected that there is a small amount of duplication with 2 organisations appearing more than once in the data. These responses are not identical and therefore all responses have been treated as valid.
Organisations were asked to select the sector(s) in which they work in from a pre-defined list. Responses were received from organisations working in most of DCMS sectors with the exception of tourism. However, due to the sample size it was not possible to break down any findings by sector.