Taking Part web panel COVID-19 Survey Technical Note
Updated 16 September 2020
1. Background
Ipsos MORI was commissioned to carry out three surveys on the Taking Part web panel for the Department for Digital, Culture, Media and Sport. The surveys collected information about respondents’ participation in leisure, cultural and sporting activities, and acquisition of digital skills during the COVID-19 coronavirus pandemic in England. The surveys were launched in the first weeks of May, June, and July 2020, and all three used the same sampling methods and data collection instruments.
For more general information about the Taking Part web panel, see the Technical Report for the Taking Part web panel (2016-19).
2. Sample
The samples for the three surveys were drawn from the Taking Part web panel. Panellists were recruited from the Taking Part face-to-face cross-sectional survey, which uses a random probability sampling methodology. For further details of sampling for the face-to-face survey and recruitment of the panel, see the Technical Report for the Taking Part web panel (2016-19).
Panel members were eligible to be sampled if they met two conditions on 5th May 2020, when the first sample was drawn:
(i) they had completed registration for the panel (which requires a valid email address)
(ii) and their face-to-face interview had taken place within the last two years.
A random sample of 1,698 panellists was independently drawn for each survey. Samples sizes were set with the aim of achieving approximately 1,000 complete interviews for each survey, with a predicted response rate of 59%. The predicted response rate was based on responses to previous surveys on the web panel within a month of launch
The samples were stratified by age/gender and region. Age and gender were derived from Taking Part face-to-face survey data. Age was incremented based on how long ago the face-to-face interview took place and was grouped into bands (16 to 24, 25 to 34, 35 to 44, 45 to 54, 55 to 64, 65 to 74, and 75 or older). The grouped age measure was then divided into male and female to give 14 age and gender groups in total. Region was derived from the postcodes held (and regularly updated) within the web panel database – these postcodes were matched to the ONS National Statistics Postcode Directory.
The sizes of the strata were initially set to match the corresponding population proportions in the mid-year 2019 ONS population estimates. These were then adjusted where there were not enough eligible panellists for a given combination of age/gender and region.
3. Data collection instrument
The same data collection instrument was used for each of the three surveys. The questionnaire was designed by DCMS and scripted in line with best practice on questionnaire design for online surveys, was device agnostic, and was intended to take around 15 minutes to complete. Longer lists were randomised and, where used, grids were displayed vertically on mobile devices. Script checking and testing was undertaken on both PCs and mobile devices.
The questionnaire covered four topic areas, which are summarised below. See the questionnaire documentation for further details.
3.1 Participation in cultural and sporting activities, charitable giving, and voluntary work
This section asked about things respondents might have done while Government restrictions on movement and gatherings were in place in England following the outbreak of the COVID-19 Coronavirus. Follow up questions probed for changes in behaviour since restrictions were introduced and anticipated behaviour changes in the future.
3.2 Skills, news, and the internet
This section asked about any skills that the respondent may have developed recently, with follow-up questions probing how and why. It also asked about engagement with news about the pandemic and the respondent’s satisfaction with their access to the internet.
3.3 Subjective wellbeing
This section comprised of eight standardised personal well-being questions developed by the Office for National Statistics.
3.4 Background / demographics
This section asked about the impact of the pandemic on the respondent’s health and employment status, as well as collecting demographic information for use in developing the weights.
4. Fieldwork
Sampled panellists were invited to take part by email, after which the survey link remained open for 15 days. After five and then ten days, sample members who had not yet completed the survey were sent reminder emails. Emails were addressed to panellists by name, and the subject lines and contents varied with each email to encourage participation.
Panellists were offered a conditional incentive of 250 points (equivalent to £2.50) for completing the survey. This is the standard incentive for completing questionnaires on the Taking Part web panel, as described in the Technical Report for the Taking Part web panel (2016-19).
Fieldwork took place over the periods 6-21 May 2020 (Survey 1), 3-18 June 2020 (Survey 2), and 1-16 July 2020 (Survey 3). Response to each survey is summarised in Table 1.
Some respondents completed the survey across multiple waves, with 1456 completing one wave, 550 completing two waves and 169 completing all three waves.
Table 1. A summary of response to each survey.
Survey 1 (May 2020) | Survey 2 (June 2020) | Survey 3 (July 2020) | |
---|---|---|---|
Sampled | 1698 | 1698 | 1698 |
Not invited* | 5 | 14 | 15 |
% of sampled not invited | 0.3% | 0.8% | 0.9% |
Invited | 1693 | 1684 | 1683 |
% of sampled invited | 99.7% | 99.2% | 99.1% |
Productive | 1052 | 1035 | 976 |
% productive of invited | 62.1% | 61.5% | 58.0% |
Unproductive | 641 | 649 | 707 |
Not started | 596 | 610 | 680 |
Broken off | 45 | 39 | 27 |
% unproductive of invited | 37.9% | 38.5% | 42.0% |
*Panellists may not be invited for one of two main reasons: either the invitation email bounced back, or the panellist was withdrawn from the sample before fieldwork was started because they opted out of the panel or indicated that they couldn’t complete any online surveys in the near future.
5. Data processing and weighting
A separate SPSS dataset was produced for each survey. Each contained a respondent serial and completion status variable, answers to all survey questions, para-data collected by the instrument (e.g., information about the respondent’s device), and relevant sampling variables and weights. The data underwent a series of checks, cleaning, and quality assurance procedures, including consistency checks for variable names and labels against the questionnaire documentation and routing checks.
Calibration weighting was carried out for each survey so that the weighted profiles for key measures matched those of the population where available (i.e., for age/gender and region) or the estimates from the last two years of the cross-sectional survey combined (Years 14 and 15, covering April 2018 to March 2020)[footnote 1].
The age and gender groups were derived from survey responses so that they were up to date. For the few participants that had missing data, responses from the original cross-sectional survey were used instead. Age was then grouped and split by gender as described in the Sample section. Region had already been derived as part of the sampling process, so was not missing in any cases. The corresponding population counts for the age and gender groups and region were taken from the mid-year 2019 ONS population estimates.
In order to obtain the cross-sectional Taking Part estimates for Years 14 and 15, the two years of the survey were merged together and weighted national estimates were obtained for: ethnicity (white or not); tenure (owned outright, buying with mortgage, other); number of adults in the household (1, 2, 3 or more); and number of children in the household (0, 1, 2 or more). When carrying out calibration weighting, it is crucial that the individual level estimates are completely consistent with the population estimates. This includes being collected using the same approach and at a similar time point. To ensure this was the case, the cross-sectional estimates were merged into the dataset for the web panel participants and they were used for the calibration weighting rather than the measures collected in the web panel itself.
-
Because of the eligibility criteria laid out in the Sample section, all eligible panellists and therefore all sample members had completed their face-to-face interviews in one of these years. ↩