Official Statistics

Tri-service reserves continuous attitude survey 2024: background quality report

Published 18 July 2024

Contact

The Responsible Statistician for the Reserves Continuous Attitude Survey (ResCAS1) is the Head of the Analysis Directorate Surveys Team. Email: Analysis-Surveys-Enquiries@mod.gov.uk.

Introduction

2.1 Tri-Service Reserves Continuous Attitude Survey (ResCAS).

The 2024 ResCAS is the eleventh time Tri-Service Reserves questions have been included in the single Service Reserves surveys to produce Tri-Service Reserves results. However, following substantial changes to the Army Reserve survey questionnaire distribution methodology and changes to the Army and RAF target populations, the 2014 Tri-Service results are not comparable to the 2015 to 2024 Tri-Service results, and no comparisons to the 2014 results have been made in the statistical report. The ResCAS is one of the main ways the Department gathers information on the views and experiences of our Reserve Forces personnel. The information from this survey helps shape policies for training, support, and the terms and conditions of service.

2.2 Brief History

For many years prior to the introduction of Tri-Service reserve survey questions in 2014, the Maritime Reserve (Royal Navy Reserve and Royal Marines Reserve), Army Reserve and Reserve Air Force (RAF) Reserve conducted individual Continuous Attitude Surveys (CASs). These inform single Service personnel policy development. However, the individual nature of each made it difficult to obtain a picture of whole Reserve Force wide issues and compare data across the Services and meant that each was published at different times. Following direction from the Reserve Forces & Cadets (RFC) team, the requirement for consistent and comparable whole Reserve Forces survey results was defined. The remit was to draw upon the expertise gathered in the single Services to produce whole Reserve Force results. This was named the Tri-Service Reserves Continuous Attitude Survey (ResCAS).

The Reserves Continuous Attitude Survey (ResCAS), specifically relating to the Tri-Service questionnaire items, is designed, and delivered through the collaboration of MOD occupational psychologists, researchers, and statisticians, to reflect the People policy user requirements. The single Services administer their Reserves surveys and collate the responses. The Analysis Surveys team then produce analysis and tabulations based on the results to the Tri-Service questions and write and publish the Tri-Service ResCAS report. Defence People Team: Research and Evidence (DPTRE), and RFC team in Head Office, in collaboration with Occupational Psychologists in the single Services and statisticians in Analysis Surveys, are responsible for consulting with their stakeholders and for deciding on the content of the Tri-Service questions to meet policy user requirements.

The aim of the ResCAS is to assess and monitor the attitudes of Reserve Forces personnel across the Royal Navy Reserve (RNR), Royal Marines Reserve (RMR), Army Reserve and Reserve Air Force (RAF) on a variety of topics including pay and allowances, support, training, and equipment (see Section 2 for a full list of the topic areas). The statistics are used to help identify where measures are needed to influence motivation, capabilities, and retention in the Reserve Forces and to inform policy development and assessment. The surveys are conducted annually to allow for attitudes to be tracked over time. The Service Chiefs and the Ministry of Defence (MOD) place a high value on the attitude data gathered from Service personnel. They are a vital means of understanding how Reserve Forces personnel feel about key issues. The information is used to inform personnel policy teams. Since 2014 the ResCAS is being published as an Official Statistic.

For the 2024 survey, fieldwork was conducted between January 2024 and April 2024 for the Maritime, Army and RAF Reserves.

Statistical Processing

ResCAS is an annual survey for which there are nine main stages. Each of these stages is briefly described below.

Stage 1: Questionnaire design

For the 2024 ResCAS, Tri-Service questions were agreed by DPTRE and RFC with each of the single Services. There are three separate questionnaires, one for each Service - RNR/RMR, Army Reserve, and RAF Reserve. Any single Service questions included in the single Service Reserves Surveys are outside the scope of this Tri-Service report. The Reserves Continuous Attitude Survey results, published as an Official Statistic, are only concerned with questions that are asked on a Tri-Service basis.

Stage 2: Sample design, selection and cleaning

The Maritime Reserve (RNR/RMR) ran a census of all its Volunteer Reserves, and the RAF Reserve ran a census of all its Volunteer Reserve and Regular Reserves, but the Tri-Service ResCAS results are only concerned with the responses from Volunteer Reserves (the RAF conduct further analysis using the responses from RAF Regular Reserves). The total number of questionnaires sent out to Maritime Reservists was 2,619 and for RAF Volunteer Reservists was 2,788. The Army Reserve survey took a disproportionate stratified random sample of 15,060 Army Volunteer Reservists excluding Non-Regular Permanent Staff (NRPS), and deployed reservists. The Army sample is stratified by rank group in an attempt to achieve a sufficient number of responses in each of the rank groups. The Army sample was designed to provide sufficient responses to yield estimates with a margin of error of plus or minus 3% by: Officers - Maj and above; Officers - Capt and below; Soldiers – Sgt and above; Soldiers – Cpl and below.

Stage 3: Survey distribution and communications

Maritime, Army and RAF Reserves data collection took place between January and April 2024.

RAF Reservists were able to complete an online self-completion questionnaire. Continuing the RAF’s move away from using paper questionnaires in recent years, the availability of RAF paper questionnaires was extremely limited, and all RAF responses this year were received entirely from the online survey.

Army Reservists were sent a paper questionnaire and pre-paid envelope to their unit address using contact details recorded on Joint Personnel Administration (JPA) system. The paper questionnaires provided an option to complete the survey online. 15,060 Army reservists were sent the survey and 3,649 usable surveys were returned, providing a response rate of 24%. The Army sample excluded Special Forces, Mobilised, Unposted List, Army Reserves Reinforcement Group and anyone who had not received pay in the preceding 6-month period.

Maritime Reservists were able to complete an online self-completion questionnaire, via generic web link distributed to their Defence Gateway addresses from their Unit. Links were also posted on all Units Defence Gateway pages which can be accessed by unit ships company. This follows a move away from using paper questionnaires among the Maritime Reserve and only a limited number of paper questionnaires being available in 2020. In 2024, the Marine Reservists response rate (8%) experienced a fall from 17% in 2023. This is likely to be due, at least in part, to the Officer responsible for survey distribution and marketing leaving his post one week into the ResCAS in-field window. This post was filled after the ResCAS in-field period had ended.

Stage 4: Data input

The Army Reserves use an external contractor to input paper survey responses. Online survey responses were downloaded and the Tri-Service question data was sent to the Analysis Surveys team for collating, data cleansing and analysis.

Stage 5: Data validation

All three Services data to the Tri-Service Reserves survey questions are combined into a single data set by statisticians in the Analysis Surveys team.

Many questions are recoded to simplify the interpretation of the output. For example, all 5 point Likert scale responses are recoded into a 3 point positive, neutral, negative scale. Responses are weighted by rank and Service. This accounts for bias caused by differing levels of response. Finally the data is transferred into SPSS.

Stage 6: Production of table of results

Results are produced in SPSS using Complex Samples to ensure percentage estimates and any corresponding standard errors are correctly weighted.

Stage 7: Production of key findings

The results for each section are analysed and summaries of the key points and figures are collated into the Main Report.

Stage 8: Quality Assurance

There are several stages of both automated and manual validation built into the data cleaning process.

Each section of statistics undergoes several layers of scrutiny. These include cross checking by at least one other memeber of the Analysis Surveys team, as well as checking carried out by DPTRE and single Service psychologists and researchers.

Stage 9: Publication

ResCAS is an Official Statistic and is produced and published in line with the Official Statistics Code of Practice2. 24hr pre-release access is given to those listed on the published ResCAS pre-release access list3. ResCAS is published on the statistics at MOD web page on the gov.uk website.

Quality Management

4.1 Quality Assurance

The MOD’s quality management process for Official Statistics consists of three elements: (1) Regularly monitoring and assessing quality risk via an annual assessment; (2) Providing a mechanism for reporting and reviewing revisions/corrections to Official Statistics; (3) Ensuring BQRs are publishing alongside reports and are updated regularly.

4.2 Quality Assessment

The most recent internal quality assessment of ResCAS was carried out in November 2023. The quality risk of ResCAS was assessed as medium. The main risks to quality relate to accuracy due to low response rates. This is discussed further in the accuracy section of this BQR.

Relevance

The principal users of the ResCAS publication are the HQ Defence People policy teams and the single Service policy teams. Other uses of these results include by the Armed Forces Pay Review Body and the Service Complaints Ombudsman’s Annual Report. The statistical information is used to inform and measure Reserve Forces personnel strategy and policy, so it is important that stakeholder requirements are represented.

ResCAS 2024 captures information on the following topic areas:

  • Life in the Reserves.
  • Reasons for joining, staying, and leaving.
  • Pay, allowances and admin support.
  • Kit and equipment.
  • Mobilisation.
  • Training.
  • Career progression.
  • Perception of Reserves.
  • Family support.
  • Your civilian employment.
  • Fairness as Work.
  • About you (includes demographic and personal background questions).
  • Well-being in the Reserves.
  • Managing Change

The information can also be used to answer parliamentary questions and Freedom of Information requests. The information in the ResCAS can be used by the general public and media to monitor the effectiveness of MOD programmes and by parliament to help hold the MOD to account.

The survey is anonymous. The ResCAS statistics published in the Tri-Service report are shown by Service to reflect differences in the roles and experiences of reservists in each of the Services. Requests for additional breakdowns of the ResCAS data would currently be considered on an ad-hoc basis by the Analysis Surveys team.

ResCAS 2024 is the eleventh year that Tri-Service questions have been asked in all single-Service Reserves Surveys. However, following substantial changes to the Army Reserve survey questionnaire distribution methodology and changes to the Army and RAF ResCAS target populations, the 2015 to 2024 Tri-Service results are not comparable to the 2014 Tri-Service results and no comparisons to the 2014 results have been made in the statistical report. Comparisons to previous survey results since 2015 have been made in the report and the availability of trend results is expected to be of particular interest and use to stakeholders.

There is currently a time lag of about 6½ months between the survey first going into field and the publication of the Tri-Service ResCAS report. The time lag of 6½ months between collecting data and publishing the results may reduce the relevance of results as opinions may have changed in this period, although substantively important changes would not be expected in such a relatively short time period.

5.1 User Needs

The Analysis Surveys team work closely with the main customer and survey sponsor for the Tri-Service element of the Reserves Surveys, the Defence People Research and Evidence (DPTRE) team, the Reserve Forces and Cadets team, and occupational psychologists and researchers from across the department so that ResCAS reflects policy user requirements. DPTRE lead steering and working groups to agree their policy user requirements and the Tri-Service questionnaire items.

Internal uses of ResCAS for decision making about policies, programmes and projects:

The main aim of each of the services is to establish their own force fully capable of meeting the demands of the future. To help achieve this, ResCAS provides performance indicators on factors that may impact Reservist’s satisfaction with Service life in general and their willingness to serve, reservist’s capability and availability to serve, and their integration with Regulars. ResCAS questions provide intelligence on the current state of affairs. ResCAS also has an extensive list of questions relating to why volunteer reservists join, stay, and why some decide to leave. These questions help to formulate recruitment and retention strategies.

External uses of ResCAS:

Service Families Federations. Service Families Federations exist to give Service families an independent voice and work with Senior Officials, including the Minister and Service Heads, to help improve the lives of Service families. The Service Families Federations use ResCAS statistics as a source of evidence when voicing the situation for serving personnel and their families. A range of ResCAS statistics have been reported on Service Families Federation websites often accompanied by a link to the full ResCAS report.

The Armed Forces Pay Review Body. The AFPRB uses ResCAS results to inform their recommendations.

The Service Complaints Ombudsman. ResCAS results are reported in the SCO’s Annual Report.

The media have reported ResCAS statistics on Reservist’s integration with Regulars.

ResCAS statistics can be used by students to facilitate academic research.

Accuracy & Reliability

6.1 Overall Accuracy

The Maritime Reserve (RNR/RMR) and the RAF Reserve run a census of their Volunteer Reserves, the Army Reserve survey used a disproportionate stratified random sample of 15,060 Army Volunteer Reservists excluding Non-Regular Permanent Staff (NRPS) and deployed reservists. The Army sample is stratified by rank group in an attempt to achieve a sufficient number of responses to achieve a margin of error of plus or minus 3% by the following rank groups: Officers - Maj and above; Officers - Capt and below; Soldiers – Sgt and above; Soldiers – Cpl and below. The Army sample design allows the Army to conduct further rank analysis that is currently outside the scope of the Tri-Service ResCAS report.

In 2024, the Marine Reservists response rate (8%) experienced a fall from 17% in 2023. This is likely to be due, at least in part, to the Officer responsible for survey distribution and marketing leaving his post one week into the ResCAS in-field window. This post was filled after the ResCAS in-field period had ended.

Survey estimates are published by Service for each of the Maritime (RNR/RMR) Volunteer Reserves, Army Volunteer Reserves, and RAF Volunteer Reserves and by Officers and Other Ranks within each Reserve Service.

The ResCAS raw data is passed through a range of automatic and manual validation and editing routines. The data sets from each of the surveys are combined into a single data set. To ensure results are representative of the Armed Forces, Analysis Surveys weight responses to correct for any bias introduced by differing levels of response. The responses are weighted broadly by rank & Service. Full details are provided in the methodology section of the report.

Analysis Surveys analyse the data using SPSS Complex Samples. This software produces weighted estimates and corresponding standard errors.

6.2 Sampling/Non-sampling errors

As the ResCAS does not achieve a 100% participation rate (the overall participation rate achieved in ResCAS 2024 was 23%) there is always the risk that those who returned questionnaires have differing views from those who did not. We assume that all non-response is Missing At Random (MAR). This means we have assumed that those people who did not return their questionnaires have (on average) the same perceptions and attitudes as those who did respond.

Weighting helps to make the ResCAS percentages as representative as possible of the Volunteer Reserves populations. The Services/Ranks which are under-represented in the dataset are given more weight so that they represent more of the people in their group who did not respond. Conversely, groups that are over-represented in the dataset are given less weight. Weighting assumes that all non-response is Missing At Random (MAR). This means we have assumed that all those people who did not respond within their Service/Rank strata have (on average) the same perceptions and attitudes as those who did respond. If those who did not respond have different attitudes to those who did respond then the observations in this report will be biased and will not represent the attitudes of all Volunteer Reserves personnel; rather, our observations would only represent the views of the responding population.

The ResCAS is designed to give an up-to-date snapshot of the attitudes and perceptions of our Volunteer Reserve personnel. While the ResCAS is reported on an annual basis it should be remembered that these attitudes and perceptions are liable to change within the calendar year, for example, as a result of events or even due to the time of the year that the responses were collected (a seasonality affect). The ResCAS timeline is driven by user reporting requirements.

All statistical results are checked by at least two Analysis Surveys staff following a clear checking process. The statistics are further checked by at least one of the psychologists on the working group. Analysis Surveys do not show any statistics where the responding group size is less than 30. This is to prevent the publication of unreliable statistical information and to prevent disclosure of information about individuals.

6.3 Data Revisions

Data revisions are handled in accordance with the MOD’s Official Statistics Revisions and Corrections Policy.

Timeliness and Punctuality

7.1 Timeline

The ResCAS takes approximately 6½ months from questionnaires being distributed to publishing the ResCAS report. The time lag of 6½ months between collecting data and publishing the results may reduce the relevance of results as opinions may have changed in this period.

A general limitation of the ResCAS is that it is a snapshot of attitudes and perceptions at the time respondents answer the questionnaire. People’s attitudes and perceptions may systematically change throughout the year in response to events or because of some seasonality affect. For the 2024 report, data collection took place from January 2024 to April 2024 for the Maritime, Army and RAF Reserves.

7.2 Punctuality

The publication date is pre-announced on the GOV.UK Official Statistics Release Calendar. All pre-announced publication deadlines have been met.

Coherence and Comparability

8.1 Coherence

ResCAS is the definitive source of Tri-Service attitudinal data about Volunteer Reservists’ own experiences and perceptions of working in the Reserves. There are no other Tri-Service data sources that collect the same attitudinal information with which to ensure coherence.

8.2 Comparability over Time

This is the eleventh time Tri-Service Reserve survey questions have been included in each of the single Service Reserves Surveys. However, following substantial changes to the Army Reserve survey questionnaire distribution methodology and changes to the Army and RAF target populations, the 2015 to 2024 Tri-Service results are not comparable to the 2014 Tri-Service results and no comparisons to the 2014 results have been made in the statistical report. Specifically, the Army Reserves survey moved from primarily distributing its questionnaires as a pull-out questionnaire in the Army Reserve Quarterly Magazine (ARQ) open to all Army reservists in 2014 to sending questionnaires to a random sample of volunteer reservists excluding NRPS since 2015. To better align the target populations the decision was made, from ResCAS 2015, to only include Volunteer Reserves excluding the NRPS in the Army. Time series comparisons are now available since the 2015 results, and trend results are expected to continue to be developed in the future.

Accessibility and Clarity

9.1 Accessibility

All ResCAS publications were available in PDF format up until the 2023 release (the 2022 report was published as an accessible PDF). Since 2023, the ResCAS main report has been published as an accessible webpage. The main report, copies of the statistical tables in Excel format and other accompanying documents can be found at the tri-service Reserves continuous attitude survey section of the gov.uk website.

9.2 Clarity

In addition to this Quality Report, the ResCAS report contains a key points section that summarises the main ResCAS findings, an introduction section that provides a brief background to ResCAS and definitions of terms used in the report, a methodology section that provides users with details of the methodology including target population, information on the sample, respondents, weighting, and notations and definitions used.

An ODS version of tables with detailed results are made available to accommodate different user preferences. These include tables showing margins of error for each estimate. Relevant footnotes are shown below tables to indicate any filters that have been applied to the data, data quality issues or time series comparisons.

Trade-offs between Output Quality Components

The main trade-off is between timeliness and quality. While the report has a key points section, there is limited contextual/or explanatory text to accompany the statistical charts. The report limits itself to examining trends over time and differences by Reserve Service and by Officers and Other Ranks with a very limited breakdown of some of the ‘Fairness in the Reserves’ questions by Ethnic group and Gender; No other demographic breakdowns of questions are considered in the report and there’s no attempt to cross-tabulate ResCAS questions by each other that may provide additional insights. This is so that the basic statistical information can be made available to policy users and the public as soon as possible in a clear accessible format. Additional analysis for policy users is available on request and external requests for further information would be considered under the usual FOI rules. A general limitation of the ResCAS is that it is a snapshot of attitudes and perceptions at the time respondents answer the questionnaire. People’s attitudes and perceptions may systematically change throughout the year in response to events or because of some seasonality affect. For the 2024 report, data collection took place from January 2024 to April 2024 for the Maritime, Army and RAF Reserves.

Cost and Respondent Burden

Costs are closely monitored, and the Surveys team and the working group strive to balance quality and timeliness against costs. The sample size is calculated to be the most efficient in order to meet the levels of precision outlined in Section 6.

Response to ResCAS is voluntary. Participant information is provided within the questionnaire to encourage informed consent. Most respondents complete the survey within 30 minutes.

Respondent burden is minimised by obtaining demographic information about respondents from the Joint Personnel Administration (JPA) database rather than asking respondents these questions in the questionnaire. This also helps to minimise costs.

Confidentiality and Security

12.1 Confidentiality - Policy

ResCAS is a confidential survey rather than anonymous. The paper survey contains a unique barcode that can only be linked to an individual’s unique Service number by the Surveys ResCAS team and the external contractor responsible for data input. Only a small number of individuals in the team have access to the person-level data including the unique identifier. In addition, a small number of named individuals in the single Services and an approved contractor have access to record-level data stripped of the unique identifier. No person from any respondent’s Chain of Command is able to access individual level data. Data Protection Impact Assessments and Data Access Agreements are in place to minimise risk to confidentiality, in accordance with the Data Protection Act.

12.2 Confidentiality - Policy

The Defence Statistics Disclosure and Confidentiality Policy is followed. Only aggregated results are provided to anyone not directly involved with the analysis. These results are only presented for groups containing at least 30 respondents.4

12.3 Security

All staff involved in the ResCAS production process adhere to the MOD and Civil Service data protection regulations. In addition, all members of the working group have to follow the relevant codes of practice for their professional groups; the Government Statistical Service (GSS) and the Government Social Research (GSR) Service. All data is stored, accessed, and analysed using the MOD’s secure IT system.

References

Serial Title of Reference Website Location
1 ResCAS https://www.gov.uk/government/collections/tri-service-reserves-continuous-attitude-survey-index
2 Official Statistics http://www.statisticsauthority.gov.uk/assessment/code-of-practice/code-of-practice-for-official-statistics.pdf
3 ResCAS pre-release https://www.gov.uk/government/statistics/defence-statistics-pre-release-access-list
4 Defence Statistics https://www.gov.uk/government/publications/defence-statistics-policies

Last Updated: July 2024