Background quality report UK armed forces: October 2022
Updated 11 October 2024
This Background Quality Report (BQR) reports on the quality of all the Official Statistics, using UK Armed Forces personnel data and applications statistics, produced by Analysis (Tri-Service).
This provides background information on our statistical outputs and indicates the level of quality of data used in our statistical outputs. Information is provided on the quality of the statistics used within our publications detailing the strengths, weaknesses and methods used.
Analysis Tri-Service Statistical Outputs:
- UK Full-time Armed Forces Statistics
- UK Reserve Forces Statistics
- UK Cadet Forces Statistics
- Armed Forces Applications Statistics
Updated 15 December 2022
Background Quality Report: UK Full-time Armed Forces Statistics
1. Contact
This section provides details on the Analysis (Tri-Service) Head of Branch is the Responsible Statistician for these statistics.
Contact details are:
Analysis (Tri-Service)
Ministry of Defence
Floor 3 Zone M
Main Building, Whitehall
London SW1A 2HB
E-mail: Analysis-Tri-Hd@mod.gov.uk
Website: Statistics at MOD
2. Introduction & Statistical Presentation
The Ministry of Defence (MOD) publishes a wide range of Armed Forces personnel statistics. The main purpose of these statistics is; to inform policy and decision making within the Department, to measure the performance of the Ministry of Defence against Government and Parliament targets, and also to inform general debate in Government, Parliament and the wider public.
These personnel statistics are primarily counts of the number of Service personnel in the Armed Forces (or ‘strengths’), the number of personnel joining (intake) and numbers of personnel leaving (outflow) all of which are reported by various categories of interest and at differing levels of detail.
‘Strength’ counts are reported against Workforce Requirement figures for the Trained (RN/RM & RAF) and Trade Trained (Army) UK Full-time Armed Forces, which enables surpluses and deficits to be calculated.
This background quality report covers the primary military personnel statistics published on the Analysis (Tri-Service) website on GOV.UK.
The publications we produce which contain statistics on UK Full-time Armed Forces personnel are:
- Quarterly Service Personnel Statistics
- Biannual Diversity Statistics
- Annual Location Statistics
On 29 June 2016, the MOD announced that the Army will in future plan to use Regular and Reserve Phase 1 trained personnel in response to crises within the UK. Following this, the term ‘Trained Strength’ will include all Army personnel trained in the core function of their Service (i.e. those who have completed Phase 1 training). The MOD has consulted on these changes and the resultant impact it will have on Analysis (Tri-Service) publications and a consultation response were published on 7 November 2016.
From 1 October 2016 edition of Monthly Service Personnel Statistics onwards, Army personnel who have completed Phase 1 training (basic Service training) but not Phase 2 training (trade training), are considered Trained personnel. This change will enable the Army to meet the Strategic Defence and Security Review (SDSR) 15 commitment to improve support to UK resilience. The Trained Strength definition for the Royal Navy, RAF, Maritime Reserve and RAF Reserves has not changed, reflecting the requirement for their personnel to complete Phase 2 training to be able to fulfil the core function of their respective Services. Army personnel who have completed Phase 2 training will now be called ‘Trade Trained’. This population aligns with the old definition of trained personnel, therefore maintaining the continuity of the statistical time series and will continue to be counted against the Workforce Requirement and SDSR target for 2020.
Previous reports containing statistics on UK Full-time Armed Forces personnel can be found on the archived Analysis (Tri-Service) website on the National Archives site:
- UK Armed Forces Monthly Personnel Report (MPR)
- UK Armed Forces Quarterly Personnel Report (QPR)
- UK Armed Forces Annual Personnel Report (APR)
- UK Regular Forces Diversity Dashboard
- UK Defence Statistics Bulletin 2.01 (excluding reserve forces)
- UK Defence Statistics Bulletin 2.03 (excluding location, civilian and compensation statistics) TriService publications (TSP 1 – TSP 10).
These historic reports can be found on the archived Analysis (Tri-Service) website on the National Archives site.
3. Statistical Processing
3.1 Source data
The statistics are principally derived from the Department’s Joint Personnel Administration (JPA) system, which is used for the administration of all Armed Forces personnel, supplemented by information from single Services’ management systems and other centrally managed databases. Prior to the introduction of JPA in 2006/07, legacy single Service administration systems were used to produce the statistics.
3.2 Frequency of data collection
Extracts are taken from JPA each month and stored on separate databases to form a time series.
3.3 Data collection
The extracts are taken six calendar days after the end of the month and the situation as at the first of the month is calculated. This ensures most late-reporting is captured.
3.4 Data validation
Data goes through a series of automatic validation checks and edits to ensure the basic quality of the data and a series of derived fields are calculated.
The data is then made available to Analysis (Tri-Service) single Service manpower branches. They undertake a wide range of validation checks and implement specialist editing rules using their expert knowledge and experience as well as data obtained from other sources within the Department.
3.5 Data compilation
Once the data is confirmed as being accurate the database is queried to produce the range of tables published. These tables undergo several layers of scrutiny to ensure the outputs are accurate and consistent.
The statistics are counts of military reserve personnel by a range of categories, including breakdowns by: Navy, Army and RAF; officer and other ranks; trained and untrained.
4. Quality Management
This section will briefly describe the overarching processes in place to manage quality e.g. annual risk assessments, and offer the opportunity to outline results of recent quality assessments.
4.1 Quality Assurance
The MOD’s quality management process for Official Statistics consists of three elements: (1) Regularly monitoring and assessing quality risk via an annual assessment; (2) Providing a mechanism for reporting and reviewing revisions/corrections to Official Statistics; (3) Ensuring BQRs are publishing alongside reports and are updated regularly.
4.2 Quality Assessment
At the time of the last Quality Assessment in November 2021, the Quarterly SPS received a ‘N/A’ as the overall summary in terms of quality risk.
5. Relevance
This section is about the degree to which the statistical product meets user needs in both coverage and content.
Analysis (Tri-Service) frequently meet with customers within the Department to discuss data, results, interpretation, and any changes to requirements. They also seek feedback from a wider range of internal and external customers.
We have made our own assessment of what these statistics could be used for using the categorisation in the UKSA paper The Use Made of Statistics.
We believe the statistics could be used as follows:
i.Informing the general public’s choices:
about the performance of government and public bodies
ii. Government decision making about policies, and associated decisions about related programmes and projects:
policy monitoring
The underlying data also allow for:
iii. Government decision making about policies, and associated decisions about related programmes and projects:
policy making
iv. Facilitating academic research.
The Service Personnel Report should be used as the authoritative source of regulars strengths and flows statistics.
The MOD has recently consulted on changes to the definition of Army trained strength and the resultant impact it will have on Analysis (Tri-Service) publications and a consultation response were published on 7 November 2016. Changes have been implemented into Monthly Service Personnel Statistics from the 1 October 2016 publication onwards.
A formal consultation on Changes to MOD Armed Forces Personnel Statistics ran between 20 March to 16 April 2015 and 27 May to 18 June 2015. This was following an internal review of MOD Tri-Service publications content and sought further user views on proposed changes to publications.
Detailed information on previous consultations can be found via the National Archives
Users are also encouraged to provide feedback on statistics produced by Analysis (Tri-Service) and also to sign up to the mailing list for their publication of interest, to receive updates to the statistics or to be made aware of any changes: Analysis-Tri-Service@mod.gov.uk
The principal customers for the Tri-Service publications are within the People area of the Ministry of Defence. They are used to inform and measure Service personnel strategy in areas such as pay & allowances and overall troop numbers, and in particular the Future Reserves 2020 Programme (FR20). This product is also used to answer parliamentary questions and Freedom of Information requests. The publications are also used to inform the monthly Defence Board Management Information. The principal stakeholders for Service personnel statistics are within the Chief of Defence Personnel area of the Ministry of Defence. The statistics are used to inform and measure Service personnel strategy in areas such as pay & allowances and equality & diversity. They are also used to answer parliamentary questions and Freedom of Information requests. The information is also used to inform the internal monitoring and regular monthly reporting to the Defence Board.
For detail on pre-release access to Analysis (Tri-Service) publications please see the Analysis (Tri-Service) pre-release access list for the most up to date list of roles receiving pre-release access.
People in the roles with access receive pre-release access to the publication up to 24 hours in advance of publication.
The coverage of these statistics is close to, if not equal to, all Full-time Armed Forces personnel, Full Time Reserve Service personnel and Gurkhas. The Service Personnel Statistics publication includes statistics on the Reserve Forces, more details of which are covered in the report on Reserve Forces information. There are no known unmet user needs.
6. Accuracy and Reliability
This section is about the differences between the estimates and the unknown true values.
6.1 Overall Quality
All personnel in the Regular Armed Forces must be recorded on JPA in order for them to receive their pay. Therefore the overall strength figures are accurate. However, more detailed breakdowns relying on the information recorded for each individual can be less accurate due to variable quality of the data entered for these different fields.
The variation in quality is partly due to key information required for managing individuals being recorded and updated centrally, whereas other information is left to the individual to complete through a self-service tool. There is also a reasonable amount of late reporting which can adversely impact the statistics, particularly for exits and changes of individual’s status from untrained to trained. Obtaining the extract on the sixth calendar day and then calculating the strength at the first of the month overcomes much of this late reporting.
The monthly datasets are passed through a range of automatic and manual validation and editing routines in order to make the key fields as accurate as possible, often drawing upon alternative data sources. A range of detailed breakdowns are produced and these are compared with previous month’s outputs and discrepancies are examined. The detailed tables are used by the single Services (Navy, Army, and Air) to manage their personnel and inform policy and strategy.
Regular feedback ensures Analysis (Tri-Service) staff are kept abreast of any changes or potential issues with the data and statistics, which is fed into the data validation and editing process.
6.2 Data Revisions
Data revisions are handled in accordance with MOD’s Official Statistics Revisions and Corrections Policy. Due to processing errors, it has been necessary to perform numerous revisions to Regulars statistics in various editions of the Quarterly SPS.
7. Timeliness and Punctuality
This section reports on the time gap between publication and the reference period (timeliness) and the gap between planned and actual publication dates (punctuality).
7.1 Timeliness
The quarterly extracts are taken on the sixth calendar day. The editing and production process usually takes around 2 weeks at the single Service level. It then takes a further 2-4 weeks to compile these data at a tri-Service level.
7.2 Punctuality
The following table provides an example of the timeline for 1 July 2022 Statistics.
Publication | Situation Date | Publication Date |
---|---|---|
Service Personnel Statistics | 1 July 22 | 15 September 22 |
Historic and planned publication dates can be found on the Publication Release Dates section of the Gov.uk Statistics at MOD webpage and on the UK National Statistics Publication Hub.
8. Coherence and Comparability
Analysis (Tri-Service) published statistics on UK Armed Forces are the definitive personnel statistics in the MOD. There are no other publicly available regular publications on the numbers of UK Armed Forces with which to ensure coherence. Within the MOD direct queries of the Joint Personnel Administration system will produce slightly different numbers due to timing and quality issues.
The UK Armed Forces personnel statistics are not always directly comparable with other countries’ statistics due to definitional differences of what constitutes an Armed Force. In some countries, particularly in Europe, part of the domestic police force is included in the Armed Forces. Similarly, there are not always direct equivalents to the Royal Navy / Royal Marines, Army, and Royal Air Force in other countries.
The total number of Service personnel are comparable across time, however the breakdown of smaller categories are generally not comparable due to structural changes in the physical and financial structures of the MOD. The introduction of JPA in 2006/07 impacted on the availability of certain statistics, as some information (divorce rates for the RAF for instance) were available under the legacy system but not on JPA. This change led to a revision of what and how much can be published – for example some categories of outflow had to be combined.
Annual editions of UK Defence Statistics compendium dating back to 1992, plus historic Tri-Service publications dating back to 2002, are available in the National Archives.
9. Accessibility and Clarity
This section reports on: the ease with which users are able to access the data, the format in which the data are available, and the availability of supporting information (accessibility); and the quality and sufficiency of the metadata, illustrations and accompanying advice (clarity).
Current publications consist of detailed Excel tables containing a historic time-series of statistics and a PDF report containing commentary, graphs and tables on trends in the statistics. The commentary in our reports identifies and analyses the key changes in the data and provides summary statistics and policy context. Graphs, tables and other data visualisation methods are used to further explain these trends.
Previous Analysis (Tri-Service) personnel reports are published on GOV.UK and are available as PDFs or Excel value copies. Other formats may be possible for Analysis (Tri-Service) to produce on request
All Analysis (Tri-Service) publications that use these data can be found under the “Military” and “Combined military and civilian” sections under the “Personnel statistics” heading on the statistics by topic section of the MOD National and Official Statistics by topic webpage
They can also be accessed via the Statistics release calendar
Copies of the reports are also placed in the House of Commons library.
Data revisions are handled in accordance with the MOD’s Official Statistics Revisions and Corrections Policy.
10. Trade-offs between Output Quality Components
This section reports the extent to which different aspects of quality are balanced against each other.
The main trade-off is between timeliness and quality. To ensure statistics are timely the editing and validation process is restricted to around two weeks and a significant amount of automatic editing is utilised. Spending more time investigating every suspect individual personnel record could marginally improve quality at a detailed trade/rank level but is unlikely to impact the aggregated statistics published in our reports.
11. Cost and Respondent Burden
This section is about the effectiveness, efficiency and economy of the statistical output.
Personnel Statistics and Analysis has six branches dedicated to producing information relating to manpower and providing analysis and advice, the majority of time is spent on adding value through analysing, forecasting and answering ad-hoc enquiries as well as producing the National Statistics.
There is very little respondent burden as the majority of the data is automatically obtained from administrative systems. However, this is supplemented with small amounts of data as well as input from other areas within the MOD.
12. Confidentiality, Transparency and Security
This section is about the procedures and policy used to ensure sound confidentiality, security and transparent practices.
12.1 Confidentiality
All published outputs are counts of individuals in particular groupings. Where there are possible disclosure issues in reporting protected characteristics, outputs are rounded according to Analysis (Tri-Service) rounding policy, which prevents disclosure of information on individuals.
Disclosure control is conducted on all statistical information provided by the MOD to safeguard the confidentiality of individuals.
12.2 Transparency
The reports provide commentary on the key features of the outputs and identify any issues or caveats to the data. This quality report provides further information on the method, production process and quality of the output.
12.3 Security
All staff involved in the statistical production process adhere to all MOD, Civil Service and data protection regulations. The data is stored, accessed and analysed using the MOD’s restricted network and IT systems, and access to raw data is password protected and approval for access is granted only by the Head of Personnel Statistics.
The Analysis (Tri-Service) website can be accessed here: Statistics at MOD
Updated: 15 December 2022
Background Quality Report: UK Reserve Forces Statistics
1. Contact
This section provides details on the Analysis (Tri-Service) Head of Branch is the Responsible Statistician for these statistics.
Analysis (Tri-Service)
Ministry of Defence
Floor 3 Zone M
Main Building, Whitehall
London SW1A 2HB
E-mail: Analysis-Tri-Hd@mod.gov.uk
Website:Statistics at MOD
2. Introduction & Statistical Presentation
Analysis (Tri-Service) publishes a large range of Armed Forces personnel statistics, mainly to inform policy and decision making within the Department. The statistics are also used to measure performance against MOD, Government and Parliament targets and to inform general debate in government, parliament and the wider public. The statistics form part of that portfolio of service personnel and report on the UK Reserve Forces, including regular reserves, volunteer reserves, sponsored reserves and the University Service Units. They are counts of personnel numbers broken down by various categories.
This background quality report covers the principal military personnel statistics published on Statistics at MOD
-
Service Personnel Statistics (published Quarterly)
-
Diversity Statistics (published biannually)
On 29 June 2016, the MOD announced that the Army will in future plan to use Regular and Reserve Phase 1 trained personnel in response to crises within the UK. Following this, the term ‘Trained Strength’ will include all Army personnel trained in the core function of their Service (i.e. those who have completed Phase 1 training). The MOD has consulted on these changes and the resultant impact it will have on Analysis (Tri-Service) publications and a consultation response were published on 7 November 2016.
From 1 October 2016 edition of Monthly Service Personnel Statistics onwards, Army personnel who have completed Phase 1 training (basic Service training) but not Phase 2 training (trade training), are considered Trained personnel. This change will enable the Army to meet the SDSR 15 commitment to improve support to UK resilience. The Trained Strength definition for the Royal Navy, RAF, Maritime Reserve and RAF Reserves has not changed, reflecting the requirement for their personnel to complete Phase 2 training to be able to fulfil the core function of their respective Services. Army personnel who have completed Phase 2 training will now be called ‘Trade Trained’. This population aligns with the old definition of trained personnel, therefore maintaining the continuity of the statistical time series and will continue to be counted against the Workforce Requirement and SDSR target for 2020.
3. Statistical Processing
3.1 Source Data
The statistics are principally derived from the Department’s Joint Personnel Administration (JPA) system, which is used for the administration of all Armed Forces personnel, supplemented by information from single Services’ management systems and other centrally managed databases. Prior to the introduction of JPA in 2006/07, legacy single Service administration systems were used to produce the statistics.
3.2 Frequency of data collection
Extracts are taken from JPA each month and stored on separate databases to form a time series.
3.3 Data collection
Data goes through a series of automatic validation checks and edits to ensure the basic quality of the data and a series of derived fields are calculated.
The data is then made available to Analysis (Tri-Service) single Service manpower branches. They undertake a wide range of validation checks and implement specialist editing rules using their expert knowledge and experience as well as data obtained from other sources within the Department.
3.4 Data validation
Data goes through a series of automatic validation checks and edits to ensure the basic quality of the data and a series of derived fields are calculated.
The data is then made available to Analysis (Tri-Service) single Service manpower branches. They undertake a wide range of validation checks and implement specialist editing rules using their expert knowledge and experience as well as data obtained from other sources within the Department.
3.5 Data compilation
Once the data is confirmed as being accurate the database is queried to produce the range of tables published. These tables undergo several layers of scrutiny to ensure the outputs are accurate and consistent.
The statistics are counts of military reserve personnel by a range of categories, including breakdowns by: Navy, Army and RAF; regular and volunteer reserves; officer and other ranks; trained and untrained. There are also counts of personnel leaving and joining certain parts of the reserve forces.
Between 2007 and 2012, Naval Service volunteer reserve data were provided direct to Analysis (Tri-Service) by the unit responsible for administering those forces. In 2012 this database was closed and records were transferred on to JPA, however the JPA data were not considered robust enough in time for April 2012 publication and therefore the information was again sourced from the administration unit. Following work to assess the quality of the JPA data, the data processing and reporting methods were brought into line with those in the other Services (and other Defence Statistics personnel data.) As a result, Analysis (Tri-Service) has been able to retain monthly extracts for statistical purposes since October 2012 and can now report from this source.
4. Quality Management
This section will briefly describe the overarching processes in place to manage quality e.g. annual risk assessments, and offer the opportunity to outline results of recent quality assessments.
4.1 Quality Assurance
The MOD’s quality management process for Official Statistics consists of three elements: (1) Regularly monitoring and assessing quality risk via an annual assessment; (2) Providing a mechanism for reporting and reviewing revisions/corrections to Official Statistics; (3) Ensuring BQRs are publishing alongside reports and are updated regularly.
4.2 Quality Assessment
At the time of the last Quality Assessment in November 2021, the Quarterly SPS received a ‘N/A’ as the overall summary in terms of quality risk.
5. Relevance
This section is about the degree to which the statistical product meets user needs in both coverage and content.
Analysis (Tri-Service) frequently meet with customers within the Department to discuss data, results, interpretation and any changes to requirements. They also seek feedback from a wider range of internal and external customers. A key use of these statistics is to allow the Department and the public to assess how the Department is progressing under FR20, which is a top level Departmental programme.
We have made our own assessment of what these statistics could be used for using the categorisation in the UKSA paper The Use Made of Statistics.
We believe the statistics could be used as follows: -
i. Informing the general public’s choices:
a. about the performance of government and public bodies
ii. Government decision making about policies, and associated decisions about related programmes and projects:
b. policy monitoring
The underlying data also allow for:
iii. Government decision making about policies, and associated decisions about related programmes and projects:
c. policy making
iv. Facilitating academic research.
The Service Personnel Report should be used as the authoritative source of reserves strengths and flows statistics.
The MOD has recently consulted on changes to the definition of Army trained strength and the resultant impact it will have on Analysis (Tri-Service) publications and a consultation response were published on 7 November 2016. Changes have been implemented into Monthly Service Personnel Statistics from the 1 October 2016 publication onwards.
A formal consultation on Changes to MOD Armed Forces Personnel Statistics ran between 20 March to 16 April 2015 and 27 May to 18 June 2015. This was following an internal review of MOD Tri-Service publications content and sought further user views on proposed changes to publications.
Detailed information on previous consultations can be found via the National Archives
Users are also encouraged to provide feedback on statistics produced by Analysis (Tri-Service) and also to sign up to the mailing list for their publication of interest, to receive updates to the statistics or to be made aware of any changes: Analysis-Tri-Service@mod.gov.uk
The principal customers for the Tri-Service publications are within the People area of the Ministry of Defence. They are used to inform and measure Service personnel strategy in areas such as pay & allowances and overall troop numbers, and in particular the Future Reserves 2020 Programme (FR20). This product is also used to answer parliamentary questions and Freedom of Information requests. The publications are also used to inform the monthly Defence Board Management Information.
From 2013, reserve strengths and FR20 population strengths and flows information were added to the Quarterly Personnel Report due to the increased attention on reserve personnel in the media and public.
In 2013 TSP7 Reserve Forces and Cadets publication was re-developed in consultation with internal users and suppliers to ensure that it reflected the correct force structures and therefore could be used to monitor the Department’s progress against FR20. Information on the population most relevant to the FR20 key personnel targets was added – including the definition of that population and information on how many have completed training. In addition, a greater range of detail on reserve personnel was added, including age and ethnicity, to provide further background information on these forces to help inform policy.
From 2015 Reserves statistics are being reported on a quarterly basis in the Service Personnel Report. This increase in frequency further reflects the increased public focus on the reserve forces. In addition to this, Reserves diversity statistics previously reported in TSP 7 Reserve Forces and Cadets were moved to the Diversity Statistics publication.
Data availability is not complete for dates prior to April 2012 due to the necessary changes made to reflect modern force structures and the fact that the Department has not retained some information (i.e. Naval Service reserves data and some ex-Regular reserves data). There is no possibility of restoring the whole-time series.
Monthly extracts of strengths from JPA were retained from 1 April 2012 for the Army Reserve and 1 October 2012 for the Maritime Reserve and RAF Reserves. It is therefore not possible to report Future Reserves 2020 flows prior to this period.
6. Accuracy and Reliability
This section is about the differences between the estimates and the unknown true values.
6.1 Overall Accuracy
All personnel in the Regular Armed Forces must be recorded on JPA in order for them to receive their pay, and this is also the case for the vast majority of volunteer reserves. (Although for a small number this may not be the case). There is anecdotal evidence that records can take some time to be updated, although the Department has committed resources to ensuring that JPA is brought and kept up to date in each of the three Services. The data on the overall numbers on volunteer reserves is therefore considered to be reasonably accurate.
The Department has devoted resources at various levels to improving the quality and coverage of volunteer reserves data on JPA over the previous few years. This work has resulted in greater confidence in the statistics published in this report, and in its expansion to include information that was not published before 2013 (e.g. Trained status for volunteer reserves.)
Monthly datasets are passed through a range of automatic and manual validation and editing routines in order to make the key fields as accurate as possible, often drawing upon alternative data sources. Analysis (Tri-Service) dedicated reserve forces analysts have worked with producers to ensure that information is recorded and processed in line with agreed rules and definitions, and that data are retained and stored appropriately for statistical purposes. Analysis (Tri-Service) monitors data and outputs and will query apparent anomalies with producers.
There remain variations in quality which are partly due to differences in the recording of information - some key information is required for managing individuals, whereas other information is left to the individual to complete through a self-service tool. This is believed to be a particular problem for reservists, whose attendance at ongoing training is intermittent, and in many cases at locations where access to the system is difficult. This may impact the coverage of certain fields such as ethnic origin and nationality. Analysis (Tri-Service) monitors coverage in these fields and will only publish figures where coverage is above an appropriate threshold.
There is also a reasonable amount of late reporting which can adversely impact the statistics, especially for areas such as exits and changes to trained status. Obtaining the extract on the sixth calendar day and then calculating the strength at the first of the month overcomes some of this late reporting.
Another source of variation in data quality is the development and use of standalone systems (i.e. outside the JPA system), and manual recording of personnel data within some units particularly affecting the data between 2007 and 2012. For this reason, Naval Service Volunteer Reserve data up to and including April 2012 are considered estimates and are likely to remain so as there is no alternative data source. During 2013, considerable effort was made to use JPA as the primary source of personnel management information with the result that all data for volunteer and regular reserve personnel are now being sourced from JPA.
6.2 Data Revisions
Data revisions are handled in accordance with the MOD’s Official Statistics Revisions and Corrections Policy.
Due to processing errors, it has been necessary to perform numerous revisions to Reserves’ statistics in various editions of the Quarterly SPS. Specifically, those published after May 2021, and most recently the January 2022 edition.
7. Timeliness and Punctuality
This section reports on the time gap between publication and the reference period (timeliness) and the gap between planned and actual publication dates (punctuality).
7.1 Timeliness
The quarterly extracts are taken on the sixth calendar day. The editing and production process usually takes around 2 weeks at the single Service level. It then takes a further 2-4 weeks to compile these data at a tri-Service level.
7.2 Punctuality
The following table provides an example of the timeline for 1 July 2022 Statistics.
Publication | Situation Date | Publication Date |
---|---|---|
Service Personnel Statistics | 1 July 22 | 15 September 22 |
Historic and planned publication dates can be found on the Publication Release Dates section of the GOV.UK Statistics at MOD webpage and on the UK National Statistics Publication Hub
8. Coherence and Comparability
This section examines: the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar (coherence); and the degree to which data can be compared over time and domain (comparability).
The Analysis (Tri-Service) statistics on UK Reserve Forces are the definitive personnel statistics in the MOD. Volunteer reserve statistics broken down by reserve type, trained status and officer / other rank are published quarterly in the Service Personnel Report. There are no other publicly available regular publications on the numbers of UK Reserve Forces with which to ensure coherence. Within the MOD direct queries of the Joint Personnel Administration system will produce markedly different numbers due to timing and quality issues.
The UK Reserve Forces personnel statistics are not always directly comparable with other countries’ statistics due to definitional differences in what constitutes a Reserve Force.
Annual editions of UK Defence Statistics since 1992 and historic Tri-Service publications back to 2001 are available at Statistics at MOD by following the link to “archived statistics”. The total number of Service personnel are comparable across time but the breakdowns are generally not comparable due to structural changes to the physical and financial structures of the MOD. The introduction of JPA across 2006/07 also impacted the statistics, reducing availability, accuracy and coverage due to a lack of investment in JPA for reserve personnel, as the Regular forces were prioritised. Work during 2012 and 13, as outlined above, resulted in changes to the statistics, both in data processing and definitions. Analysis (Tri-Service) has endeavoured to match historical data to the current definitions; where this has not been possible the tables are marked with a break in series.
9. Accessibility and Clarity
The reports are published at Statistics at MOD and are currently available in Excel spreadsheet and PDF format. They can be found by first following the “statistics by topic” link and then under the “Personnel statistics” heading. They can also be accessed via the Statistics release calendar or through an internet search engine such as Google.
10. Trade-offs between Output Quality Components
This section reports the extent to which different aspects of quality are balanced against each other
The main trade-off is between timeliness and quality. To ensure statistics are timely the editing and validation process is restricted to around three weeks. We do not have the resources to investigate further so we publish to a level that we feel is appropriate. Self-reported data are most often published as-is.
Due to the nature of the reserve forces as outlined above, the data are unlikely to be as robust as that for their regular counterparts. For example, information on whether individuals are considered trained or not is wholly reliant on Unit Admins updating the relevant data in a correct and timely fashion. Great effort has been made by the Department to educate Unit Admin staff, however they are often volunteer reservists themselves and subject to time and access pressures. The information is published acknowledging that it may never be 100% accurate but it represents the best information available, and there has been considerable improvement since 2012.
11. Cost and Respondent Burden
This section is about the effectiveness, efficiency and economy of the statistical output.
DS has four branches dedicated to producing information relating to personnel and providing analysis and advice. However, the majority of time is spent on adding value through analysing, forecasting and answering ad hoc queries rather than producing the National Statistics per se. Some of Analysis (Tri-Service) other branches (there are approximately 15) provide support to the manpower branches. 3x FTE statistical officers (1 of which is Army-focused) are currently dedicated to producing and compiling reserve forces information.
There is some respondent burden as not all data are automatically obtained from administrative systems.
12. Confidentiality and Security
This section is about the procedures and policy used to ensure sound confidentiality, security and transparent practices.
12.1 Confidentiality – Policy and Data Treatment
Disclosure control is conducted on all statistical information provided by the MOD to safeguard the confidentiality of individuals.
All published outputs are counts of individuals in particular groupings. Where there are possible disclosure issues in reporting protected characteristics, outputs are rounded according to Analysis (Tri-Service) rounding policy, which prevents disclosure of information on individuals.
12.2 Security
All staff involved in the production process have signed the Data Protection Act; all MOD, Civil Service and data protection regulations are adhered to. The data is stored, accessed and analysed using the MOD’s restricted network and IT systems, and the access to raw data is password protected.
The Analysis (Tri-Service) website can be accessed here: Statistics at MOD
Updated: 15 December 2022
Background Quality Report: MoD Sponsored Cadet Force Statistics
1. Contact
This section provides details on the Analysis (Tri-Service) Head of Branch is the Responsible Statistician for these statistics.
Analysis (Tri-Service)
Ministry of Defence
Floor 3 Zone M
Main Building, Whitehall
London SW1A 2HB
E-mail: Analysis-Tri-Hd@mod.gov.uk
Website: Statistics at MOD
2. Introduction & Statistical Presentation
The Ministry of Defence (MOD) publishes a wide range of personnel statistics. The main purpose of these statistics is; to inform policy and decision making within the Department, to measure the performance of the Ministry of Defence against Government and Parliament targets, and also to inform general debate in Government, Parliament and the wider public.
This background quality report covers the cadet forces statistics published on the Analysis (Tri-Service) website on GOV.UK:Statistics at MOD
Historic reports can be found on the archived Analysis (Tri-Service) website on the National Archives site
3. Statistical Processing
3.1 Source Data
Personnel statistics are derived from legacy single Service administration systems used to produce statistics.
Cadet data are provided to Analysis (Tri-Service) Tri Service by Reserve Forces and Cadets and sourced from the Cadet Management Information System.
3.2 Frequency of data collection
Extracts are taken from Cadet Management Information annually.
3.3 Data collection
The extracts are taken six calendar days after the end of April and the situation as at the first of April is calculated. This ensures most late-reporting is captured.
3.4 Data Validation
The data goes through a series of automatic validation checks based on previous corrections.
The data is then made available to experts in each service where they undertake a range of checks using their expert knowledge and experience as well as data obtained from other sources within the Department.
These tables undergo several rounds of checking and scrutiny to ensure the outputs are accurate and consistent, before being published on an annual basis.
3.5 Data Compilation
Once the data is confirmed as being accurate the database is queried to produce the range of tables published. These tables undergo several layers of scrutiny to ensure the outputs are accurate and consistent.
These statistics are primarily counts of strengths (numbers of personnel at the 1st of April), broken down into the following populations / characteristics of interest:
- Community Cadets or Combined Cadet Force
- Service (Royal Navy/Royal Marines, Army, and Royal Air Force)
- Cadets and Cadet Forces Adult Volunteers
- Gender
- Age
4. Quality Management
This section will briefly describe the overarching processes in place to manage quality e.g. annual risk assessments, and offer the opportunity to outline results of recent quality assessments.
4.1 Quality Assurance
The MOD’s quality management process for Official Statistics consists of three elements: (1) Regularly monitoring and assessing quality risk via an annual assessment; (2) Providing a mechanism for reporting and reviewing revisions/corrections to Official Statistics; (3) Ensuring BQRs are publishing alongside reports and are updated regularly.
4.2 Quality Assessment
At the time of the last Quality Assessment in November 2021, the MOD Sponsored Cadet Forces - publication received a ‘N/A’ as the overall summary in terms of quality risk.
5. Relevance
This section is about the degree to which the statistical product meets user needs in both coverage and content.
Analysis (Tri-Service) frequently meets with customers within the Department to discuss data, results, interpretation, and any changes to requirements. They also seek feedback from a wider range of internal and external customers.
We have made our own assessment of what these statistics could be used for using the categorisation in the UKSA paper The Use Made of Statistics.
We believe the statistics could be used as follows: -
i. Informing the general public’s choices:
a. about the performance of government and public bodies
ii. Government decision making about policies, and associated decisions about related programmes and projects:
b. policy monitoring
The underlying data also allow for:
iii. Government decision making about policies, and associated decisions about related programmes and projects:
c. policy making
iv. Facilitating academic research.
Detailed information on previous consultations can be found via the National Archives
Users are also encouraged to provide feedback on statistics produced by Analysis (Tri-Service) and also to sign up to the mailing list for their publication of interest, to receive updates to the statistics or to be made aware of any changes: Analysis-Tri-Service@mod.gov.uk
The principal stakeholders for these statistics are within the Chief of Defence Personnel area of the Ministry of Defence. They are also used to answer parliamentary questions and Freedom of Information requests.
For detail on pre-release access to Analysis (Tri-Service) publications please see the Analysis (Tri-Service) pre-release access list for the most up to date list of roles receiving pre-release access.
People in the roles with access receive pre-release access to the publication up to 24 hours in advance of publication.
These statistics were originally developed in close consultation with stakeholders. The Quarterly, Monthly and Annual Personnel Reports and Bulletins were reviewed by the UK Statistics Authority (UKSA) in 2013 to ensure they met the requirements of a National Statistic. This review led to the inclusion of more commentary and where possible references to relevant policy to provide greater context to the figures reported.
There are no known unmet user needs.
6. Accuracy and Reliability
This section is about the differences between the estimates and the unknown true values.
Regular feedback ensures Analysis (Tri-Service) staff are kept abreast of any changes or potential issues with the data and statistics, which is fed into the data validation and editing process.
7. Timeliness and Punctuality
This section reports on the time gap between publication and the reference period (timeliness) and the gap between planned and actual publication dates (punctuality).
The annual extracts are taken on the first working day of April. The editing and production process usually takes about three weeks at the single Service level. It then takes a further four weeks to compile and publish these data at a Tri-Service level and publish as National Statistics.
Historic and planned publication dates can be found on the UK National Statistics Publication release list
In 2017, the report was delayed from May until July to ensure the quality of the publication.
In 2020 and 2021, the report was delayed from May until June as a precautionary measure to mitigate the impact of COVID-19.
Historic and planned publication dates can be found at statistics at MOD and on the UK National Statistics Publication Hub.
8. Coherence and Comparability
This section examines: the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar (coherence); and the degree to which data can be compared over time and domain (comparability).
Analysis (Tri-Service) published statistics on UK Armed Forces are the definitive personnel statistics in the MOD. There are no other available data sets which to ensure coherence.
Annual editions of UK Defence Statistics compendium dating back to 1992, plus historic Tri-Service publication TSP07 dating back to 2002, are available in the National Archives.
9. Accessibility and Clarity
This section reports on: the ease with which users are able to access the data, the format in which the data are available, and the availability of supporting information (accessibility); and the quality and sufficiency of the metadata, illustrations and accompanying advice (clarity).
Current publications consist of detailed Excel tables containing a historic time-series of statistics, a PDF report containing commentary, graphs and tables on trends in the statistics and an accessible HTML version of the publication. The commentary in our reports identifies and analyses the key changes in the data and provides summary statistics and policy context. Graphs, tables and other data visualisation methods are used to further explain these trends.
Previous Analysis (Tri-Service) personnel reports are published on GOV.UK and are available as PDFs or Excel value copies. Other formats may be possible for Analysis (Tri-Service) to produce on request.
All Analysis (Tri-Service) publications that use these data can be found under the “Military” and “Combined military and civilian” sections under the “Personnel statistics” heading on the statistics by topic section of the MOD National and Official Statistics by topic webpage
They can also be accessed via the Statistics release calendar.
Copies of the reports are also placed in the House of Commons library.
10. Trade-offs between Output Quality Components
This section reports the extent to which different aspects of quality are balanced against each other.
The main trade-off is between timeliness and quality. To ensure statistics are timely the editing and validation process is restricted to around two weeks and a significant amount of automatic editing is utilised. Spending more time investigating every suspect individual personnel record could marginally improve quality at a detailed trade/rank level but is unlikely to impact the aggregated statistics published in our reports.
The COVID-19 pandemic resulted in the suspension of face-to-face cadet activity in 2020 and 2021, which severely disrupted engagement with Cadets and Adult Volunteers. This directly impacted the recording and processing of applications to join each respective Cadet Force. Some administrative processes are likely to have been delayed due to COVID-19.
11. Cost and Respondent Burden
This section is about the effectiveness, efficiency and economy of the statistical output.
Analysis (Tri-Service) has four branches dedicated to producing information relating to manpower and providing analysis and advice, the majority of time is spent on adding value through analysing, forecasting and answering ad-hoc enquiries as well as producing the National Statistics.
There is very little respondent burden as the majority of the data is automatically obtained from administrative systems. However, this is supplemented with small amounts of data as well as input from other areas within the MOD.
12. Confidentiality, Transparency and Security
This section is about the procedures and policy used to ensure sound confidentiality, security and transparent practices.
12.1 Confidentiality- Data Treatment
All published outputs are counts of individuals in particular groupings. Where there are possible disclosure issues in reporting protected characteristics, outputs are rounded according to Analysis (Tri-Service) rounding policy, which prevents disclosure of information on individuals.
Disclosure control is conducted on all statistical information provided by the MOD to safeguard the confidentiality of individuals.
12.2 Transparency
The reports provide commentary on the key features of the outputs and identify any issues or caveats to the data. This quality report provides further information on the method, production process and quality of the output.
12.3 Security
All staff involved in the statistical production process adhere to all MOD, Civil Service and data protection regulations. The data is stored, accessed and analysed using the MOD’s restricted network and IT systems, and access to raw data is password protected and approval for access is granted only by the Head of Personnel Statistics.
References
The Analysis (Tri-Service) website can be accessed at Statistics at MOD
Updated:15 December 2022
Background Quality Report: Armed Forces Application Statistics
1. Contact
This section provides details on the Analysis (Tri-Service) Head of Branch is the Responsible Statistician for these statistics.
Analysis (Tri-Service)
Ministry of Defence
Floor 3 Zone M
Main Building, Whitehall
London SW1A 2HB
E-mail: Analysis-Tri-Hd@mod.gov.uk
Website: Statistics at MOD
2. Introduction & Statistical Presentation
2.1 Overview
Analysis Tri-Service publishes a large range of Armed Forces (AF) personnel statistics, mainly to inform policy and decision making within the Department. The statistics are also used to measure performance against MOD, Government and Parliament targets and to inform general debate in Government, Parliament and the wider public.
This background quality report covers the Official Statistics specifically concerning the number of applications to serve in the AF received by each of the three Services. Application statistics were published quarterly until a new recruitment system, namely the Defence Recruitment System (DRS), was introduced. Application data since 1 October 2017 has not been available as an official statistic. We have verified and applied quality assurance to the received data, and data at quarterly points starting from 1 July 2019 is available in the Service Personnel Publication
The application is the first formal submission for scrutiny and it differs from “applicant” since an applicant may submit more than one application. Whilst application counts for each service are based on online applications submitted by an individual and accepted by Defence Recruitment System (DRS), work is ongoing to verify that application processes and definitions are consistent and we would recommend that numbers should not be aggregated to show Armed Forces totals. This will be discussed in more detail later in this section.
3. Statistical Processing
3.1 Source Data
The Defence Recruitment System (DRS) is where recruitment data is currently held and is used to support the Armed Forces recruitment process.
3.2 Frequency of Data Collection
Record level data is provided on a quarterly basis to Analysis Tri-Service by the single Services’ (sS) recruitment teams from DRS.
3.3 Data Collection
Record level data provided by the sS recruitment teams are stored separately from the live database to provide a time series of historical data.
3.4 Data Validation
Analysis Tri-Service verifies, processes and collates the data into the format required for publication in accordance with UK Statistics Authority (UKSA) guidelines
3.5 Data Compilation
DRS is an administrative data source since the data it holds were not primarily collected with statistical purposes in mind. The National Statistician’s Office (NSO) provides guiding principles around the use of administrative data for statistical purposes[footnote 1], with a particular focus on the statistician’s role in assuring and communicating the quality of administrative data used to produce official statistics. The guidance aims to ensure the requirements of the Code of Practice for Official Statistics, and the expectations of users, are met. It references the requirements of Protocol 3 of the Code of Practice, which specifically addresses the use of administrative sources for statistical purposes.
NSO guidance has been adhered to in the assessment of the data, and in subsequent work with data suppliers with regard to data validation. Further details regarding the work conducted can be found in the “Accuracy and Reliability” section below. Due to the subtle differences in the recruitment application processes, sS numbers should not be aggregated to show total AF applications. The table below sets out the differences in the three Service’s approach:
The statistics are counts of the number of applications to serve in the AF received by each of the three Services; providing a breakdown by Service, Regular and Volunteer Reserve Forces, and by Officer and Other Rank.
Service | Definition |
---|---|
RN/RM | An application is defined as an online application submitted by an individual and accepted by Defence Recruitment System (DRS). |
Army | An application is an online application submitted by an individual and accepted by Defence Recruitment System (DRS). For an application to be conducted the contact needs to have successfully registered on the Army website. Registrations are recorded by the RRP. |
RAF | An application is defined as an online application submitted by an individual and accepted by Defence Recruitment System (DRS). |
4. Quality Management
This section will briefly describe the overarching processes in place to manage quality e.g. annual risk assessments, and offer the opportunity to outline results of recent quality assessments.
4.1 Quality Assurance
The MOD’s quality management process for Official Statistics consists of three elements: (1) Regularly monitoring and assessing quality risk via an annual assessment; (2) Providing a mechanism for reporting and reviewing revisions/corrections to Official Statistics; (3) Ensuring BQRs are publishing alongside reports and are updated regularly.
4.2 Quality Assessment
At the time of the last Quality Assessment in November 2021, the Quarterly SPS received a ‘N/A’ as the overall summary in terms of quality risk.
5. Relevance
Relevance is the degree to which the statistical output meets current and potential users’ needs. Describe how the data meets these needs.
Analysis Tri-Service frequently meets with customers within the Department to discuss data, results, interpretation and any changes to requirements. They also seek feedback from a wider range of internal and external customers.
Users are also encouraged to provide feedback on Analysis Tri-Service through the annual consultation meetings.
Due to the high-profile nature of application statistics, this information serves the need of the Department to have consistent, reliable Official Statistics on which to base its press releases and other queries.
We have made our own assessment of what these statistics could be used for using the categorisation in the UKSA paper The Use Made of Statistics. We believe the statistics could be used as follows: -
i. Informing the general public’s choices:
a. about the performance of government and public bodies
ii. Government decision making about policies, and associated decisions about related programmes and projects:
b. policy monitoring
The underlying data also allow for:
iii. Government decision making about policies, and associated decisions about related programmes and projects:
c. policy making
iv. Facilitating academic research.
6. Accuracy and Reliability
This section is about the differences between the estimates and the unknown true values.
6.1 Overall Accuracy
Initially, three stages of recruitment were identified during the recruitment process and data scoping phase. These were: initial “contacts” made by individuals wishing to garner information about joining the AF; formal “applications” submitted to join the AF; and “potential entrants”, whom have passed the selection criteria to join the AF and have received an offer of employment.
Subsequently it was recognised that there is often no distinct contact step since candidates may go directly to the application stage, and applications submitted on-line cannot be linked back to a contact records. It is, therefore, not possible to identify a complete cohort of “contacts”. Also information pertaining to potential entrants is not generally collected by the sS. However, gathering applications data remained a feasible and appropriate recommendation.
The quality of application data was assessed in terms of fitness for purpose and whether the conditions required to be produced as Official Statistics were met.
Data are collected on DRS to manage the recruitment of the AF. All new entrants are required to formally apply and complete various stages of the recruitment process, and DRS is the live system which collates this information. The overall number of applications received is, therefore, judged to be sufficiently accurate, although in some circumstances manual input is required, and some timelags are apparent.
The single Service Recruiting Teams have checked and verified that the applications data collated by Analysis – Tri is consistent and accurate with their internal products.
At the macro level, the existence and potential for data errors is currently assessed as not having a disproportionate impact on the quality of the resulting statistics. There is a potential for duplicate applications to be present in the published statistics as all applications have been counted, and multiple applications from the same applicant are accepted by the system. Applications used to test the application system have been removed as they are not true applications.
With regard to incentive and opportunity for distorting the data supplied; since DRS is maintained for operational purposes, with the statistical use a secondary benefit; and the data is extracted directly from DRS via management information interfaces, this is considered to be minimal.
Once the sS recruitment teams have applied their specialist knowledge to extract the data in accordance with their specific definitions of an application, Analysis Tri-Service independently processes and collates the data into the format required for publication in accordance with UKSA guidelines.
6.2 Data Revisions
Data revisions are handled in accordance with the MOD’s Official Statistics Revisions and Corrections Policy.
There has been a revision to the RAF applications figures, due to a processing error in the 1 July 2021 and 1 October 2021 Quarterly Service Personnel Statistics.
7. Timeliness and Punctuality
This section reports on the time gap between publication and the reference period (timeliness) and the gap between planned and actual publication dates (punctuality).
The sS provide record level data to Analysis Tri-Service on a quarterly basis, while each of the sS receive their extracts on a monthly basis from DRS.
The verification, processing and collating exercise takes Analysis Tri-Service approximately two weeks to complete. The statistics are published in the Service Personnel Statistics publication at the quarterly points. In the Service Personnel Statistics publication there is a reporting lag of 1 quarter. There is a break in the time series between 1 October 2017 and 1 July 2018 due to the change from TAFMIS to DRS and due to the need to ensure consistency of the start date for reintroduced data between the sS.
Navy/RAF applications figures have not been updated in the 1 July 2022 and 1 October 2022 editions of the Quarterly Service Personnel Statistics. This is due to technical issues while migrating the data for Applications to a new system, the full 12-month time series is incomplete and requires extensive quality control and other validation.
Army applications figures have not been updated in the 1 October 2022 edition but were updated in the 1 July 2022 edition of the Quarterly Service Personnel Statistics, however these figures are still marked as provisional and are subject to change. There are currently issues with supply of the underlying data and all parties are working together to resolve these issues and it is planned to publish them in a future edition.
8. Coherence and Comparability
This section examines: the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar (coherence); and the degree to which data can be compared over time and domain (comparability).
Analysis (Tri-Service) teams are the definitive source of UK AF statistics in the MOD. There are no other publicly available regular publications on the numbers of applications to serve in the AF received by each of the three Services with which to ensure coherence. However this data may be publicly available in the responses to Parliamentary Questions and Freedom of Information Requests.
Due to the subtle differences in the recruitment application processes, single Service numbers should not be aggregated to show total AF applications. Further details about the different definitions can be found in the “Methodology and Production” section above.
In addition, the number of applications received does not directly relate to the intake figures Analysis Tri-Service compiles since there is a time-lag between an application being received and an individual successfully being taken onto untrained strength.
9. Accessibility and Clarity
This section reports on: the ease with which users are able to access the data, the format in which the data are available, and the availability of supporting information (accessibility); and the quality and sufficiency of the metadata, illustrations and accompanying advice (clarity).
The commentary reports were first published on 13th August 2015 and are available in pdf/HTML format; detailed tables of time series data in Excel format can be found via the same link.
10. Trade-offs between Output Quality Components
This section reports the extent to which different aspects of quality are balanced against each other.
Trade-offs exist between timeliness and quality, plus resource constraints in the training delivery organisations mean it is not always possible to correct known data issues.
To ensure statistics are timely, the editing and validation process is restricted to around two weeks. Spending more time investigating every suspect individual personnel record could marginally improve quality at a detailed trade/rank level but is unlikely to impact the aggregated statistics published. Data may include duplicate applications as all valid applications based on the definitions provided are counted. Duplicate applications are defined as multiple applications from a single applicant, intentional or otherwise. Data cleansing in regard to applications submitted to test the system has been undertaken and identified applications have been removed.
11. Cost and Respondent Burden
This section is about the effectiveness, efficiency and economy of the statistical output.
Analysis Directorate has four branches dedicated to producing information relating to personnel and providing analysis and advice, the majority of time is spent on adding value through analysing, forecasting and answering ad-hoc enquiries as well as producing the National Statistics.
There is very little respondent burden as the majority of the data is automatically obtained from administrative systems. However, this is supplemented with small amounts of data as well as input from other areas within the MOD.
12. Confidentiality, Transparency and Security
This section is about the procedures and policy used to ensure sound confidentiality, security and transparent practices.
12.1 Confidentiality
All published outputs are counts of individuals in particular groupings. The outputs are rounded according to Analysis Tri-Service’s rounding policy which reflects the degree of accuracy of the outputs and prevents disclosure of information on individuals. See the rounding policy on the Analysis Tri-Service policies page here: MOD statistics: policies
12.2 Transparency
The reports provide commentary on the key features of the outputs and identify any issues or caveats to the data. This quality report provides further information on the method, production process and quality of the output.
12.3 Security
All staff involved in the statistical production process adhere to all MOD, Civil Service and data protection regulations. The data is stored, accessed and analysed using the MOD’s restricted network and IT systems, and access to raw data is password protected and approval for access is granted only by the Head of Personnel Statistics.
The Analysis (Tri-Service) website can be accessed here: Statistics at MOD
Updated: 15 December 2022
-
National Statistician’s Office (2014). Using Administrative Data: Good Practice Guidance for Statisticians UK Statistics Authority (2009). (2012) Principles to guide the Statistics Authority’s assessment of quality assurance practices relating to statistics produced from administrative data ↩