Fire and rescue incident statistics: Methodology and quality report
Updated 17 October 2024
Applies to England
First published 13 August 2020
Updated 13 May 2021
1. Background to the statistics
The source of the data used for Home Office statistical publications involving incidents attended by fire and rescue services (FRSs) is the online Incident Recording System (IRS) which was introduced in April 2009. Full details of the questions and categories used in the recording of incidents attended by FRSs in the IRS are available in the document ‘IRS Questions and Lists’. This can be downloaded from our Guidance page.
1.1 Publications
The IRS data is used to produce seven publications per year:
- Fire and rescue incident statistics, quarterly; presents trends in fires, fire-related fatalities and fire casualties, false alarms and non-fire incidents attended by fire and rescue services
- Detailed analysis of fires, annual; presents detailed statistics on fires attended by fire and rescue services, and fire-related fatalities and non-fatal casualties in those fires; including analyses of the causes of fires and smoke alarms operation
- Detailed analysis of non-fire incidents, annual; presents detailed statistics on non-fire incidents attended for example emergency medical responding, flooding, road traffic collisions, fatalities and casualties
- Fire and rescue response times, annual; focuses on trends in average response times to different types of fires
The publication calendar for the fire statistics collection can be found on the gov.uk website.
1.2 Quality Assurance (QA) Process
Data is collected in real time as firefighters respond to incidents and enter information into the IRS. Some fields are updated on a continuous basis as fire and rescue investigations proceed and new information is obtained.
The quality assurance processes in place are focussed on the accurate capture of data, consistency of recording, and the accurate transfer of processed data into a range of publications and published tables. If the underlying data is inaccurate then this has significant impact on all its’ potential uses. Data that are widely used and that inform important and high-profile decisions will receive the highest level of QA. Other data will undergo a more limited, but proportional level of QA. This ensures the data is fit-for-purpose in terms of the individual uses of each dataset.
In 2015 the UK Statistics Authority (UKSA) published a regulatory standard for the quality assurance of administrative data. To assess the quality of the data provided for this release the Home Office has followed that standard. The standard is supported with an Administrative Data Quality Assurance Toolkit which provides useful guidance on the practices that can be adopted to assure the quality of the data they utilise. This section draws on that guidance, recognising that quality assurance of administrative data is more than just checking that the figures add up. It is an ongoing iterative process to assess the data’s fitness to serve their purpose. It covers the entire statistical production process and involves monitoring data quality over time and reporting on variations in that quality.
An assessment of the level of risk based on the Quality Assurance Toolkit is as follows:
Risk/Profile Matrix Statistical Series: Fire and rescue incident statistics
Administrative Source: IRS
Data Quality Concern: Low
Public Interest: High
Matrix Classification: Medium Risk [A2]
The publication of Fire and rescue incident statistics can be considered as high profile, as there is significant mainstream media interest, with moderate economic and or political sensitivity. The data quality concern is considered a low concern given that the data are checked by providers and the data are then further quality assured in detail by the statisticians responsible for the publication, who perform further detailed validation and checks, spotting and correcting any errors. These checks involve comparisons with data provided, published or historical data.
Overall, the Fire and rescue incident statistics have been assessed as A2: Medium Risk. This is mainly driven by the high-profile nature of the figures.
There is a clear Memorandum of Understanding between the Home Office and FRSs outlining the terms and purpose of what data are provided.
Quality assurance by FRSs
The online IRS makes the following automatic checks:
- only the applicable questions are asked
- all dates and or times are complete and in the correct format
- dates and or times are in a valid order
- only appropriate options are displayed
The IRS cannot check that all the data is ‘logically correct’, because the range of possible incidents is too numerous and there are many incidents which are unlikely but that do occur. For example, chip pan fires tend to start in kitchens, but if a kitchen was being refurbished, it is possible to have a chip pan fire in a dining room. Examples like these can only be checked by having a process for quality control and assurance by the FRSs, which will be different in different FRSs.
For this reason, the IRS has a workflow where an incident is first ‘Recorded’ by the officer in charge (OIC) and then ‘Published’ after checking by the FRS quality assurance team who carry out a check of the information being submitted. Only once the data has been ‘Published’ will it be quality assured by the HO.
FRSs have been provided with a priority list of fields that should be checked, and the rationale for checking:
For all incidents, the following fields should be checked:
What type of property was involved? The property type is a key field used for reporting and analysis, the IRS has many property types (305) which will allow detailed analysis. However, the large number of options can mean that it can be difficult to find the correct option.
Address and location information It is imperative that the incident location is recorded as accurately as possible, the OIC should confirm that the address (if applicable) and location are accurate even if the data has been pre-populated. Often it may be necessary to alter the location received from the mobilizing system e.g. for outdoor fires, skip fires, fires at large premises. This can be done by confirming the location on the map.
Casualty information Incidents involving casualties are of particular importance and so the OIC should check carefully that all the details provided are correct. Incidents involving casualties represent less than five per cent of all incidents.
In addition, for primary fires:
Reason alarm system did not function as intended
What was the cause of the fire?
What was the source of ignition? This question has many options (81) and so care must be taken to ensure the correct option is selected.
What item was ignited first? This question has many options (47) and so care must be taken to ensure the correct option is selected.
What was the item, if any, that was mainly responsible for the spread of the fire? This question has many options (47) and so care must be taken to ensure the correct option is selected.
What type of room/compartment did the fire start in (Location of Origin)? This question has many options (up to 53) and so care must be taken to ensure the correct option is selected. This data is used for reporting and analysis.
In addition, for special services: Special Services Incident type
In addition, for false alarms: What was the reason for the False Alarm?
Quality assurance by Home Office statisticians
Data received by the Home Office undergo a quality assurance process to ensure the data is fit-for-purpose and published to the highest possible standard. Any data quality issues are flagged and subsequently resolved with FRSs.
Home Office statisticians run some specific checks during a ‘monthly monitoring’ process. This looks at data gaps and variance checks to identify figures that seem unusually large or small compared with figures for the same month in the previous year or unusual patterns in the data. A large increase or decrease is flagged to the FRS and they are asked for an explanation, for example unusually hot, dry weather leading to more outdoor fires than usual. In some instances, this leads to FRSs uploading further incidents or becoming aware of the need to delete duplicates. Where a significant number of incidents are missing, the tables are footnoted accordingly. For example, in May 2018, it was footnoted that the figures for Suffolk in April 2017 to March 2018 Q3 were a known underestimate because records had been relatively slow to be added to the IRS. The revisions policy ensures that these records were included in future tables.
When the quarterly snapshot is taken for analysis purposes, further checks are carried out. For example, the additional comments field is accessed to check for fatalities that are marked as not due to the fire (e.g. where a person has died in a road traffic collision and the vehicle catches fire afterwards) and the coding checked.
So that the reliability of FRSs’ data can be ensured, an annual reconciliation exercise is conducted where each FRS is asked to check its own data systems for the number of incidents, fires, false alarms, non-fire incidents, non-fatal casualties and fire-related fatalities for the current and previous financial year and they are asked to confirm numbers and supply revised figures where necessary. In addition, FRSs are asked to compare their fatality figures against another data source, for example fire investigation reports, to ensure there are none missing. Details of any known data quality issues are included in the relevant part of the bulletin, and or data tables.
Where an error is spotted, Home Office statisticians consider whether this is likely to be an FRS specific issue, or whether other FRSs should be notified to ensure consistency of data between FRSs. Recent examples of errors or issues spotted, and the action subsequently taken include the way FRSs deal with incidents that take place ‘over the border’ which led to Home Office statisticians sending a circular to all FRS data leads to clarify the process.
Home Office statisticians extracted information for reports and ad-hoc analyses from the IRS using SQL. Results are exported to Excel as an intermediary step before preparing the final published product.
The SQL code is often complex and so needs to be well documented, with considerable attention to detail. As part of the processing of data, the SQL code includes routines to produce frequencies and data extracts of existing and derived variables. This acts as a double-check for potential errors occurring as a result of recoding and aggregation, mitigating the risk of errors being introduced in the processing stage.
Once the Excel tables are produced for the publication, they are independently checked by a second member of the team against the raw data. They also work through a clear checklist for each table, running standard variance checks, e.g. is this a significant increase from last year, and to look at, for example, totals that should be consistent within and between tables, consistency checking between tables, ensuring consistency between products, and checking totals, percentages etc. within tables are calculated correctly, hyperlinks work, formulae and drop-down lists update correctly. Each check is systematically signed off when it has been completed.
Home Office statisticians are responsible for checking that the commentary appropriately describes the trends seen in the data and is not biased and it is checked against the tables for accuracy. Reports and tables are checked for accessibility using tailored guidelines and checklist drawn up from the Web Content Accessibility Guidelines.
Reports are peer reviewed and signed off at a senior level prior to publication.
The data underpinning publications is held in a snapshot form so that the content of publications can be replicated and that the option remains for additional historical scrutiny and analysis if required.
Feedback received may include queries on the data itself, but more often will ask for reasons behind levels or changes in the data, in order to understand trends or unusual figures. It is rare for errors to be made in publications. When made, these are corrected either immediately or in the next release (depending on severity and frequency) in line with our revisions policy.
NATIONAL STATISTICS STATUS
National Statistics status means that our statistics meet the highest standards of trustworthiness, quality and public value, and it is our responsibility to maintain compliance with these standards.
The designation of these statistics as National Statistics was confirmed in June 2012 following a full assessment against the Code of Practice by the then UK Statistics Authority.
Since the review we have continued to comply with the Code of Practice for Statistics and have made the following improvements:
- reviewing the content, labelling and timing of outputs in 2017, leading to the introduction of the quarterly headline statistic publications, a new publication on detailed analysis of non-fire incidents, and the renaming of publications to better describe their content
- liaising with IRS users at the Fire Statistics User Group meetings to enhance our understanding of how the statistics and recording system are used so that they remain relevant and suitable for their purpose
- undertaking a user consultation for our Response times publication, to clarify the incidents included, and to ask for feedback on the publication generally
- automating more of our table production to reduce the opportunity for manual error
- improving the GOV.UK publication page by collating all the data tables in one place and ensuring the previous versions of tables are available on an archive page
- investigating and improving coding and checking of particular fields eg domestic appliance brands
- introduction of raw data sheets within the excel data tables with drop-down selection boxes to enable users to run their own pivot tables
- introduction of incident level datasets (ILDs) to enable users to run their own more detailed analysis
- improving guidance supplied to FRSs by encouraging them to email the IRS Helpdesk or the FireStatistics Inbox and opening a dialogue regarding the current guidance and specific, perhaps unusual, incidents, for example very large wildfires
1.3 Revisions policy
Non-Scheduled Revisions
The Home Office corrects and revises data in accordance with its ‘Statement of compliance with the Code of Practice for Official Statistics’.
Where a substantial error has occurred as a result of the compilation, imputation or dissemination process, the statistical release, live tables and other accompanying releases will be updated with a correction notice as soon as is practical.
Scheduled Revisions
Changes to the data sources used in the releases are incorporated in the next scheduled release of data.
The IRS is a continually updated database, with FRSs adding incidents daily. The figures in the published releases refer to records of incidents that were submitted to the IRS by a specific date, when a snapshot of the database was taken for the purpose of analysis. As such, we note in the release that the statistics published may not match those held locally by FRSs and revisions may occur in the future. This is particularly the case for statistics with relatively small numbers, such as fire-related fatalities. For instance, this can occur because coroner’s reports may mean the initial view taken by the FRS will need to be revised; this can take many months, even years, to do so.
The statistical releases using IRS data are set up to take revisions to data from year ending March 2011 onwards.
Table 1 shows the extent of revisions made to the number of incidents attended by FRSs in England, for the last three years, by quarter. This compares the data extracted from the IRS on 15 March 2020 (and published on 14 May 2020) with the same set of data extracted on 14 June 2020 (and published on 13 August 2020). This analysis confirmed that the extent of further amendment to IRS records is minimal on an annual basis, and at the England level – giving users confidence that the published statistics provide a sufficiently accurate measure of incidents.
Table 1 The extent of revisions of the total number of incidents attended by FRSs in England
Quarter | Total incidents as at 15 March 2020 | Total incidents as at 14 June 2020 | Absolute difference | Percentage increase |
---|---|---|---|---|
2016/17 Apr, May, June | 137,038 | 137,041 | 3 | 0.00% |
2016/17 July, Aug, Sept | 154,248 | 154,249 | 1 | 0.00% |
2016/17 Oct, Nov, Dec | 143,490 | 143,491 | 1 | 0.00% |
2016/17 Jan, Feb, Mar | 125,721 | 125,723 | 2 | 0.00% |
2017/18 Apr, May, June | 153,272 | 153,279 | 7 | 0.00% |
2017/18 July, Aug, Sept | 146,628 | 146,632 | 4 | 0.00% |
2017/18 Oct, Nov, Dec | 140,516 | 140,528 | 12 | 0.01% |
2017/18 Jan, Feb, Mar | 126,036 | 126,036 | 0 | 0.00% |
2018/19 Apr, May, June | 146,183 | 146,201 | 18 | 0.01% |
2018/19 July, Aug, Sept | 171,691 | 171,710 | 19 | 0.01% |
2018/19 Oct, Nov, Dec | 134,203 | 134,214 | 11 | 0.01% |
2018/19 Jan, Feb, Mar | 124,249 | 124,266 | 17 | 0.01% |
2019/20 Apr, May, June | 143,515 | 143,578 | 63 | 0.04% |
2019/20 July, Aug, Sept | 153,343 | 153,614 | 271 | 0.18% |
2019/20 Oct, Nov, Dec | 134,652 | 135,258 | 606 | 0.45% |
2. Quality summary
2.1 Relevance
The degree to which the statistical product meets user needs in both coverage and content.
The statistics provide information on incidents attended by FRSs in England. There is some information available and published from April 1981 to March 1982, although consistent and comparable information is not available for many variables until April 2009 to March 2010.
In addition to the quarterly and annual reports, a series of over 40 data tables are published in Excel format on GOV.UK. These tables are a rich source of trend data and are broken by fire and rescue authority and geographical categories, such as urban or rural and Metropolitan or non-Metropolitan.
The data is used extensively by policy colleagues to inform decisions and advise ministers. They have been used, for example, to inform the fire reform agenda and analysis on fires in purpose-built flats following the Grenfell Tower fire. They continue to feed into the Dame Hackitt review and MHCLG Grenfell-related work. In addition, FRSs use the data to monitor and benchmark performance, to make strategic decisions, for planning and risk assessment and to identify the most vulnerable groups to target for fire prevention activity. Other interest and uses of this data are outlined in the section on Uses and Users.
We review our data collections and outputs to ensure that they are relevant, collect reliable data and meet user needs. More details are in section 2.7 on Assessment of User Needs and Perceptions
The IRS is difficult (and costly) to maintain and inflexible to change due to its age - even minor adjustments to the data collected (e.g. adding electronic cigarettes or laptops as a cause of fire) are prohibitively costly. The IRS and Fire Statistics teams maintain a list of all user data requests and these will feed into any future review of the data collection to ensure that any updated data collection will meet user requirements.
The content, labelling and timing of outputs was reviewed in 2017 and the main change was the introduction of the quarterly headline statistic publications and the renaming of publications to better describe their content. We also introduced a new publication, Detailed analysis of non-fire incidents attended by fire and rescue services in England, to reflect the fact that, in the previous two years, there was an increase in the number of non-fire incidents attended by FRSs thus justifying more in-depth analysis, as is already done for fire incidents. The annual publication covers topics of interest on a rolling basis.
Uses and users
The statistics produced in the series are used by a range of users to monitor trends in the incidents attended by FRSs in England.
We believe the uses of fire statistics are:
- informing the general public – the statistics are used by both national and local media, which in turn informs the public about trends in incidents and fatalities and non-fatal casualties in those incidents; iInformation on the statistics is also routinely requested by Parliamentary Questions and Freedom of Information requests
- policy making and monitoring – the statistics are used by policy areas to monitor the state of the FRSs and to provide context and evidence for policies; the data is also used to inform discussion around the allocation of fire and rescue resources, and to provide advice to Ministers
- Fire and rescue service - comparisons and benchmarking
- third parties – the statistics are used by a range of third parties, eg third sector groups, or fire research academics
- informing public marketing campaigns – statistics are used to measure the effectiveness of campaigns by individual FRSs or nationally, for example Fire Kills
- The Office for National Statistics - use the figures in their work on productivity
- inspections and auditing – HM Inspectorate of Constabulary, Fire and Rescue Services (HMICFRS) use the data when carrying out inspections on the size and composition of FRSs
- national and international comparisons – as well as allowing for comparisons between FRSs in England, comparisons may be made with other countries in the UK, or internationally; caution should be taken when making comparisons between datasets, as they may not be directly comparable due to differences with both what and how the data is collected
We believe the users of fire statistics are:
- Ministers
- Members of Parliament
- Fire and rescue authorities and services
- other colleagues within the Home Office
- other government departments for example, Ministry for Housing, Communities and Local Government; Department for Business, Energy and Industrial Strategy; Forestry Commission; the Office for National Statistics
- HMICFRS
- trade unions
- journalists
- Chartered Institute of Public Finance and Accountancy
- Local Government Association
- individual citizens and private companies
- students, academics and universities
- charities
2.2 Accuracy and Reliability
The proximity between an estimate and the unknown true value.
Fire and rescue incident data is collected by obtaining a data extract from the IRS. All FRSs enter data onto the national Home Office system using:
- independent electronic systems to interface and transmit completed records to the IRS
- web forms facility provided by the national IRS
Information about the incident attended is input by a member of the attending fire crew and then quality assured by their line manager. The IRS consists of up to 175 questions, not all of which are asked for every incident. The IRS has online data entry with in-built validation rules which ensures that basic validation errors are avoided. However, there are likely to be some inaccuracies in the data due to reporting or keying errors, such as misclassification or missing cases. There is more detail on the QA process in the previous section. Following the transition to the online IRS in 2009, the main types of errors in the data is thought to relate to recording and classification errors. The level of missing data on fields is very low, with such missing data reported as unknown and therefore no grossing, imputation or other estimation methods are used. The Home Office provides a dedicated helpline for telephone and email queries.
As discussed in the section on quality assurance, the Fire Statistics team run some specific checks before publishing the data.
Accuracy can be broken down into sampling and non-sampling error. The data requested and provided by FRSs are not required under legislation, but we aim to achieve 100 per cent response for all fire and rescue collections, therefore reducing sampling error to the minimum. In order to ensure the IRS data is complete, Home Office statisticians carry out monthly monitoring of the number of incidents submitted as discussed in the quality assurance section.
Non-sampling error includes areas such as coverage error, non-response error, measurement error, processing error.
We aim to reduce non-sampling error through the provision of guidance about the data collections and the definitions of the data items. There are validation checks within the IRS to ensure that data is of good quality and fit-for-purpose.
The IRS is a continually updated database, with FRSs adding incidents daily. The figures in the published releases refer to records of incidents that were submitted to the IRS by a specific date, when a snapshot of the database was taken for the purpose of analysis. The date to take the extract from the IRS and produce the release on this data is chosen so that is far enough after the end of the period to allow the data for the end of the period to enter the system but at the same time it is not too late to reduce the usefulness of the figures. It is noted in the release that the statistics published may not match those held locally by FRSs and revisions may occur in the future.
2.3 Timeliness and Punctuality
Timeliness refers to the time gap between publication and the reference period. Punctuality refers to the gap between planned and actual publication dates.
There is a trade-off between timeliness and the other quality dimensions, in particular accuracy, accessibility and clarity. It is important to ensure that the Fire Statistics releases have adequate processes to ensure accuracy of the dataset and produce clear publication tables and apply appropriate disclosure control to the ILDs released.
To provide timely data to users, key headline figures from the IRS are published in a quarterly report, usually around four months after the end of the quarter. For example, the year ending December 2019 data were published on 14 May 2020. Care is taken to analyse data from well-regulated snapshots so to ensure consistency of reporting.
The publication date for quarterly reports and further annual reports is preannounced on the research and statistics release calendar. a provisional date is added around three to six months before publication, and this is confirmed around one to three months before publication.
Releases covering more detailed analysis of fires and non-fire incidents are both published later in the financial year.
FRSs are given one calendar month from the date of the incident to complete and quality assure the record, and to upload to the Home Office. The data is then quality assured by Home Office Statisticians, tables and commentary are prepared, and they are then quality assured and prepared for publication.
In accordance with Pre-release Access to Official Statistics Order 2008, ministers and eligible staff are given pre-release access to Home Office statistics 24 hours before release. A list of people given pre-release access is published alongside the relevant release.
The data production and publication schedules are kept under review and take into account user needs when considering the timeliness of future data releases.
2.4 Accessibility and Clarity
Accessibility is the ease with which users are able to access the data, also reflecting the format in which the data is available and the availability of supporting information. Clarity refers to the quality and sufficiency of the metadata, illustrations and accompanying advice.
The Fire statistics webpages are accessible from the Home Office statistics launch page. They are published in an accessible, orderly, pre-announced manner on the GOV.UK website at 9:30am on the day of publication. An RSS feed alerts registered users to this publication. All releases are available to download for free.
The outputs aim to provide a balance of commentary, summary tables and charts. The aim is to ‘tell the story’ in the output, without the output becoming overly long and complicated.
The publication is available in PDF and HTML format and includes email contact details for sending queries and feedback to the production team. Additional, detailed data tables are available in Excel spreadsheet format, the ILDs are available as ODS files.
The format used is in line with Home Office departmental guidance. It aims to make outputs clear for the audience and all outputs adhere to the accessibility policy. Key users of the publication are informed of the statistics on the day of their release. Further information regarding the statistics can be obtained by contacting the relevant staff detailed on the release via: firestatistics@homeoffice.gov.uk
The Fire Statistics team receives a large number of requests for advice and information about the IRS from other government departments and outside the government as well as for ad-hoc analysis by Home Office colleagues. The team also responds to Parliamentary Questions and Freedom of Information requests.
The data published on the Fire statistics collection pages of GOV.UK are subject to rights detailed in the Open Government Licence v3.0: ‘All content is available under the open government licence version 3, except where otherwise stated’.
The statistics are taken directly from the source data that is collected for administrative purposes with little manipulation between source and publication.
2.5 Coherence and Comparability
Coherence is the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar.
For many of the statistics covered by this report, they are the only source of official data on that subject. The fire data is collected from all providers on the same form with accompanying guidance and definitions. This ensures consistency across the different types of data provider.
Comparability is the degree to which data can be compared over time and domain.
The statistics are taken directly from the 45 FRSs in England (44 from 1 April 2021 when Hampshire and Isle of Wight merged), some of whom use the Home Office’s national data collection tool and some of whom use their own bespoke systems. All FRSs use the same question set and guidance documents.
Following user feedback, the reports have included more findings on long-term trends and information on changes in definitions over time
Comparison over time
Fire and rescue incident data saw a major change in the method of reporting in April 2009. Reports of incidents were previously collected using an FDR (Fire Data Report) form whereas, after this date, incidents were input onto the new online IRS. This change has possibly led to differences in the reporting patterns of certain types of incident. This is shown in the outputs by adding footnotes and comments where applicable.
Devolved administration data sources
Wales and Scotland use the Home Office IRS data collection tool. However, strategies, policies and approaches can differ between countries and this will affect the comparability of statistical outputs in some cases. The Home Office statistical releases present Welsh and Scottish incident data for headline tables.
Northern Ireland fire statistics are published by the Northern Ireland Fire and Rescue Service using data from a system similar to the Incident Recording System, which means that they are not directly comparable to English, Welsh and Scottish data.
Geographies below England level
The majority of IRS data tables are published at fire and rescue authority level, and FRS is included in the ILDs where it is not disclosive. The provision of clear guidance and definitions and the extensive validation carried out help to ensure that the data is consistent across FRSs. Frequent queries are received on the availability of IRS data for geographies below FRS level, such as Lower Layer Super Output Areas (LSOA) or including specific x-y co-ordinates. Home Office statisticians have concerns about the potential disclosure of individuals if IRS datasets containing lower geography variables were released with no restrictions. LSOA is included in some of the ILDs where the combination of other supplied variables is not disclosive. More information on this can be found in the ILD Project Overview.
Restricted access to the lower geography variables is available by providing datasets for specific research under a data sharing agreement.
Harmonisation of statistics inputs and outputs
A cross-governmental programme of work is currently underway looking into standardising inputs and outputs for use in National Statistics This is known as harmonisation. The Government Statistical Service published a Harmonisation Strategy in 2019. Its aim is to make it easier for users to draw clearer and more robust comparisons between data sources. The IRS adopts harmonised questions where possible, and harmonisation will be part of any ongoing changes to the system.
2.6 Trade-Off Between Output Quality Components
Trade-offs are the extent to which different aspects of quality are balanced against each other.
The statistics are produced and released according to a pre-announced timetable.
As discussed previously, the timetable for publication is designed to provide users with the best balance of timeliness and accuracy. At the time of production of the publication, some records may still be incomplete, and compliance checks by FRSs may result in the subsequent alteration of incidents. This is particularly the case for statistics with relatively small numbers, such as fire-related fatalities. This is pointed out in the publications.
2.7 Assessment of User Needs and Perceptions
The processes for finding out about users and uses, and their views on the statistical products.
The Fire Statistics team work closely with key customers and stakeholders in the Home Office to keep track of developments in fire and rescue policy, and continuously review the coverage and content of publications to ensure that they meet the needs of users.
We also consult our users on fire data collection issues as part of an ongoing exercise covering all fire statistics in order to better understand user requirements and priorities for the future. As part of this, Home Office policy colleagues, FRSs and others have provided information on how they use fire statistics as discussed in the earlier section on Uses and Users. We encourage feedback on all our outputs and data collections. Contact details are available at the bottom of each publication for users to get in touch if they have a query.
Stakeholder engagement
Home Office statisticians have worked closely with stakeholders to ensure that as much IRS data as possible is shared, upholding the principle of collect once and use many times.
We also collaborate with the fire research team in the Home Office, sharing knowledge across the teams and working on various queries (internal and external) together. We quality assure each other’s statistics and research products on a routine basis.
We use networks such as the National Fire Chiefs’ Council (NFCC), the Fire Statistics User Group plus specific fire networks to reach other users.
The Central Local Information Partnership (CLIP) Fire sub-group is the main governance group of fire statistics produced by the Home Office. The group meets on an ad hoc basis and provides an opportunity for data providers to share their views on fire data requirements, including identifying new priorities and areas for improvement.
We welcome feedback from HMICFRS, the Welsh and Scottish Governments and FRSs on the data collection and engage on a year-round basis where necessary to answer queries and provide advice on the collection.
The Home Office has also established a Fire and Rescue Data Board which aims to improve senior level oversight of central fire and rescue data and systems and to provide a forum for discussion on how to optimise the use and sharing of data across government and key stakeholders – leading to more sustainable management of the data. It meets three times a year and consists of senior level representatives from NFCC, HMICFRS, the Office for National Statistics and the Home Office.
User consultations
Broader consultations are conducted when appropriate, for example when significantly changing the provision or coverage of the published data, or revising the methodology used. These generally involve contacting known users of the published statistics to ask specific questions or request feedback. These questions are also published on the website adjacent or linked to the publication in order to capture users with whom we have had no previous contact. The results of user consultations are published on the website.
In 2018 a user survey of Home Office statistical releases was carried out as part of the development of a new statistical template for pdf releases. This was sent to all key stakeholders and a link was placed on each Home Office statistical release landing page on GOV.UK. The survey asked for views on, and the importance of, presentation, dissemination and improvements to publications. There were ten responses to the online consultation which, although giving a range of opinions, showed that more users wanted a concise statistical release rather than an all-encompassing one, and greater use of visualisation. Four respondents were not able to find what they were looking for in the release.
A consultation paper was published alongside the April 2017 to March 2018 Fire response times statistical release in January 2019. The consultation invited comments on whether to include incidents marked as ‘heat and/or smoke damage only’ in future publications. It also asked if respondents had any feedback on the response times to fires publication generally. There were 10 responses, mainly on the specific question, but there were also suggestions for changes to the publication which were reviewed for the following year’s release.
In addition to carrying out surveys of users, Home Office statisticians aim to engage with users on an ad-hoc basis to ensure that the statistics remain as relevant as possible. Examples have included:
- talking to key users about how to use the new style of tables, including having a single URL
- running workshops on ILDs and how to use them
- disseminating information about the ILDs via the Knowledge Hub, a “freeto-use digital tool for the global public service community” and asking for feedback