Official Statistics

Monthly official statistics - background quality report 22 August 2024

Published 22 August 2024

Applies to England

1. Introduction

This background quality report assesses the quality of monthly official statistics for the planning Inspectorate using the European Statistics System (ESS) Quality Assurance Framework (QAF). This is the method recommended by the Government Statistical Service (GSS) Quality Strategy. Statistics are of good quality when they are fit for their intended use.

The ESS QAF measures the quality of statistical outputs against the dimensions of

  • relevance

  • accuracy and reliability

  • timeliness

  • accessibility and clarity

  • comparability and coherence

The GSS also recommends assessment against 3 other principles in the ESS QAF. These are:

  • trade-offs between output quality components

  • confidentiality and transparency

  • balance between performance, cost and respondent burden

These dimensions and principles cross the three pillars of trustworthiness, quality and value in the Code of Practice for Statistics.

This quality assessment covers the monthly statistical release which provides summary information on appeals, which represent the highest volume (in terms of number of cases) of the work of the Planning Inspectorate.

These statistics are produced each month to allow anyone to see how the Planning Inspectorate are performing. The focus is on timeliness as that is an area in which stakeholders have an interest. Also included are information on the decisions that have been made; and on the number of Inspectors available to make those decisions.

2. Background and Context

The Planning Inspectorate’s job is to make decisions and provide recommendations and advice on a range of land use planning-related issues across England. This is done in a fair, open and timely way.

The Planning Inspectorate deals with planning appeals, national infrastructure planning applications, examinations of local plans and other planning-related and specialist casework in England.

The Planning Inspectorate is an executive agency, sponsored by the Ministry of Housing, Communities and Local Government

3. Methodology and Production

From the November 2022 release onwards, the Planning Inspectorate have applied a policy of taking all data extracts for Official Statistics on or after the third working day of the following month.

The casework statistics provided in this publication has used data from:

  • The casework management systems used for processing appeals casework, Horizon. This has been used to produce the statistics on our casework. Analysis is based on data extracted from these systems on:

  • number of decisions – 6th August 2024

  • number of receipts – 6th August 2024

  • mean, median, standard deviation – 6th August 2024

  • open cases – 13th August 2024

  • SAP HR – The previous Human Resources system database used to store all information regarding members of staff. This data source has been used to provide statistics on the number of inspectors up to January 2024. Analysis is based on data extracted from SAP on 6th February 2024

  • Employee central – The Human Resources system database used to store all information regarding members of staff. This data source has been used to provide statistics on the number of inspectors for February and later. Analysis is based on data extracted from Employee central on 6th August 2024

  • Spreadsheets – some of the casework data, for Tree Preservation Orders, High Hedges appeals and Hedgerow appeals, is also extracted from source MS Excel spreadsheets. This data has been used in conjunction with Horizon data to calculate performance data; extracted on 6th August 2024

Within the publication there is a focus on three different types of casework:

  1. Planning covers s78 planning appeals, Householder appeals, Commercial appeals, s20 Listed Building appeals, Advertisement appeals, s106 Planning Obligation appeals and Called In Planning Applications.

  2. Enforcement covers s174 Enforcement appeals, s39 Enforcement Listed Building appeals and Lawful Development Certificate appeals.

  3. Specialist casework includes Common Land, Rights of Way orders (including Schedule 14 cases), Purchase orders, Tree Preservation Orders , High Hedges appeals, Hedgerow appeals, Wayleave, Compulsory Purchase Orders, Secretary of State, Transport, Environmental Permitting Appeals and Coastal Access. Additional casework types have been added to this category over time.

4. Relevance

The Planning Inspectorate has proactively decided to produce these statistics monthly to better meet user needs. We welcome feedback and will continue to develop the statistic over time to ensure we continue to meet user needs.

The release can be used to answer press queries, parliamentary questions and Freedom of Information requests. The report is also useful for internal customers to support evidence-based decisions and to support discussions with external stakeholders.

5. Accuracy and Reliability

The Planning Inspectorate use administrative data from operational delivery systems to compile these statistics, as these data come from live systems there are occasions when this data changes. Data used on the publication is based on data recorded in these systems at the time of extraction.

The possible changes that could occur in these statistics include:

  • Data entry error – Some data may be entered in a form that is incomplete or in a format that cannot be processed. An example of this is that there are occasionally errors in date fields; these are highlighted in internal data quality reports and the Inspectorate is working to improve the quality of data that supports this publication.

  • On occasions the categorisation of cases may change e.g. the procedure type can change and this will be recorded differently in the latest monthly statistic compared to previous versions.

  • Delays in updating records on operational systems mean that changes may apply to data older than the latest month released.

This information and associated data collection methods will be quality assured, to develop a longer-term solution to collecting these statistics. Definitions of what constitutes an event are differs according to the type of casework.

When data is extracted from source systems, data processing can mean that values vary. For example, open cases data is processed using a snapshot method, which is separate to the way data is processed on closed cases. This has led to inconsistencies in trends, where data does not balance. The Coherence and Comparability section below has more information on the impact on open cases data.

Where data is published on events, instances have been found where an event date is recorded for cases that do not require an event, as the cases are dealt with based on only the documentation submitted.

There are instances where case records indicate a case has been closed and a decision (such as whether the appeal has been dismissed or allowed) has been recorded, but no date has been entered. It is not clear whether the decision has been added in error, or the date omitted in error. Any such case record will be excluded from the counts of the number of decisions (which use the month of the decision) which may give an under-estimate. This applies to fewer than 100 cases received in a year, in the context of over 17,000 decisions a year. Further work is required to automatically identify these cases and get any errors amended.

One of the main measures in the report is the number of decisions in a given time period. Also given is a count of the number of closed cases. This count is considerably higher as it includes cases where an appeal is withdrawn, notice is withdrawn, or the appeal is turned away.

Issues affecting the statistics this month are as follows:

  • There are 44 cases in the open cases measures (Tables 2 & 10) that do not have a procedure recorded against them, the specific reasons for which are not known. It has been established that these are all specialist cases and are either, Hedgerow, Rights of Way or Tree Preservation Order cases. Further work is needed to determine how best to resolve this issue, thought to be caused by a delay in determining the procedure.

  • Data separating open cases into those that have and have not yet had an event is not available for the end of July. We are working to fix this problem.

  • There is an issue with valid to decision time for TPO cases. These are currently not being included in the timeliness calculations.

  • We are investigating an issue with our processing of withdrawn and turned away cases that may mean some of these closed cases are being included in decision counts, contrary to our usual definition of ‘decided’. This may result in decision counts being revised in future publications. We believe that this affects fewer than 1% of decisions reported for the last year.

6. Timeliness and Punctuality

Figures are published monthly within a month of the end of the reporting period. This is to allow time to produce the statistics while ensuring they are timely for users.

The release date for this publication was pre-announced on the Planning Inspectorate’s Calendar of Upcoming Releases section of GOV.UK. There is also a 12-month release calendar with a specific release date given at least four weeks in advance where practicable provided on the GOV.UK website.

7. Accessibility and Clarity

The statistics are published on the GOV.UK website. The publication is available from 09:30 hours on the day of release.

Figures from the statistic are separately available in MS Excel format for users to download. This allows for use in individual research and reports.

8. Coherence and Comparability

The publication includes trends over a 12-month period to allow comparisons over time. If significant changes are observed in the statistics these have been explained.

For most of the data in this publication there is only one source of data and therefore it is not possible to cross-reference this with another data source – but it is possible to compare each month’s data with what was published the previous month. We have highlighted in the Statistics, any values which have changed by more than five (when measuring number of decisions/ cases) or more than 0.5 weeks (for mean, median or standard deviation of weeks).

The number in brackets is the difference between this month’s value and the previous month’s value.

There have been changes in Table 1:

  • Events held: March (20) and June 2024 (27)

There have been changes in Table 2:

  • Received: January (12), February (8), March (8), April (22), May (29) and June 2024 (21)

  • Closed: November (6) and December 2023 (12), January (23), February (31), March (39), April (60), May (83) and June 2024 (95)

There have been changes in Table 6:

  • Inquiries Valid to decision (Median weeks): October 2023 (1.3), March (2.6) and May 2024 (0.8)

  • Inquiries Valid to decision (Mean weeks): March (1.0) and May 2024 (1.0)

There have been changes in Annex A:

Planning Measure

  • Valid to decision median weeks: Inquiries March (1.1) and May 2024 (0.6)

  • Valid to decision mean weeks: Inquiries October 2023 (0.6), March (1.0) and May 2024 (0.8)

  • Valid to decision mean weeks: Inquiries February (28), March (6.0) and May 2024 (30.1)

  • Standard Deviation: Inquiries February (25.6), March (2.0) and May 2024 (30.1)

Linked cases

In some cases, the “lead” case has had a decision date added, but linked cases (which can be referred to as “child” cases), which have their decision issued at the same time, have not had the required fields updated. This makes the number of open cases appear higher than it should; and distorts the timeliness figures. For the October 2023 publication onwards, we changed the method for counting events. Previously, only a single event would be counted for a group of linked appeals. We now attribute an event to each appeal in the group. This brought the methodology in line with that for counting decisions.

Open cases

An in-depth investigation has been carried out into anomalies between the snapshot data used to count the number of open cases, and data captured about how many appeals have been received and closed during a month. If the data was accurate and consistent, though being from different sources, it should balance – but it does not. The reasons for the data not balancing are:

  • There are instances where case records indicate a case has been closed (due to the processing data entered onto our operational systems) but no date has been entered. Therefore, the case is excluded from snapshot data but is not counted in a closed cases measure.

  • As noted above, delays in registering appeals onto relevant systems mean that the open cases figure can increase in a particular month, but the receipts are recorded from potentially several months before. The Inspectorate are investigating how to improve the quality of this data by updating older snapshots of data to provide a more accurate estimate of open cases when new cases are added that were received in earlier months.

  • There are delays with registering Tree Preservation Orders that are affecting the open cases measure.

  • There is an issue with the exact date to capture snapshot data (the last date of the month excludes those cases registered on that day).

  • Withdrawn or closed cases being re-opened for consideration of awarding costs.

Inspectors

The Planning Inspectorate employs five categories of decision makers: Salaried Planning Inspectors, Fixed Term Planning Inspectors, Appeal Planning Officers, Apprentice HEOs and Planning Appeal Decision suppliers (formally known as Non-Salaried Inspectors (NSIs)). Appeal Planning Officers make recommendations to Salaried Inspectors on planning appeals, the recommendations are signed off by Inspectors who have the delegated authority to make decisions on behalf of the Secretary of State (SoS). Apprentice HEOs are an initiative started by the Planning Inspectorate to increase the diversity in the Planning Profession.

The monthly statistical release reports the headcount and full time equivalent of Salaried Planning Inspectors only.

9. Trade-offs between Output Quality Components

Where possible the cost to Government of producing these statistics has minimised by using data already collated for operational delivery purposes. The main sources of data used for compiling these statistics are the casework management systems, HORIZON and PICASO , these systems are large administrative databases, and as such, data quality across fields is of varying quality and completeness.

These statistics are produced each month, less than a month after the period on which they are reporting. This provides limited time for checking of the quality of the data. This decision is made to allow users timely information. Quality improvement is a key focus area, in which improvement is continuously sought.

10. Quality Assurance

Data feeding the publications undergoes quality checks to ensure the correct data has been extracted and the appropriate filters have been applied. Subsequently, the layout and presentation of the data in the statistical release is read by multiple members of The Data and Performance team to ensure that the data is presented appropriately to ensure the correct interpretation by the user.

11. Assessment of User Needs and Perceptions

Publication of this report has been in response to requests for information from the media and the general public about the Planning Inspectorate’s performance. This report also contributes to the Planning Inspectorate’s commitment to release information where possible.

The Planning Inspectorate invite users to provide feedback to any of their publications or reports using the contact information within the publication.

12. Performance, Cost and Respondent Burden

The production of the Monthly Official Statistic requires less than one FTE per annum.

The report uses administrative data sources already collected by the Planning Inspectorate. As such, there is no respondent burden, and the main cost is the production of the statistics including quality assurance and data interpretation.

13. Confidentiality, Transparency and Security

The Data and Performance team involved in the production of this Official Statistic have completed the Government wide Responsible for Information training and they understand their responsibilities under the Data Protection Act and the Official Statistics Code of Practice.

The Data and Performance team adhere to the principles and protocols laid out in the Code of Practice for Statistics and comply with pre-release access arrangements. The Pre-Release Access list for our publications are available on the GOV.UK website.

14. Contact Details

The Planning Inspectorate welcome feedback on our statistical products. If you have any comments or questions about this publication or about our statistics in general, you can contact us as follows:

Media enquiries 0303 444 5004 email press.office@planninginspectorate.gov.uk

Public enquiries email statistics@planninginspectorate.gov.uk

15. Official Statistics Designation

The Planning Inspectorate Monthly Statistical bulletin is designated as Official Statistics. The bulletin, and this associated Background Quality Report, are produced according to the principles of Trust, Quality and Value. We intend to review our complete Official Statistics offering to assess how it is meeting user needs. If you would like to provide feedback to contribute to this, please contact: statistics@planninginspectorate.gov.uk

Our statistical practice is regulated by the Office for Statistics Regulation (OSR).

OSR sets the standards of trustworthiness, quality and value in the Code of Practice for Statistics that all producers of official statistics should adhere to.

You are welcome to contact us directly with any comments about how we meet these standards.

Alternatively, you can contact OSR by emailing regulation@statistics.gov.uk or via the OSR website.