Monthly Official Statistics - Background Quality Report
Published 22 April 2021
Applies to England
1. Introduction
This background quality report assesses the quality of monthly official statistics for the planning Inspectorate using the European Statistics System (ESS) Quality Assurance Framework (QAF). This is the method recommended by the Government Statistical Service (GSS) Quality Strategy. Statistics are of good quality when they are fit for their intended use.
The ESS QAF measures the quality of statistical outputs against the dimensions of
- relevance
- accuracy and reliability
- timeliness
- accessibility and clarity
- comparability and coherence
The GSS also recommends assessment against 3 other principles in the ESS QAF. These are:
- trade-offs between output quality components
- confidentiality and transparency
- balance between performance, cost and respondent burden
These dimensions and principles cross the three pillars of trustworthiness, quality and value in the Code of Practice for Statistics. This quality assessment covers the monthly statistical release which provides summary information on appeals, which represent the highest volume (in terms of number of cases) of the work of the Planning Inspectorate.
These statistics are produced each month to allow anyone to see how the Planning Inspectorate are performing. The focus is on timeliness as that is an area in which stakeholders have an interest. Also included are information on the decisions that have been made; and on the number of Inspectors available to make those decisions.
2. Background and Context
The Planning Inspectorate’s job is to make decisions and provide recommendations and advice on a range of land use planning-related issues across England and Wales. This is done in a fair, open and timely way.
The Planning Inspectorate deals with planning appeals, national infrastructure planning applications, examinations of local plans and other planning-related and specialist casework in England and Wales.
The Planning Inspectorate is an executive agency, sponsored by the Ministry of Housing, Communities & Local Government and the Welsh Government.
3. Methodology and Production
The statistics provided in this publication has used data from:
- The casework management systems used for processing appeals casework, Horizon, Picaso and Inspector Scheduling System. This has been used to produce the statistics on our casework. Analysis is based on data extracted from these systems on:
- number of decisions – 14th April 2021,
- mean, median, standard deviation and open cases – 14th April 2021.
- SAP HR – The Human Resources system database used to store all information regarding members of staff. This data source has been used to provide statistics on the number of inspectors. Analysis is based on data extracted from SAP on 14th April 2021.
- Spreadsheets – some of the casework data, for Tree Preservation Orders, High Hedges appeals and Hedgerow appeals, is also extracted from source MS Excel spreadsheets. This data has been used in conjunction with Horizon data to calculate performance data; extracted on 14th April 2021.
- Virtual Events Spreadsheet. This data has been used to provide statistics on the number of virtual events. Analysis is based on data extracted from this spreadsheet on 16th April 2021.
- Virtual Events Timesheet data. This data has been used to count the number of Local Plans held virtually and comes from data recorded by Planning Inspectors on time they spent on casework. This data was extracted on 16th April 2021.
Within the publication there is a focus on three different types of casework:
- Planning covers s78 planning appeals, Householder appeals, Commercial appeals, s20 Listed Building appeals, Advertisement appeals, s106 Planning Obligation appeals and Called In Planning Applications.
- Enforcement covers s174 Enforcement appeals, s39 Enforcement Listed Building appeals and Lawful Development Certificate appeals.
- Specialist casework includes Common Land, Rights of Way orders, Purchase Orders, Tree Preservation Orders, High Hedges appeals and Hedgerow appeals.
4. Relevance
The Planning Inspectorate has proactively decided to produce these statistics monthly to better meet user needs. As this is a new publication, we welcome feedback and will continue to develop the statistic over time to ensure we continue to meet user needs.
The release can be used to answer press queries, parliamentary questions and Freedom of Information requests. The report is also useful for internal customers to support evidence-based decisions and to support discussions with external stakeholders.
5. Accuracy and Reliability
The Planning Inspectorate use administrative data from operational delivery systems to compile these statistics, as these data come from live systems there are occasions when this data changes. Data used on the publication is based on data recorded in these systems at the time of extraction.
The possible changes that could occur in these statistics include:
- Data entry error – Some data may be entered in a form that is incomplete or in a format that cannot be processed. An example of this is that there are occasionally errors in date fields; these are highlighted in internal data quality reports and the Inspectorate is working to improve the quality of data that supports this publication.
- On occasions the categorisation of cases may change e.g. the procedure type can change and this will be recorded differently in the latest monthly statistic compared to previous versions.
- Delays in updating records on Operational systems mean that changes may apply to data older than the latest month released.
This information and associated data collection methods will be quality assured, to develop a longer-term solution to collecting these statistics. Definitions of what constitutes an event are being refined, as this differs according to the type of casework. Whilst this work is in progress these numbers should be treated as provisional.
Where we publish data on events, we have found instances where an event date is recorded for cases that do not require either a physical or virtual event (the cases are dealt with based on only the documentation submitted). However operational systems demand that an event date is recorded and this may therefore slightly inflate events data.
We are aware that whilst we extract data from the same source systems the way we process data can vary. For example, we process data on open cases using a snapshot method, which is separate to the way we process data on closed cases. This can lead to minor inconsistencies in trends.
One of the main measures in the report is the number of decisions in a given time period. We also give a count of the number of closed cases. This count is considerably higher as it includes cases where an appeal is withdrawn, notice is withdrawn or the appeal is turned away.
6. Virtual Events data
There are concerns with the accuracy of data on virtual events. Data for virtual events is recorded on an Excel spreadsheet. However, it appears that not all events are being recorded through the prescribed process.
The table below shows the difference between the current method an alternative method that uses data collected on all events, and the respective event date. This is possible because we know that every hearing and inquiry at present is a virtual event. For February 21 around 33% of events have not been counted in the current method used for virtual events. The number of events not being counted falls to around 25% for March 21.
It should be noted that this table is not for all casework types for which virtual events are held; it is limited applicable to the high volume casework types of s78 planning appeals and enforcement appeals using the hearing or inquiry procedure only.
Events data will be further explored in the hope of providing more reliable information in the future. In parallel, a new data capture procedure is in place to give a more stable footing for future data.
Table: Virtual Events by month; current and alternative methods for counting number of events
Case Type | Jun-20 | Jul-20 | Aug-20 | Sep-20 | Oct-20 | Nov-20 | Dec-20 | Jan-21 | Feb-21 | Mar-21 |
Current method | 12 | 18 | 25 | 56 | 67 | 78 | 71 | 74 | 70 | 78 |
Alternative method | 19 | 20 | 27 | 55 | 66 | 90 | 81 | 97 | 103 | 103 |
Additional cases under alternative method | 7 | 2 | 2 | -1 | -1 | 12 | 10 | 23 | 33 | 25 |
The section on Coherence and Comparability gives details of where an examination of this month’s data, against last month’s, has highlighted previously unidentified data quality issues.
7. Timeliness and Punctuality
Figures are published monthly within a month of the end of the reporting period. This is to allow time to produce the statistics while ensuring they are timely for users.
The release date for this publication was pre-announced on the Planning Inspectorate’s Calendar of Upcoming Releases section of GOV.UK. There is also a 12 month release calendar with a specific release date given at least four weeks in advance where practicable provided on the GOV.UK website.
8. Accessibility and Clarity
Our statistics are published on the GOV.UK website. The publication is available from 0930 hours on the day of release.
Figures from the statistic are separately available in MS Excel format for users to download. This allows for use in individual research and reports.
9. Coherence and Comparability
The publication includes trends over a 12-month period to allow comparisons over time. If significant changes are observed in the statistics these have been explained.
For most of the data in this publication there is only one source of data and therefore it is not possible to cross-reference this with another data source – but it is possible to compare each month’s data with what was published the previous month. We have highlighted in the Statistics, any values which have changed by more than five (when measuring number of decisions/ cases) or more than 0.5 weeks (for mean, median or standard deviation of weeks).
Issues with the data identified this month are as follows
- There are fewer events reported in April 21 compared to March 21 for the following months; December 20 (10 fewer), January 21 (15 fewer) and February 21 (32 fewer). This data is shown in Table 1. Other months show changes of five or fewer. It is not clear what has caused this, other than the fact that data come from a live operational system where users make frequent changes.
- There have been changes to the number of appeals closed; 10 fewer in February 21 than previously reported, the specific reasons for which are not known (Table 2).
- There are fewer open cases reported in April 21 compared to March 21 for the following months; December 20 (43 fewer), January 21 (44 fewer) and February 21 (47 fewer). In each of these months 39 cases of the total were enforcement appeals that have subsequently been recorded as ‘Closed – opened in error’. Due to the nature of how data is extracted these ‘error’ were removed from previous snapshot data.
- There have been small changes to timeliness data in Tables 7 and Annex B, they impact only Specialist cases, the specific reasons for the changes are not known.
- There are 50 cases in the open cases measures (Tables 2 & 10) that do not have a procedure recorded against them, the specific reasons for which are not known.
10. Trade-offs between Output Quality Components
Where possible the cost to Government of producing these statistics has minimised by using data already collated for operational delivery purposes. The main sources of data used for compiling these statistics are the casework management systems, HORIZON and PICASO (Picaso is no longer a live operational casework system), these systems are large administrative databases, and as such, data quality across fields is of varying quality and completeness.
These statistics are produced each month, less than a month after the period on which they are reporting. This provides limited time for checking of the quality of the data. This decision is made to allow users timely information. Quality improvement is a key focus area, in which improvement is continuously sought.
11. Quality Assurance
Data feeding the publications undergoes quality checks to ensure the correct data has been extracted and the appropriate filters have been applied. Subsequently, the layout and presentation of the data in the statistical release is read by multiple members of The Data and Performance team to ensure that the data is presented appropriately to ensure the correct interpretation by the user.
12. Assessment of User Needs and Perceptions
Publication of this report has been in response to requests for information from the media and the general public about the Planning Inspectorate’s performance. This report also contributes to the Planning Inspectorate’s commitment to release information where possible.
The Planning Inspectorate invite users to provide feedback to any of their publications or reports using the contact information within the publication.
13. Performance, Cost and Respondent Burden
The production of the Monthly Official Statistic requires less than one FTE per annum.
The report uses administrative data sources already collected by the Planning Inspectorate. As such, there is no respondent burden, and the main cost is the production of the statistics including quality assurance and data interpretation.
14. Confidentiality, Transparency and Security
The Data and Performance team involved in the production of this Official Statistic have completed the Government wide Responsible for Information training and they understand their responsibilities under the Data Protection Act and the Official Statistics Code of Practice.
The Data and Performance team adhere to the principles and protocols laid out in the Code of Practice for Statistics and comply with pre-release access arrangements. The Pre-Release Access list for our publications are available on the GOV.UK website.
15. Contact Details
The Planning Inspectorate welcome feedback on our statistical products. If you have any comments or questions about this publication or about our statistics in general, you can contact us as follows:
Media enquiries 0303 444 5004
email [email protected]
Public enquiries email [email protected]
Please note we are currently reviewing our statistics with a view to making them as clear and helpful as possible for users. We would be delighted if you could contact us via the address below with any views on this approach; particularly on what content would be most useful and why.
email [email protected]