Guidance

SLC Statistics: Quality Guidelines - from November 2024

Updated 28 November 2024

This page was last updated/reviewed on 28 November 2024.

Introduction

The framework for the UK Statistics Authority’s (UKSA) Code of Practice is based on three pillars – Trustworthiness, Quality and Value. These Quality Guidelines have been developed by the Student Loans Company (SLC) to comply with Pillar 3: Quality.

Statistical Quality

Quality means that statistics fit their intended uses, are based on appropriate data and methods, and are not materially misleading. Quality requires skilled professional judgement about collecting, preparing, analysing and publishing statistics and data in ways that meet the needs of people who want to use the statistics.

Under the Quality pillar, SLC is committed to adhering to the following principles:

Q1: Suitable data sources - Statistics should be based on the most appropriate data to meet intended uses. The impact of any data limitations for use should be assessed, minimised and explained.

  • Q1.1 “Statistics should be based on data sources that are appropriate for the intended uses. The data sources should be based on definitions and concepts that are suitable approximations of what the statistics aim to measure, or that can be processed to become suitable for producing the statistics.”

  • Q1.2 “Statistics producers should establish and maintain constructive relationships with those involved in the collection, recording, supply, linking and quality assurance of data, wherever possible.”

  • Q1.3 “A clear statement of data requirements should be shared with the organisations that provide that data, setting out decisions on timing, definitions and format of data supply, and explaining how and why the data will be used.”

  • Q1.4 “Source data should be coherent across different levels of aggregation, consistent over time, and comparable between geographical areas, whenever possible.”

  • Q1.5 “The nature of data sources used, and how and why they were selected, should be explained. Potential bias, uncertainty and possible distortive effects in the source data should be identified and the extent of any impact on the statistics should be clearly reported.”

  • Q1.6 “The causes of limitations in data sources should be identified and addressed where possible. Statistics producers should be open about the extent to which limitations can be overcome and the impact on the statistics.”

  • Q1.7 “The impact of changes in the circumstances and context of a data source on the statistics over time should be evaluated. Reasons for any lack of consistency and related implications for use should be clearly explained to users.”

Q2: Sound methods - Producers of statistics and data should use the best available methods and recognised standards and be open about their decisions.

  • Q2.1 “Methods and processes should be based on national or international good practice, scientific principles, or established professional consensus.”

  • Q2.2 “Statistics, data and metadata should be compiled using recognised standards, classifications and definitions. They should be harmonised to be consistent and coherent with related statistics and data where possible. Users should be provided with reasons for deviations from these standards and explanations of any related implications for use.”

  • Q2.3 “Statistics producers should be transparent about methods used, giving the reasons for their selection. The level of detail of the explanation should be proportionate to the complexity of the methods chosen and reflect the needs of different types of users and uses.”

  • Q2.4 “Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation.”

  • Q2.5 “Producers of statistics and data should provide users with advance notice about changes to methods, explaining why the changes are being made. A consistent time series should be produced, with back series provided where possible. Users should be made aware of the nature and extent of the change.”

  • Q2.6 “Statistics producers should collaborate with topic and methods experts and producers of related statistics and data wherever possible.”

Q3: Assured quality - Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely.

  • Q3.1 “Statistics should be produced to a level of quality that meets users’ needs. The strengths and limitations of the statistics and data should be considered in relation to different uses, and clearly explained alongside the statistics.”

  • Q3.2 “Quality assurance arrangements should be proportionate to the nature of the quality issues and the importance of the statistics in serving the public good. Statistics producers should be transparent about the quality assurance approach taken throughout the preparation of the statistics. The risk and impact of quality issues on statistics and data should be minimised to an acceptable level for the intended uses.”

  • Q3.3 “The quality of the statistics and data, including their accuracy and reliability, coherence and comparability, and timeliness and punctuality, should be monitored and reported regularly. Statistics should be validated through comparison with other relevant statistics and data sources. The extent and nature of any uncertainty in the estimates should be clearly explained.”

  • Q3.4 “Scheduled revisions, or unscheduled corrections that result from errors, should be explained alongside the statistics, being clear on the scale, nature, cause and impact.”

  • Q3.5 “Systematic and periodic reviews on the strengths and limitations in the data and methods should be undertaken. Statistics producers should be open in addressing the issues identified and be transparent about their decisions on whether to act.”

Quality Planning and Continuous Improvement

Following each publication, SLC reviews what improvement is appropriate based on user engagement, lessons learned and any changes that have occurred in the preceding period.