Guidance

Participation Survey Quality Report

Updated 9 November 2023

Applies to England

1. Participation Survey Quality Report

1.1 About the Participation Survey

The Participation Survey is a continuous push to web survey of adults aged 16 and over in England with paper surveys available for those not digitally engaged. It has been running since October 2021 and will be a main evidence source for DCMS and its sectors by providing statistically representative national, regional and local estimates of adult engagement with the DCMS sectors. The survey’s main objectives are to:

  • Provide a central, reliable evidence source that can be used to analyse cultural, digital, and live sporting engagement, providing a clear picture of why people do or do not engage.
  • Provide data at a county level to meet user needs, including providing evidence for the levelling up agenda.
  • Underpin further research on driving engagement and the value and benefits of engagement.

More information on the Participation Survey can be found on the DCMS website.

1.2 Publications

The Participation Survey data is used to produce four publications per year:

  • April - June quarterly release
  • July - September quarterly release
  • October - December quarterly release
  • January - March quarterly release and annual release (these are one publication combined)

The publication calendar for the Participation Survey statistics collection can be found on the gov.uk website. As well as these publications, some of the data from the survey are used in the DCMS Outcome Delivery Plan metrics (outcome 4).

1.3 Quality control

As Official Statistics, Participation Survey statistics are produced to high professional standards set out in the Code of Practice for statistics and many quality control measures are in place to ensure the integrity of the data. These are covered in this document as well as in more detail in the Technical reports published alongside the statistical releases.

1.4 Quality Assurance

Quality assurance checks are carried out by our contractor and within DCMS. Quality checks carried out by the contractor are available in the Technical report published alongside the statistical releases.

To ensure an independent evaluation of the work, production of the analysis and report is typically carried out by one member of staff, and quality assurance is completed by at least one other. Finalised figures are disseminated within OpenDocument Format tables and a written headline report (annually), published on GOV.UK. These are produced by the survey team in DCMS. Before publishing, a quality assurer checks the data tables as well as the report to ensure minimal errors. This is checked against a QA log where comments can be fed back and actioned accordingly. The quality assurer also makes sure any statements made about the figures (for example, regarding trends) are correct according to the analysis and checks for spelling or grammatical errors.

Proof reading and publication checks are done at the final stage, including:

  • checking the figures in the publication match the published tables
  • checking the footnote numbering is correct
  • making sure hyperlinks work
  • checking chart/table numbers are in the correct order
  • ensuring the publication is signed off by DCMS Head of Profession for Statistics and DCMS Head of Division
  • contacting press office to ensure they are aware of the release date
  • checking the published GOV.UK page again after publishing

Once the publication is released, DCMS reviews the processes and procedures followed via a wash up meeting. This occurs usually a week after the publication release date and discusses:

  • what went well and what issues were encountered
  • what improvements can be made for next time
  • what feedback have we received from engaging with users

1.5 Revisions policy

Non-Scheduled Revisions

The DCMS corrects and revises data in accordance with its Statement of compliance with the Code of Practice for Official Statistics.

Where a substantial error has occurred as a result of the compilation or dissemination process, the statistical release, data tables and other accompanying releases will be updated with a correction notice as soon as is practical.

1.6 Quality Overview of the Participation Survey

The Participation Survey is used as an official statistic, and as such is held to the standard of quality as required by the UK Statistics Authority.

The following section outlines the quality of the statistics from the Participation Survey using the European Statistical System’s (ESS) Dimensions of Quality criteria for assessing the fitness for purpose of statistical outputs.

1.7 Relevance

The Participation Survey is the successor to the Taking Part survey which ran for 16 years. In 2020 and 2021, DCMS undertook more in-depth reviews of survey sources within DCMS. In order to meet the evidence needs for data on participation in DCMS sectors during COVID-19 recovery, it was concluded that a new adult Participation Survey would need to be designed and commissioned. This would allow for the collection of data within a landscape of changing social contact restrictions. Findings from the review on the need for more regular and geographically granular data were fed into the design of an interim adult Participation Survey.

Through this review, user engagement, and the running of the Taking Part Survey over the past 16 years, DCMS has gained insight into the uses and desires of how to use participation data and been able to identify many key users of participation data for DCMS sectors. The users of these statistics fall into five broad categories:

1 . Ministers and other political figures

2 . Policy and other professionals in DCMS and other Government departments

3 . Industries and their representative bodies

4 . Charitable organisations

5 . Academics

The broad uses of the statistics are:

  • Policy making and monitoring – the statistics are used by policy areas to monitor the state of engagement in their area and to provide context and evidence for policies; the data is also used to inform discussion around the allocation of resources, and to provide advice to Ministers
  • Informing the general public – the statistics are used by both national and local media, which in turn informs the public about engagement in cultural, digital and sporting activities.
  • Comparisons and benchmarking
  • Developing the evidence base

The Participation Survey does bring some change from that of the previous DCMS Participation Survey, the Taking Part survey, which was not directly meeting all the needs of DCMS and other users, and used a face to face methodology which was susceptible to the change in environment during the pandemic. These differences are explored further in a comparison paper between the two surveys.

Every year the Participation Survey is reviewed to assess if any questions need to be added, removed or amended to ensure the survey meets user needs as much as possible. Any new questions undergo in-depth levels of pilot testing to check that they are fit for purpose and understood by respondents. Recent examples are included in the 2021/22 Pilot Report.

The Participation Survey annual dataset is freely accessible from the UK Data Service for users who wish to explore micro-level data for themselves.

As this is a new survey series, we will be building up our user networks and committing to understanding further their needs and how well our outputs meet these needs. This will be an ongoing process, and we will feed these needs into the process

1.8 Accuracy & Reliability

As discussed in the section on quality assurance, the Survey Statistics team run some specific checks before publishing the data. The participation statistics are based on survey data and, as with all data from surveys, there will be an associated error margin surrounding these estimates, that is, a sampling error. Sampling error is the error caused by observing a sample (as in a survey) instead of the whole population (as in a census). While each sample is designed to produce the “best” estimate of the true population value, a number of equal-sized samples covering the population would generally produce varying population estimates. This means we cannot say an estimate of, for example, 20% is totally accurate for the whole population. Our best estimates, from the survey sample, suggest that the figure is 20%, but due to the degree of error the true population figure could perhaps be 18% or 23%. This is not an issue with the quality of the data or analysis, rather it is an inherent principle when using survey data to inform estimates.

Accuracy can be broken down into sampling and non-sampling error. The data requested and provided by adults through the survey are not mandatory, but we aim to achieve a representative response for all adults, therefore reducing sampling error to the minimum. In order to ensure the survey data is representative, our contractor carries out regular monitoring of the response rate as discussed in the quality assurance section. Fieldwork is monitored on a weekly basis at an ITL level as well as overall level to monitor progress against targets, and to enable future batches of the sample to be selected appropriately. Further information is available in the ‘Sampling’ section of the Technical report published alongside each statistical release.

Non-sampling error includes areas such as coverage error, non-response error, measurement error and processing error. We aim to reduce non-sampling error through the provision of guidance about the survey and how to complete it. We also provide FAQs and definitions of any terms used. There are validation checks within the survey to ensure that data is of good quality and fit-for-purpose. Despite these quality control and assurance procedures, the survey is based on self-reporting, which may not reflect actual behaviour. Disadvantages to relying on self reporting include inaccurate recall or social desirability bias. While there are extensive validation checks in place to minimise error, it is not possible to eliminate them entirely. There are also limitations to the amount of data that can be collected. Although the Participation Survey collects a rich level of detail of engagement with DCMS sectors, it is not possible to capture an extensive amount of information about each engagement, mainly because of space in the survey but also the ability for respondents to accurately recall events in detail, especially up to 12 months before the survey.

1.9 Timeliness and Punctuality

There is a trade-off between timeliness and the other quality dimensions, in particular accuracy, accessibility and clarity. It is important to ensure that the Participation Survey Statistical releases have adequate processes to ensure accuracy of the dataset and produce clear publication tables and apply appropriate disclosure control to the data released.

The Participation Survey is run throughout the year. To provide timely data to stakeholders, we aim to publish a summary set of statistics (in open data format tables) within 3 months of the end of the data collection period.

Survey in the field Data published
April to June September
July to September December
October to December March
January to March (combined with the annual publication for the survey year) July

These publication dates are provided in the DCMS Official Statistics calendar and provide 12 month rolling publication schedule. If there is a need to change the planned publication date, then this is updated in the official statistics calendar and a change note added to gov.uk release calendar. The data production and publication schedules are kept under review and take into account user needs when considering the timeliness of future data releases.

In accordance with Pre-release Access to Official Statistics Order 2008, ministers and eligible staff are given pre-release access to DCMS statistics 24 hours before release. A list of people given pre-release access is published alongside the relevant release.

1.10 Comparability and Coherence

The Participation Survey data are comparable across the 9 English regions and between counties in England. The survey is not representative at local authority level, except in 2023/24 where the survey was boosted by Arts Council England, meaning the sample size was significantly bigger and estimates at local authority level are representative. Where possible the Participation Survey uses harmonised question sets and definitions that are consistent with ONS and other government surveys. This makes our data on common areas such as demographics (for example, ethnicity) and wellbeing comparable with a wide range of other government surveys.

Prior to the launch of the Participation Survey in October 2021, two rounds of cognitive testing were conducted to ensure that the questions were answered in the manner expected. This is covered in the Pilot Report.

Since 2005, the Taking Part survey (TPS) has been DCMS’ flagship survey, collecting data on how adults and children engage with DCMS’ sectors. The emergence of the COVID-19 pandemic prevented face-to-face fieldwork taking place in the 2020/21 (yr 16) survey year and therefore provided an unavoidable break in the survey time series. The Participation Survey was introduced in October 2021 which built on new user needs as well as advances in methodologies which provided better value for money and were more resilient during the COVID-19 pandemic. This means a change in mode, questionnaire content, sampling approach and methodology, and real world changes (such as Covid), between the two surveys. As a result, a direct comparison of the Taking Part Survey and Participation Survey is not considered feasible. Further information is available in the comparability document.

In 2021, a survey measure mapping exercise was undertaken to identify surveys that contain measures relevant to DCMS sectors. The mapping exercise identified 68 surveys of varying quality with the most robust surveys generally being conducted by government Departments, universities, arms-length bodies or international research groups. Robust surveys such as Sport England’s Active Lives Survey remain integral to the evidence base for specific sectors, however no surveys were identified which were commissioned on a regular basis for the sole purpose of measuring arts and culture attendance and participation. Q15 in the FAQ document outlines some of the other sources which provide contextual information for some of the DCMS sectors during the Covid-19 pandemic. Please note that differences in findings are expected given that the choice of mode of data collection, time period and sampling frame vary.

1.11 Accessibility and Clarity

The survey is based on a representative sample for all adults in England.The address sample is designed to yield a respondent sample that is representative with respect to neighbourhood deprivation level, and age group within each of the 33 ITL2 regions in England. All selected addresses were sent an initial invitation letter containing the following information:

● A brief description of the survey

● The URL of survey website (used to access the online script)

● A QR code that can be scanned to access the online survey

● Log-in details for the required number of household members

● An explanation that participants will receive a £10 shopping voucher

● Information about how to contact the contractor in case of any queries

The reverse of the letter featured responses to a series of Frequently Asked Questions.

As well as the online survey, respondents were given the option to complete a paper questionnaire, which consisted of an abridged version of the online survey. Each letter informed respondents that they could request a paper questionnaire by contacting the contractor using the email address or freephone telephone number provided. In addition, two copies of the paper questionnaires were included with the second reminder letter. A round of piloting was conducted to ensure the process was accessible and clear to respondents. This is summarised in the Pilot Report.

The Participation Survey web pages are accessible from the DCMS statistics launch page. They are published in an accessible, orderly, pre-announced manner on the GOV.UK website at 9:30am on the day of publication. An RSS feed alerts registered users to this publication. All releases are available to download for free.

The outputs aim to provide a balance of commentary, summary tables and charts. The aim is to ‘tell the story’ in the output, without the output becoming overly long and complicated. The publication is available in HTML format and includes email contact details for sending queries and feedback to the production team. Additional, detailed data tables are available in ODS files.

The format used is in line with DCMS departmental guidance. It aims to make outputs clear for the audience and all outputs adhere to the accessibility policy. Key users of the publication are informed of the statistics on the day of their release. Further information regarding the statistics can be obtained by contacting the relevant staff detailed on the release via: [email protected].

1.12 Trade-Off Between Output Quality Components

Trade-offs are the extent to which different aspects of quality are balanced against each other. The statistics are produced and released according to a pre-announced timetable.

As discussed previously, the timetable for publication is designed to provide users with the best balance of timeliness and accuracy. Whilst the fieldwork periods end in March, June, September, and December, it is possible that completed surveys are provided after the end of the quarter, and therefore we need to balance the need to publish quarterly statistics promptly with ensuring we capture as many completed responses for the quarter. There may be instances where completed responses for a quarter are not captured in the quarterly results but subsequently are included in the annual statistics. These will be noted in the correct quarter in the datasets available for stakeholders to use.

1.13 Assessment of User Needs and Perceptions

The DCMS Survey team works closely with key customers and stakeholders in DCMS to keep track of developments in policy, and continuously review the coverage and content of publications to ensure that they meet the needs of users. We also consult our external users, including ALBs and academics, on their uses and needs of participation statistics to better understand user requirements and priorities for the future. We encourage feedback on all our outputs. Contact details are available at the bottom of each publication for users to get in touch if they have a query.

Broader consultations are conducted when appropriate, for example when significantly changing a survey, the methodology or the provision or coverage of the published data. These generally involve contacting known users of the published statistics to ask specific questions or request feedback. These questions are also published on the website or linked to the publication in order to capture users with whom we have had no previous contact. The results of user consultations are published on the website. The most recent user consultation was during February to May 2024, and considered the survey needs of the department covering the Participation Survey and Community Life Survey.