Official Statistics

Cyber security breaches survey 2023: technical report

Published 19 April 2023

Introduction

This Technical Annex provides the technical details of the Cyber Security Breaches Survey 2023. It covers the quantitative survey (fieldwork carried out in winter 2022 and early 2023) and qualitative element (carried out in early 2023), and copies of the main survey instruments (in the appendices) to aid with interpretation of the findings.

The annex supplements a main Statistical Release published by the Department for Science, Innovation and Technology (DSIT), covering the this year’s results for businesses and charities.

There is another Education Institutions Findings Annex, available on the same GOV.UK page, that covers the findings for schools, colleges and universities.

This research was previously commissioned by the former Department for Digital, Culture, Media and Sport (DCMS). In February 2023, the parts of UK government responsible for cyber security policy moved to the new department, DSIT. Since 2023, the study has also been co-funded and co-developed by the Home Office.

The Cyber Security Breaches Survey is a research study for UK cyber resilience, aligning with the National Cyber Strategy. It is primarily used to inform government policy on cyber security, making the UK cyberspace a secure place to do business. The study explores the policies, processes and approach to cyber security, for businesses, charities and educational institutions. It also considers the different cyber attacks and cyber crimes these organisations face, as well as how these organisations are impacted and respond.

For this latest release, the quantitative survey was carried out in winter 2022/23 and the qualitative element in early 2023.

Responsible analyst:

Emma Johns

Responsible statistician

Maddy Ell

Statistical enquiries:

[email protected]

Chapter 1: Overview

1.1 Summary of methodology

As in previous years, there were two strands to the Cyber Security Breaches Survey:

  • We undertook a random probability telephone and online survey of 2,263 UK businesses, 1,174 UK registered charities and 554 education institutions from 27 September 2022 to 18 January 2023. The data for businesses and charities have been weighted to be statistically representative of these two populations.

  • We carried out 44 in-depth interviews between December 2022 and January 2023, to gain further qualitative insights from some of the organisations that answered the survey.

Sole traders and public-sector organisations were outside the scope of the survey. In addition, businesses with no IT capacity or online presence were deemed ineligible. These exclusions are consistent with previous years.

This year’s survey sees more substantial changes to aspects of the survey methodology than in previous years (see Section 1.4). Nevertheless, the core approach – data collected from organisations via a random-probability survey, predominantly conducted by telephone – is unchanged, and therefore the survey is still considered comparable across years.

1.2 Strengths and limitations of the survey overall

While there have been other surveys about cyber security in organisations in recent years, these have often been less applicable to the typical UK business or charity for several methodological reasons, including:

  • focusing on larger organisations employing cyber security or IT professionals, at the expense of small organisations (with under 50 staff) that typically do not employ a professional in this role – these small organisations make up the overwhelming majority of the business and charity populations

  • covering several countries alongside the UK, which leads to a small sample size of UK organisations

  • using partially representative sampling or online-only data collection methods.

By contrast, the Cyber Security Breaches Survey series is intended to be statistically representative of UK businesses of all sizes and all relevant sectors, and of UK registered charities in all income bands.

The 2023 survey shares the same strengths as previous surveys in the series:

  • the use of random probability sampling and interviewing to avoid selection bias

  • the inclusion of micro and small businesses, and low-income charities, which ensures that the respective findings are not disproportionately skewed towards larger organisations

  • a data collection approach predominantly conducted by telephone, which aims to also include businesses and charities with less of an online presence (compared to online-only surveys)

  • a comprehensive attempt to obtain accurate frequency and cost data from respondents, giving respondents flexibility in how they can answer (e.g. allowing numeric and banded amounts), and sending them a follow-up online survey to validate answers given in telephone interviews

  • a consideration of the cost of an organisation’s most disruptive cyber security breach or attack beyond the immediate direct costs (i.e. explicitly asking respondents to consider longer-term direct costs, staff time costs, as well as other indirect costs, while giving a description of what might be included within each of these cost categories).

At the same time, while this survey aims to produce the most representative, accurate and reliable data possible with the resources available, it should be acknowledged that there are inevitable limitations of the data, as with any survey project. The following might be considered the main limitations:

  • Organisations can only tell us about the cyber security breaches or attacks that they have detected. There may be other breaches or attacks affecting organisations, but which are not identified as such by their systems or by staff, such as a virus or other malicious code that has so far gone unnoticed. Therefore, the survey may tend to systematically underestimate the real level of breaches or attacks. This equally applies to the cyber crime and cyber-facilitated fraud prevalence and scale estimates, given that these types of crimes emanate from cyber security breaches and attacks.

  • The business survey intends to represent businesses of all sizes. As the BEIS Business Population Estimates 2022 show, the UK business population is predominantly made up of micro and small businesses (respectively 82% and 15% of all enterprises). This presents a challenge – these businesses, due to their smaller scale and resource limitations, typically have a less mature cyber security profile. This may limit the insights this study in isolation can generate into the more sophisticated cyber security issues and challenges facing the UK’s large business population, and the kinds of high-impact cyber security incidents that appear in the news and media. Nevertheless, the study design attempts to balance this by boosting survey responses among medium and large businesses (and high-income charities) and by focusing on larger organisations in the qualitative strand. Moreover, DSIT undertakes a separate survey series focused on larger organisations, the Cyber Security Longitudinal Survey, partly to address this limitation.

  • Organisations may be inclined to give answers that reflect favourably on them in surveys about cyber security (a form of social desirability bias), given the common perceptions of reputational damage associated with cyber security incidents. Furthermore, organisations that have suffered from more substantial cyber security incidents may be less inclined to take part because of this. This may result in surveys like this one undercounting the true extent and cost of cyber security incidents, although we have no direct evidence of this (for example from cognitive testing). Moreover, we make a concerted effort to overcome this in the administration of the survey. We make it clear to respondents, across a range of communication materials, that their answers are confidential and anonymous.

  • A significant challenge remains in terms of designing a methodology that accurately captures the financial implications of cyber security incidents, given that survey findings necessarily depend on self-reported costs from organisations. As previous years’ findings and government research from 2020 on the full cost of cyber security breaches suggest, there is no consistent framework across organisations at present that supports them to understand and monitor their costs, and many organisations do not actively monitor these costs at all. Moreover, we consciously opted to not to ask about certain long-term indirect costs (see Section 2.1), as it was unrealistic to collect accurate figures for these areas in a single survey. In addition, a survey based on a sample such as this one may miss some of the most financially damaging cyber security incidents, that affect a very small number of UK organisations in a very extreme way. This implies that respondents may underestimate the true economic cost of their most disruptive breaches or attacks in the survey, and that our averaged results may miss critical cases within the population. This particular challenge also applies to the cost of cyber crime estimates.

  • The total populations of further and higher education institutions are very small (305 and 175 respectively, as of this survey taking place). This limits the ability to achieve relatively high sample sizes among these groups. It results in much higher margins of error for the survey estimates for these groups, compared to businesses, charities and schools.

1.3 Strengths and limitations of the new cyber crime statistics

This year’s survey produces statistics on cyber crime, and on fraud that occurs as a result of cyber crime (i.e. cyber-facilitated fraud), in UK organisations for the first time. This addition was overseen by both DSIT and the Home Office. It includes estimates for:

  • the prevalence of cyber crime, i.e. how many organisations are affected by them

  • the nature of these cyber crimes

  • the scale of cyber crimes, i.e. the number of times each organisation is impacted, and estimates for the total number of cyber crimes against UK organisations

  • the financial cost of cyber crime

  • a similar set of statistics with regards to frauds that occur as a result of cyber crime (cyber-facilitated fraud), but excluding financial cost estimates for these frauds, which could not be reported this year (explained in Section 2.7).

The survey approaches these estimates in a similar way to existing official estimates of crime against individuals. This includes police-recorded crime as well as the estimates from the general public Crime Survey for England and Wales (CSEW), both of which follow the Home Office Counting Rules. The approach to this can be considered highly robust in various ways:

  • Comprehensiveness – the questionnaire was set up to measure multiple types of cyber crime, relating to ransomware, viruses and other malware, unauthorised access to data, online takeovers, denial of service and phishing. Cyber-facilitated fraud is counted separately, as a different category of crime.

  • Isolating criminal acts – the survey asks a series of questions to establish whether the cyber security breaches or attacks that organisations have experienced are crimes. It systematically aims to exclude cyber attacks that were stopped by software and breaches where the organisation was not deliberately targeted (e.g. accidental accessing of confidential data by employees). It only includes phishing attacks in cases where organisations confirmed that either employees responded in some way (e.g. by opening an attachment) or that the email contained personal data relating to the recipient – and no other crimes succeeded this.

  • Avoidance of double-counting – the survey aims to establish whether a criminal act was the final event, separate from other cyber incidents, or part of a wider chain of events leading to a subsequent and different category of cyber incident. For example, instances of unauthorised access may have led to subsequent events, such as ransomware, other malware or cyber-facilitated fraud. In these instances, only the final event is recorded as a crime. This avoids double-counting, in line with the Home Office Counting Rules.

However, as this is the first year these questions have been asked and there is no baseline for comparison, users should be relatively cautious when interpreting these statistics. They should ideally be considered alongside other, related evidence on computer misuse in organisations, such as the statistics for retail and wholesale premises collected in the 2021 Commercial Victimisation Survey (CVS). The CVS and the Cyber Security Breaches Survey are not directly comparable – the former represents one business sector and is sampled at an establishment level, while this study represents all sectors and comes from a sample of enterprises. However, it does highlight that 7% of retail and wholesale premises were victims of computer misuse in 2021, and that most of these events were phishing-related. This matches some of the findings in this latest Cyber Security Breaches Survey.

The Crime Survey for England and Wales (CSEW) also estimated c.690,000 computer misuse offences experienced by the general public (in England and Wales) in the 12 months up to the end of September 2022. Again, the CSEW and Cyber Security Breaches Survey are not directly comparable – the CSEW does not look at crime against organisations and excludes Scotland and Northern Ireland. However, it does provide a benchmark for the scale of cyber crime against individuals in England and Wales, to help contextualise the equivalent results for UK organisations in this survey.

The data collection will be subject to refinement in future years. For this year, the results have the following limitations:

  • Each individual type of cyber crime is a relatively rare event. As such, some of the questions have very low sample sizes, and it is not possible to report statistically reliable results for each cyber crime type (e.g. on the number of ransomware-related crimes or hacking-related crimes). The Statistical Release focuses instead on reporting aggregated statistics for all cyber crimes among businesses and charities overall (in terms of prevalence, scale and cost) and breaking these down where possible. For education institutions, where the sample sizes are smaller, it is only possible to report cyber crime prevalence figures for primary schools, colleges and universities, and both prevalence and scale figures (but not cost estimates) for secondary schools.

  • These new questions put a high cognitive load on respondents, given that they required individuals to recall very specific information about individual cyber incidents. The questions were cognitively tested and piloted, so we are confident that the majority of respondents understood and answered the questions as intended. However, a review of the data highlighted contradictions in the responses for a minority of respondents, for example when comparing their answers on cyber security breaches or attacks (from the older questions) and cyber crimes (from the new questions). Where there were contradictions, Ipsos attempted a second call to organisations (where permission was given for a call back for quality control purposes) to check the answers, and collect new answers if necessary. More information on this is provided in Section 2.4, including how we have handled contradictions where no call back was possible.

  • This year, cyber crime specifically to do with successful hacking of online bank accounts (filtered from TYPE4) has not been covered. This would constitute a hacking-related cyber crime, but this year’s cyber crime questions do not measure the prevalence and scale of these specific events. This is likely to be included in subsequent years. Given that just 4% of businesses and 1% of charities identified cyber security breaches or attacks relating hacking or attempted hacking of online back accounts, we expect this omission to result in a relatively minor underreporting of cyber crimes this year. This does not affect the estimates for fraud that occurred as a result of hacking of online bank accounts (which covers 23% of the businesses that experienced cyber-facilitated fraud).

1.4 Methodology changes from previous waves

One of the objectives of the survey is to understand how approaches to cyber security and the cost of breaches are evolving over time. Therefore, the methodology is intended to be as comparable as possible to previous surveys in the series.

As the core approach – data collected from organisations via a random-probability survey, predominantly conducted by telephone – is unchanged, we continue to make comparisons to previous years. Nevertheless, this year, we made more substantial changes to aspects of the methodology than in previous years. The noteworthy changes were as follows:

  • In order to explore the new topic of cyber crime, the sample sizes for the quantitative survey were increased this year, including a higher number of medium firms (277, vs. 149 in 2022), large firms (199, vs. 134 in 2022), and high-income charities (358, vs. 138 in 2022) specifically. In total, we interviewed 3,991 respondents across all types of organisation (vs. 2,157 in 2022). We also undertook a higher number of qualitative follow-up interviews (44, vs. 35 in 2022), to allow us to explore more topics. The remaining changes discussed in the following bullet points were, in part, necessary to be able to deliver these larger sample sizes.

  • We changed the sample frame for businesses from the Inter-Departmental Business Register (IDBR) to the Market Location business database. This was done to improve the overall sample quality, accuracy and telephone coverage. We do not expect this specific adaptation to have made a meaningful impact on trend findings – more explanation of this is provided in Section 2.3. The sample frames for charities and education institutions were consistent with previous years (see Section 2.3).

  • We adopted a multimode data collection approach, allowing organisations to take part partially or fully online as well as by phone. This matches the approach taken in other random probability business surveys since the COVID-19 pandemic (e.g. the cyber skills labour market study) and reflects the increasing need to offer organisations the flexibility to respond online under hybrid working. More details are in Section 2.4.

  • For businesses and charities, we substantially increased the use of split-sampling – where certain questions are only asked to a random half of the sample. We also restricted various questions to larger organisations (medium and large businesses, and high-income charities). Both actions were in order to maintain a questionnaire length comparable to previous years. This change has resulted in some primary and secondary schools not being asked the full set of questions this year. The questions affected, including the impact on the schools samples, are covered in more detail in Section 2.1.

  • We removed the COSTA and COSTB questions, which estimated the total cost of all cyber security breaches or attacks organisations have faced in the last 12 months. This was decided after the pilot survey, when it was apparent that the questionnaire length needed to be further reduced. In previous years, these statistics have been relatively prominent in the summary and conclusions of the main Statistical Release, indicating that the overall cost of cyber security breaches or attacks is substantial for the average organisation. There are now a set of new questions estimating the total cost of cyber crime. All questionnaire changes and additions are comprehensively covered in Section 2.1.

The following points cover other major changes or additions to the study that have been made in previous years:

  • The agriculture, forestry and fishing sector was included in the business sample for the first time in 2022. This is a small sector, accounting for 3.6% of all UK businesses. Its inclusion has a negligible impact on the comparability of findings across years.

  • The charities sample was added in 2018, while the education institutions sample was added in 2020. The initial scope of the school and college samples were expanded from 2021 to include institutions in Wales, Scotland and Northern Ireland, as well as England.

  • The government’s 10 Steps to Cyber Security guidance was refreshed between the 2021 and 2022 studies. As such, the way the 10 steps mapped to the questionnaire changed, and this section of the Statistical Release is not comparable to releases pre-2022.

  • In 2021, we substantially changed the way we collect data on the costs of breaches in the survey, as part of a reflection on findings from a separate 2020 DCMS (now DSIT) research study on the full cost of cyber security breaches. These changes mean we cannot make direct comparisons between data from 2021 onwards and previous years. We can, however, still comment on whether the broad patterns in the data are consistent with previous years, for example the differences between smaller and larger businesses, as well as charities.

1.5 Comparability to the pre-2016 Information Security Breaches Surveys

From 2012 to 2015, the government commissioned and published annual Information Security Breaches Surveys.[footnote 1] While these surveys covered similar topics to the Cyber Security Breaches Survey series, they employed a radically different methodology, with a self-selecting online sample weighted more towards large businesses. Moreover, the question wording and order is different for both sets of surveys. This means that comparisons between surveys from both series are not possible.

1.6 Margins of error

The survey results for businesses and charities are weighted to be representative of the respective UK population profiles for these organisations. The education institution samples are unweighted, but these groups are included as simple random samples, i.e. without any disproportionate stratification. As such, they are also considered to be representative samples. Therefore, it is theoretically possible to extrapolate survey responses to the wider population (with the exception of the financial cost data, as explained at the end of this section).

We recommend accounting for the margin of error in any extrapolated results. Table 1.1 shows the overall margins of error (MoE) for the sampled groups in the survey, for different survey estimates.

As a worked through example, the overall business sample this year has a margin of error range of ±1.4 to ±2.4 percentage points, based on a 95% confidence interval calculation. That is to say, if we were to conduct this survey 100 times (each time with a different sample of the business population), we would expect the results to be within 1.4 to 2.4 percentage points of the results we achieved here in 95 out of those 100 cases. The range illustrates that survey results closer to 50% tend to have higher margins of error. If 90% of surveyed businesses said cyber security is a high priority for their senior management, this result would have a margin of error of ±1.4 percentage points, whereas if only 50% this, the margin of error would be ±2.4 percentage points. The margins of error are calculated using the effective sample sizes (which take into account survey weighting).

For reference, we have also included MoE calculations for the split-sampled questions, where the business and charities samples are roughly half of the total. In these cases we have used the lower of the two split-samples. For example, where the business questions are split-sampled, some questions were asked to a randomly selected 1,152 business respondents (out of the total 2,263) whereas some questions were asked to the remaining 1,111. We have calculated the MoE for the 1,111.

Table 1.1: Margins of error (MoE) for each sample group for different survey estimates (in percentage points)

Sample group Sample size Effective sample size 10% or 90% estimate 30% or 70% estimate 50% estimate
Businesses 2,263 1,702 ±1.4 ±2.2 ±2.4
Businesses – split-sampled 1,111 849 ±2.0 ±3.1 ±3.4
Charities 1,174 808 ±2.1 ±3.2 ±3.4
Charities – split-sampled 570 392 ±3.0 ±4.5 ±4.9
Primary schools 241 241 ±3.8 ±5.8 ±6.3
Secondary schools 217 217 ±4.0 ±6.1 ±6.7
Further education 44 44 ±8.2 ±12.5 ±13.7
Higher education 52 52 ±6.8 ±10.4 ±11.4

1.7 Extrapolating results to the wider population

The total population sizes for each of these sample groups are as follows:

  • 1,477,900 UK businesses (according to the BEIS Business Population Estimates 2022)

  • 201,325 UK registered charities (combining the lists of registered charities across the 3 UK charity regulator databases, laid out in Section 2.3)

  • 20,783 primary schools (including free schools, academies, Local Authority-maintained schools and special schools covering children aged 5 to 11)

  • 3,886 secondary schools (including free schools, academies, Local Authority-maintained schools and special schools covering children aged 11+)

  • 305 further education colleges

  • 175 universities.

As the samples for each group are statistically representative, it is theoretically possible to extrapolate survey results to the overall population. This applies both to the standard percentage estimates across the survey, and to the estimates of the scale of cyber crime and cyber-facilitated fraud (where the Statistical Release does include extrapolated estimates for the business and charity populations).

We recommend restricting any extrapolation of results to the overall business and charity populations rather than to any subgroups within these populations (e.g. large businesses, or construction businesses). The sample sizes for these subgroups in our survey are much smaller than the overall sample sizes, and consequently have much higher margins of error. Similarly, the sample sizes for education institutions are small and have relatively high margins of error.

Any extrapolated results should be clearly labelled as estimates and, ideally, should be calibrated against other sources of evidence.

We specifically do not consider the financial cost estimates from this survey to be suitable for this sort of extrapolation (e.g. to produce a total cost of cyber incidents, cyber crime or cyber-facilitated fraud for the UK economy). These estimates tend to have a high level of statistical standard error, so the margins of error for any extrapolated cost estimate are likely to be very wide, limiting the value of such an estimate.

If you wish to use extrapolated Cyber Security Breaches Survey data as part of your analysis or reporting, then we would encourage you to contact DSIT via the evidence mailbox. As of publication, the DCMS mailbox remains the correct address, but this may change to a DSIT-specific email address next year: [email protected].

Chapter 2: Survey approach technical details

2.1 Survey and questionnaire development

The questionnaire content is largely driven by the Cyber Resilience team at DSIT, alongside the Home Office (which has co-funded the study since 2023). They ensure that the focus aligns with the National Cyber Strategy, to provide evidence on UK cyber resilience, and influence future government policy and other interventions in this space.

Ipsos developed the questionnaire and all other survey instruments (e.g. the interview script and briefing materials) with DSIT (formerly DCMS) and the Home Office. The involvement of the Home Office in this year’s study reflects the addition of the cyber crime questions. DSIT had final approval of the questionnaire. A full copy is available in Appendix A.

Development for this year’s survey took place over three stages from July to September 2022:

  • stakeholder engagement via email with industry representatives, and via a workshop with government representatives (alongside ongoing input over email and further meetings)

  • cognitive testing interviews with 10 organisations (businesses, charities and schools)

  • a pilot survey, consisting of 40 interviews (22 businesses, 14 charities and 4 schools).

Stakeholder engagement

Each year, Ipsos has consulted a range of industry stakeholders, to ensure that the Cyber Security Breaches Survey continues to explore the most important trends and themes that organisations are grappling with when it comes to cyber security. This includes the Association of British Insurers (ABI), techUK and the Institute of Chartered Accountants in England and Wales (ICAEW), who were consulted this year and agreed to endorse the survey. Similarly, DSIT, alongside the Home Office as co-funders this year, have consulted a range of stakeholders across government, such as the National Cyber Security Centre (NCSC) and the Office for National Statistics (ONS).

There were two meetings with government stakeholders serving different purposes. Firstly, Ipsos met with DSIT and Home Office colleagues in July 2022 to brainstorm the challenges around defining cyber crime, covering different types of cyber crimes and avoiding double-counting of crimes. Secondly, Ipsos facilitated a wider session in August 2022 with stakeholders from DSIT, the Home Office, the NCSC, and the devolved governments of Scotland, Wales and Northern Ireland. As part of this meeting, Ipsos shared a list of potential questions to cut from the survey. This list was then discussed and validated by stakeholders. Lower priority questions from this list were ultimately cut from the survey following the pilot stage of fieldwork – these questions are listed towards the end of Section 2.1.

Separately, Ipsos and DSIT jointly held meetings with two stakeholders that had relationships with cyber security professionals in the further and higher education sectors – Jisc (a membership organisation of individuals in digital roles within the further and higher education sectors) and UCISA (formerly known as the Universities and Colleges Information Systems Association). These organisations subsequently encouraged their members and contacts to take part in the survey, promoting the online survey link created by Ipsos (see Section 2.4).

Finally, Ipsos engaged Professor Steven Furnell from the University of Nottingham in July 2022 to review how the questionnaire was mapped to the government’s 10 Steps to Cyber Security guidance, and suggest a more accurate and robust mapping. More detail on the eventual changes made to the mapping is in Section 2.7

Defining cyber crime

The most significant change to the questionnaire this year was the addition of questions to measure the prevalence, nature, scale and financial cost of cyber crime, and of the frauds that occur as a result of cyber crime. These were jointly developed by Ipsos, DSIT and the Home Office.

In addition to the brainstorming meeting in July 2022, Ipsos, DSIT and the Home Office carried out an extensive scoping exercise to inform these questions. This included a review of existing survey approaches to measuring computer-related crimes, focusing on two major surveys – the Commercial Victimisation Survey (CVS), which measures crime against business premises, and the Crime Survey of England and Wales (CSEW), which measures crime against individuals. Both surveys include questions measuring cyber crime, as defined in the Computer Misuse Act 1990. Both studies acted as a basis for the development of the new questions, but both had drawbacks. For instance, the codes in the CVS were too general and not explicitly related to cyber crime, and the CSEW wording was aimed at individuals and not businesses. The relevant CSEW module was also unfeasibly long to replicate, taking around 20 minutes.

DSIT and the Home Office also consulted the Office for National Statistics (ONS) for guidance on the inclusion of these new questions. The ONS provided useful guidance on the computer misuse questions in the CSEW, and the ways in which these adhered to the Home Office Counting Rules.

From this scoping exercise, we identified the following types of cyber crimes to be mapped:

  • ransomware that breached an organisation’s defences (i.e. it was not stopped by software)

  • other computer viruses or malware that breached an organisation’s defences

  • denial of service attacks that breached an organisation’s defences and were carried out intentionally, including attacks that led to extortion

  • hacking attacks that were carried out intentionally, including attacks that led to extortion

  • phishing attacks that individuals responded to (e.g. by opening an attachment) or that contained personal data about the recipient, and did not lead to any further crimes being committed.

In addition, the Home Office requested the inclusion of further questions to explore the extent to which fraud occurred as a result of cyber crime. Fraud is a separate category of crime from cyber crime (which is defined in the Computer Misuse Act), but can occur following a crime. This includes, for example, criminals stealing money from an organisation by:

  • hacking into an organisation’s IT system to obtain bank details and using these to transfer money to their own account

  • sending false invoices via a compromised organisation email account and getting the recipient to pay these.

Routing respondents into the cyber crime questions

Ipsos mapped the current questionnaire to the Computer Misuse Act and the CSEW coding log, to help understand how the existing Cyber Security Breaches Survey questions could help identify victims of crime. This mapping exercise highlighted that the existing TYPE question – the question that measures instances of cyber security breaches and attacks – was the most appropriate for routing potential crime victims.

However, TYPE was not enough on its own. Some of the cyber security breaches and attacks reported in Chapter 4 would not constitute cyber crimes in the context of this survey. For example, some attempted attacks will not have penetrated an organisation’s cyber defences and some, such as online impersonation, would be beyond the scope of the Computer Misuse Act. Therefore, this question was a starting point for a new set of questions on cyber crime.

Routing into the crime-related questions on ransomware, viruses or other malware, denial of service attacks and phishing attacks was relatively straightforward – each of these areas mapped one-to-one to a single, equivalent answer code at TYPE. Routing for hacking attacks and cyber-facilitated fraud was less straightforward. The following codes at TYPE were ultimately used to route organisations to the hacking-related cyber crime questions:

  • unauthorised accessing of files or networks by staff, even if accidental

  • unauthorised accessing of files or networks by students

  • unauthorised accessing of files or networks by people outside your organisation

  • unauthorised listening into video conferences or instant messaging

  • takeovers or attempts to take over your website, social media accounts or email accounts.

There was an additional code that mentioned hacking (“hacking or attempted hacking of online bank accounts”). Successful hacking of online bank accounts would constitute a crime in one of two different ways:

  • if criminals used the hacked bank details to transfer money to their own account, this would count as cyber-facilitated fraud

  • if no money was stolen, a successful hack would count as a hacking-related cyber crime.

This year’s cyber crime questions do measure the prevalence and scale of the first of these, i.e. hacking resulting in fraud (as identified by the organisation). However, they do not measure the second of these, i.e. hacking-related cyber crime (where no fraud has occurred) This will be included in subsequent years.

When it came to fully mapping cyber-facilitated fraud, there was no single code at TYPE that captured all types of activity that could lead to fraud. Therefore, we ultimately decided to route all organisations that identified any kind of cyber security breach into the fraud questions. However, in reporting the cyber-facilitated fraud statistics, we have excluded instances where organisations said fraud was enabled by online impersonation. This is because impersonation is not a cyber crime under the Computer Misuse Act. More detail on this exclusion is included in the discussion on SPSS derived variables in Section 2.7.

Drafting of cyber crime questions routed from TYPE

To help draft questions, Ipsos used the CSEW coding log and Home Office guidance on the Computer Misuse Act. From these, we devised a generalised step-by-step process for each type of cyber incident. This step-by-step process was then used to draft full questions.

  • For all incident types, the survey asks the number of times a cyber incident occurred (not necessarily a crime).

  • For all incident types, when identifying cyber crime, we adhere to the Home Office Counting Rules, and avoid double-counting of crimes. We establish if organisations experienced these kinds of cyber incidents as isolated events, or part of a wider chain of events. If they were part of a chain, we have taken the approach of only counting the final event (i.e. type of offence) in the chain.

  • In the case of fraud, all identified cyber-facilitated fraud events were considered crimes in their own right (as fraud is a distinct category of crime).

  • In the case of phishing, where phishing attacks were the final event, the survey also establishes whether these were attacks that individuals responded to (e.g. by opening an attachment) or that contained personal data about the recipient – these are then counted as crimes. This approach excludes mass phishing emails that were ignored by staff.

  • For ransomware, viruses and other malware, and denial of service attacks, the survey asks if these were stopped by software or not. Ransomware, and viruses and other malware, are both counted as cyber crimes if they breached an organisation’s defences.

  • Hacking incidents were split into two streams – unauthorised access and online takeovers. Unauthorised access, as well as denial of service attacks, are only counted as crimes if respondents can confirm that their organisation was deliberately targeted (i.e. to clarify that it was not just a case of accidental access by a member of staff, or an instance of high internet traffic affecting multiple organisations). Online takeovers are counted as crimes when they were successful (i.e. excluding attempted takeovers that did not breach defences).

Through this set of questions, the survey attempts to measure the prevalence of each type of crime (i.e. the percentage of the population experiencing it) and the scale of each type (i.e. the frequency with which it is experienced among the affected population).

Beyond this, the survey also attempts to capture:

  • the financial cost of all events counted as cyber crimes in the previous 12 months

  • the amount of ransom demanded in ransomware cyber crimes and the amount paid out

  • instances of extortion in the case of cyber crimes relating to denial of service attacks, unauthorised access and online takeovers, and the scale (i.e. frequency) of this

  • the cyber incidents that preceded the final event in a chain.

It should be noted that not all the results from these questions have been reported in the main Statistical Release, since sample sizes are too low in some cases, or the survey did not collect enough instances to report (e.g. there were no instances of respondents paying a ransom).

Wider questionnaire changes (beyond cyber crime questions)

In order to introduce the cyber crime section without resulting in an unfeasibly long average interview length (which would increase the burden on respondents), other sections of the questionnaire needed to be revised or removed.

Before the pilot, we streamlined the questionnaire by deleting a question specifically on awareness of the Cyber Aware campaign (CYBERAWARE), and instead asked about the campaign as a code within the next question (SCHEME – already covering awareness of government schemes, information and guidance on cyber security).

Further deletions were made after the pilot stage, noted later in this section.

Another approach to reduce length was to reduce the sample sizes for various questions by split-sampling – that is, asking a random half of the sample alternating questions. The increased overall sample sizes for businesses and charities meant that this approach did not overly affect the margins of error for the overall business and charity estimates, although it did limit analysis of sectoral differences at these questions. This approach was applied to the majority of the 2023 questionnaire, meaning that only a small number of core questions continued to be asked of the full sample. These were:

  • organisation and respondent characteristics (TITLE, TYPEX, SIZEA AND SIZEB)

  • questions that are heavily used in the derived variables showing whether organisations have made progress against the 10 Steps guidance (MANAGE, IDENT, RULES, TRAINED, SUPPLYRISK, INCIDCONTENT) – see Section 2.7

  • questions measuring the experience and impact of cyber security breaches, and incident response (TYPE through to IMPACT and DISRUPTA through to INCIDACTION), with the exception of the COSTA and COSTB questions, which were deleted post-pilot

  • the cyber crime questions (asked of all respondents that identified any cyber security breaches or attacks, routed from the TYPE question without any split-sampling),

The following questions were restricted to larger organisations (medium and large businesses, and high-income charities). Previous years of the survey showed that larger organisations are more likely to take these kinds of actions, and this is consequently where the policy interest lies:

  • UPDATE – how often senior managers are updated on cyber security actions

  • STRATEGY and STRATINT – whether organisations have a cyber security strategy

  • CORPORATE and CORPRISK – whether organisations’ annual reports include any content on cyber risks faced by the organisation.

Outside the cyber crime section, the NOREPORT question last asked in 2017 was reinstated in 2023.

Beyond this, only one new question was added in 2023, to INCIDCONTENT. This was intended to capture whether organisations used an NCSC-approved incident response company. This question was added after the cognitive testing stage, so could not be tested. As it was phrased in a past tense, whereas the other codes at INCIDCONTENT and question wording referred to what organisations do or would do (i.e. present and conditional tense), we believe the answers to this question may be difficult to interpret. Therefore, this question has been included in the SPSS dataset, but not reported in the Statistical Release.

Points to note for education samples in 2023 as a result of questionnaire changes

Two issues have affected the education samples at a small number of questions this year, as a consequence of some of the changes noted in the previous subsection:

  • PRIORITY, INSUREX, INFO, SCHEME, COMPLY and RANSOM were all split-sampled this year for businesses and charities. For the education institutions, the intention was to ask these questions to the full samples. However, a script issue meant that, for all interviews carried out up to 19 October, these questions were also split-sampled for education institutions. This accounts for 10% of all education institution interviews, As a result, these questions are reported in the Education Institutions Findings Annex, but with sample sizes slightly lower than the full education samples.

  • UPDATE, STRATEGY and STRATINT were restricted this year to be asked of medium and large businesses and high-income charities. The survey script also withheld these questions from primary and secondary schools that said they had fewer than 50 staff members in the survey. As such, the results from these two questions cannot be reported for schools, as we do not have a complete, representative sample of data. This script will be amended next year to rectify this issue. The 2023 data for colleges and universities is unaffected – the full samples were asked these questions.

Cognitive testing

The Ipsos research team carried out 10 cognitive testing interviews with businesses, charities and schools. These interviews focused on the new cyber crime questions.

We recruited all participants by telephone. The sample source was organisations that took part in the previous iteration of the survey and gave permission to be recontacted for subsequent research on cyber security over the next 12 months. We applied recruitment quotas and offered £50 incentive[footnote 2] to ensure participation from different-sized organisations across the country, covering a range of sectors. In addition, we asked our recruitment partner to screen all participants to ensure they had experienced at least 1 of the relevant cyber incidents – achieving a minimum quota of 2 organisations covering each incident type – that would enable them to answer the cyber crime questions.

The cognitive testing highlighted several issues with the original, longer-form crime questions. The broad challenges to address included:

  • respondents struggling to recall an exact or accurate number of cyber incidents, particularly if there was no internal incident log

  • conflation of individual versus commercial cyber incidents

  • double-counting of the same incidents

  • confusing wording

  • long preambles, leaving respondents waiting a long time before they could give an answer

  • the time taken to get through the crime questions, and the risk of abandonment if questions were seen to be too lengthy or repetitive.

As a result, Ipsos reviewed and simplified the wording of the cyber crime preambles and questions wherever possible. Some questions were also split into two. One of the constraints with this was the agreed structure of the questions, that still required businesses to answer a negatively framed question in places (e.g. asking how many incidents were not stopped by software, rather than how many were stopped). Therefore, while the new questions were greatly simplified, they were still relatively long and placed a higher cognitive load on respondents than other questions in the survey.

Pilot survey

The pilot survey was used to:

  • test the questionnaire Computer-Assisted Telephone Interviewing (CATI) script

  • time the questionnaire

  • test the usefulness of the interviewer briefing materials

  • test the quality and eligibility of the sample (by calculating the proportion of the dialled sample that ended up containing usable leads).

Ipsos interviewers carried out all the pilot fieldwork by phone between 23 and 27 September 2022. Again, we applied quotas to ensure the pilot covered different-sized businesses from a range of sectors, charities with different incomes and from different countries, and the various education institutions we intended to survey in the main fieldwork. This was with one exception – we excluded any higher and further education samples, as the populations are so small (making the available sample precious). We carried out 40 interviews, breaking down as:

  • 22 businesses

  • 14 charities

  • 4 schools (2 primary schools and 2 secondary schools).

The pilot sample came from the same sample frames used for the main stage survey (see next section). In total, we randomly selected 720 business leads, 959 charity leads and 478 schools.

Following the same approach as last year, the pilot was used as a soft launch of the main fieldwork. While quotas were initially applied to achieve these 40 interviews, the remaining pilot sample was subsumed into the main survey and fully worked alongside the other sample batches, following a strict random probability approach. Moreover, there were no substantial post-pilot changes to the questionnaire other than the deletions. Therefore, the 40 pilot interviews were counted as part of the final data.

The average interview length for the pilot was just over 26 minutes, which was above target for the main stage (22 minutes). Following the pilot stage, we deleted the following questions to reduce the average interview length. These all reflected questions flagged as lower priorities in the August 2022 stakeholder meeting:

  • ONLINE – mapping organisations’ digital footprints (e.g. use of online bank accounts or network-connected devices)

  • MOBILE – covering use of personal devices

  • CLAIM – covering cyber insurance claims made (among those with cyber insurance)

  • STRATEXT and STRATREV – whether those with cyber security strategies had had them reviewed by third parties, and whether this was a specific review of the cyber strategy or a wider review covering more than the cyber strategy

  • COSTA and COSTB – collecting numeric or banded estimates for the total financial cost of all cyber security breaches identified in the last 12 months.

2.2 Survey microsite and GOV.UK page

As in previous years, a publicly accessible Ipsos microsite (still active as of April 2023) and a similar GOV.UK page were again used to provide reassurance that the survey was legitimate and provide more information before respondents agreed to take part.

Interviewers could refer to both pages at the start of the telephone call, while the reassurance emails sent out from the CATI script (to organisations that wanted more information) included a link to the GOV.UK page.

2.3 Sampling

Business population and sample frame

The target population of businesses largely matched those included in the all the previous surveys in this series, i.e. private companies or non-profit organisations[footnote 3] with more than one person on the payroll.

The survey is designed to represent enterprises (i.e. the whole organisation) rather than establishments (i.e. local or regional offices or sites). This reflects that multi-site organisations will typically have connected digital devices and will therefore deal with cyber security centrally.

The sample frame for businesses was the Market Location database which covers businesses in all sectors across the UK at the enterprise level. This marks a change from the sample frame used in all previous years of the survey – the Inter-Departmental Business Register (IDBR). In recent years, the low telephone coverage of the IDBR sample and overall low quality (in terms of inaccurate or outdated business characteristics and contact details), relative to commercial business databases, made it an increasingly unfeasible sample frame for this survey.

Given the much higher target number of interviews this year, DSIT and Ipsos jointly agreed to move away from the IDBR to an alternative option. The Market Location database was considered the best option. It includes c.790,000 businesses with 2-9 employees. It is compiled from a mix of public business directories, Companies House data and call centre activity. It is not only a clean database requiring less processing than the IDBR, but also higher quality – over 10,000 calls are made daily to validate numbers, with each record (telephone, email and senior contact name) having been validated within a rolling 12-month period.

Market Location is also superior to the alternative Dun and Bradstreet (D&B) and Experian databases. The latter are, primarily, credit files. D&B is maintained by a global team. Market Location is, by contrast, a dedicated UK business database, built and monitored by a UK team for market research purposes. Ipsos also spoke to the Market Location team ahead of fieldwork and received assurances that no changes to the data sources are planned, which might otherwise affect the comparability of samples in future iterations of the survey.

As previously mentioned, the core approach to the survey – data collected from organisations via a random-probability survey, predominantly conducted by telephone – has not changed. Furthermore, the achieved sample includes a very diverse mix of businesses by size, sector and region. As such, we do not expect the move to Market Location to have a meaningful impact on trend findings. However, we cannot definitively rule out that any change in the business findings this year may be due to an unobservable difference in the sample frames between years. As a cautionary measure, where we have charts showing changes over time in the main Statistical Release, we use a dotted line to show the trend from 2022 to 2023.

Exclusions from the Market Location sample

With the exception of universities, public sector organisations are typically subject to government-set minimum standards on cyber security. Moreover, the focus of the primary sample in the survey was to provide evidence on businesses’ engagement, to inform future policy for this audience. Public sector organisations (Standard Industrial Classification, or SIC, 2007 category O) were therefore considered outside of the scope of the survey and excluded from the sample selection.

In line with the previous year, businesses listed as having just 1 employee were eligible to take part in the survey (only 0-employee businesses were excluded entirely). However, given that many businesses listed as having 1 employee on business databases turn out to have 0 employees, the sampling was only done on businesses listed as having 2 or more employees. This helped to avoid an unreasonably high ineligibility rate during fieldwork.

Charity population and sample frames (including limitations)

The target population of charities was all UK registered charities. The sample frames were the charity regulator databases in each UK country:

In England and Wales, and in Scotland, the respective charity regulator databases contain a comprehensive list of registered charities. DSIT was granted full access to the non-public OSCR database, including telephone numbers, meaning we could sample from the full list of Scotland-based charities, rather than just those for which we were able to find telephone numbers. This process has since changed, and subsequent iterations of the survey will involve requesting a random sample of the OSCR database rather than the full database.

The Charity Commission in Northern Ireland does not yet have a comprehensive list of established charities, but has been registering charities and building its list over the past few years. Alternative sample frames for Northern Ireland, such as the Experian and Dun & Bradstreet business directories (which also include charities) have been considered in previous years, and ruled out, because they do not contain essential information on charity income for sampling, and cannot guarantee up-to-date charity information.

Therefore, while the Charity Commission in Northern Ireland database was the best sample frame for this survey, it cannot be considered as a truly random sample of Northern Ireland charities at present. This year, there were 6,880 registered charities on the Northern Ireland database, compared to 6,438 in the 2022 survey and 6,190 in the 2021 survey.

Education institutions population and sample frame

The education institutions sample frame came from the following sources:

Given the significant differences in size and management approaches between different types of education institutions, we split the sample frame into four independent groups:

  • 20,783 primary schools (including free schools, academies, Local Authority-maintained schools and special schools covering children aged 5 to 11)

  • 3,886 secondary schools (including free schools, academies, Local Authority-maintained schools and special schools covering children aged 11+)

  • 305 further education colleges

  • 175 universities.

In order to avoid disclosure, we do not include any information about the specific school type (beyond fitting into the primary or secondary school bracket) in the published data or SPSS file.

Business sample selection

In total, 45,599 businesses were selected from the Market Location database for the 2023 survey. This is substantially lower than the 84,174 selected in 2022 from the IDBR, and reflects that all Market Location records come with a telephone number (vs. c.25% of IDBR records).

The business sample was proportionately stratified by region, and disproportionately stratified by size and sector. An entirely proportionately stratified sample would not allow sufficient subgroup analysis by size and sector. For example, it would effectively exclude all medium and large businesses from the selected sample, as they make up a very small proportion of all UK businesses – according to the Business Population Estimates 2022, published by the former Department for Business, Energy and Industrial Strategy (BEIS). Therefore, we set disproportionate sample targets for micro (1 to 9 staff), small (10 to 49 staff), medium (50 to 249 staff) and large (250 or more staff) businesses. We also boosted specific sectors, to ensure we could report findings for the same sector subgroups that were used in the 2022 report. The boosted sectors included:

  • manufacturing (SIC C)

  • information and communications (SIC J)

  • financial and insurance (SIC K)

  • health, social work or social care (SIC Q).

Post-survey weighting corrected for the disproportionate stratification (see Section 2.6).

Table 2.1 breaks down the selected business sample by size and sector.

Table 2.1: Pre-cleaning selected business sample by size and sector

SIC 2007 letter[footnote 4] Sector description Micro (1–9 staff) Small (10-49 staff) Medium (50–249 staff) Large (250+ staff) Total
A Agriculture, forestry or fishing 950 117 69 85 1,160
B, C, D, E Utilities or production (including manufacturing) 1,183 369 833 1,127 3,512
F Construction 5,629 123 318 163 6,233
G Retail or wholesale (including vehicle sales and repairs) 3,867 817 829 626 6,139
H Transport or storage 1,365 146 212 306 2,029
I Food or hospitality 3,822 651 531 208 5,212
J Information or communications 1,515 286 609 394 2,804
K Finance or insurance 1,673 736 632 444 3,485
L, N Administration or real estate 3,859 241 593 866 5,559
M Professional, scientific or technical 2,743 281 647 622 4,293
P Education 220 52 42 48 362
Q Health, social care or social work 879 341 393 317 1,930
R, S Entertainment, service or membership organisations 2,232 271 171 207 2,881
  Total 29,937 4,431 5,879 5,352 45,599

Charity and education institution sample selection

The charity sample was proportionately stratified by country and disproportionately stratified by income band, using the respective charity regulator databases to profile the population. This used the same reasoning as for businesses – without this disproportionate stratification, analysis by income band would not be possible as hardly any high-income charities would be in the selected sample. In addition, having fewer high-income charities in the sample would be likely to reduce the variance in responses, as high-income charities tend to take more action on cyber security than low-income ones. This would have raised the margins of error in the survey estimates.

As the entirety of the three charity regulator databases were used for sample selection, there was no restriction in the amount of charity sample that could be used, so no equivalent to Table 2.1 is shown for charities.

Similarly, the entirety of the state education institution databases was available for sample selection, so no equivalent table is shown for education institutions.

Sample telephone tracing and cleaning

Not all the original sample was usable. In total:

  • 3,409 of the 45,599 original Market Location records were excluded because they had an invalid telephone number (i.e. the number was either in an incorrect format, too long, too short, had an invalid string, or was a number which would charge the respondent when called), because they were flagged as part of Ipsos’ non-contact list (organisations that have requested no further contact), or because they were based outside the UK.

  • 30,829 of the 201,325 charities had no valid telephone numbers, were flagged as being on the non-contact list, or were duplicate records (i.e. with the same number appearing twice)

  • 4,015 of the 25,149 education institutions had no valid telephone number or were duplicate records.

In previous years, due to the very low telephone coverage in the IDBR sample, this has required substantial automated telephone tracing (i.e. matching the sample frame data to a commercial business database to add missing contact details where possible). Given the near-full telephone coverage on the Market Location sample this year, such telephone tracing was not required. Nevertheless, Ipsos did undertake significant sample improvement work, using their sampling partners to match the samples to data from organisations’ websites, publicly available LinkedIn pages and other social media, and Companies House data, to add in the names and job titles of relevant individuals within the business, as well as email addresses where available, in order to maximise our ability to get past gatekeepers (e.g. receptionists) and reach the appropriate individual in the organisation.

At the same time as this survey, Ipsos was also carrying out another survey with a potentially overlapping sample of businesses and charities – the DSIT cyber skills labour market survey. We therefore flagged overlapping sample leads across surveys, so telephone interviewers could avoid contacting the same organisations in quick succession for both surveys, and minimise the burden on respondents.

Following cleaning to remove unusable or duplicate numbers, the usable business sample amounted to:

  • 42,190 Market Location records

  • 170,496 charities (with exclusions mainly due to the high prevalence of duplicate numbers in this sample frame)

  • 21,153 education institution.

Table 2.2 breaks the usable business leads down by size and sector, for the business sample. As this shows, there was typically much greater telephone coverage in the medium and large businesses in the sample frame than among micro and small businesses. This has been a common pattern across years. In part, it reflects the greater stability in the medium and large business population, where firms tend to be older and are less likely to have recently updated their telephone numbers.

Table 2.2: Post-cleaning available business sample by size and sector (sample volumes and as a percentage of originally selected sample)

SIC 2007 letter Sector description Micro (1–9 staff) Small (10-49 staff) Medium (50–249 staff) Large (250+ staff) Total
A Agriculture, forestry or fishing 920
97%
113
97%
66
96%
22
26%
1,121
97%
B, C, D, E Utilities or production (including manufacturing) 1,137
96%
350
95%
789
95%
1027
91%
3,303
94%
F Construction 5,384
96%
116
94%
289
91%
139
85%
5,928
95%
G Retail or wholesale (including vehicle sales and repairs) 3,760
97%
775
95%
772
93%
542
87%
5,849
95%
H Transport or storage 1,303
95%
129
88%
193
91%
265
87%
1,890
93%
I Food or hospitality 3,692
97%
620
95%
510
96%
193
93%
5,015
96%
J Information or communications 1,389
92%
261
91%
541
89%
323
82%
2,514
90%
K Finance or insurance 1,573
94%
640
87%
511
81%
322
73%
3,046
87%
L, N Administration or real estate 3,350
87%
230
95%
535
90%
724
84%
4,839
87%
M Professional, scientific or technical 2,613
95%
264
94%
573
89%
507
82%
3,957
92%
P Education 209
95%
49
94%
30
71%
42
88%
330
91%
Q Health, social care or social work 808
92%
312
91%
357
91%
273
86%
1,750
91%
R, S Entertainment, service or membership organisations 2,072
93%
254
94%
160
94%
162
78%
2,648
92%
  Total 28,210
94%
4,113
93%
5,326
91%
4,541
85%
42,190
93%

Sample batches

For businesses and charities, the usable sample for the main stage survey was randomly allocated into batches. The first batch, excluding pilot sample, had 24,587 business records and 5,324 charity records.

The selection counts were modelled according to two criteria:

  • If a particular size band, industry sector or (in the case of charities) income band had a higher interview target based on the disproportionate stratification, we selected more records to reflect that higher target.

  • Equally, if a particular size band, industry sector or income band had historically achieved lower response rates, we selected more records to reflect these lower response rate expectations. The response rate expectations were modelled on how other recent DSIT cyber surveys using these same sample frames had performed.

For primary and secondary schools, we selected simple random sample batches of each group. In the first batch, this amounted to 1,099 primary schools and 1,241 secondary schools.

The colleges and higher education institutions sample was released in full at the start of fieldwork (i.e. we carried out a census of these groups, only excluding records where there was no valid telephone number, or numbers were duplicated).

Subsequent sample batches were selected according to the same criteria, updated with the remaining interview targets and response rates achieved up to that point. Across all sample groups, four batches (in addition to the pilot batch) were released throughout fieldwork. We aimed to maximise the response rate by fully exhausting the existing sample batches before releasing additional records. This aim was balanced against the need to meet interview targets, particularly for boosted sample groups (without setting specific interview quotas).

Over the course of fieldwork, we used (including for the pilot):

  • 32,819 Market Location records

  • 7,612 charity records

  • 2,139 primary schools

  • 2,678 secondary schools

  • 266 further education colleges

  • 146 higher education institutions.

That is to say, we did not use all the available (and usable) records for businesses, charities, primary schools and secondary schools. The remaining records were held in reserve.

2.4 Fieldwork

Ipsos carried out all main stage fieldwork from 29 September 2022 (following a 2-day pause after the pilot fieldwork) to 18 January 2023, i.e. a fieldwork period just over 14 weeks.[footnote 5] This is a just over a week longer than the fieldwork period for the previous two iterations of the study, and reflects the additional time used to fully work through the much larger sample volumes this year (given the higher interview targets).

In total, we completed interviews with 3,991 organisations:

  • 2,263 businesses

  • 1,174 charities

  • 241 primary schools

  • 217 secondary schools

  • 44 further education colleges

  • 52 higher education institutions.

The average interview length was just under c.24 minutes for all groups.

Multimode data collection

The previous years’ surveys were conducted by telephone only, using Computer-Assisted Telephone Interviewing (CATI). As part of a range of measures this year to help improve the survey sample coverage and response rate, we implemented multimode surveying after the pilot stage, allowing respondents to take part either by telephone or online.

In practical terms, the multimode methodology worked as follows for businesses, charities, and primary and secondary schools:

  • All initial contact with organisations took place by phone, with Ipsos telephone interviewers calling organisations in line with previous years.

  • Where organisations requested more information before deciding to take part, interviewers could send out an information and reassurance email. This email contained a unique link for each organisation to complete the survey entirely or partially online. The interviewers explained this ahead of sending out each email.

  • Beyond the initial phone call to establish contact and explain the survey, the respondents that completed the survey online had no interaction with an Ipsos interviewer when answering the questions but were instead routed through an online questionnaire, with each question appearing on a separate screen.

For further and higher education institutions, a further option was available. Ipsos created an open link to the online survey to be disseminated by Jisc and UCISA – representative bodies for individuals working in IT and cyber roles in colleges and universities – to their members. In total, 21 colleges and 31 universities took part in the survey via this open link (and these are included in the total completed interviews mentioned at the start of Section 2.4).

Over the course of fieldwork, we sent 2 reminder emails to those that had started but not finished the survey online. In total, 715 interviews were completed using the online survey option, which represents 18% of the 3,991 total interviews. Table 2.3 shows how this is split across the different sample groups:

Table 2.3: Data collection mode by sample group

Sample group Telephone interviews Online interviews Percentage conducted online
Businesses 1,857 406 18%
Charities 960 214 18%
Primary schools 234 7 3%
Secondary schools 203 14 6%
Further education 18 26 59%
Higher education 11 41 79%

Ipsos made the following efforts to monitor and maintain the quality of the online interviews, and reduce the possibility of mode differences in the responses:

  • We took a best-practice approach to multimode questionnaire design, where the format of each question was similar across modes (e.g. using collapsible grids for statements online, rather than showing all statements at once). However, it should be noted that long pre-coded questions like INFO, GOVTACT, NOREPORT, REPORTB and PREVENT were unavoidably different across modes. INFO was asked unprompted by telephone but as a prompted list online. This is standard practice in multimode questionnaires, but typically means that online respondents are inclined to give a wider range of responses (as they see a list of possible responses in front of them). For the 2023 survey, the remaining long pre-coded questions were asked as open text boxes in the online survey, although this will be changed in subsequent years to match the approach used at INFO. This does not necessarily mean that either the telephone or online responses are wrong at any of these questions. However, it does mean that a small note of caution should be applied when comparing results for individual answer codes before and since 2023.

  • We validated that online respondents were the appropriate individuals from the organisation via the TITLE question (which requests job titles).

  • We checked online responses to ensure respondents were not speeding through the interview or “straightlining” (i.e. answering “don’t know” or the top answer code in the list to every question).

  • We monitored responses at key questions (e.g. PRIORITY and TYPE) by mode at regular intervals during fieldwork. This was redone as part of the final data processing, With a focus on the questions where there had been a change between 2022 and 2023. Where there were differences by mode at these questions, we checked to see if these could be explained by the different profile of respondents answering online than by phone. These checks suggest that any changes over time cannot be attributed to mode differences. For example, for PRIORITY, the drop in prioritisation of cyber security in 2023 is apparent both in the telephone sample and in the online sample. This gives us confidence that the findings remain comparable to previous years despite the addition of online interviews.

Fieldwork preparation

Prior to fieldwork, the Ipsos research team briefed the telephone interviewing team in a video call, attended by DSIT colleagues. They also received:

  • written briefing materials about all aspects of the survey

  • a copy of the questionnaire and other survey instruments.

Screening of respondents (for telephone interviews)

Telephone interviewers screened all sampled organisations at the beginning of the call to identify the right individual to take part and ensure the business was eligible for the survey. At this point, the following organisations would have been removed as ineligible:

  • organisations that identified themselves as sole traders with no other employees on the payroll

  • organisations that identified themselves as part of the public sector.

In previous years, organisations that claimed to have no computer, website or other online presence were also screened out. This type of ineligibility has dwindled in recent years, and we expect that most organisations making this claim are, in fact, simply refusing to take part by proxy. Therefore, this year, this reason for not taking part was simply listed as a refusal.

As this was a survey of enterprises rather than establishments, interviewers also confirmed that they had called through to the UK head office or site of the organisation.

At this point, interviewers specifically asked for the senior individual with the most responsibility for cyber security in the organisation. The interviewer briefing materials included written guidance on likely job roles and job titles for these individuals, which would differ based on the type and size of the organisation.

For UK businesses that were part of a multinational group, interviewers requested to speak to the relevant person in the UK who dealt with cyber security at the company level. In any instances where a multinational group had different registered companies in Great Britain and in Northern Ireland, both companies were considered eligible.

Franchisees with the same company name but different trading addresses were also all considered eligible as separate independent respondents.

Random probability approach and maximising participation

We adopted random probability interviewing to minimise selection bias. The overall aim with this approach is to have a known outcome for every piece of sample loaded. For this survey, an approach comparable to other robust business surveys was used around this:

  • Each organisation loaded in the main survey sample was called either a minimum of 7 times, or until an interview was achieved, a refusal given, or information obtained to make a judgment on the eligibility of that contact. In practice, our approach exceeded these minimum requirements – any records marked as reaching the maximum number of tries had in fact been called 10 times or more.

  • Each piece of sample was called at different times of the day, throughout the working week, to make every possible attempt to achieve an interview. Evening and weekend interviews were also offered if the respondent preferred these times.

We took several steps to maximise participation in the survey and reduce non-response bias:

  • The survey had its own web page on GOV.UK and the Ipsos microsite, to let organisations know that the contact from Ipsos was genuine. The web pages included appropriate Privacy Notices on processing of personal data, and the data rights of participants, following the introduction of GDPR in May 2018.

  • Interviewers could send a reassurance email to prospective respondents if the respondent requested this. This included a link to the GOV.UK page to confirm the legitimacy of the survey, a link to the relevant Privacy Notice and an option to unsubscribe (by replying to the message and requesting this).

  • Ipsos set up an email inbox and free (0800) phone number for respondents to be able to contact to set up appointments or, in the case of the phone number, take part there and then in interviews. Where we had email addresses on the sample for organisations, we also sent five warm-up and reminder emails across the course of fieldwork to let organisations know that an Ipsos interviewer would attempt to call them, and give them the opportunity to opt in by arranging an appointment. These emails also asked organisations to check the contact details we had for them and to send us better contact details if necessary. They were tailored to the type of organisation, with each email featuring a different subject line and key message to encourage participation.

  • The survey was endorsed by the Institute of Chartered Accountants in England and Wales (ICAEW), the Association of British Insurers (ABI), the Charity Commission for England and Wales and the Charity Commission for Northern Ireland and techUK. In practice, this meant that these organisations allowed their identity and logos to be used in the survey introduction and on the microsite, to encourage organisations to take part.

  • As an extra encouragement, we offered to email respondents a copy of last year’s infographic summaries, and a help card listing the range of government guidance on cyber security, following their interview.

  • Specifically, to encourage participation from colleges and universities, DSIT and Ipsos jointly worked with Jisc and UCISA. These organisations contacted their members, which include IT and cyber security professionals in the further and higher education sectors, to proactively ask them to take part in the survey via the open link. Ipsos created a promotional PowerPoint deck explaining the survey to support this.

  • The multimode surveying approach aimed to offer organisations much greater flexibility as to how they could take part. Given the challenges of reaching appropriate individuals in hybrid working environments, this approach was felt to have greatly improved the final sample composition, over and above what a telephone survey alone could achieve.

Fieldwork monitoring

Ipsos is a member of the interviewer Quality Control Scheme recognised by the Market Research Society. In accordance with this scheme, the field supervisor on this project listened into at least 10% of the interviews and checked the data entry on screen for these interviews.

Recontact survey to clarify responses at cyber crime questions

During the main fieldwork period, we noticed that a minority of organisations were providing seemingly contradictory responses when comparing the core cyber incident question (TYPE) within the new cyber crime questions. The categories of these contradictions can be summarised as follows:

  • organisations saying at the “COUNT” questions (e.g. RANSCOUNT) that they had incurred zero instances of this type of cyber incident, despite saying at TYPE that they had incurred this kind of breach or attack in the previous 12 months

  • organisations saying at the “FIN” questions that a particular kind of cyber incident was not the final event (e.g. ransomware at RANSFIN), but not identifying another category of cyber incident that followed it at the “CONT” questions (e.g. if ransomware was followed by unauthorised access, this should have been indicated at HACKCONT).

In the latter instances, this meant that no final event was identified for certain cyber incidents. Therefore, it is possible that some cyber crimes were not counted.

A potential explanation for some contradictions, beyond respondent errors in terms of recall, is that organisations may consider cyber security “breaches or attacks” to be different from cyber “incidents”. For example, an organisation may have experienced a ransomware attack, but not considered it as an “incident” if it had a negligible impact on the organisation’s day-to-day operations. While this specific issue did not emerge in cognitive testing, it will be explored in the questionnaire development next year.

Ipsos and the Home Office developed a recontact survey for respondents that had one of these types of contradictions. This applied to 146 respondents out of the 1,637 identifying any cyber security breaches or attacks at TYPE (i.e. 9% of those who answered the cyber crime questions). This recontact survey focused on revalidating the cyber crime data with alternative question wording, while also offering respondents the chance to explain why they may have given different answers in the main survey.

In total, 67 of these 146 respondents took part in the recontact survey (with the remainder being unavailable or refusing). A summary of the responses is as follows:

  • All but 8 respondents changed their responses from the original interview, suggesting that the wording of the original questions may have been confusing for them.

  • Among those that changed their responses, 3 initially still contradicted themselves by saying they had not incurred the cyber incidents they mentioned at TYPE in the original interview. However, they changed their answers after prompting from the interviewer that cyber security breaches or attacks that seemingly had no impact on their organisation still counted. This highlights that the varying terminology of cyber security breaches, attacks, incidents and cyber crimes may be differently interpreted by some organisations – for example, an unsuccessful attack may not be considered an “incident”.

  • Where respondents did change their answers, we have edited these in the final SPSS data to reflect the revalidated responses. However, the revalidated responses did not resolve all the apparent contradictions in the data, and these contradictions have therefore been left as they are in the final SPSS data. Ipsos discussed the possibility of removing all the remaining contradictory responses from the data with DSIT and the Home Office, but opted not to do this, as it would have meant potentially removing legitimate answers as well as contradictory ones.

  • Where responses had changed, these make a negligible impact on the cyber crime estimates reported in the main Statistical Release. In most cases, respondents did not change their answers in such a way that a new cyber crime was recorded, or a cyber crime from the original interview was removed.

2.5 Fieldwork outcomes and response rate

We monitored fieldwork outcomes and response rates throughout fieldwork, and interviewers were given regular guidance on how to avoid common reasons for refusal. Table 2.4 shows the final outcomes, the response rate and the response rate adjusted for unusable or ineligible records, for businesses and charities. The approach for calculating these figures is covered later in this section.

Table 2.4: Fieldwork outcomes and response rate calculations for businesses and charities

Outcome Businesses Charities
Total selected from original sample frame 45,599 201,325
Sample without contact details or duplicates post-cleaning 3,409 30,829
Net: total sample with contact details 42,190 170,496
Sample with contact details left in reserve 9,371 162,884
Net: total sample used (i.e. excluding any left in reserve) 32,819 7,612
Unresponsive numbers 23,340 4,722
Refusals 2,911 486
Unusable leads with working numbers 3,185 1,186
Unusable numbers 593 107
Ineligible leads – established during screener 231 36
Incomplete interviews 146 51
Net: completed interviews 2,413 1,024
Expected eligibility of screened respondents 92% 97%
Response rate 7% 13%
Response rate adjusted for unusable or ineligible records 9% 17%

The fieldwork outcomes for state education institutions are shown in Table 2.5.

Table 2.5: Fieldwork outcomes and response rate calculations for state education institutions

Outcome Primary schools Secondary schools Further education Higher education
Total selected from original sample frame 20,783 3,886 305 175
Sample without contact details or duplicates post-cleaning 3,375 572 39 29
Net: total sample with contact details 17,408 3,314 266 146
Sample with contact details left in reserve 15,269 636 0 0
Net: total sample used (i.e. excluding any left in reserve) 2,139 2,678 266 146
Incomplete interviews 1,642 2,200 185 73
Ineligible leads – established during screener 135 137 16 7
Refusals 88 68 18 9
Unusable leads with working numbers 15 32 3 4
Unusable numbers 5 8 0 0
Unresponsive numbers 13 16 0 1
Net: completed interviews 241 217 44 52
Expected eligibility of screened respondents 98% 97% 100% 100%
Response rate 11% 8% 17% 36%
Response rate adjusted for unusable or ineligible records 12% 9% 18% 39%

Notes on response rate calculations

The following points explain the specific calculations and assumptions involved in coming up with these response rates:

  • Response rate = completed interviews / total sample used

  • Response rate adjusted for unusable or ineligible records = completed interviews / (completed interviews + incomplete interviews + refusals expected to be eligible + any remaining unresponsive numbers expected to be eligible)

  • Refusals exclude “soft” refusals. This is where the respondent was hesitant about taking part, so our interviewers backed away and avoided a definitive refusal.

  • Unusable leads with working numbers are where there was communication difficulty making it impossible to carry out the survey (e.g. a bad line, or language difficulty), as well as numbers called 10 or more times over fieldwork without ever being picked up.

  • Unusable numbers are where the number was in a valid format, so was loaded into the main survey sample batches, but which turned out to be wrong numbers, fax numbers, household numbers or disconnected.

  • Unresponsive numbers account for sample that had a working telephone number, but where the respondent was unreachable or unavailable for an interview during the fieldwork period, so eligibility could not be assessed.

Response rates post-COVID-19 and expected negligible impact on the survey reliability

The adjusted response rates for all the sampled groups, outside of higher education institutions, are lower than in earlier iterations of this study, that took place before the COVID-19 pandemic. For example, the adjusted response rates for the last survey in this series that took place before the pandemic were 27% for businesses and 45% for charities.

The lower response rates compared to historic years are likely to be due to a combination of unique circumstances, including:

  • the hybrid working conditions adopted by many organisations since the pandemic

  • the ongoing challenge of declining response rates in telephone survey fieldwork in general, including in business surveys specifically.

More generally, there has been an increasing awareness of cyber security, potentially making businesses more reticent to take part in surveys on this topic.

Furthermore, the increase in the survey length from c.17 minutes in 2020 and earlier iterations, to c.24 minutes this year is also expected to have reduced the response rate – interviewers must mention the average length to respondents when they introduce the survey, and respondents are naturally less inclined to take part in longer interviews.

To a lesser extent, the existence of another DSIT organisational survey on cyber security, the Cyber Security Longitudinal Survey (CSLS), may have impacted the performance of this survey. Ipsos also undertook fieldwork for the CSLS. The CSLS fieldwork took place earlier, between March and July 2022. Organisations that took part in the CSLS were excluded from the sample for the Cyber Security Breaches Survey. However, organisations that were contacted for that survey but opted not to take part may also have been resampled and contacted anew for the Cyber Security Breaches Survey, and been less likely to take part as a result.

However, it is important to remember that response rates are not a direct measure of non-response bias in a survey, but only a measure of the potential for non-response bias to exist. Previous research into response rates, mainly with consumer surveys, has indicated that they are often poorly correlated with non-response bias.[footnote 6]

2.6 Data processing and weighting

Editing and data validation

There were a number of logic checks in the CATI script, which checked the consistency and likely accuracy of answers estimating costs and time spent dealing with breaches. If respondents gave unusually high or low answers at these questions relative to the size of their organisation, the interviewer would read out the response they had just recorded and double-check this is what the respondent meant to say. In addition, respondents overwhelmingly revalidated their answers at the cost questions in the online follow-up survey. This meant that, typically, minimal work was needed to manually edit the data post fieldwork.

Nonetheless, individual outliers in the data can heavily affect cyber breach cost estimates. Therefore, the research team manually checked the final data for outliers and recalculated the estimates without these outliers, in order to check the impact that they were having on answers. This year, we edited two outlier responses.

  • One respondent had given a total cost estimate at RANSCOSTA (the total cost of all the ransomware cyber crimes in the last 12 months) of £1.5 million. However, they also gave a combined total cost estimate for their single most disruptive cyber security breach or attack at the DAMAGE questions (e.g. DAMAGEDIRS) as £1.25 million. It is highly unlikely that the organisation would have had two extremely costly cyber incidents in the same year, so we assumed these were intended to be the same incident, and capped the answer at RANSCOSTA at £1.25 million.

  • One respondent had given a total cost estimate at RANSCOSTA of £100,000. However, they also gave a combined total cost estimate for their most disruptive breach or attack at the DAMAGE questions in the low £100s. As the DAMAGE questions are more well-established, and aim to break down multiple aspects of the cost of breaches, we assumed the £100,000 estimate was an error and edited this to a “don’t know” response.

The final SPSS data uploaded to the UK Data Archive excludes outlier responses.

Coding

The verbatim responses to unprompted questions could be coded as “other” by interviewers when they did not appear to fit into the predefined code frame. These “other” responses were coded manually by Ipsos’ coding team, and where possible, were assigned to codes in the existing code frame. It was also possible for new codes to be added where enough respondents – 10% or more – had given a similar answer outside of the existing code frame. The Ipsos research team verified the accuracy of the coding, by checking and approving each new code proposed.

It should be noted that the code list for INFO, GOVTACT, NOREPORT, REPORTB and PREVENT is getting relatively long, with additions of new codes each year that are only used by very small handfuls of respondents. These codeframes will be reviewed in the next iteration of the survey to rationalise them and avoid overly long and impractical lists of codes.

We did not undertake SIC coding. Instead the SIC 2007 codes that were already in the IDBR sample were used to assign businesses to a sector for weighting and analysis purposes. The pilot survey in 2017 had overwhelmingly found the SIC 2007 codes in the sample to be accurate, so this practice was carried forward to subsequent surveys.

Weighting

The education institutions samples are unweighted. Since they were sampled through a simple random sample approach, there were no sample skews to be corrected through weighting.

For the business and charities samples, we applied random iterative method (rim) weighting for two reasons. Firstly, to account for non-response bias where possible. Secondly, to account for the disproportionate sampling approaches, which purposely skewed the achieved business sample by size and sector, and the charities sample by income band. The weighting makes the data representative of the actual UK business and registered charities populations.

Rim weighting is a standard weighting approach undertaken in business surveys of this nature, because it allows you to weight your sample to represent a wider population using multiple variables. In cases where the weighting variables are strongly correlated with each other, it is potentially less effective than other methods, such as cell weighting. However, this is not the case here.

We did not weight by region, primarily because region is not considered to be an important determining factor for attitudes and behaviours around cyber security. Moreover, the final weighted data are already closely aligned with the business population region profile. The population profile data came from the BEIS Business Population Estimates 2022.

Non-interlocking rim weighting by income band and country was undertaken for charities. The population profile data for these came from the respective charity regulator databases.

For both businesses and charities, interlocking weighting was also possible, but was ruled out as it would have potentially resulted in very large weights. This would have reduced the statistical power of the survey results, without making any considerable difference to the weighted percentage scores at each question.

Table 2.6 and Table 2.7 shows the unweighted and weighted profiles of the final data. The percentages are rounded so do not always add to 100%.

Table 2.6: Unweighted and weighted sample profiles for business interviews

Size Unweighted % Weighted %
Micro (1–9 staff) 61% 82%
Small (10–49 staff) 18% 15%
Medium (50–249 staff) 12% 3%
Large (250+ staff) 9% 1%
Sector Unweighted % Weighted %
Agriculture, forestry or fishing 2% 4%
Administration or real estate 11% 12%
Construction 11% 13%
Education 1% 2%
Entertainment, service or membership organisations 6% 7%
Finance or insurance 8% 2%
Food or hospitality 10% 10%
Health, social care or social work 4% 4%
Information or communications 7% 5%
Professional, scientific or technical 13% 13%
Retail or wholesale (including vehicle sales or repairs 15% 17%
Transport or storage 4% 4%
Utilities or production (including manufacturing) 9% 7%

Table 2.7: Unweighted and weighted sample profiles for charity interviews

Income band Unweighted % Weighted %
£0 to under £10,000 22% 38%
£10,000 to under £100,000 18% 30%
£100,000 to under £500,000 18% 13%
£500,000 to under £5 million 18% 5%
£5 million or more 16% 1%
Unknown income 13% 13%
Country Unweighted % Weighted %
England and Wales 83% 84%
Northern Ireland 7% 4%
Scotland 10% 13%

2.7 SPSS data uploaded to UK Data Archive

A de-identified SPSS dataset from this survey is being published on the UK Data Archive to enable further analysis. The variables are largely consistent with those in the previously archived datasets (from 2022 to 2018), outside of new questions and deleted questions.

A new variable, HALF, has been included this year. It shows, for the questions that were split-sampled, how the business and charity samples were split (into halves A and B). Beyond this, the rest of this section covers all major differences from the 2022 SPSS file.

List of changes to old variables in the SPSS file

The following SPSS variable is no longer comparable with previous years due to significant changes in question wording (covered earlier in Section 2.1):

  • IDENT5.

The following questions, which were present in the 2022 SPSS data, were removed from the survey questionnaire, but we have kept the variable with blank data in the latest SPSS file. This preserves the numeric ordering of variables in the file (e.g. since there is a GOVTACT23 variable, we have kept GOVTACT22 blank rather than delete it). The common reason for no longer requiring these variables is because of a desire to avoid adding too many codes to the already long codeframes for questions like INFO, GOVTACT, REPORTB and PREVENT. Many of these codes raised in previous years for a small handful of respondents are no longer used, or can be back-coded to existing codes. We have then relabelled these variables to make it clear they are no longer being used:

  • GOVTACT variables 22, 27 and 36

  • REPORTB variables 2, 4, 13, 19, 25, 27, 29, 31, 33, 34, 37-39, 41-42 and 44

  • PREVENT variables 48-52 and 54.

By contrast, some codes at these questions have been reinstated from previous years, as they were mentioned by respondents this year but not mentioned last year:

  • INFO24

  • PREVENT variables 17-19, 21, 22, 27-28, 31.

In 2022, there was a proliferation of codes at INFO that were only mentioned by a handful of respondents. This year, we have attempted to rationalise the code list, to remove codes that are unlikely to add substantial insight. This has meant removing several of the high-numbered INFO variables, reducing the highest number from INFO101 in 2022, to INFO83 in 2023. As a result, there are some discrepancies between the 2022 and 2023 SPSS files. Table 2.8 can be used for cross-referencing.

Table 2.8: Changes in labelling of the INFO variables between 2022 and 2023

Variable description 2022 variable name 2023 variable name
London Grid for Learning (LGFL) INFO82 INFO67
UK Cyber Security Council INFO83 INFO68
Centre for the Protection of National Infrastructure (CPNI) INFO68 INFO69
Cyber Crime Unit/Regional Crime Unit/Centre INFO85 INFO70
PCI Compliance INFO86 INFO71
Scottish Business Resilience Centre INFO87 INFO72
Cyber resilience centre/service INFO89 INFO73
Data protection officer/agency INFO90 INFO74
Independent/professional advisors INFO93 INFO75
Manufacturers INFO94 INFO76
Other businesses/organisations/schools INFO96 INFO77
Social media INFO97 INFO78
Web designer INFO99 INFO79
Church/Diocese INFO100 INFO80

NOREPORT is also a long pre-coded question but has not required amending. It was reinstated for the first time since the 2017 survey.

Revised mapping of 10 Steps guidance

As noted in Section 2.1, Ipsos engaged Professor Steven Furnell from the University of Nottingham in July 2022 to review how the questionnaire was mapped to the government’s 10 Steps to Cyber Security guidance, and suggest a more accurate and robust mapping. The rationale for changing the mapping was as follows:

  • ensuring that the Steps are kept mutually exclusive (so an organisation cannot automatically fulfil one step by fulfilling another)

  • setting an appropriately high bar for fulfilling each Step, while also ensuring that only particular types of organisations can fulfil a Step

  • streamlining the derived variables to avoid grouping several questions together where a single question would suffice.

The revised mapping of the 10 Steps to specific survey questions versus the 2022 mapping is summarised in Table 2.9. It highlights that all but two of these Steps (Steps 3 and 10) are no longer comparable to the 2022 survey (which itself was amended from the 2021 survey).

Table 2.9: New and previous mapping of the questionnaire to the 10 Steps to Cyber Security guidance

Step in SPSS Previous step description and mapping Current step description and mapping
Step1 Risk management – organisations at least annually update senior managers on cyber security actions and have or do at least 2 of the following:
- a cyber security policy or strategy
- adhere to Cyber Essentials or Cyber Essentials Plus
- have undertaken a cyber security risk assessment
- have cyber insurance (either a specific or non-specific policy)
- have undertaken cyber security vulnerability audits
- have an incident response plan
- have taken actions to manage the cyber risks from their immediate suppliers or wider supply chain
Risk management – organisation have undertaken a cyber security risk assessment (IDENT4)
Step2 Engagement and training – staff receive cyber security training, or the organisation has undertaken mock phishing exercises Engagement and training – staff receive cyber security training (TRAINED)
Step3 Asset management – organisations have a list of their critical assets Asset management – organisations have a list of their critical assets (MANAGE8)
Step4 Architecture and configuration – organisations have configured firewalls and at least 1 of the following:
- secure configurations, i.e. security controls on company devices
- a policy around what staff are permitted to do on company devices
Architecture and configuration – organisations have at least 3 of the following:
- up-to-date malware protection (RULES2)
- firewalls that cover your entire IT network, as well as individual devices (RULES3)
- restricting IT admin and access rights to specific users (RULES4)
- security controls on organisation-owned devices (e.g. laptops) (RULES7)
- only allowing access via organisation -owned devices (RULES8)
- separate WiFi networks for staff and for visitors (RULES9)
- specific rules for storing and moving personal data files securely (RULES15)
- a virtual private network, or VPN, for staff connecting remotely (RULES18)
Step5 Vulnerability management – organisations have a patching policy and at least 1 of the following:
- have undertaken cyber security vulnerability audits
- have undertaken penetration testing
- updated anti-malware
- a cyber security policy covering Software as a Service (SaaS)
Vulnerability management – organisations have policy to apply software security updates within 14 days (RULES1)
Step6 Identity and access management – organisations have or do at least 1 of the following:
- restrict admin rights to specific users
- a password policy
- two-factor authentication (2FA)
Identity and access management – organisations have any requirement for two-factor authentication when people access the organisation’s network, or for applications they use (RULES20)
Step7 Data security – organisations have cloud backups or other kinds of backups, and at least 1 of the following:
- rules covering secure personal data transfers
- a cyber security policy covering removable storage
- a cyber security policy covering how to store data
Data security – organisations have cloud backups (RULES13) or other kinds of backups (RULES14)
Step8 Logging and monitoring – organisations fulfil one of the following criteria:
- use security monitoring tools
- they have a log of breaches and have had a breach
Logging and monitoring – organisations fulfil at least 1 of the following criteria:
- used specific tools designed for security monitoring, such as Intrusion Detection Systems (IDENT11)
- any monitoring of user activity (RULES5)
Step9 Incident management – organisations have at least 1 of the following:
- an incident response plan
- formal debriefs for cyber security incidents
Incident management – organisations have a formal incident response plan (INCIDCONTENT1) or at least 3 of the following:
- written guidance on who to notify of breaches (INCIDCONTENT2)
- roles or responsibilities assigned to specific individuals during or after an incident (INCIDCONTENT3)
- external communications and public engagement plans (INCIDCONTENT6)
- guidance around when to report incidents externally, e.g. to regulators or insurers (INCIDCONTENT11)
Step10 Supply chain security – organisations have taken actions to manage the cyber risks from their immediate suppliers or wider supply chain Supply chain security – organisations have taken actions to manage the cyber risks from their immediate suppliers (SUPPLYRISK1) or wider supply chain (SUPPLYRISK2)

Organisation size variables

There are two organisation size variables, including a numeric variable (SIZEA) and a banded variable (SIZEB). The banded variable in the SPSS does not include the highest band from the questionnaire (1,000 or more employees) because there is no analysis carried out on this group (due to low sample sizes). Instead, it is merged into an overall large business (250 or more employees) size band, which is used across the published report.

Sector grouping before the 2019 survey

In the SPSS datasets for 2016 to 2018, an alternative sector variable (sector_comb1) was included. This variable grouped some sectors together in a different way, and was less granular than the updated sector variable (sector_comb2).

  • “education” and “health, social care or social work” were merged together, rather than being analysed separately

  • “information or communications” and “utilities” were merged together, whereas now “utilities” and “manufacturing” are merged together.

The previous grouping reflected how we used to report on sector differences before the 2019 survey. As this legacy variable has not been used in the report for the last two years, we have stopped including it in the SPSS dataset, in favour of the updated sector variable.

Derived financial cost estimates for cyber security breaches and attacks

For the questions in the survey estimating the financial costs of an organisation’s most disruptive breach or attack (DAMAGEDIRSX, DAMAGEDIRLX, DAMAGESTAFFX, DAMAGEINDX), respondents were asked to give either an approximate numeric response or, if they did not know, then a banded response. The vast majority of those who gave a response gave numeric responses (after excluding refusals and those saying there was no cost incurred).

We agreed with DSIT from the outset of the survey that for those who gave banded responses, a numeric response would be imputed, in line with all previous surveys in the series. This ensures that no survey data goes unused and also allows for larger sample sizes for these questions.

To impute numeric responses, syntax was applied to the SPSS dataset which:

  • calculated the mean amount within a banded range for respondents who had given numeric responses (e.g. a £200 mean amount for everyone giving an answer between £100 and £500)

  • applied this mean amount as the imputed value for all respondents who gave the equivalent banded response (i.e. £200 would be the imputed mean amount for everyone not giving a numeric response but saying “£100 to less than £500” as a banded response).

Often in these cases, a common alternative approach is to take the mid-point of each banded response and use that as the imputed value (i.e. £300 for everyone saying “£100 to less than £500”). It was decided against doing this for these specific questions, given that the mean responses within a banded range have tended to cluster towards the bottom of the band over the years. This suggested that imputing values based on mid-points would slightly overestimate the true values across respondents.

Derived cyber crime estimates (including numeric and financial cost estimates)

There are a range of new derived variables in this year’s SPSS file based on the new cyber crime questions. Here is a brief description of each derived variable:

  • Cybercrime_all – the percentage of organisations that have experienced any cyber crime (i.e. excluding cyber-facilitated fraud)

  • Cybercrime_allsum – the total number of cyber crimes experienced (i.e. excluding cyber-facilitated fraud), rebased to only be amongst those that experienced cyber crimes

  • Cybercrime_notphish – the percentage of organisations that have experienced any cyber crime other than phishing (still excluding cyber-facilitated fraud)

  • Cybercrime_notphishsum – the total number of cyber crimes experienced, other than phishing (still excluding cyber-facilitated fraud), rebased to only be amongst those that experienced these cyber crimes

  • Cybercrime_rans – the percentage of organisations that have experienced cyber crime relating to ransomware

  • Cybercrime_ranssum – the total number of cyber crimes experienced relating to ransomware, rebased to only be amongst those that experienced these cyber crimes

  • Cybercrime_virus – the percentage of organisations that have experienced cyber crime relating to viruses or other malware

  • Cybercrime_virussum – the total number of cyber crimes experienced relating to viruses or other malware, rebased to only be amongst those that experienced these cyber crimes

  • Cybercrime_hack – the percentage of organisations that have experienced cyber crime relating to hacking

  • Cybercrime_hacksum – the total number of cyber crimes experienced relating to hacking, rebased to only be amongst those that experienced these cyber crimes

  • Cybercrime_dos – the percentage of organisations that have experienced cyber crime relating to denial of service attacks

  • Cybercrime_dossum – the total number of cyber crimes experienced relating to denial of service attacks, rebased to only be amongst those that experienced these cyber crimes

  • crime_fraud – the percentage of organisations that have experienced fraud as a result of cyber crime

  • crime_fraudsum – the total number of frauds experienced as a result of cyber crime, rebased to only be amongst those that experienced these frauds

  • Cybercrime_phish – the percentage of organisations that have experienced cyber crime relating to phishing

  • Cybercrime_phishsum – the total number of cyber crimes experienced relating to phishing, rebased to only be amongst those that experienced these cyber crimes

  • Extortion – the percentage of organisations that have experienced any extortion (among those experiencing cyber crimes relating to unauthorised access, online takeovers or denial of service)

  • Extortion_sum – the total number of extortion events, rebased to only be amongst those that experienced cyber crimes relating to unauthorised access, online takeovers or denial of service

  • hacksumcost_bands – the total cost of criminal hacking and online takeovers in the last 12 months, rebased to only be amongst those that provided a cost estimate for any relevant cyber crime experienced

  • notfraudcost_bands – the total cost of all cyber crimes (i.e. excluding cyber-facilitated fraud), rebased to only be amongst those that provided a cost estimate for any relevant cyber crime experienced

  • fraudcost_bands – the total cost of fraud that occurred as a result of cyber crime

  • crimecost_bands – the total cost of all crimes (including cyber-facilitated fraud), rebased to only be amongst those that provided a cost estimate for any crime experienced.

For the numeric and financial cost estimates for cyber crime, respondents were also able to give a banded response if they could not provide an exact answer. We have opted to impute the numeric or financial value for these questions by taking the mid-point of each banded response (or the specific value mentioned in the top band). This is different from the cyber incident cost estimates, which impute the average value within the band. The sample of cyber crime cost estimates is much lower, so there is not enough data to impute average values within bands. In other words, it is simply not possible to use anything other than the mid-point values.

Post-survey editing of derived variables covering cyber-facilitated fraud

The fraud-related derived variables, crime_fraud, crime_fraudsum and fraudcost_bands, come from the equivalent survey questions (FRAUD, FRAUDCOUNT_NUM and FRAUDCOSTA). During the reporting for this year, Ipsos, DSIT and the Home Office agreed to exclude instances where fraud was only preceded by impersonation and not by any other cyber security breach or attack. This is because impersonation is not a cyber crime, and the reported estimates intend to show fraud that occurs only as a result of cyber crime. The derived variables take account of this, and therefore exclude some of the data from the equivalent survey questions. Next year, the fraud-related questions will be refined to avoid the need for post-survey editing of the data.

Non-reporting of financial cost estimates for cyber-facilitated fraud

As a result of the post-survey editing of the cyber-facilitated fraud data, Ipsos, DSIT and the Home Office agreed not to report any cost estimates for cyber-facilitated fraud in the main Statistical Release this year. This is because the costs that organisations were asked to encompass at the FRAUDCOSTA question (which were transferred to the fraudcost_bands derived variable) explicitly included the cost of fraud resulting from online impersonation. As previously mentioned, online impersonation does not constitute a cyber crime in the context of this survey, and it was not possible to separate out online impersonation and cyber crime costs with the data available.

We have left this question and the associated derived variable in the SPSS file, but users should not that the cost data collected does not necessarily the cost of fraud that only occurs as a result of cyber crime.

Redaction of financial cost estimates in published SPSS data

No numeric cost variables will be included in the published SPSS dataset, both for the cyber incident (DAMAGE) questions and the crime (COSTA) questions. This was agreed with DSIT to prevent any possibility of individual organisations being identified. Instead, all variables related to spending and cost figures will be banded, including the imputed values (laid out in the previous section). These banded variables include:

  • damagedirsx_bands

  • damagedirlx_bands

  • damagestaffx_bands

  • damageindx_bands

  • ransdema

  • ranspaya

  • ranscosta

  • viruscosta

  • hackcosta

  • tkvrcosta

  • doscosta

  • fraudcosta

  • hacksumcost_bands

  • notfraudcost_bands

  • fraudcost_bands

  • crimecost_bands.

In addition, the following merged or derived variables will be included:

  • merged region (region_comb), which includes collapsed region groupings to ensure that no individual respondent can be identified

  • a merged sector variable (sector_comb2), which matches the sector groupings used in the 2020 and 2019 main reports.

No region groupings are included for the education institution data, to avoid the risk of these schools, colleges or universities being identified.

Missing values

We have treated missing values consistently each year.

  • For all non-cost data, only respondents that did not answer a question are treated as missing, and allocated a value of -1. That means that all responses, including “don’t know” (a value of -97) and “refused” responses (-99) are counted in the base and in any descriptive statistics.

  • For all cost data, i.e. damagedirs through to cost_bands, the “don’t know” (-97) and “refused” (-99) responses are treated as missing. Practically, this means that any analysis run on these variables systematically excludes “don’t know” and “refused” responses from the base. In other words, this kind of analysis (e.g. analysis to show the mean cost or median cost) only uses the respondents that have given a numeric or banded cost.

Rounding differences between the SPSS dataset and published data

If running analysis on weighted data in SPSS, users must be aware that the default setting of the SPSS crosstabs command does not handle non-integer weighting in the same way as typical survey data tables.[footnote 7] Users may, therefore, see very minor differences in results between the SPSS dataset and the percentages in the main release, which consistently use the survey data tables. These should be differences of no more than one percentage point, and only occur on rare occasions.

Chapter 3: Qualitative approach technical details

The qualitative strand of this research covered all the sampled groups from the survey, except for further education colleges (these were also not included in qualitative interviewing in previous years). This reflected the expanded size of this year’s study, where we conducted 44 in-depth interviews overall (vs. 30 in 2022).

3.1 Sampling

We took the sample for all 44 in-depth interviews from the quantitative survey. We asked respondents during the survey whether they would be willing to be recontacted specifically to take part in a further 60-minute interview on the same topic. Table 3.1 shows the proportion of respondents from each group that agreed to recontact, the total recontact sample available, and the qualitative interviews undertaken with each group.

Table 3.1: Summary of qualitative sample counts and interviews

Sample group Achieved quantitative interviews Permission for recontact Recontact sample Achieved qualitative interviews
Businesses 2,263 69% 1,551 20
Charities 1,174 71% 836 9
Primary schools 241 59% 142 2
Secondary schools 217 60% 130 6
Higher education 52 35% 18 7

3.2 Recruitment quotas and screening

We carried out recruitment for the qualitative element by email and telephone, using the contact details collected in the survey, and via a specialist business recruiter. We offered a high street voucher or charity donation of £50 made on behalf of participants to encourage participation.

We used recruitment quotas to ensure that interviews included a mix of different sizes, sectors and regions for businesses, and different charitable areas, income bands and countries for charities. We also had further quotas based on the responses in the quantitative survey, reflecting the topics to be discussed in the interviews. These ensured we spoke to a range of organisations that had:

  • a formal cyber security strategy

  • adopted specific cyber security standards or accreditations

  • formally reviewed supply chain cyber security risks (including for immediate suppliers and their wider supply chain)

  • some form of incident response planning

  • referenced their cyber security risks in a corporate annual report

  • used Managed Service Providers or other Digital Service Providers

  • identified cyber security breaches.

These were all administered as soft rather than hard quotas. This meant that the recruiter aimed to recruit a minimum number of participants in each group, and could exceed these minimums, rather than having to reach a fixed number of each type of respondent.

We also briefed the recruiter to carry out a further qualitative screening process of participants, to check that they felt capable of discussing at least some of the broad topic areas covered in the topic guide (laid out in the following section). The recruiter probed participants’ job titles, job roles, and gave them some further information about the topic areas over email. The intention was to screen out organisations that might have been willing to take part but would have had little to say on these topics.

3.3 Fieldwork

The Ipsos research team carried out all fieldwork in December 2022 and January 2023. We conducted the 44 interviews through a mix of telephone and Microsoft Teams calls. Interviews lasted around 60 minutes on average.

DSIT and the Home Office originally laid out their topics of interest for the 2023 study. Ipsos then drafted the interview topic guide around these topics, which was reviewed and approved by both departments. The qualitative topic guide has changed each year much more substantially than the quantitative questionnaire, in order to respond to the new findings that emerge from each year’s quantitative survey. The intention is for the qualitative research to explore new topics that were not necessarily as big or salient in previous years, as well as to look more in depth at the answers that organisations gave in this year’s survey. This year, the guide covered the following broad thematic areas:

  • perception of cyber security risk (including risks from supply chains and Digital Service Providers)

  • corporate annual reporting on cyber security risks

  • the impact of recent events and the economic environment (e.g. rising inflation and costs, supply chain issues and the Ukraine war) on approaches to cyber security

  • cyber security leadership and governance at the board level

  • incident response approaches.

There was not enough time in each interview to ask about all these topics, so we used a modular topic guide design, where the researcher doing the interview would know beforehand to only focus on a selection of these areas. Across the course of fieldwork, the core research team reviewed the notes from each interview and gave the fieldwork team guidance on which topics needed further coverage in the remaining interviews. This ensured we asked about each of these areas in a wide range of interviews, with at least 4 interviews covering each topic.

A full reproduction of the topic guide is available in Appendix B.

Tables 3.2 and 3.3 shows a profile of the 20 interviewed businesses by size and sector.

Table 3.2: Sector profile of businesses in follow-up qualitative stage

SIC 2007 letter Sector description Total
A Agriculture, forestry or fishing 0
B, C, D, E Utilities or production (including manufacturing) 5
F Construction 0
G Retail or wholesale (including vehicle sales and repairs) 3
H Transport or storage 1
I Food or hospitality 1
J Information or communications 1
K Finance or insurance 1
L, N Administration or real estate 0
M Professional, scientific or technical 4
P Education (excluding state education institutions) 0
Q Health, social care or social work 2
R, S Entertainment, service or membership organisations 2
  Total 20

Table 3.3: Size profile of businesses (by number of staff) in follow-up qualitative stage

Size band Total
Micro or small (1–49 staff) 7
Medium (50–249 staff) 6
Large (250+ staff) 7

Table 3.4 shows a profile of the 8 interviewed charities by income band.

Table 3.4: Size profile of charities (by income band) in follow-up qualitative stage

Income band Total
£100,000 to under £500,000 1
£500,000 to under £5 million 4
£5 million or more 4

3.4 Analysis

Throughout fieldwork, the core research team discussed interim findings and outlined areas to focus on in subsequent interviews. Specifically, we held two face-to-face analysis meetings with the entire fieldwork team – one halfway through fieldwork and one towards the end of fieldwork. In these sessions, researchers discussed the findings from individual interviews, and we drew out emerging key themes, recurring findings and other patterns across the interviews. DSIT and the Home Office attended a separate analysis session during the latter part of fieldwork and helped identify what they saw as the most important findings, as well as areas worth exploring further in the remaining interviews.

We also recorded all interviews and summarised them in an Excel notes template, which categorised findings by topic area and the research questions within that topic area. The research team reviewed these notes, and also listened back to recordings, to identify the examples and verbatim quotes to include in the main report.

Chapter 4: Research burden

The Government Statistical Service (GSS) has a policy of monitoring and reducing statistical survey burden to participants where possible, and the burden imposed should be proportionate to the benefits arising from the use of the statistics. As a producer of statistics, DSIT is committed to monitoring and reducing the burden on those providing their information, and on those involved in collecting, recording and supplying data. 

This section calculates the research compliance cost, in terms of the time cost on respondents, imposed by both the quantitative survey and qualitative fieldwork.

  • The quantitative survey had 3,991 respondents and the average (mean) survey length was 24 minutes. There was a further recontact survey of 68 respondents, lasting an average of 5 minutes. Therefore the research compliance cost for the quantitative survey this year was [(3,991 × 24 minutes) + (68 × 5 minutes) = 1,602 hours].

  • The qualitative research had 44 respondents and the average interview length was 60 minutes. Respondents completed the qualitative interviews in addition to the quantitative survey. The research compliance cost for the qualitative strand this year was [44 × 60 minutes = 44 hours].

In total, the compliance cost for the Cyber Security Breaches Survey 2023 was 1,640 hours.

Steps taken to minimise the research burden

Across both strands of fieldwork, we took the following steps to minimise the research burden on respondents:

  • making it clear that all participation was voluntary

  • informing respondents of the average time it takes to complete an interview at the start of the survey call, during recruitment for the qualitative research and again at the start of the qualitative interview

  • confirming that respondents were happy to continue if the interviews went over this average time

  • split-sampled certain questions – that is to say they were asked to a random half of respondents – to reduce the overall interview length

  • offering to carry out interviews at the times convenient for respondents, including evenings and weekends where requested

  • offering an online interview instead of a telephone one, according to the respondent’s preferences.

The study also adheres to Government Social Research Professional Guidance on ethics.

Appendix A: Questionnaire


ASK ALL

Q1B.TITLE

What is your job title?

CATI: PROMPT TO CODE, INCLUDING SENIORITY AND IF RELATED DIRECTLY TO CYBER SECURITY OR NOT

SINGLE CODE

Directly related to cyber security

Chief Information Officer (CIO)

Chief Information Security Officer (CISO)

Director of Security

Head of Cyber Security/Information Security

Other cyber security role WRITE IN

Directly related to IT

Senior IT role (e.g. IT director)

Non-senior IT role (e.g. IT manager, technician, administrator)

Not related to cyber security/IT – senior management level

Business owner

Chief Executive (CEO)/Managing Director (MD)

Chief Operations Officer (COO)/Operations Director

Finance Director/Controller

Headteacher

Trustee/treasurer/on trustee board

Other senior management role (e.g. director)

Partner

Chair

Not related to cyber security/IT – non-senior management level

General/office manager (not a director/trustee)

PA/secretary/admin

Teacher (not in senior management)

Other non-senior role


ASK IF BUSINESS

Q5X.TYPEX

Would you classify your organisation as … ?

READ OUT

INTERVIEWER NOTE: IF THEY HAVE A SOCIAL PURPOSE BUT STILL MAKE A PROFIT (E.G. PRIVATE PROVIDER OF HEALTH OR SOCIAL CARE) CODE AS CODE 1

SINGLE CODE

Mainly seeking to make a profit

A social enterprise

A charity or voluntary sector organisation

DO NOT READ OUT: Don’t know


ASK ALL

Q4.SIZEA

Including yourself, how many [IF BUSINESS/EDUCATION: employees/IF CHARITY: employees, volunteers and trustees] work for your organisation across the UK as a whole?

ADD IF NECESSARY: [IF BUSINESS/EDUCATION: By that [CATI I/WEB we] mean both full-time and part-time employees on your payroll, as well as any working proprietors or owners./IF CHARITY: By that [CATI I/WEB we] mean both full-time and part-time employees on your payroll, as well as people who regularly volunteer for your organisation.]

PROBE FOR BEST ESTIMATE BEFORE CODING DK

WRITE IN RANGE 2–500,000 (SOFT CHECK IF >99,999)

SINGLE CODE

Respondent is sole trader CLOSE SURVEY

Don’t know


ASK IF DON’T KNOW SIZE OF ORGANISATION (SIZEA CODE DK)

Q5.SIZEB

Which of these best represents the number of [IF BUSINESS/EDUCATION: employees/IF CHARITY: employees, volunteers and trustees] working for your organisation across the UK as a whole, including yourself?

PROBE FULLY

SINGLE CODE

Under 10

10–49

50–249

250–999

1,000 or more

DO NOT READ OUT: Don’t know


READ OUT/SHOW TO ALL

For the rest of the survey, we will be asking about cyber security. By this, we mean any strategy, processes, practices or technologies that organisations have in place to secure their networks, computers, programs or the data they hold from damage, attack or unauthorised access.


ASK HALF B IF BUSINESS/CHARITY, OR ALL IF EDUCATION

Q9.PRIORITY

How high or low a priority is cyber security to your organisation’s [IF BUSINESS: directors/IF CHARITY: trustees/IF EDUCATION: governors] or senior management? Is it …

READ OUT

SINGLE CODE

REVERSE SCALE EXCEPT FOR LAST CODE

Very high

Fairly high

Fairly low

Very low

DO NOT READ OUT: Don’t know


ASK IF MEDIUM OR LARGE BUSINESS, HIGH-INCOME CHARITY OR EDUCATION

Q11.UPDATE

Approximately how often, if at all, are your organisation’s [IF BUSINESS: directors/IF CHARITY: trustees/IF EDUCATION: governors] or senior management given an update on any actions taken around cyber security? Is it …

READ OUT

IF EDUCATION: INTERVIEWER NOTE: FOR EDUCATION INSTITUTIONS, “EVERY TERM” MEANS QUARTERLY

SINGLE CODE

REVERSE SCALE EXCEPT FOR LAST 2 CODES

Never

Less than once a year

Annually

Quarterly

Monthly

Weekly

Daily

DO NOT READ OUT: Each time there is a breach or attack

DO NOT READ OUT: Don’t know


ASK HALF A IF BUSINESS/CHARITY, OR ALL IF EDUCATION

Q23X.INSUREX

There are general insurance policies that provide cover for cyber security breaches or attacks, among other things. There are also specific insurance policies that are solely for this purpose. Which of the following best describes your situation?

READ OUT

SINGLE CODE

We have a specific cyber security insurance policy

We have cyber security cover as part of a broader insurance policy

We are not insured against cyber security breaches or attacks

DO NOT READ OUT: Don’t know


ASK HALF B IF BUSINESS/CHARITY, OR ALL IF EDUCATION

Q24.INFO

In the last 12 months, from where, if anywhere, have you sought information, advice or guidance on the cyber security threats that your organisation faces?

DO NOT READ OUT

INTERVIEWER NOTE: IF “GOVERNMENT”, THEN PROBE WHERE EXACTLY

PROBE FULLY (“ANYWHERE ELSE?”)

WRITE IN TEXT BOX IF WEB

MULTICODE IF CATI

Government/public sector

Government’s 10 Steps to Cyber Security guidance

Government’s Cyber Aware website/materials

Government’s Cyber Essentials materials

Government intelligence services (e.g. GCHQ)

GOV.UK/Government website (excluding NCSC website)

Cyber Resilience Centres

Action Fraud

Government – other WRITE IN

National Cyber Security Centre (NCSC) website/offline

Police

Regulator (e.g. Financial Conduct Authority) – but excluding Charity Commission

Charity related

Association of Chief Executives of Voluntary Organisations (ACEVO)

Charity Commission (England and Wales, Scotland or Northern Ireland)

Charity Finance Group (CFG)

Community Accountants

Community Voluntary Services (CVS)

Institute of Fundraising (IOF)

National Council For Voluntary Organisations (NCVO)

Other local infrastructure body

Other national infrastructure body

Education related

Jisc/the Janet network

Department for Education (DfE)

Ofsted

Secure Schools programme

Teachers’ unions (e.g. NASUWT, NEU or NUT)

Other specific organisations

Cyber Security Information Sharing Partnership (CISP)

Professional/trade/industry/volunteering association

Security bodies (e.g. ISF or IISP)

Security product vendors (e.g. AVG, Kaspersky etc)

UK Cyber Security Council

Internal

Within your organisation – senior management/board

Within your organisation – other colleagues or experts

External

Auditors/accountants

Bank/business bank/bank’s IT staff

External security/IT consultants/cyber security providers

Internet Service Provider

LinkedIn

Newspapers/media

Online searching generally/Google

Specialist IT blogs/forums/websites

Other (non-government) WRITE IN

SINGLE CODE

Nowhere

Don’t know


ASK HALF B IF BUSINESS/CHARITY, OR ALL IF EDUCATION

Q24D.SCHEME

There are various Government campaigns, schemes, information and guidance on cyber security. Which, if any, of the following have you heard of?

READ OUT

ASK AS A GRID

RANDOMISE LIST

  1. The Cyber Essentials scheme
  2. The 10 Steps to Cyber Security
  3. IF MICRO OR SMALL BUSINESS: Any Small Business Guides, such as the Small Business Guide to Cyber Security, or the Small Business Guide to Response and Recovery
  4. IF MEDIUM OR LARGE BUSINESS, CHARITY OR EDUCATION: The Cyber Security Board Toolkit
  5. IF CHARITY: The Cyber Security Small Charity Guide
  6. Cyber Aware Campaign

SINGLE CODE PER ROW

Yes

No

DO NOT READ OUT: Don’t know


ASK IF BUSINESS/CHARITY AND SEEN OR HEARD GOVERNMENT GUIDANCE

Q24E.GOVTACT

What, if anything, have you changed or implemented at your organisation after seeing or hearing any government campaigns or guidance on cyber security?

DO NOT READ OUT

PROBE FULLY (“ANYTHING ELSE?”)

WRITE IN TEXT BOX IF WEB

MULTICODE IF CATI

Governance changes

Increased spending

Changed nature of the business/activities

New/updated business continuity plans

New/updated cyber policies

New checks for suppliers/contractors

New procurement processes, e.g. for devices/IT

New risk assessments

Increased senior management oversight/involvement

Technical changes

Changed/updated firewall/system configurations

Changed user admin/access rights

Increased monitoring

New/updated antivirus/anti-malware software

Other new software/tools (not antivirus/anti-malware)

Penetration testing

People/training changes

Outsourced cyber security/hired external provider

Recruited new staff

Staff training/communications

Vetting staff/extra vetting

Other WRITE IN

SINGLE CODE

Nothing done

Only heard about guidance, not read it

Don’t know

READ OUT/SHOW TO ALL

Now we would like to ask some questions about your current cyber security processes and procedures. Just to reassure you, we are not looking for a “right” or “wrong” answer. If you don’t do or have the things we’re asking about, just say so and we’ll move on.


ASK ALL

Q29.MANAGE

Which of the following governance or risk management arrangements, if any, do you have in place? IF WEB: Please select all that apply.

READ OUT

MULTICODE

ROTATE LIST

[IF BUSINESS: Board members/IF CHARITY: Trustees/IF EDUCATION: A governor or senior manager] with responsibility for cyber security

An outsourced provider that manages your cyber security

A formal policy or policies in place covering cyber security risks

A Business Continuity Plan that covers cyber security

A written list of the most critical data, systems or assets that your organisation wants to protect

SINGLE CODE

NOT PART OF ROTATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


ASK HALF B IF BUSINESS/CHARITY, OR ALL IF EDUCATION

Q29A.COMPLY

Which of the following standards or accreditations, if any, does your organisation adhere to? IF WEB: Please select all that apply.

READ OUT

MULTICODE

ROTATE LIST BUT KEEP CODES 4 AND 5 TOGETHER

ISO 27001

The Payment Card Industry Data Security Standard, or PCI DSS

Any National Institute of Standards and Technology (NIST) standards

IF HEARD OF CYBER ESSENTIALS: The Cyber Essentials standard

IF HEARD OF CYBER ESSENTIALS: The Cyber Essentials Plus standard

SINGLE CODE

NOT PART OF ROTATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


ASK ALL

Q30.IDENT

And which of the following, if any, have you done over the last 12 months to identify cyber security risks to your organisation? IF WEB: Please select all that apply.

READ OUT

MULTICODE

ROTATE LIST

A cyber security vulnerability audit

A risk assessment covering cyber security risks

Used or invested in threat intelligence

Used specific tools designed for security monitoring, such as Intrusion Detection Systems

Penetration testing

Testing staff awareness and response (e.g. via mock phishing exercises)

SINGLE CODE

NOT PART OF ROTATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


ASK IF CARRIED OUT AN AUDIT

Q30A.AUDIT

Were any cyber security audits carried out internally by staff, by an external contractor, or both?

DO NOT PROMPT IF CATI. IF WEB SHOW LIST.

SINGLE CODE

Only internally by staff

Only by an external contractor

Both internal and external

Don’t know


ASK ALL

Q31.RULES

And which of the following rules or controls, if any, do you have in place? IF WEB: Please select all that apply.

READ OUT

MULTICODE

ROTATE LIST

CODE 11 MUST FOLLOW CODE 10

A policy to apply software security updates within 14 days

Up-to-date malware protection

Firewalls that cover your entire IT network, as well as individual devices

Restricting IT admin and access rights to specific users

Any monitoring of user activity

Specific rules for storing and moving personal data files securely

Security controls on company-owned devices (e.g. laptops)

Only allowing access via company-owned devices

Separate WiFi networks for staff and for visitors

Backing up data securely via a cloud service

Backing up data securely via other means

A password policy that ensures users set strong passwords

A virtual private network, or VPN, for staff connecting remotely

An agreed process for staff to follow when they identify a fraudulent email or malicious website

Any requirement for two-factor authentication when people access your network, or for applications they use

SINGLE CODE

NOT PART OF ROTATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


ASK IF HAVE POLICIES

Q32.POLICY

Which of the following aspects, if any, are covered within your cyber security-related policy, or policies? IF WEB: Please select all that apply.

READ OUT

MULTICODE

ROTATE LIST

What can be stored on removable devices (e.g. USB sticks)

Remote or mobile working (e.g. from home)

What staff are permitted to do on your organisation’s IT devices

Use of personally-owned devices for business activities

Use of cloud computing

Use of network-connected devices, sometimes called smart devices

Use of Software as a Service, or SaaS

How you’re supposed to store data

SINGLE CODE

NOT PART OF ROTATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


ASK IF HAVE POLICIES

**Q33A.REVIEW **

When were any of your policies or documentation for cyber security last created, updated, or reviewed to make sure they were up-to-date?

PROBE FULLY

INTERVIEWER NOTE: IF NEVER UPDATED OR REVIEWED, ANSWER IS WHEN POLICIES WERE CREATED

IF WEB SHOW LIST

SINGLE CODE

Within the last 3 months

3 to under 6 months ago

6 to under 12 months ago

12 to under 24 months ago

24 months ago or earlier

DO NOT READ OUT: Don’t know


ASK ALL

Q33B.TRAINED

In the last 12 months, have you carried out any cyber security training or awareness raising sessions specifically for any [IF BUSINESS/EDUCATION: staff/IF CHARITY: staff or volunteers] who are not directly involved in cyber security?

SINGLE CODE

Yes

No

Don’t know


ASK IF MEDIUM OR LARGE BUSINESS, HIGH-INCOME CHARITY OR EDUCATION

Q33D.STRATEGY

Does your organisation have a formal cyber security strategy, i.e. a document that underpins all your policies and processes?

SINGLE CODE

Yes

No

Don’t know


ASK IF HAVE A STRATEGY

Q33E.STRATINT

In the last 12 months, has this strategy been reviewed by your organisation’s [IF BUSINESS: directors/IF CHARITY: trustees/IF EDUCATION: governors] or senior management?

SINGLE CODE

REVERSE SCALE EXCEPT FOR LAST CODE

Yes

No

DO NOT READ OUT: Don’t know


ASK IF MEDIUM OR LARGE BUSINESS, OR HIGH-INCOME CHARITY

Q33H.CORPORATE

This next section is about how cyber security is discussed in any publicly available annual reports of your organisation’s activities.

Firstly, did your organisation publish an annual report in the last 12 months?

SINGLE CODE

Yes

No

Don’t know


ASK IF HAVE AN ANNUAL

Q33I.CORPRISK

Did your latest annual report cover any cyber security risks faced by your organisation?

SINGLE CODE

Yes

No

Don’t know


READ OUT/SHOW TO ALL BUSINESSES

The next question is about suppliers. This is not just security or IT suppliers. It includes any immediate suppliers that directly provide goods or services to your organisation. We also ask about your wider supply chain, i.e. your suppliers’ suppliers.


READ OUT/SHOW TO ALL CHARITIES OR EDUCATION

The next question is about third-party organisations you work with. This includes any immediate suppliers that directly provide goods or services to your organisation, or partners such as local authorities. We also ask about your wider supply chain, i.e. your suppliers’ suppliers.


ASK ALL

Q45B.SUPPLYRISK

Has your organisation carried out any work to formally review the following?

READ OUT

ASK AS A GRID

  1. The potential cyber security risks presented by your immediate suppliers [IF CHARITY/EDUCATION: or partners]
  2. The potential cyber security risks presented by your wider supply chain, i.e. your suppliers’ suppliers

SINGLE CODE

Yes

No

DO NOT READ OUT: Don’t know


ASK IF BUSINESS OR CHARITY AND REVIEWED ANY SUPPLY CHAIN RISKS

Q45D.BARRIER

Which of the following, if any, have made it difficult for your organisation to manage any cyber security risks from your supply chain [IF CHARITY/EDUCATION: or partners]? IF WEB: Please select all that apply.

READ OUT

MULTICODE

RANDOMISE LIST

Lack of time or money to dedicate to this

Lack of skills to be able to check suppliers [IF CHARITY/EDUCATION: or partners] in this way

Not knowing what kinds of checks to carry out

Not knowing which suppliers [IF CHARITY/EDUCATION: or partners] to check

We can’t get the necessary information from suppliers [IF CHARITY/EDUCATION: or partners] to carry out checks

It’s not a priority when working with suppliers [IF CHARITY/EDUCATION: or partners]

SINGLE CODE

NOT PART OF RANDOMISATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


ASK ALL

Q53A.TYPE

Have any of the following happened to your organisation in the last 12 months, or not? IF WEB: Please select all that apply.

READ OUT

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

MULTICODE

ROTATE LIST

CODE 2 MUST FOLLOW CODE 1

CODES 7, 8 AND 9 TO STAY IN ORDER

Computers becoming infected with ransomware

Computers becoming infected with other malware (e.g. viruses or spyware)

Denial of service attacks, i.e. attacks that try to slow or take down your website, applications or online services

Hacking or attempted hacking of online bank accounts

People impersonating your organisation in emails or online

Phishing attacks, i.e. staff receiving fraudulent emails, or arriving at fraudulent websites

Unauthorised accessing of files or networks by staff, even if accidental

IF EDUCATION: Unauthorised accessing of files or networks by students

Unauthorised accessing of files or networks by people [IF BUSINESS/CHARITY: outside your organisation/IF EDUCATION: other than staff or students]

Unauthorised listening into video conferences or instant messaging

Takeovers or attempts to take over your website, social media accounts or email accounts

MULTICODE

NOT PART OF ROTATION

Any other types of cyber security breaches or attacks

SINGLE CODE

NOT PART OF ROTATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these

DO NOT READ OUT: Prefer not to say


ASK IF ANY BREACHES OR ATTACKS

Q54.FREQ

Approximately, how often in the last 12 months did you experience any of the cyber security breaches or attacks you mentioned? Was it …

READ OUT

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

SINGLE CODE

Once only

More than once but less than once a month

Roughly once a month

Roughly once a week

Roughly once a day

Several times a day

DO NOT READ OUT: Don’t know

DO NOT READ OUT: Prefer not to say


ASK IF ANY BREACHES OR ATTACKS

Q56A.OUTCOME

Thinking of all the cyber security breaches or attacks experienced in the last 12 months, which, if any, of the following happened as a result? IF WEB: Please select all that apply.

READ OUT

MULTICODE

ROTATE LIST

CODE 4 MUST FOLLOW CODE 3

CODE 7 MUST FOLLOW CODE 6

Software or systems were corrupted or damaged

Personal data (e.g. on [IF BUSINESS: customers or staff/IF CHARITY: beneficiaries, donors, volunteers or staff/IF EDUCATION: students or staff]) was altered, destroyed or taken

Permanent loss of files (other than personal data)

Temporary loss of access to files or networks

Lost or stolen assets, trade secrets or intellectual property

Money was stolen

Money was paid as a ransom

Your website, applications or online services were taken down or made slower

Lost access to any third-party services you rely on

Physical devices or equipment were damaged or corrupted

Compromised accounts or systems used for illicit purposes (e.g. launching attacks)

SINGLE CODE

NOT PART OF ROTATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


ASK IF ANY BREACHES OR ATTACKS

Q57.IMPACT

And have any of these breaches or attacks impacted your organisation in any of the following ways, or not? IF WEB: Please select all that apply.

READ OUT

MULTICODE

ROTATE LIST

CODE 4 MUST FOLLOW CODE 3

Stopped staff from carrying out their day-to-day work

Loss of [IF BUSINESS: revenue or share value/ELSE: income]

Additional staff time to deal with the breach or attack, or to inform [IF BUSINESS: customers/IF CHARITY: beneficiaries/IF EDUCATION: students, parents] or stakeholders

Any other repair or recovery costs

New measures needed to prevent or protect against future breaches or attacks

Fines from regulators or authorities, or associated legal costs

Reputational damage

IF BUSINESS/CHARITY: Prevented provision of goods or services to [IF BUSINESS: customers/IF CHARITY: beneficiaries or service users]

Discouraged you from carrying out a future business activity you were intending to do

Complaints from [IF BUSINESS: customers/IF CHARITY: beneficiaries or stakeholders/IF EDUCATION: students or parents]

IF BUSINESS/CHARITY: Goodwill compensation or discounts given to customers

SINGLE CODE

NOT PART OF ROTATION

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


READ OUT/SHOW IF ANY BREACHES OR ATTACKS

We are now going to ask you some questions on the breaches you have experienced. We understand that several incidents or breaches may form part of the same event in which several different types of breach happen. In these incidents we would only like you to consider the final event in a sequence. For instance if you were victim of a phishing attack that led to your computer becoming infected in a virus, we would only like you to consider the virus when being asked these questions.


READ OUT/SHOW IF EXPERIENCED RANSOMWARE

You said you experienced at least one incident where computers became infected with ransomware. We are now going to ask you some questions to determine the nature of the attack.

IF EXPERIENCED VIRUSES OR OTHER MALWARE: Please exclude any events relating to malware that did not include a ransomware attack to your answers for this set of questions.


ASK IF EXPERIENCED RANSOMWARE

Q83A.RANSCOUNT

How many times in the last 12 months did computers become infected with ransomware?

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY RANSOMWARE EVENTS EXPERIENCED

Q83B.RANSCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SAY THEY EXPERIENCED NO RANSOMWARE EVENTS

Q83B2.RANSNONE

You previously stated you experienced at least one ransomware attack in the last 12 months. Are you sure you experienced none?

SINGLE CODE

Yes

No REASK RANSCOUNT

Don’t know

Prefer not to say


ASK IF CAN RECALL AT LEAST ONE RANSOMWARE EVENT

Q83C.RANSFIN

We understand that several events or breaches may form part of the same incident in which several different types of breach happen. [IF MORE THAN ONE RANSOMWARE EVENT: We would now only like you to consider ransomware infections which were the final event in the incident.]

[IF MORE THAN ONE RANSOMWARE EVENT: How many of the [INSERT RESPONSE FROM RANSCOUNT/RANSCOUNTDK] times your computer became infected with ransomware in the last 12 months was it the final event in an incident?]

[IF ONE RANSOMWARE EVENT: Was the time your computer became infected with ransomware in the last 12 months the final event in an incident?]

By this we mean it did not lead to any subsequent events such as hacking or fraud.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN RANSCOUNT OR RANSCOUNTDK.

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY RANSOMWARE FINAL EVENTS EXPERIENCED

Q83D.RANSFINDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF CAN RECALL AT LEAST ONE RANSOMWARE FINAL EVENT

Q83E.RANSSOFT

Ransomware attacks can be stopped upon infection by internal or third party software before it makes an impact. We’d now like you to consider the number of times you’ve experienced ransomware infection that has not been stopped by such software.

[IF MORE THAN ONE RANSOMWARE FINAL EVENT: How many of the [INSERT RESPONSE FROM RANSFIN/RANSFINDK] times where the ransomware infection was the final event in an incident was it not stopped by any internal or third party provided software?]

[IF ONE RANSOMWARE FINAL EVENT: Was the time your computer became infected with ransomware in the last 12 months not stopped by internal or third party provided software?]

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN RANSFIN OR RANSFINDK.

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY SUCCESSFUL RANSOMWARE FINAL EVENTS EXPERIENCED

Q83F.RANSSOFTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SUCCESSFUL RANSOMWARE FINAL EVENTS EXPERIENCED

Q83G.RANSCONT

[IF MORE THAN ONE SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED: Thinking of the wider circumstances and causes of the ransomware attacks you have experienced that were not stopped by software in the last 12 months, how many were preceded by … ?]

[IF ONE SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED: Thinking of the wider circumstances and causes of the ransomware attack you experienced that was not stopped by software in the last 12 months, was it preceded by … ?]

CODE IN NUMBER. HARD CHECK. SINGLE CODES MUST BE EQUAL TO OR LESS THAN RANSSOFT OR RANSSOFTDK.

IF CODE 2 AT TYPE: Computers becoming infected with other malware (e.g. viruses or spyware)

IF CODE 3 AT TYPE: Denial of service attacks, i.e. attacks that try to slow or take down your website, applications or online services

IF CODE 4 AT TYPE: Hacking or attempted hacking of online bank accounts

IF CODE 6 AT TYPE: Phishing attacks, i.e. staff receiving fraudulent emails, or arriving at fraudulent websites

IF CODES 7-9 AT TYPE: Unauthorised accessing of files or networks by staff or [IF BUSINESS/CHARITY: people outside your organisation/IF EDUCATION: students, or people other than staff or students,] even if accidental

IF CODE 10 AT TYPE: Unauthorised listening into video conferences or instant messaging

IF CODE 11 AT TYPE: Takeovers or attempts to take over your website, social media accounts or email accounts


ASK IF SUCCESSFUL RANSOMWARE FINAL EVENTS EXPERIENCED

Q83H.RANSDEMA

How much was the total [IF MORE THAN ONE SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED: demanded in ransoms for all ransomware incidents experienced in the last 12 months] [IF ONE SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED: ransom demanded for this ransomware incident]?

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£30,000,000

IF SMALL (SIZEA CODE<50 OR SIZEB CODES 1–2): SOFT CHECK IF >£9,999

IF MEDIUM (SIZEA 49<CODE<250 OR SIZEB CODE 3): SOFT CHECK IF <£100 OR >£99,999

IF LARGE (SIZEA 249<CODE OR [SIZEB CODES 4–5 OR DK]): SOFT CHECK IF <£1,000 OR >£99,999

SINGLE CODE

No ransom demanded

Don’t know

Prefer not to say


ASK IF DON’T KNOW RANSOM DEMANDED

Q83I.RANSDEMB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


ASK IF CAN RECALL RANSOM DEMANDED

Q83J.RANSPAYYN

And did you pay any of this amount to the perpetrators?

IF YES PROMPT TO CODE

SINGLE CODE

Yes, totally

Yes, partially

No

Don’t know


ASK IF PARTIALLY PAID RANSOME

Q83K.RANSPAYA

How much did you end up paying in ransoms to the perpetrators for [IF MORE THAN ONE SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED: all ransomware incidents experienced in the last 12 months] [IF ONE SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED: this ransomware incident]?

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£30,000,000

IF SMALL (SIZEA CODE<50 OR SIZEB CODES 1–2): SOFT CHECK IF >£9,999

IF MEDIUM (SIZEA 49<CODE<250 OR SIZEB CODE 3): SOFT CHECK IF <£100 OR >£99,999

IF LARGE (SIZEA 249<CODE OR [SIZEB CODES 4–5 OR DK]): SOFT CHECK IF <£1,000 OR >£99,999

SINGLE CODE

Don’t know

Prefer not to say


ASK IF DON’T KNOW TOTAL RANSOM PAID

Q83L.RANSPAYB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


ASK IF SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED

Q83M.RANSCOSTA

How much do you think [IF MORE THAN ONE SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED: all ransomware incidents experienced that were the final events] [IF ONE SUCCESSFUL RANSOMWARE FINAL EVENT EXPERIENCED: this ransomware incident has that was the final event] in an incident and not stopped by software in the last 12 months, and not stopped by software, have cost your organisation financially in total?

Please include any costs such as legal fees, staff time spent on recovery and investigation, lost revenue, costs of any external contractors to help resolve issue, insurance excess, or any costs related to damaged files or software or purchasing any new software,

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£30,000,000

IF SMALL (SIZEA CODE<50 OR SIZEB CODES 1–2): SOFT CHECK IF >£9,999

IF MEDIUM (SIZEA 49<CODE<250 OR SIZEB CODE 3): SOFT CHECK IF <£100 OR >£99,999

IF LARGE (SIZEA 249<CODE OR [SIZEB CODES 4–5 OR DK]): SOFT CHECK IF <£1,000 OR >£99,999

SINGLE CODE

No cost incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW TOTAL COST OF SUCCESSFUL RANSOMWARE FINAL EVENTS

Q83N.RANSCOSTB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


READ OUT/SHOW IF EXPERIENCED VIRUSES OR OTHER MALWARE

You said you experienced at least one incident where computers became infected with malware such as viruses or spyware. We are now going to ask you some questions to determine the nature of the attack.

IF EXPERIENCED RANSOMWARE: Please exclude any event relating to ransomware in your answers for this set of questions.


ASK IF EXPERIENCED VIRUSES OR OTHER MALWARE

Q84A.VIRUSCOUNT

How many times in the last 12 months did computers become infected with malware such as viruses or spyware?

If multiple computers were infected through a single attack this counts as one incident.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY VIRUSES OR OTHER MALWARE EVENTS EXPERIENCED

Q84B.VIRUSCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SAY THEY EXPERIENCED NO VIRUS OR OTHER MALWARE EVENTS

Q84B2.VIRUSNONE

You previously stated you experienced at least one incident where you became infected with viruses or other malware in the last 12 months. Are you sure you experienced none?

SINGLE CODE

Yes

No REASK VIRUSCOUNT

Don’t know

Prefer not to say


ASK IF CAN RECALL AT LEAST ONE VIRUS OR OTHER MALWARE EVENT

Q84C.VIRUSFIN

We understand that several events may form part of the same incident in which several different types of breach happen.

[IF MORE THAN ONE VIRUS OR OTHER MALWARE EVENT: We would now only like you to consider malware infections such as viruses of spyware which were the final event in the incident. How many of the [INSERT RESPONSE FROM VIRUSCOUNT/VIRUSCOUNTDK] times your computer became infected with malware in the last 12 months was it the final event in an incident?]

[IF ONE VIRUS OR OTHER MALWARE EVENT: Was the time your computer became infected with malware in the last 12 months the final event in an incident?]

By this we mean it did not lead to any subsequent events such as hacking or fraud.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN VIRUSCOUNT OR VIRUSCOUNTDK.

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY VIRUS OR OTHER MALWARE FINAL EVENTS EXPERIENCED

Q84D.VIRUSFINDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF CAN RECALL AT LEAST ONE VIRUS OR OTHER MALWARE FINAL EVENT

Q84E.VIRUSSOFT

Malware attacks, such as viruses or spyware, can be stopped upon infection by internal or third party software before it makes an impact. We’d now like you to consider when you’ve experienced malware infection that has not been stopped by such software.

[IF MORE THAN ONE VIRUS OR OTHER MALWARE FINAL EVENT: many of the [INSERT RESPONSE FROM VIRUSFIN/VIRUSFINDK] times where the malware infection was the final event in an incident was it not stopped by any internal or third party provided software?]

[IF ONE VIRUS OR OTHER MALWARE FINAL EVENT: Was the time your computer became infected with malware in the last 12 months not stopped by internal or third party provided software?]

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN VIRUSFIN OR VIRUSFINDK.

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY SUCCESSFUL VIRUS OR OTHER MALWARE FINAL EVENTS EXPERIENCED

Q84F.VIRUSSOFTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SUCCESSFUL VIRUS OR OTHER MALWARE FINAL EVENTS EXPERIENCED

Q84G.VIRUSCONT

[IF MORE THAN ONE SUCCESSFUL VIRUS OR OTHER MALWARE FINAL EVENT EXPERIENCED: Thinking of the wider circumstances and causes of when your computers became infected with malware in the last 12 months, how many were preceded by … ?]

[IF ONE SUCCESSFUL VIRUS OR OTHER MALWARE FINAL EVENT EXPERIENCED: Thinking of the wider circumstances and causes of when your computers became infected with malware in the last 12 months, was it preceded by … ?]

CODE IN NUMBER. HARD CHECK. SINGLE CODES MUST BE EQUAL TO OR LESS THAN VIRUSCOUNT OR VIRUSCOUNTDK.

IF CODE 1 AT TYPE: Computers becoming infected with ransomware

IF CODE 3 AT TYPE: Denial of service attacks, i.e. attacks that try to slow or take down your website, applications or online services

IF CODE 4 AT TYPE: Hacking or attempted hacking of online bank accounts

IF CODE 6 AT TYPE: Phishing attacks, i.e. staff receiving fraudulent emails, or arriving at fraudulent websites

IF CODES 7-9 AT TYPE: Unauthorised accessing of files or networks by staff or [IF BUSINESS/CHARITY: people outside your organisation/IF EDUCATION: students, or people other than staff or students,] even if accidental

IF CODE 10 AT TYPE: Unauthorised listening into video conferences or instant messaging

IF CODE 11 AT TYPE: Takeovers or attempts to take over your website, social media accounts or email accounts

SINGLE CODE

Don’t know

Prefer not to say


ASK IF SUCCESSFUL VIRUS OR OTHER MALWARE FINAL EVENT EXPERIENCED

Q84I.VIRUSCOSTA

How much do you think [IF MORE THAN ONE SUCCESSFUL VIRUS OR OTHER MALWARE FINAL EVENT EXPERIENCED: all malware incidents experienced that were the final events in an incident and not stopped by software in the last 12 months have] [IF ONE SUCCESSFUL VIRUS OR OTHER MALWARE FINAL EVENT EXPERIENCED: this malware incident has that was the final event in an incident and not stopped by software] cost your organisation financially in total?

Please include any costs such as legal fees, staff time spent on recovery and investigation, lost revenue, costs of any external contractors to help resolve issue, insurance excess, or any costs related to damaged files or software or purchasing any new software,

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£30,000,000

IF SMALL (SIZEA CODE<50 OR SIZEB CODES 1–2): SOFT CHECK IF >£9,999

IF MEDIUM (SIZEA 49<CODE<250 OR SIZEB CODE 3): SOFT CHECK IF <£100 OR >£99,999

IF LARGE (SIZEA 249<CODE OR [SIZEB CODES 4–5 OR DK]): SOFT CHECK IF <£1,000 OR >£99,999

SINGLE CODE

No cost incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW TOTAL COST OF SUCCESSFUL VIRUS OR OTHER MALWARE FINAL EVENTS

Q84J.VIRUSCOSTB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


READ OUT/SHOW IF EXPERIENCED UNAUTHORISED ACCESS

You said you experienced someone accessing files, networks, social media, websites or email accounts or listening to conference calls without authorisation on at least one occasion in the last 12 months. We are now going to ask you some questions about the nature of these breaches.


ASK IF EXPERIENCED UNAUTHORISED ACCESS

Q85A.HACKCOUNT

How many times in the last 12 months has someone accessed your files or networks, social media, or listened to conference calls without authorisation?

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY UNAUTHORISED ACCESS EVENTS EXPERIENCED

Q85B.HACKCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SAY THEY EXPERIENCED NO UNAUTHORISED ACCESS EVENTS

Q85B2.HACKNONE

You previously stated you experienced at least one incident where someone accessed your files or networks, social media, or listened to conference calls without authorisation in the last 12 months. Are you sure you experienced none?

SINGLE CODE

Yes

No REASK HACKCOUNT

Don’t know

Prefer not to say


ASK IF CAN RECALL AT LEAST ONE UNAUTHORISED ACCESS EVENT

Q85C.HACKFIN

We understand that several events may form part of the same incident in which several different types of breach happen.

[IF MORE THAN ONE UNAUTHORISED ACCESS EVENT: We would now only like you to consider events where someone accessed your files, networks or listened to conference calls without authorisation which were the final event in the incident. How many of the [INSERT RESPONSE FROM HACKCOUNT/HACKCOUNTDK] times someone accessed your files, networks or listened to conference calls without authorisation in the last 12 months was it the final event in an incident?]

[IF ONE UNAUTHORISED ACCESS EVENT: UNTDK EQUALS 1 OR HACKCOUNTDK IS 1 Was the time someone accessed your files, networks or listened to conference calls without authorisation in the last 12 months the final event in an incident?]

By this we mean it did not lead to any subsequent events such as a virus or fraud.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN HACKCOUNT OR HACKCOUNTDK.

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY UNAUTHORISED ACCESS FINAL EVENTS EXPERIENCED

Q85D.HACKFINDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF CAN RECALL AT LEAST ONE UNAUTHORISED ACCESS FINAL EVENT

Q85E.HACKSIV

Some events where unauthorised access is gained might be accidental, whereas for others you might be deliberately targeted. We’d now like you to only consider unauthorised access where your organisation has been deliberately targeted.

By your organisation being deliberately targeted we mean events where your organisation’s files or networks, were hacked into in a targeted manner. For instance, if an employee has accidentally used a machine or accessed a file they did not have permission to use, this would not count. However, where someone has knowingly gained unauthorised access into your network drives by getting login details via a phishing attack this would count.

[IF MORE THAN ONE UNAUTHORISED ACCESS FINAL EVENT: How many of the [INSERT RESPONSE FROM HACKFIN/HACKFINDK] times where unauthorised access was the final event in an incident were you deliberately targeted as an organisation?]

[IF ONE UNAUTHORISED ACCESS FINAL EVENT: were you deliberately targeted as an organisation when someone gained unauthorised access?]

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY INTENTIONAL UNAUTHORISED ACCESS FINAL EVENTS EXPERIENCED Q85F.HACKSIVDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF INTENTIONAL UNAUTHORISED ACCESS FINAL EVENTS EXPERIENCED

Q85G.HACKCONT

Thinking of the wider circumstances and causes of when unauthorised access was obtained in the last 12 months, [IF MORE THAN ONE INTENTIONAL UNAUTHORISED ACCESS FINAL EVENT EXPERIENCED: how many were] [IF ONE INTENTIONAL UNAUTHORISED ACCESS FINAL EVENT EXPERIENCED: was it] preceded by … ?

CODE IN NUMBER. HARD CHECK. SINGLE CODES MUST BE EQUAL TO OR LESS THAN HACKSIV OR HACKSIVDK.

IF CODE 1 AT TYPE: Computers becoming infected with ransomware

IF CODE 2 AT TYPE: Computers becoming infected with other malware (e.g. viruses or spyware)

IF CODE 3 AT TYPE: Denial of service attacks, i.e. attacks that try to slow or take down your website, applications or online services

IF CODE 4 AT TYPE: Hacking or attempted hacking of online bank accounts

IF CODE 6 AT TYPE: Phishing attacks, i.e. staff receiving fraudulent emails, or arriving at fraudulent websites

IF CODE 11 AT TYPE: Takeovers or attempts to take over your website, social media accounts or email accounts

SINGLE CODE

Don’t know

Prefer not to say


ASK IF INTENTIONAL UNAUTHORISED ACCESS FINAL EVENTS EXPERIENCED

Q85H.HACKEXTCOUNT

When unauthorised access was obtained in the last 12 months, [IF MORE THAN ONE INTENTIONAL UNAUTHORISED ACCESS FINAL EVENT EXPERIENCED: how many times] did this involve having a fee being demanded to end the unauthorised access?

Please exclude any unauthorised access events led to any subsequent events, such as a virus or fraud.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN HACKSIV OR HACKSIVDK.

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY INTENTIONAL UNAUTHORISED ACCESS FINAL EVENTS INVOLVED A FEE

Q85I.HACKEXTCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SHOW CODES EQUAL TO OR LESS THAN HACKSIV OR HACKSIVDK.

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF INTENTIONAL UNAUTHORISED ACCESS FINAL EVENT EXPERIENCED

Q85J.HACKCOSTA

How much do you think [IF MORE THAN ONE INTENTIONAL UNAUTHORISED ACCESS FINAL EVENT EXPERIENCED: all incidents where your files or networks were accessed or conference calls listened to without authorisation, where your organisation was targeted deliberately and/or maliciously in the last 12 months, have] [IF ONE INTENTIONAL UNAUTHORISED ACCESS FINAL EVENT EXPERIENCED: this incident where your files or networks, were accessed or conference calls listened to, where your organisation was targeted deliberately and/or maliciously in the last 12 months, has] cost your organisation financially?

Please include any costs such as legal fees, staff time spent on recovery and investigation, lost revenue, costs of any external contractors to help resolve issue, insurance excess, or any costs related to damaged files or software or purchasing any new software,

Please exclude any unauthorised access events led to any subsequent events, such as a virus or fraud.

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£30,000,000

IF SMALL (SIZEA CODE<50 OR SIZEB CODES 1–2): SOFT CHECK IF >£9,999

IF MEDIUM (SIZEA 49<CODE<250 OR SIZEB CODE 3): SOFT CHECK IF <£100 OR >£99,999

IF LARGE (SIZEA 249<CODE OR [SIZEB CODES 4–5 OR DK]): SOFT CHECK IF <£1,000 OR >£99,999

SINGLE CODE

No cost incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW TOTAL COST OF INTENTIONAL UNAUTHORISED ACCESS FINAL EVENTS

Q85K.HACKCOSTB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


READ OUT/SHOW IF EXPERIENCED ONLINE TAKEOVERS

You said you experienced someone trying to takeover websites, social media or email accounts on at least one occasion in the last 12 months. We are now going to ask you some questions about the nature of these breaches.

IF EXPERIENCED UNAUTHORISED ACCESS: We are referring explicitly to people outside your organisation accessing and taking over your websites, social media or email accounts. Please consider these as separate events to those where networks or files were accessed, or video or conference calls were listened to without authorisation


ASK IF EXPERIENCED ONLINE TAKEOVERS

Q86A.TKVRCOUNT

How many times in the last 12 months has someone tried to takeover websites, social media or email accounts?

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY ONLINE TAKEOVER EVENTS EXPERIENCED

Q86B.TKVRCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SAY THEY EXPERIENCED NO ONLINE TAKEOVER EVENTS

Q86B2.TKVRNONE

You previously stated you experienced at least one incident where someone tried to takeover websites, social media or email accounts in the last 12 months. Are you sure you experienced none?

SINGLE CODE

Yes

No REASK TKVRCOUNT

Don’t know

Prefer not to say


ASK IF CAN RECALL AT LEAST ONE ONLINE TAKEOVER EVENT

Q86C.TKVRSUC

[IF MORE THAN ONE ONLINE TAKEOVER EVENT: And how many times have these attempted takeovers been successful?]

[IF ONE ONLINE TAKEOVER EVENT: And was the attempted takeover successful?]

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN TKVRCOUNT OR TKVRCOUNTDK.

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY ONLINE TAKEOVER EVENTS WERE SUCCESSFUL

Q86D.TKVRSUCDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF CAN RECALL AT LEAST ONE SUCCESSFUL ONLINE TAKEOVER EVENT

Q86E.TKVRFIN

We understand that several events may form part of the same incident in which several different types of breach happen.

[IF MORE THAN ONE SUCCESSFUL ONLINE TAKEOVER EVENT: We would now only like you to consider events where someone tried to takeover websites, social media or email accounts which were the final event in the incident. How many of the [INSERT RESPONSE FROM TKVRSUC/TKVRSUCDK] times someone a someone tried to takeover websites, social media or email accounts in the last 12 months was it the final event in an incident?]

[IF ONE SUCCESSFUL ONLINE TAKEOVER EVENT: Was the time someone tried to takeover websites, social media or email accounts in the last 12 months the final event in an incident?]

By this we mean it did not lead to any subsequent events such as a virus or fraud.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN TKVRSUC OR TKVRSUCDK.

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY SUCCESSFUL ONLINE TAKEOVER FINAL EVENTS EXPERIENCED

Q86F.TKVRFINDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SUCCESSFUL ONLINE TAKEOVER FINAL EVENTS EXPERIENCED

Q86G.TKVRCONT

Thinking of the wider circumstances and causes of when someone tried to takeover websites, social media or email accounts in the last 12 months, how many were preceded by … ?

[IF MORE THAN ONE SUCCESSFUL ONLINE TAKEOVER FINAL EVENT: Thinking of the wider circumstances and causes of when someone tried to takeover websites, social media or email accounts in the last 12 months, how many were preceded by … ?]

[IF ONE SUCCESSFUL ONLINE TAKEOVER FINAL EVENT: Thinking of the wider circumstances and causes of when someone tried to takeover websites, social media or email accounts in the last 12 months, was it preceded by … ?]

CODE IN NUMBER. HARD CHECK. SINGLE CODES MUST BE EQUAL TO OR LESS THAN TKVRFIN OR TKVRFINDK.

IF CODE 1 AT TYPE: Computers becoming infected with ransomware

IF CODE 2 AT TYPE: Computers becoming infected with other malware (e.g. viruses or spyware)

IF CODE 3 AT TYPE: Denial of service attacks, i.e. attacks that try to slow or take down your website, applications or online services

IF CODE 4 AT TYPE: Hacking or attempted hacking of online bank accounts

IF CODE 6 AT TYPE: Phishing attacks, i.e. staff receiving fraudulent emails, or arriving at fraudulent websites

IF CODES 7-9 AT TYPE: Unauthorised accessing of files or networks by staff or [IF BUSINESS/CHARITY: people outside your organisation/IF EDUCATION: students, or people other than staff or students,] even if accidental

IF CODE 10 AT TYPE: Unauthorised listening into video conferences or instant messaging

SINGLE CODE

Don’t know

Prefer not to say


ASK IF SUCCESSFUL ONLINE TAKEOVER FINAL EVENTS EXPERIENCED

Q86H.TKVREXTCOUNT

When a takeover of websites, social media accounts or email accounts was successful in the last 12 months, [IF MORE THAN ONE SUCCESSFUL ONLINE TAKEOVER FINAL EVENT EXPERIENCED: how many times] did this involve having a fee being demanded to end the takeover?

Please exclude any takeover events led to any subsequent events, such as a virus or fraud.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN TKVRFIN OR TKVRFINDK.

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY SUCCESSFUL ONLINE TAKEOVER FINAL EVENTS INVOLVED A FEE

Q86I.TKVREXTCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SHOW CODES EQUAL TO OR LESS THAN TKVRFIN OR TKVRFINDK

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SUCCESSFUL ONLINE TAKEOVER FINAL EVENT EXPERIENCED

Q86J.TKVRCOSTA

How much do you think [IF MORE THAN ONE SUCCESSFUL ONLINE TAKEOVER FINAL EVENT EXPERIENCED: all incidents where someone tried to takeover websites, social media or email accounts where your organisation was targeted deliberately in the last 12 months have] [IF ONE SUCCESSFUL ONLINE TAKEOVER FINAL EVENT EXPERIENCED: this incident where someone tried to takeover websites, social media or email accounts where your organisation was targeted deliberately in the last 12 months has] cost your organisation financially?

Please include any costs such as legal fees, staff time spent on recovery and investigation, lost revenue, costs of any external contractors to help resolve issue, insurance excess, or any costs related to damaged files or software or purchasing any new software,

Please exclude any takeover events led to any subsequent events, such as a virus or fraud.

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£30,000,000

IF SMALL (SIZEA CODE<50 OR SIZEB CODES 1–2): SOFT CHECK IF >£9,999

IF MEDIUM (SIZEA 49<CODE<250 OR SIZEB CODE 3): SOFT CHECK IF <£100 OR >£99,999

IF LARGE (SIZEA 249<CODE OR [SIZEB CODES 4–5 OR DK]): SOFT CHECK IF <£1,000 OR >£99,999

SINGLE CODE

No cost incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW TOTAL COST OF SUCCESSFUL ONLINE TAKEOVER FINAL EVENTS

Q86K.TKVRCOSTB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


READ OUT/SHOW IF EXPERIENCED DENIAL OF SERVICE

You said you experienced one or more Denial of Service attack in the last 12 months. By this we mean attacks that tried to slow or take down your website, applications or online services. We are now going to ask you some questions about the nature of these breaches.


ASK IF EXPERIENCED DENIAL OF SERVICE

Q87A.DOSCOUNT

How many times in the last 12 months did your organisation experience a Denial of Service attack?

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY DENIAL OF SERVICE EVENTS EXPERIENCED

Q87B.DOSCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SAY THEY EXPERIENCED NO DENIAL OF SERVICE EVENTS

Q87B2.DOSNONE

You previously stated you experienced at least one denial of service attack in the last 12 months. Are you sure you experienced none?

SINGLE CODE

Yes

No REASK DOSCOUNT

Don’t know

Prefer not to say


ASK IF CAN RECALL AT LEAST ONE DENIAL OF SERVICE EVENT

Q87C.DOSFIN

We understand that several events may form part of the same incident in which several different types of breach happen.

[IF MORE THAN ONE DENIAL OF SERVICE EVENT: We would now only like you to consider incidents where Denial of Service attacks were the final event in the incident. How many of the [INSERT RESPONSE FROM DOSCOUNT/DOSCOUNTDK] Denial of Service attacks in the last 12 months was it the final event in an incident?]

[IF ONE DENIAL OF SERVICE EVENT: Was the Denial of Service attack you experienced in the last 12 months the final event in an incident?]

By this we mean it did not lead to any subsequent events such as a virus, hacking or fraud.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN DOSCOUNT OR DOSCOUNTDK.

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY DENIAL OF SERVICE FINAL EVENTS EXPERIENCED

Q87D.DOSFINDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF CAN RECALL AT LEAST ONE DENIAL OF SERVICE FINAL EVENT

Q87E.DOSSOFT

Denial of service attacks can be stopped upon infection by internal or third party software before it makes an impact. We’d now like you to consider when you’ve experienced a denial of service attack that has not been stopped by such software.

[IF MORE THAN ONE DENIAL OF SERVICE FINAL EVENT: How many of the [INSERT RESPONSE FROM DOSFIN/DOSFINDK] times where the denial of service attack was the final event in an incident was it not stopped by any internal or third party provided software?]

[IF ONE DENIAL OF SERVICE FINAL EVENT: Was the time your organisation experienced a denial of service attack in the last 12 months where it was the final event in an incident not stopped by internal or third party provided software?]

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN DOSFIN OR DOSFINDK.

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY SUCCESSFUL DENIAL OF SERVICE FINAL EVENTS EXPERIENCED Q87F.DOSSOFTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF CAN RECALL AT LEAST ONE SUCCESSFUL DENIAL OF SERVICE FINAL EVENT

Q87G.DOSSIV

Some events where unauthorised access is gained might be accidental, whereas for others you might be deliberately targeted. We’d now like you to only consider unauthorised access where your organisation has been deliberately targeted.

By being targeted deliberately we mean that you are certain the attack was carried out with malicious intent specifically at your organisation. For instance, where the website service is denied because it is experiencing high traffic,, this would not count as a targeted denial of service attack. However, where a group or an individual has deliberately overloaded your systems to cause them to crash, this would count as a targeted denial of service attack.

[IF MORE THAN ONE SUCCESSFUL DENIAL OF SERVICE FINAL EVENT: H ow many of the [INSERT RESPONSE FROM DOSSOFT/DOSSOFTDK] times where the denial of service attack was the final event in an incident and not stopped by software do you know that you were deliberately targeted as an organisation?]

[IF ONE SUCCESSFUL DENIAL OF SERVICE FINAL EVENT: Do you know if you were deliberately targeted as an organisation during the denial of service attack?]

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN DOSSOFT OR DOSSOFTDK.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

No/none

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENTS EXPERIENCED

Q87H.DOSSIVDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENTS EXPERIENCED

Q87I.DOSCONT

Thinking of the wider circumstances and causes of when [IF MORE THAN ONE SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENT EXPERIENCED: there have been Denial of Service attacks in the last 12 months, how many were] [IF ONE SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENT EXPERIENCED: there was a Denial of Service attack in the last 12 months, was it] preceded by … ?

CODE IN NUMBER. HARD CHECK. SINGLE CODES MUST BE EQUAL TO OR LESS THAN DOSSIV OR DOSSIVDK.

IF CODE 1 AT TYPE: Computers becoming infected with ransomware

IF CODE 2 AT TYPE: Computers becoming infected with other malware (e.g. viruses or spyware)

IF CODE 4 AT TYPE: Hacking or attempted hacking of online bank accounts

IF CODE 6 AT TYPE: Phishing attacks, i.e. staff receiving fraudulent emails, or arriving at fraudulent websites

IF CODES 7-9 AT TYPE: Unauthorised accessing of files or networks by staff or [IF BUSINESS/CHARITY: people outside your organisation/IF EDUCATION: students, or people other than staff or students,] even if accidental

IF CODE 10 AT TYPE: Unauthorised listening into video conferences or instant messaging

IF CODE 11 AT TYPE: Takeovers or attempts to take over your website, social media accounts or email accounts

SINGLE CODE

Don’t know

Prefer not to say


ASK IF SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENTS EXPERIENCED Q87J.DOSEXTCOUNT

And [IF MORE THAN ONE SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENT EXPERIENCED: how many times] did this involve having a fee being demanded to end the attack?

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. MUST BE EQUAL TO OR LESS THAN DOSSIV OR DOSSIVDK.

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY DOS ATTACKS INVOLVED FEE BEING DEMANDED (DOSEXTCOUNT CODE DK)

Q87K.DOSEXTCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SHOW CODES EQUAL TO OR LESS THAN DOSSIV OR DOSSIVDK

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENTS EXPERIENCED

Q87L.DOSCOSTA

How much do you think [IF MORE THAN ONE SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENT EXPERIENCED: all incidents where there was a Denial of Service attack where your organisation was targeted deliberately and/or maliciously in the last 12 months have] [IF ONE SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENT EXPERIENCED: this incident where there was denial of service attack where your organisation was targeted deliberately and/or maliciously in the last 12 months has] cost your organisation financially?

Please include any costs such as legal fees, staff time spent on recovery and investigation, lost revenue, costs of any external contractors to help resolve issue, insurance excess, or any costs related to damaged files or software or purchasing any new software,

Please exclude any attempts where any attacks were stopped by internal software or where they Denial of Service attack which led to any subsequent events, such as a virus, hacking incident or fraud.

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£30,000,000

IF SMALL (SIZEA CODE<50 OR SIZEB CODES 1–2): SOFT CHECK IF >£9,999

IF MEDIUM (SIZEA 49<CODE<250 OR SIZEB CODE 3): SOFT CHECK IF <£100 OR >£99,999

IF LARGE (SIZEA 249<CODE OR [SIZEB CODES 4–5 OR DK]): SOFT CHECK IF <£1,000 OR >£99,999

SINGLE CODE

No cost incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW TOTAL COST OF SUCCESSFUL AND INTENTIONAL DENIAL OF SERVICE FINAL EVENTS

Q87M.DOSCOSTB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know

READ OUT/SHOW IF ANY BREACHES OR ATTACKS

We’d now like to ask you about your experience of fraud. Just to remind you, anything you say in this interview is confidential and anonymous.


ASK IF ANY BREACHES OR ATTACKS

Q88A.FRAUD

In the last 12 months was your organisation a victim of fraud which was enabled by a cyber security breach?

By fraud we mean when a person or organisation is acting dishonestly, with the intent of making a gain (often financial), at the loss of another person or organisation. This could include impersonation of your organisation, obtaining bank account details to access and/or steal money from your account, using false invoices or bank details to ask for payments.

SINGLE CODE

Yes

No

Don’t know


ASK IF EXPERIENCED FRAUD

Q88B.FRAUDCOUNT

How many times in the last 12 months were you a victim of fraud which was enabled by a cyber security breach?

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY FRAUDS EXPERIENCED

Q88C.FRAUDCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know


ASK IF CAN RECALL AT LEAST ONE FRAUD EXPERIENCED

Q88D.FRAUDCONT

Thinking of the wider circumstances and causes of when you experienced cyber enabled fraud in the last 12 months, [IF MORE THAN ONE FRAUD EXPERIENCED: how many were] [IF ONE FRAUD EXPERIENCED was it] preceded by … ?

CODE IN NUMBER. HARD CHECK. SINGLE CODES MUST BE EQUAL TO OR LESS THAN FRAUDCOUNT OR FRAUDCOUNTDK.

IF CODE 1 AT TYPE: Computers becoming infected with ransomware

IF CODE 2 AT TYPE: Computers becoming infected with other malware (e.g. viruses or spyware)

IF CODE 3 AT TYPE: Denial of service attacks, i.e. attacks that try to slow or take down your website, applications or online services

IF CODE 4 AT TYPE: Hacking or attempted hacking of online bank accounts

IF CODE 5 AT TYPE: People impersonating your organisation in emails or online

IF CODE 6 AT TYPE: Phishing attacks, i.e. staff receiving fraudulent emails, or arriving at fraudulent websites

IF CODES 7-9 AT TYPE: Unauthorised accessing of files or networks by staff or [IF BUSINESS/CHARITY: people outside your organisation/IF EDUCATION: students, or people other than staff or students,] even if accidental

IF CODE 10 AT TYPE: Unauthorised listening into video conferences or instant messaging

IF CODE 11 AT TYPE: Takeovers or attempts to take over your website, social media accounts or email accounts

SINGLE CODE

Don’t know

Prefer not to say


ASK IF CAN RECALL AT LEAST ONE FRAUD EXPERIENCED

Q88E.FRAUDCOSTA

How much do you think [IF MORE THAN ONE FRAUD EXPERIENCED: all incidents where you experienced fraud enabled by a cyber security breach in the last 12 months have] [IF ONE FRAUD EXPERIENCED: this incident where you experienced fraud enabled by a cyber security breach in the last 12 months has] cost your organisation financially?

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£30,000,000

IF SMALL (SIZEA CODE<50 OR SIZEB CODES 1–2): SOFT CHECK IF >£9,999

IF MEDIUM (SIZEA 49<CODE<250 OR SIZEB CODE 3): SOFT CHECK IF <£100 OR >£99,999

IF LARGE (SIZEA 249<CODE OR [SIZEB CODES 4–5 OR DK]): SOFT CHECK IF <£1,000 OR >£99,999

SINGLE CODE

No cost incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW TOTAL COST OF FRAUD

Q88F.FRAUDCOSTB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


READ OUT/SHOW IF EXPERIENCED PHISHING

You said you experienced at least one phishing attack in the last 12 months. We’d now like to ask you some questions about your experience of phishing.


ASK IF EXPERIENCED PHISHING

Q89A.PHISHCOUNT

How many times in the last 12 months did you experience a phishing attack?

WRITE IN RANGE 1-100,000

SOFT CHECK IF >500

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY PHISHING EVENTS EXPERIENCED

Q89B.PHISHCOUNTDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know

SINGLE CODE

Don’t know

Prefer not to say


ASK IF SAY THEY EXPERIENCED NO PHISHING EVENTS

Q89B2.PHISHNONE

You previously stated you experienced at least one phishing attack in the last 12 months. Are you sure you experienced none?

SINGLE CODE

Yes

No REASK PHISHCOUNT

Don’t know

Prefer not to say


ASK IF CAN RECALL AT LEAST ONE PHISHING EVENT

Q89C.PHISHENG

[IF MORE THAN ONE PHISHING EVENT: How many of the [INSERT RESPONSE FROM PHISHCOUNT/PHISHCOUNTDK] phishing attacks you experienced did someone, such as an employee, within your organisation respond in some way?]

[IF ONE PHISHING EVENT: When your organisation experienced a phishing attack in the last 12 months did someone, such as an employee, within your organisation respond in some way?]

By responding we mean if someone clicked a link, opened an attachment, downloaded a file, or provided information in response. Please exclude any attacks that led to any subsequent events, such as a virus, hacking incident or fraud. Please also exclude any where nobody clicked or engaged with the link in the phishing email. If multiple computers were targeted through a single attack this counts as one incident.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY PHISHING EVENTS WERE ENGAGED WITH

Q89D.PHISHENGDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know

SINGLE CODE

Don’t know

Prefer not to say


ASK IF CAN RECALL AT LEAST ONE PHISHING EVENT, AND THIS NUMBER IS MORE THAN THE NUMBER OF PHISHING EVENTS ENGAGED WITH

Q89E.PHISHCON

[IF MORE THAN ONE PHISHING EVENT ENGAGED WITH: Excluding the number of times someone in your organisation responded to a phishing attack, how many times in the last 12 months did you experience a phishing attack that contained personal details of the recipient?]

[IF ONE PHISHING EVENT ENGAGED WITH: When your organisation experienced a phishing attack in the last 12 months did the communication contain any personal details of the recipient?]

By this we mean it contains their full name or any identifiable information such as an address or date of birth.

WRITE IN RANGE 1-1,000

SOFT CHECK IF >50

CODE IN NUMBER. HARD CHECK. SINGLE CODES MUST BE EQUAL TO OR LESS THAN PHISHCOUNT OR PHISHCOUNTDK MINUS COUNT FROM PHISHENG OR PHISHENGDK.

SINGLE CODE

None

Don’t know

Prefer not to say


ASK IF DON’T KNOW HOW MANY NON-ENGAGED PHISHING EVENTS HAD PERSONAL CONTACT DETAILS

Q89F.PHISHCONDK

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Only once

Two or three times

Four or five times

Six to ten times

Eleven to twenty times

Twenty-one to fifty times

Fifty-one to one hundred times

More than one hundred times

DO NOT READ OUT: Don’t know

SINGLE CODE

Don’t know

Prefer not to say


READ OUT/SHOW IF MORE THAN ONE TYPE OF BREACH OR ATTACK EXPERIENCED

Now [IF CATI I/IF WEB we] would like you to think about the one cyber security breach, or related series of breaches or attacks, that caused the most disruption to your organisation in the last 12 months.


ASK IF BUSINESS OR CHARITY AND MORE THAN ONE TYPE OF BREACH OR ATTACK EXPERIENCED

Q64A.DISRUPTA

What kind of breach was this?

PROMPT TO CODE IF NECESSARY

INTERVIEWER NOTE: IF MORE THAN ONE CODE APPLIES, ASK RESPONDENT WHICH ONE OF THESE THEY THINK STARTED OFF THE BREACH OR ATTACK

SINGLE CODE

CODES MENTIONED AT TYPE

Computers becoming infected with ransomware

Computers becoming infected with other malware (e.g. viruses or spyware)

Denial of service attacks, i.e. attacks that try to slow or take down your website, applications or online services

Hacking or attempted hacking of online bank accounts

People impersonating your organisation in emails or online

Phishing attacks, i.e. staff receiving fraudulent emails or arriving at fraudulent websites

Unauthorised accessing of files or networks by staff, even if accidental

Unauthorised accessing of files or networks by students

Unauthorised accessing of files or networks by people [IF BUSINESS/CHARITY: outside your organisation/IF EDUCATION: other than staff or students]

Unauthorised listening into video conferences or instant messaging

Takeovers or attempts to take over your website, social media accounts or email accounts

Any other types of cyber security breaches or attacks

DO NOT READ OUT: Don’t know


READ OUT/SHOW IF BUSINESS/CHARITY AND EXPERIENCED ONE TYPE OF BREACH OR ATTACK MORE THAN ONCE

You mentioned you had experienced [INSERT RESPONSE FROM TYPE] on more than one occasion. Now I would like you to think about the one instance of this that caused the most disruption to your organisation in the last 12 months.


ASK IF BUSINESS/CHARITY AND ONLY ONE TYPE OF BREACH OR ATTACK EXPERIENCED OR IF CAN CONSIDER A PARTICULAR BREACH OR ATTACK

Q71.RESTORE

How long, if any time at all, did it take to restore business operations back to normal after the breach or attack was identified? Was it …

PROBE FULLY

SINGLE CODE

No time at all

Less than a day

Between a day and under a week

Between a week and under a month

One month or more

DO NOT READ OUT: Still not back to normal

DO NOT READ OUT: Don’t know


ASK IF BUSINESS/CHARITY AND ONLY ONE TYPE OF BREACH OR ATTACK EXPERIENCED OR IF CAN CONSIDER A PARTICULAR BREACH OR ATTACK

Q76.REPORTA

Was this breach or attack reported to anyone outside your organisation, or not?

SINGLE CODE

Yes

No

Don’t know


ASK IF NOT REPORTED

Q77A.NOREPORT

What were the reasons for not reporting this breach or attack?

DO NOT READ OUT

PROBE FULLY (“ANYTHING ELSE?”)

WRITE IN TEXT BOX IF WEB

MULTICODE IF CATI

Breach/impact not significant enough

Breach was not criminal

Don’t know who to report to

No benefit to our business

Not obliged/required to report breaches

Reporting won’t make a difference

Too soon/haven’t had enough time

Worried about reputational damage

Other WRITE IN

SINGLE CODE

Don’t know


ASK IF REPORTED

Q77.REPORTB

Who was this breach or attack reported to?

DO NOT READ OUT

PROBE FULLY (“ANYONE ELSE?”)

WRITE IN TEXT BOX IF WEB

MULTICODE IF CATI

Action Fraud

Antivirus company

Bank, building society or credit card company

Centre for the Protection of National Infrastructure (CPNI)

CERT UK (the national computer emergency response team)

Cifas (the UK fraud prevention service)

Charity Commission

Clients/customers

Cyber Security Information Sharing Partnership (CISP)

Information Commissioner’s Office (ICO)

Internet/Network Service Provider

National Cyber Security Centre (NCSC)

National Crime Agency (NCA)

Outsourced cyber security provider

Police

Professional/trade/industry association

Regulator (e.g. Financial Conduct Authority)

Suppliers

Was publicly declared

Website administrator

Other government agency

Other WRITE IN

SINGLE CODE

Don’t know


ASK IF BUSINESS/CHARITY AND ONLY ONE TYPE OF BREACH OR ATTACK EXPERIENCED OR IF CAN CONSIDER A PARTICULAR BREACH OR ATTACK

Q78.PREVENT

What, if anything, have you done since this breach or attack to prevent or protect your organisation from further breaches like this?

DO NOT READ OUT

PROBE FULLY (“ANYTHING ELSE?”)

WRITE IN TEXT BOX IF WEB

MULTICODE IF CATI

Governance changes

Increased spending

Changed nature of the business/activities

New/updated business continuity plans

New/updated cyber policies

New checks for suppliers/contractors

New procurement processes, e.g. for devices/IT

New risk assessments

Increased senior management oversight/involvement

Purchased cyber insurance

Technical changes

Changed/updated firewall/system configurations

Changed user admin/access rights

Increased monitoring

New/updated antivirus/anti-malware software

Other new software/tools (not antivirus/anti-malware)

Penetration testing

People/training changes

Outsourced cyber security/hired external provider

Recruited new staff

Staff training/communications

Vetting staff/extra vetting

Other WRITE IN

SINGLE CODE

Nothing done

Don’t know


READ OUT/SHOW IF ONLY ONE TYPE OF BREACH OR ATTACK EXPERIENCED OR IF CAN CONSIDER A PARTICULAR BREACH OR ATTACK

We are now going to ask you about the approximate costs of this particular breach or attack.


ASK IF BUSINESS/CHARITY AND ONLY ONE TYPE OF BREACH OR ATTACK EXPERIENCED OR IF CAN CONSIDER A PARTICULAR BREACH OR ATTACK

Q78K.DAMAGEDIRS

What was the approximate value of any external payments made when the incident was being dealt with? This includes:

  • any payments to external IT consultants or contractors to investigate or fix the problem

  • any payments to the attackers, or money they stole.

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£999,999

SOFT CHECK IF >£9,999

SINGLE CODE

No cost of this kind incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW DIRECT RESULT COST OF THIS CYBER SECURITY BREACH OR ATTACK

Q78L.DAMAGEDIRSB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


ASK IF BUSINESS/CHARITY AND ONLY ONE TYPE OF BREACH OR ATTACK EXPERIENCED OR IF CAN CONSIDER A PARTICULAR BREACH OR ATTACK

Q78M.DAMAGEDIRL

What was the approximate value of any external payments made in the aftermath of the incident? This includes:

  • any payments to external IT consultants or contractors to run audits, risk assessments or training

  • the cost of new or upgraded software or systems

  • recruitment costs if you had to hire someone new

  • any legal fees, insurance excess, fines, compensation or PR costs related to the incident.

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£999,999

SOFT CHECK IF >£9,999

SINGLE CODE

No cost of this kind incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW DIRECT RESULT COST OF THIS CYBER SECURITY BREACH OR ATTACK

Q78N.DAMAGEDIRLB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


ASK IF BUSINESS/CHARITY AND ONLY ONE TYPE OF BREACH OR ATTACK EXPERIENCED OR IF CAN CONSIDER A PARTICULAR BREACH OR ATTACK

Q78O.DAMAGESTAFF

What was the approximate cost of the staff time dealing with the incident? This is how much staff would have got paid for the time they spent investigating or fixing the problem. Please include this cost even if this was part of this staff member’s job.

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£999,999

SOFT CHECK IF >£9,999

SINGLE CODE

No cost of this kind incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW DIRECT RESULT COST OF THIS CYBER SECURITY BREACH OR ATTACK

Q78P.DAMAGESTAFFB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


ASK IF BUSINESS/CHARITY AND ONLY ONE TYPE OF BREACH OR ATTACK EXPERIENCED OR IF CAN CONSIDER A PARTICULAR BREACH OR ATTACK

Q78Q.DAMAGEIND

What was the approximate value of any damage or disruption during the incident? This includes:

  • the cost of any time when staff could not do their jobs

  • the value of lost files or intellectual property

  • the cost of any devices or equipment that needed replacing.

PROBE FOR BEST ESTIMATE BEFORE CODING DK

REASSURE ABOUT CONFIDENTIALITY AND ANONYMISATION BEFORE CODING REF

WRITE IN RANGE £1–£999,999

SOFT CHECK IF >£9,999

SINGLE CODE

No cost of this kind incurred

Don’t know

Prefer not to say


ASK IF DON’T KNOW DIRECT RESULT COST OF THIS CYBER SECURITY BREACH OR ATTACK

Q78R.DAMAGEINDB

Was it approximately … ?

PROMPT TO CODE

SINGLE CODE

Less than £100

£100 to less than £500

£500 to less than £1,000

£1,000 to less than £5,000

£5,000 to less than £10,000

£10,000 to less than £20,000

£20,000 to less than £50,000

£50,000 to less than £100,000

£100,000 to less than £500,000

£500,000 to less than £1 million

£1 million to less than £5 million

£5 million or more

DO NOT READ OUT: Don’t know


ASK ALL

Q63A.INCIDCONTENT

Which of the following, if any, do you have in place, for when you experience a cyber security incident? By this, we mean any breach or attack that requires a response from your organisation. IF WEB: Please select all that apply.

READ OUT

MULTICODE

ROTATE LIST

Written guidance on who to notify

Roles or responsibilities assigned to specific individuals during or after an incident

External communications and public engagement plans

A formal incident response plan

Guidance around when to report incidents externally, e.g. to regulators or insurers

Used an NCSC approved incident response company

SINGLE CODE

DO NOT READ OUT: Don’t know

DO NOT READ OUT: None of these


ASK ALL

Q63B.INCIDACTION

IF ANY BREACHES OR ATTACKS: Which of the following, if any, do you do when you experience a cyber security incident?

IF NO BREACHES OR ATTACKS: Which of the following, if any, do you plan to do if you experience a cyber security incident?

READ OUT

ASK AS A GRID

RANDOMISE LIST

  1. Keep an internal record of incidents
  2. Attempt to identify the source of the incident
  3. Make an assessment of the scale and impact of the incident
  4. Formal debriefs or discussions to log any lessons learnt
  5. Inform your [IF BUSINESS: directors/IF CHARITY: trustees/IF EDUCATION: governors] or senior management of the incident
  6. Inform a regulator of the incident when required
  7. ASK IF HAVE CYBER INSURANCE: Inform your cyber insurance provider of the incident

SINGLE CODE

Yes

No

DO NOT READ OUT: Don’t know

DO NOT READ OUT: Depends on the severity/nature of the incident


ASK HALF A IF BUSINESS/CHARITY, OR ALL IF EDUCATION

Q63C.RANSOM

In the case of ransomware attacks, does your organisation make it a rule or policy to not pay ransomware payments?

SINGLE CODE

Yes

No

Don’t know

Appendix B: Topic guide

Perception of cyber security risk (ASK ALL)

SKIP IF THEY ARE PRESSED FOR TIME

Briefly, what would you say are the top 2-3 cyber security priorities for your organisation right now?

Impact of recent events

How has awareness of cyber resilience and cyber risks (e.g. cyber espionage, IP theft, data breaches, hacks and leaks, impact on reputation) changed over the last 12 months?

  • What prompted these changes, if any?

  • What impact have these changes had, if any?

  • Have recent geopolitical events (e.g. Russia/Ukraine, China) impacted your awareness of or attitudes towards cyber resilience and cyber risks?

  • Are you doing anything differently as a result of these geopolitical events?

  • Have they impacted on your ability to spot cyber security breaches?

How do you see cyber resilience and cyber risks changing over the next 12 months?

Cyber security leadership and governance – part A (ASK ALL)

Developing and implementing cyber security strategy

Can you briefly summarise your organisation’s overarching approach to cyber security? Why do you take this approach?

  • How do you formalise this approach, if at all?

  • How does the board monitor and update your approach to cyber security?

  • IF HAVE CYBER SECURITY STRATEGY AT Q33D: What is contained in your cyber security strategy?

  • Who has responsibility for it (PROBE ON BOARD)? How is it updated?

How does your organisation budget for cyber security? Who decides the budget? How frequently is it reviewed? PROBE:

  • How proactive/reactive are spending decisions? How much is in response to incidents? How much of it is planned?

  • Is it cyber security-specific, or part of a wider team budget (e.g. IT)? How well does cyber security get prioritised within this wider budget?

  • What do you need to do to justify any spending (e.g. a business case)? How easy is it to produce this? Do you have a way to demonstrate return on investment, i.e. investing £xx in cyber security saved losses of £xx?

Are any areas of cyber security held back by budget constraints? If you had extra money right now for cyber security, where would it go?

How has spending changed over the last few years? Has it trended upwards/downwards? What has driven this?

Have current economic conditions (e.g. rising inflation, cost of living crisis) impacted on how cyber security is viewed within the business? In what ways?

  • How has it impacted cyber security? What impact has it had on your budgets? How has it affected how you are able to implement robust cyber security measures?

How do you expect the predicted UK recession and global economic uncertainty will impact views of cyber security in your organisation going forward?

  • How do you see spending changing over the next 12 months?

  • Do you expect to see a reduction/increase in the company’s revenue/profits and how do you expect this to impact cyber security spending?

Roles and responsibilities

PRIORITY: The next few questions are about roles and responsibilities of board members

How closely are board members (directors, trustees etc.) and your executive team (CEOs etc.) involved in cyber security decisions? PROBE INVOLVEMENT IN:

  • Deciding what your cyber security priorities/critical assets are

  • Spending decisions (including staffing and outsourcing)

  • Incident response

How frequently is cyber security discussed at the Board or at senior management meetings (e.g. ad hoc, standing agenda item)?

Do you receive any support or advice from external experts around cyber security? If so, from who (e.g. outsourced providers, auditors)? Why do you need to use these?

How would you rate board members’ understanding of cyber security issues and reports?

IF TIME: The next few questions are about roles and responsibilities of the wider team

Who is responsible in your organisation for managing and protecting key digital assets and data?

Who is responsible for cyber security and how do they report to the board?

What is the broad structure of the cyber security team? What are the specific roles and responsibilities?

What interaction is there, if any, between cyber security and wider digital asset management?

Embedding cyber security in risk management

How well are your organisation’s cyber security needs understood by …

  • Members of the Board (IF RELEVANT)

  • Members of the executive team

How well do they understand your approach? PROBE:

  • What bits do they understand well/less well? What further support would you want to see from them on cyber security?

  • How do cyber security risks fit into wider risk management?

  • Does your organisation make a distinction between fraud, cyber crime and cyber attacks perpetrated by states? How does this affect the response?

  • How frequently do you assess the threat environment, technology developments and your capabilities?

  • What do you think other organisations like yours could learn from the way your board or executive team approaches cyber security?

Promoting cyber resilience culture

Across your wider staff, how do you go about embedding good behaviour and practice when it comes to cyber security? What is most effective? PROBE RELATIVE IMPORTANCE OF:

  • Building relationships between staff and cyber/IT teams

  • Awareness raising and training (any good examples)?

  • Exercises and feedback

[SEE SURVEY RESPONSE AT Q33B] What training, if any, have board members and wider staff had in cyber security? What training was for board members and what training was for wider staff?

  • How was this delivered (e.g. face-to-face training, online modules, testing, phishing exercises, any innovative methods)? How frequently?

  • How effective was it? How do you measure effectiveness?

  • Have you made any changes to practice or policy as a result of cyber security training?

Incident planning

How prepared would you say your wider staff are for a major cyber attack or cyber incident, if it took place tomorrow?

You mentioned in the survey that you have plans in place if you experience a cyber security incident [SEE SURVEY RESPONSE AT Q63A/B].

  • What types of cyber security incidents does this cover?

  • How do you go about preparing against cyber security incidents?

  • Who is responsible for signing these off? What guidance or advice do you receive on this, and from whom?

How often, if at all, do you run simulation exercises, penetration testing or scenario testing?

  • Do all staff do this, or only certain groups of staff? Why is this?

What’s the biggest challenge in this planning for incidents? What do you find hard? PROBE:

  • Willingness/pushback from staff

  • Skills of staff/senior management

  • Time/capacity of staff

  • Hybrid working

  • Budgets/training budgets

Incident response (ON ROTATION, LISTED IN THE SAMPLE PROFILE)

[SEE SURVEY RESPONSES TO Q63A / Q63B FOR CONTEXT AND REPHRASE QUESTIONS ACCORDINGLY]

We’d now like to move away from incident planning and ask you about incident response in the aftermath of a breach.

What is the first thing you would do when you notice any suspicious activity?

How do you assess the severity / potential impact of an incident or breach?

Would you be able to briefly walk us through the journey of how you typically respond to an impactful breach?

  • In the survey you said you [SEE SURVEY RESPONSES TO Q63A / Q63B] in response to a breach. Is this consistent across all breaches? What drives you to take action? What other actions do you take?

  • How has your actual response to a severe breach differed to what was planned? How did this impact overall response? IF TALKING ABOUT RANSOMWARE: Did it involve making payments?

  • What do you do once the breach has been contained? Why is this?

  • In what stage and type of incident would you involve the Board or senior management?

Under what circumstances would you not report a cyber security incident or breach? Why not?

Do you have a policy or rules around post-incident reviews with the board and management?

How do you think your approach to incident response can be improved? What kind of support would you need in helping to address this?

What types of cyber security incident would lead to you alerting any of the following bodies or groups:

  • A regulator

  • Your bank or insurance company

  • The police or a related body like Action Fraud – what kinds of breaches do you think they are interested in hearing about? Had you heard of Action Fraud before (https://www.actionfraud.police.uk/)?

  • The National Cyber Security Centre – what kinds of breaches do you think they are interested in hearing about?

  • Your customers, investors or suppliers

Do you think other organisations in your sector take the same approach?

Have you previously reported breaches to any of the bodies or groups mentioned above?

  • IF YES: Please talk me through the breach and the decision behind reporting it.

  • What were the advantages and disadvantages of reporting?

  • How did you decide who to report the breach to? Did you receive any third party advice on actions to take?

  • IF NOT REPORTED A BREACH: Please talk me through what informed this decision (e.g. cost / benefit, third party influence).

Have you ever had a serious breach you didn’t report to anyone externally? IF YES: Please talk me through the breach and why you didn’t report it.

Corporate annual reporting on digital strategy and risks (ON ROTATION, LISTED IN THE SAMPLE PROFILE)

In the survey you said that your organisation included a section on your digital strategy and risks in your last annual report.

What is the main purpose of this section? Who is the intended audience?

Could you outline the content of this section? What information was included? PROBE:

  • Digital or cyber strategy, risk assessments, governance arrangements, technical settings, training, supply chains, incidents

  • Would you typically exclude any of these areas? What’s the rationale behind that?

How is this section compiled and edited? PROBE:

  • Who writes it? Who decides on content and focus? Who approves it?

  • How much involvement does the board/executive team have over this section of the report? How much is left to the cyber/IT team alone?

  • Was the cyber security content compiled and edited in the same way as the rest of the document or differently? How/why?

How does this reporting compare to your competitors/others in your sector? Would you consider your organisation behind or ahead of others in terms of what you publish?

Digital Service Providers (ON ROTATION, LISTED IN THE SAMPLE PROFILE)

This last section is about Digital Service Providers, or DSPs, that manage a suite of IT services like your network, cloud computing and applications. In the survey, you said your organisation used one or more DSPs. This may include Managed Service Providers (MSPs).

What do(es) your DSP(s) provide? Is it a software package or a service? How essential are they to your continuity of production/service?

What were the factors involved in choosing your DSP(s)?

  • Was cyber security one of the considerations? IF YES: How much of a priority would you say this was compared to other factors (e.g. price, reliability, word of mouth)?

How much of a risk do you think your DSP(s) poses to your organisation’s cyber security?

  • Have you discussed this with them? How willing are they to discuss it/share information on their cyber security?

  • Does your contract with your DSP say anything about cyber security? What’s covered?

Who is responsible for cyber security between them and you, when it comes to their service? E.g. for incident response?

Cyber security practices

Standards and accreditation decisions (IF HAVE ACCREDITATIONS)

Tell us about the external cyber security standards and accreditations your organisation has adopted.

  • What made you decide to apply for this? PROBE: internal pressure (e.g. board members), external pressure/requirements from clients, investors, insurance providers, for branding/marketing, etc.

  • What made you choose this standard over others? PROBE: ISO 27001, Cyber Essentials, Cyber Essentials Plus, NIST

  • What involvement did your board/executive team have in this? How well do they understand this standard and what it means?

  • How has this standard improved your cyber security? What changes did you have to make to meet this standard, if any?

Have your cyber security standards and accreditations helped you win contracts or new business?

Have you heard of the NCSC’s Cyber Assessment Framework? Have you used this in your organisation? What has your experience been?

Supply chain decisions (IF HAVE SUPPLY CHAIN MANAGEMENT)

How do you go about managing cyber security risks from your wider supply chains? How systematic/formalised is this process?

  • Do you use software for this?

Who is responsible? How engaged are your board/executive team in this?

How would you rate your awareness/monitoring of the risks? PROBE: Do you know which suppliers have access to your IT systems? Which ones are essential to your continuity of production/service?

How often do you talk to your suppliers about cyber security? How do you ensure they are aware of their responsibilities?

How would you react to suppliers if they had cyber security incidents affecting you? Would you expect to provide any support?

Do you supply to other businesses or organisations?

  • IF PARTICIPANT IS A SUPPLIER: Do you know your cyber security contractual obligations as a supplier?

  • What do you have to report to your client businesses? How do you monitor and evidence this?

  • If you experienced a cyber security incident, how would this impact the businesses / organisations that you supply? Would you expect to receive any support?

Has anything changed in terms of how you look at cyber security risks from your supply chains in the last 2 years? Has it got any more/less important?

Is there more you would want to do about assessing or monitoring supply chains? What are the barriers to doing this?

Summary and wrap-up

Thinking about all the challenges we talked about, are there any areas that you think your organisation could improve on, or could focus on more?

What’s the most important thing you think we have talked about?

Appendix C: Further information

  1. The Department for Science, Innovation and Technology and the Home Office would like to thank the following people for their work in the development and carrying out of the survey and for their work compiling the Statistical Release and annexes.
  • Harry Williams, Ipsos

  • Finlay Procter, Ipsos

  • Jamie Douglas, Ipsos

  • Shahil Parmar, Ipsos

  • Nick Coleman, Ipsos

  • Jayesh Navin Shah, Ipsos

  • Professor Steven Furnell, University of Nottingham.

  1. The Cyber Security Breaches Survey was first published in 2016 as a research report, and became an Official Statistic in 2017. The previous reports can be found at https://www.gov.uk/government/collections/cyber-security-breaches-survey. This includes the full report and the technical and methodological information for each year.

  2. The responsible DSIT analyst for this release is Emma Johns. The responsible statistician is Maddy Ell. For enquiries on this release, from an official statistics perspective, please contact DSIT at [email protected]. As of publication, the DCMS mailbox remains the correct address, but this may change to a DSIT-specific email address next year.

  3. For general enquiries contact: 020 7215 1000

  4. The Cyber Security Breaches Survey is an official statistics publication and has been produced to the standards set out in the Code of Practice for Official Statistics. For more information, see https://www.statisticsauthority.gov.uk/code-of-practice/. Details of the pre-release access arrangements for this dataset have been published alongside this release.

  5. This work was carried out in accordance with the requirements of the international quality standard for Market Research, ISO 20252.

  1. See https://www.gov.uk/government/publications/information-security-breaches-survey-2015 for the final survey in this series. This was preceded by earlier surveys in 2014, 2013 and 2012. We reiterate that these surveys are not representative of all UK businesses and are not comparable to the Cyber Security Breaches Survey series. 

  2. This was administered either as a high street voucher or as a charity donation, as the participant preferred. 

  3. These are organisations that work for a social purpose, but are not registered as charities, so not regulated by the UK’s charity regulators. 

  4. SIC sectors here and in subsequent tables in this report have been combined into the sector groupings used in the main report. 

  5. This excludes the two weeks around the Christmas and New Year bank holidays, during which there was minimal fieldwork conducted. 

  6. See, for example, Groves and Peytcheva (2008) “The Impact of Nonresponse Rates on Nonresponse Bias: A Meta-Analysis”, Public Opinion Quarterly (available at: https://academic.oup.com/poq/article-abstract/72/2/167/1920564) and Sturgis, Williams, Brunton-Smith and Moore (2016) “Fieldwork Effort, Response Rate, and the Distribution of Survey Outcomes: A Multilevel Meta-analysis”, Public Opinion Quarterly (available at: https://academic.oup.com/poq/issue/81/2). 

  7. The default SPSS setting is to round cell counts and then calculate percentages based on integers.