Skip to main content

Organizations Fail to Test Cybersecurity Incident Response Plans

A vast majority of organizations are still unprepared to properly respond to cybersecurity incidents, with 77% of respondents indicating they do not have a cybersecurity incident response plan applied consistently across the enterprise, according to The 2019 Study on the Cyber Resilient Organization, a study conducted by the Ponemon Institute on behalf of IBM.


While studies show that companies who can respond quickly and efficiently to contain a cyberattack within 30 days save over $1 million on the total cost of a data breach on average, shortfalls in proper cybersecurity incident response planning have remained consistent over the past four years of the study.

Of the organizations surveyed that do have a plan in place, more than half (54%) do not test their plans regularly, which can leave them less prepared to effectively manage the complex processes and coordination that must take place in the wake of an attack.

The difficulty cybersecurity teams are facing in implementing a cyber security incident response plan has also impacted businesses' compliance with the General Data Protection Regulation (GDPR). Nearly half of respondents (46%) say their organization has yet to realize full compliance with GDPR, even as the one-year anniversary of the legislation quickly approaches.

"Failing to plan is a plan to fail when it comes to responding to a cybersecurity incident. These plans need to be stress tested regularly and need full support from the board to invest in the necessary people, processes and technologies to sustain such a program," said Ted Julian, VP of Product Management and Co-Founder, IBM Resilient. "When proper planning is paired with investments in automation, we see companies able to save millions of dollars during a breach."

Other takeaways from the study include:

Automation Still Emerging

For the first time, this year's study measured the impact of automation on cyber resilience. In the context of this research, automation refers to enabling security technologies that augment or replace human intervention in the identification and containment of cyber exploits or breaches. These technologies depend upon artificial intelligence, machine learning, analytics and orchestration.

When asked if their organization leveraged automation, only 23% of respondents said they were significant users, whereas 77% reported their organizations only use automation moderately, insignificantly or not at all. Organizations with the extensive use of automation rate their ability to prevent (69% vs. 53%), detect (76% vs. 53%), respond (68% vs. 53%) and contain (74% vs. 49%) a cyberattack as higher than the overall sample of respondents.

According to the 2018 Cost of a Data Breach Study, the use of automation is a missed opportunity to strengthen cyber resilience, as organizations that fully deployed security automation saved $1.5 million on the total cost of a data breach, contrasted with organizations that did not leverage automation and realized a much higher total cost of a data breach.

Skills Gap Still Impacting Cyber Resilience

The cybersecurity skills gap appears to be further undermining cyber resilience, as organizations reported that a lack of staffing hindered their ability to properly manage resources and needs. Survey participants stated they lack the headcount to properly maintain and test their incident response plans and are facing 10-20 open seats on cybersecurity teams. In fact, only 30% of respondents reported that staffing for cybersecurity is sufficient to achieve a high level of cyber resilience. Furthermore, 75% of respondents rate their difficulty in hiring and retaining skilled cybersecurity personnel as moderately high to high.

Adding to the skills challenge, nearly half of respondents (48%) said their organization deploys too many separate security tools, ultimately increasing operational complexity and reducing visibility into overall security posture.

Privacy Growing as a Priority

Organizations are finally acknowledging that collaboration between privacy and cybersecurity teams can improve cyber resilience, with 62% indicating that aligning these teams is essential to achieving resilience. Most respondents believe the privacy role is becoming increasingly important, especially with the emergence of new regulations like GDPR and the California Consumer Privacy Act, and are prioritizing data protection when making IT buying decisions.

When asked what the top factor was in justifying cybersecurity spend, 56% of respondents said information loss or theft. This rings especially true as consumers are demanding businesses do more to actively protect their data. According to a recent survey by IBM, 78% of respondents say a company's ability to keep their data private is extremely important, and only 20% completely trust organizations they interact with to maintain the privacy of their data.

In addition, most respondents also reported having a privacy leader employed, with 73% stating they have a Chief Privacy Officer, further proving that data privacy has become a top priority in organizations.

Methodology: Conducted by the Ponemon Institute and sponsored by IBM Resilient, "The 2019 Cyber Resilient Organization" is the fourth annual benchmark study on Cyber Resilience – an organization's ability to maintain its core purpose and integrity in the face of cyberattacks. The global survey features insight from more than 3,600 security and IT professionals from around the world, including the United States, Canada, United Kingdom, France, Germany, Brazil, Australia, Middle East and Asia Pacific.

The Latest

Reliability is no longer proven by uptime alone, according to the The SRE Report 2026 from LogicMonitor. In the AI era, it is experienced through speed, consistency, and user trust, and increasingly judged by business impact. As digital services grow more complex and AI systems move into production, traditional monitoring approaches are struggling to keep pace, increasing the need for AI-first observability that spans applications, infrastructure, and the Internet ...

If AI is the engine of a modern organization, then data engineering is the road system beneath it. You can build the most powerful engine in the world, but without paved roads, traffic signals, and bridges that can support its weight, it will stall. In many enterprises, the engine is ready. The roads are not ...

In the world of digital-first business, there is no tolerance for service outages. Businesses know that outages are the quickest way to lose money and customers. For smaller organizations, unplanned downtime could even force the business to close ... A new study from PagerDuty, The State of AI-First Operations, reveals that companies actively incorporating AI into operations now view operational resilience as a growth driver rather than a cost center. But how are they achieving it? ...

In live financial environments, capital markets software cannot pause for rebuilds. New capabilities are introduced as stacked technology layers to meet evolving demands while systems remain active, data keeps moving, and controls stay intact. AI is no exception, and its opportunities are significant: accelerated decision cycles, compressed manual workflows, and more effective operations across complex environments. The constraint isn't the models themselves, but the architectural environments they enter ...

Like most digital transformation shifts, organizations often prioritize productivity and leave security and observability to keep pace. This usually translates to both the mass implementation of new technology and fragmented monitoring and observability (M&O) tooling. In the era of AI and varied cloud architecture, a disparate observability function can be dangerous. IT teams will lack a complete picture of their IT environment, making it harder to diagnose issues while slowing down mean time to resolve (MTTR). In fact, according to recent data from the SolarWinds State of Monitoring & Observability Report, 77% of IT personnel said the lack of visibility across their on-prem and cloud architecture was an issue ...

Organizations Fail to Test Cybersecurity Incident Response Plans

A vast majority of organizations are still unprepared to properly respond to cybersecurity incidents, with 77% of respondents indicating they do not have a cybersecurity incident response plan applied consistently across the enterprise, according to The 2019 Study on the Cyber Resilient Organization, a study conducted by the Ponemon Institute on behalf of IBM.


While studies show that companies who can respond quickly and efficiently to contain a cyberattack within 30 days save over $1 million on the total cost of a data breach on average, shortfalls in proper cybersecurity incident response planning have remained consistent over the past four years of the study.

Of the organizations surveyed that do have a plan in place, more than half (54%) do not test their plans regularly, which can leave them less prepared to effectively manage the complex processes and coordination that must take place in the wake of an attack.

The difficulty cybersecurity teams are facing in implementing a cyber security incident response plan has also impacted businesses' compliance with the General Data Protection Regulation (GDPR). Nearly half of respondents (46%) say their organization has yet to realize full compliance with GDPR, even as the one-year anniversary of the legislation quickly approaches.

"Failing to plan is a plan to fail when it comes to responding to a cybersecurity incident. These plans need to be stress tested regularly and need full support from the board to invest in the necessary people, processes and technologies to sustain such a program," said Ted Julian, VP of Product Management and Co-Founder, IBM Resilient. "When proper planning is paired with investments in automation, we see companies able to save millions of dollars during a breach."

Other takeaways from the study include:

Automation Still Emerging

For the first time, this year's study measured the impact of automation on cyber resilience. In the context of this research, automation refers to enabling security technologies that augment or replace human intervention in the identification and containment of cyber exploits or breaches. These technologies depend upon artificial intelligence, machine learning, analytics and orchestration.

When asked if their organization leveraged automation, only 23% of respondents said they were significant users, whereas 77% reported their organizations only use automation moderately, insignificantly or not at all. Organizations with the extensive use of automation rate their ability to prevent (69% vs. 53%), detect (76% vs. 53%), respond (68% vs. 53%) and contain (74% vs. 49%) a cyberattack as higher than the overall sample of respondents.

According to the 2018 Cost of a Data Breach Study, the use of automation is a missed opportunity to strengthen cyber resilience, as organizations that fully deployed security automation saved $1.5 million on the total cost of a data breach, contrasted with organizations that did not leverage automation and realized a much higher total cost of a data breach.

Skills Gap Still Impacting Cyber Resilience

The cybersecurity skills gap appears to be further undermining cyber resilience, as organizations reported that a lack of staffing hindered their ability to properly manage resources and needs. Survey participants stated they lack the headcount to properly maintain and test their incident response plans and are facing 10-20 open seats on cybersecurity teams. In fact, only 30% of respondents reported that staffing for cybersecurity is sufficient to achieve a high level of cyber resilience. Furthermore, 75% of respondents rate their difficulty in hiring and retaining skilled cybersecurity personnel as moderately high to high.

Adding to the skills challenge, nearly half of respondents (48%) said their organization deploys too many separate security tools, ultimately increasing operational complexity and reducing visibility into overall security posture.

Privacy Growing as a Priority

Organizations are finally acknowledging that collaboration between privacy and cybersecurity teams can improve cyber resilience, with 62% indicating that aligning these teams is essential to achieving resilience. Most respondents believe the privacy role is becoming increasingly important, especially with the emergence of new regulations like GDPR and the California Consumer Privacy Act, and are prioritizing data protection when making IT buying decisions.

When asked what the top factor was in justifying cybersecurity spend, 56% of respondents said information loss or theft. This rings especially true as consumers are demanding businesses do more to actively protect their data. According to a recent survey by IBM, 78% of respondents say a company's ability to keep their data private is extremely important, and only 20% completely trust organizations they interact with to maintain the privacy of their data.

In addition, most respondents also reported having a privacy leader employed, with 73% stating they have a Chief Privacy Officer, further proving that data privacy has become a top priority in organizations.

Methodology: Conducted by the Ponemon Institute and sponsored by IBM Resilient, "The 2019 Cyber Resilient Organization" is the fourth annual benchmark study on Cyber Resilience – an organization's ability to maintain its core purpose and integrity in the face of cyberattacks. The global survey features insight from more than 3,600 security and IT professionals from around the world, including the United States, Canada, United Kingdom, France, Germany, Brazil, Australia, Middle East and Asia Pacific.

The Latest

Reliability is no longer proven by uptime alone, according to the The SRE Report 2026 from LogicMonitor. In the AI era, it is experienced through speed, consistency, and user trust, and increasingly judged by business impact. As digital services grow more complex and AI systems move into production, traditional monitoring approaches are struggling to keep pace, increasing the need for AI-first observability that spans applications, infrastructure, and the Internet ...

If AI is the engine of a modern organization, then data engineering is the road system beneath it. You can build the most powerful engine in the world, but without paved roads, traffic signals, and bridges that can support its weight, it will stall. In many enterprises, the engine is ready. The roads are not ...

In the world of digital-first business, there is no tolerance for service outages. Businesses know that outages are the quickest way to lose money and customers. For smaller organizations, unplanned downtime could even force the business to close ... A new study from PagerDuty, The State of AI-First Operations, reveals that companies actively incorporating AI into operations now view operational resilience as a growth driver rather than a cost center. But how are they achieving it? ...

In live financial environments, capital markets software cannot pause for rebuilds. New capabilities are introduced as stacked technology layers to meet evolving demands while systems remain active, data keeps moving, and controls stay intact. AI is no exception, and its opportunities are significant: accelerated decision cycles, compressed manual workflows, and more effective operations across complex environments. The constraint isn't the models themselves, but the architectural environments they enter ...

Like most digital transformation shifts, organizations often prioritize productivity and leave security and observability to keep pace. This usually translates to both the mass implementation of new technology and fragmented monitoring and observability (M&O) tooling. In the era of AI and varied cloud architecture, a disparate observability function can be dangerous. IT teams will lack a complete picture of their IT environment, making it harder to diagnose issues while slowing down mean time to resolve (MTTR). In fact, according to recent data from the SolarWinds State of Monitoring & Observability Report, 77% of IT personnel said the lack of visibility across their on-prem and cloud architecture was an issue ...