Cost of Poor Software Quality in US Exceeds $2 Trillion
February 09, 2021
Share this

The cost of poor software quality (CPSQ) in the US in 2020 was approximately $2.08 trillion, according to The Cost of Poor Software Quality In the US: A 2020 Report from the Consortium for Information & Software Quality (CISQ), co-sponsored by Synopsys.

This includes poor software quality resulting from software failures, unsuccessful development projects, legacy system problems, technical debt and cybercrime enabled by exploitable weaknesses and vulnerabilities in software.

"As organizations undertake major digital transformations, software-based innovation and development rapidly expands," said report author, Herb Krasner. "The result is a balancing act, trying to deliver value at high speed without sacrificing quality. However, software quality typically lags behind other objectives in most organizations. That lack of primary attention to quality comes at a steep cost."

Key findings from the report include:

Operational software failure

Operational software failure is the leading driver of the total cost of poor software quality (CPSQ), estimated at $1.56 trillion — about 10X costlier than finding and fixing the defects before releasing software into operation.

This figure represents a 22% increase since 2018. That number could be low given the meteoric rise in cybersecurity failures, and also with the understanding that many failures go unreported.

Cybercrimes enabled by exploitable weaknesses and vulnerabilities in software are the largest growth area by far in the last 2 years. The underlying cause is primarily unmitigated software flaws.

The report recommends preventing defects from occurring as early as possible when they are relatively cheap to fix. The second recommendation is isolating, mitigating, and correcting those failures as quickly as possible to limit damage.

Unsuccessful development projects

Unsuccessful development projects, the next largest growth area of the CPSQ, is estimated at $260 billion.

This figure has risen by 46% since 2018. There has been a steady project failure rate of ~19% for over a decade.

The underlying causes are varied, but one consistent theme has been the lack of attention to quality.

The report states: "It is amazing how many IT projects just assume that “quality happens.” The best way to focus a project on quality is to properly define what quality means for that specific project and then focus on achieving measurable results against stated quality objectives."

Research suggests that success rates go up dramatically when using Agile and DevOps methodologies, leading to decision latency being minimized.

Legacy software

The operation and maintenance of legacy software contributed $520 billion to the CPSQ.

While this is down from $635 billion in 2018, it still represents nearly a third of the US's total IT expenditure in 2020.

The report explains: "CPSQ in legacy systems is harder to address because such systems automate core business functions and modernization is not always straightforward. After decades of operation, they may have become less efficient, less secure, unstable, incompatible with newer technologies and systems, and more difficult to support due to loss of knowledge and/or increased complexity or loss of vendor support. In many cases, they represent a single point of failure risk to the business."

The report recommends strategies to improve quality are about overcoming the lack of understanding and knowledge of how the system works internally. Any tool that helps identify weaknesses, vulnerabilities, failure symptoms, defects and improvement targets is going to be useful.

Conslusion

"As poor software quality persists on an upward trajectory, the solution remains the same: prevention is still the best medicine. It's important to build secure, high-quality software that addresses weaknesses and vulnerabilities as close to the source as possible," said Joe Jarzombek, Director for Government and Critical Infrastructure Programs at Synopsys. "This limits the potential damage and cost to resolve issues. It reduces the cost of ownership and makes software-controlled capabilities more resilient to attempts of cyber exploitation."

Methodologies such as Agile and DevOps have supported the evolution of software development whereby software developers apply enhancements as small, incremental changes that are tested and committed daily, hourly, or even moment by moment into production. This results in higher velocity and more responsive development cycles, but not necessarily better quality.

As DevSecOps aims to improve the security mechanisms around high-velocity software development, the emergence of DevQualOps encompasses activities that assure an appropriate level of quality across the Agile, DevOps, and DevSecOps lifecycle.

Share this

The Latest

April 25, 2024

The use of hybrid multicloud models is forecasted to double over the next one to three years as IT decision makers are facing new pressures to modernize IT infrastructures because of drivers like AI, security, and sustainability, according to the Enterprise Cloud Index (ECI) report from Nutanix ...

April 24, 2024

Over the last 20 years Digital Employee Experience has become a necessity for companies committed to digital transformation and improving IT experiences. In fact, by 2025, more than 50% of IT organizations will use digital employee experience to prioritize and measure digital initiative success ...

April 23, 2024

While most companies are now deploying cloud-based technologies, the 2024 Secure Cloud Networking Field Report from Aviatrix found that there is a silent struggle to maximize value from those investments. Many of the challenges organizations have faced over the past several years have evolved, but continue today ...

April 22, 2024

In our latest research, Cisco's The App Attention Index 2023: Beware the Application Generation, 62% of consumers report their expectations for digital experiences are far higher than they were two years ago, and 64% state they are less forgiving of poor digital services than they were just 12 months ago ...

April 19, 2024

In MEAN TIME TO INSIGHT Episode 5, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses the network source of truth ...

April 18, 2024

A vast majority (89%) of organizations have rapidly expanded their technology in the past few years and three quarters (76%) say it's brought with it increased "chaos" that they have to manage, according to Situation Report 2024: Managing Technology Chaos from Software AG ...

April 17, 2024

In 2024 the number one challenge facing IT teams is a lack of skilled workers, and many are turning to automation as an answer, according to IT Trends: 2024 Industry Report ...

April 16, 2024

Organizations are continuing to embrace multicloud environments and cloud-native architectures to enable rapid transformation and deliver secure innovation. However, despite the speed, scale, and agility enabled by these modern cloud ecosystems, organizations are struggling to manage the explosion of data they create, according to The state of observability 2024: Overcoming complexity through AI-driven analytics and automation strategies, a report from Dynatrace ...

April 15, 2024

Organizations recognize the value of observability, but only 10% of them are actually practicing full observability of their applications and infrastructure. This is among the key findings from the recently completed Logz.io 2024 Observability Pulse Survey and Report ...

April 11, 2024

Businesses must adopt a comprehensive Internet Performance Monitoring (IPM) strategy, says Enterprise Management Associates (EMA), a leading IT analyst research firm. This strategy is crucial to bridge the significant observability gap within today's complex IT infrastructures. The recommendation is particularly timely, given that 99% of enterprises are expanding their use of the Internet as a primary connectivity conduit while facing challenges due to the inefficiency of multiple, disjointed monitoring tools, according to Modern Enterprises Must Boost Observability with Internet Performance Monitoring, a new report from EMA and Catchpoint ...