Cost of Poor Software Quality in US Exceeds $2 Trillion
February 09, 2021
Share this

The cost of poor software quality (CPSQ) in the US in 2020 was approximately $2.08 trillion, according to The Cost of Poor Software Quality In the US: A 2020 Report from the Consortium for Information & Software Quality (CISQ), co-sponsored by Synopsys.

This includes poor software quality resulting from software failures, unsuccessful development projects, legacy system problems, technical debt and cybercrime enabled by exploitable weaknesses and vulnerabilities in software.

"As organizations undertake major digital transformations, software-based innovation and development rapidly expands," said report author, Herb Krasner. "The result is a balancing act, trying to deliver value at high speed without sacrificing quality. However, software quality typically lags behind other objectives in most organizations. That lack of primary attention to quality comes at a steep cost."

Key findings from the report include:

Operational software failure

Operational software failure is the leading driver of the total cost of poor software quality (CPSQ), estimated at $1.56 trillion — about 10X costlier than finding and fixing the defects before releasing software into operation.

This figure represents a 22% increase since 2018. That number could be low given the meteoric rise in cybersecurity failures, and also with the understanding that many failures go unreported.

Cybercrimes enabled by exploitable weaknesses and vulnerabilities in software are the largest growth area by far in the last 2 years. The underlying cause is primarily unmitigated software flaws.

The report recommends preventing defects from occurring as early as possible when they are relatively cheap to fix. The second recommendation is isolating, mitigating, and correcting those failures as quickly as possible to limit damage.

Unsuccessful development projects

Unsuccessful development projects, the next largest growth area of the CPSQ, is estimated at $260 billion.

This figure has risen by 46% since 2018. There has been a steady project failure rate of ~19% for over a decade.

The underlying causes are varied, but one consistent theme has been the lack of attention to quality.

The report states: "It is amazing how many IT projects just assume that “quality happens.” The best way to focus a project on quality is to properly define what quality means for that specific project and then focus on achieving measurable results against stated quality objectives."

Research suggests that success rates go up dramatically when using Agile and DevOps methodologies, leading to decision latency being minimized.

Legacy software

The operation and maintenance of legacy software contributed $520 billion to the CPSQ.

While this is down from $635 billion in 2018, it still represents nearly a third of the US's total IT expenditure in 2020.

The report explains: "CPSQ in legacy systems is harder to address because such systems automate core business functions and modernization is not always straightforward. After decades of operation, they may have become less efficient, less secure, unstable, incompatible with newer technologies and systems, and more difficult to support due to loss of knowledge and/or increased complexity or loss of vendor support. In many cases, they represent a single point of failure risk to the business."

The report recommends strategies to improve quality are about overcoming the lack of understanding and knowledge of how the system works internally. Any tool that helps identify weaknesses, vulnerabilities, failure symptoms, defects and improvement targets is going to be useful.

Conslusion

"As poor software quality persists on an upward trajectory, the solution remains the same: prevention is still the best medicine. It's important to build secure, high-quality software that addresses weaknesses and vulnerabilities as close to the source as possible," said Joe Jarzombek, Director for Government and Critical Infrastructure Programs at Synopsys. "This limits the potential damage and cost to resolve issues. It reduces the cost of ownership and makes software-controlled capabilities more resilient to attempts of cyber exploitation."

Methodologies such as Agile and DevOps have supported the evolution of software development whereby software developers apply enhancements as small, incremental changes that are tested and committed daily, hourly, or even moment by moment into production. This results in higher velocity and more responsive development cycles, but not necessarily better quality.

As DevSecOps aims to improve the security mechanisms around high-velocity software development, the emergence of DevQualOps encompasses activities that assure an appropriate level of quality across the Agile, DevOps, and DevSecOps lifecycle.

Share this

The Latest

February 27, 2024

Generative AI has recently experienced unprecedented dramatic growth, making it one of the most exciting transformations the tech industry has seen in some time. However, this growth also poses a challenge for tech leaders who will be expected to deliver on the promise of new technology. In 2024, delivering tangible outcomes that meet the potential of AI, and setting up incubator projects for the future will be key tasks ...

February 26, 2024

SAP is a tool for automating business processes. Managing SAP solutions, especially with the shift to the cloud-based S/4HANA platform, can be intricate. To explore the concerns of SAP users during operational transformations and automation, a survey was conducted in mid-2023 by Digitate and Americas' SAP Users' Group ...

February 22, 2024

Some companies are just starting to dip their toes into developing AI capabilities, while (few) others can claim they have built a truly AI-first product. Regardless of where a company is on the AI journey, leaders must understand what it means to build every aspect of their product with AI in mind ...

February 21, 2024

Generative AI will usher in advantages within various industries. However, the technology is still nascent, and according to the recent Dynatrace survey there are many challenges and risks that organizations need to overcome to use this technology effectively ...

February 20, 2024

In today's digital era, monitoring and observability are indispensable in software and application development. Their efficacy lies in empowering developers to swiftly identify and address issues, enhance performance, and deliver flawless user experiences. Achieving these objectives requires meticulous planning, strategic implementation, and consistent ongoing maintenance. In this blog, we're sharing our five best practices to fortify your approach to application performance monitoring (APM) and observability ...

February 16, 2024

In MEAN TIME TO INSIGHT Episode 3, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses network security with Chris Steffen, VP of Research Covering Information Security, Risk, and Compliance Management at EMA ...

February 15, 2024

In a time where we're constantly bombarded with new buzzwords and technological advancements, it can be challenging for businesses to determine what is real, what is useful, and what they truly need. Over the years, we've witnessed the rise and fall of various tech trends, such as the promises (and fears) of AI becoming sentient and replacing humans to the declaration that data is the new oil. At the end of the day, one fundamental question remains: How can companies navigate through the tech buzz and make informed decisions for their future? ...

February 14, 2024

We increasingly see companies using their observability data to support security use cases. It's not entirely surprising given the challenges that organizations have with legacy SIEMs. We wanted to dig into this evolving intersection of security and observability, so we surveyed 500 security professionals — 40% of whom were either CISOs or CSOs — for our inaugural State of Security Observability report ...

February 13, 2024

Cloud computing continues to soar, with little signs of slowing down ... But, as with any new program, companies are seeing substantial benefits in the cloud but are also navigating budgetary challenges. With an estimated 94% of companies using cloud services today, priorities for IT teams have shifted from purely adoption-based to deploying new strategies. As they explore new territories, it can be a struggle to exploit the full value of their spend and the cloud's transformative capabilities ...

February 12, 2024

What will the enterprise of the future look like? If we asked this question three years ago, I doubt most of us would have pictured today as we know it: a future where generative AI has become deeply integrated into business and even our daily lives ...