Data Issues Take 2 Days to Identify and Fix
July 11, 2023
Share this

Companies experience a median of five to ten data incidents over a period of three months, according to the 2023 State of Data Quality report from Bigeye.

Respondents reported at least two "severe" data incidents in the last six months, which created damage to the business/bottom line and were visible at the C-level. And 70% reported at least two data incidents that diminished the productivity of their teams.


Source: Bigeye

They also said incidents take an average of 48 hours to troubleshoot. And even though data issues most commonly take approximately 1-2 days to identify and fix, the issues can cause problems that last as long as weeks or even months.

Organizations with more than five data incidents per month are going from incident to incident, with little ability to trust data or invest in larger data infrastructure projects. They are performing reactive over proactive data quality work.

The report also found that data engineers are the first line of defense in managing data issues, followed closely by software engineers. The role of data engineer has moved on par with software engineering. Like software engineers, data engineers are in charge of a product — the data product — that increasingly demands software-like levels of process, maintenance, and code review.

Survey respondents say it takes an estimated 37,500 man hours to build in-house data quality monitoring, equating to about one year of work for 20 engineers

Those who used third-party data monitoring solutions found about a 2x to 3x ROI over in-house solutions. They also noted that at full utilization, third-party data monitoring solved for two issues: fractured infrastructure and anomalous data. They further reported that third-party data monitoring solutions had better test libraries, and a broader perspective on data problems.

"Data quality issues are the biggest blockers preventing data teams from being successful," said Kyle Kirwan, Bigeye CEO and co-founder. "We've heard that around 250-500 hours are lost every quarter, just dealing with data pipeline issues."

Methodology: The report consisted of answers from 100 survey respondents. At least 63 came from mid-to-large cloud data warehouse customers (with a spend of more than $500k per annum) who have some form of data monitoring in place, whether third-party or built in-house.

Share this

The Latest

May 17, 2024

In MEAN TIME TO INSIGHT Episode 6, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses network automation ...

May 16, 2024

In the ever-evolving landscape of software development and infrastructure management, observability stands as a crucial pillar. Among its fundamental components lies log collection ... However, traditional methods of log collection have faced challenges, especially in high-volume and dynamic environments. Enter eBPF, a groundbreaking technology ...

May 15, 2024

Businesses are dazzled by the promise of generative AI, as it touts the capability to increase productivity and efficiency, cut costs, and provide competitive advantages. With more and more generative AI options available today, businesses are now investigating how to convert the AI promise into profit. One way businesses are looking to do this is by using AI to improve personalized customer engagement ...

May 14, 2024

In the fast-evolving realm of cloud computing, where innovation collides with fiscal responsibility, the Flexera 2024 State of the Cloud Report illuminates the challenges and triumphs shaping the digital landscape ... At the forefront of this year's findings is the resounding chorus of organizations grappling with cloud costs ...

May 13, 2024

Government agencies are transforming to improve the digital experience for employees and citizens, allowing them to achieve key goals, including unleashing staff productivity, recruiting and retaining talent in the public sector, and delivering on the mission, according to the Global Digital Employee Experience (DEX) Survey from Riverbed ...

May 09, 2024

App sprawl has been a concern for technologists for some time, but it has never presented such a challenge as now. As organizations move to implement generative AI into their applications, it's only going to become more complex ... Observability is a necessary component for understanding the vast amounts of complex data within AI-infused applications, and it must be the centerpiece of an app- and data-centric strategy to truly manage app sprawl ...

May 08, 2024

Fundamentally, investments in digital transformation — often an amorphous budget category for enterprises — have not yielded their anticipated productivity and value ... In the wake of the tsunami of money thrown at digital transformation, most businesses don't actually know what technology they've acquired, or the extent of it, and how it's being used, which is directly tied to how people do their jobs. Now, AI transformation represents the biggest change management challenge organizations will face in the next one to two years ...

May 07, 2024

As businesses focus more and more on uncovering new ways to unlock the value of their data, generative AI (GenAI) is presenting some new opportunities to do so, particularly when it comes to data management and how organizations collect, process, analyze, and derive insights from their assets. In the near future, I expect to see six key ways in which GenAI will reshape our current data management landscape ...

May 06, 2024

The rise of AI is ushering in a new disrupt-or-die era. "Data-ready enterprises that connect and unify broad structured and unstructured data sets into an intelligent data infrastructure are best positioned to win in the age of AI ...

May 02, 2024

A majority (61%) of organizations are forced to evolve or rethink their data and analytics (D&A) operating model because of the impact of disruptive artificial intelligence (AI) technologies, according to a new Gartner survey ...