Data Issues Take 2 Days to Identify and Fix
July 11, 2023
Share this

Companies experience a median of five to ten data incidents over a period of three months, according to the 2023 State of Data Quality report from Bigeye.

Respondents reported at least two "severe" data incidents in the last six months, which created damage to the business/bottom line and were visible at the C-level. And 70% reported at least two data incidents that diminished the productivity of their teams.


Source: Bigeye

They also said incidents take an average of 48 hours to troubleshoot. And even though data issues most commonly take approximately 1-2 days to identify and fix, the issues can cause problems that last as long as weeks or even months.

Organizations with more than five data incidents per month are going from incident to incident, with little ability to trust data or invest in larger data infrastructure projects. They are performing reactive over proactive data quality work.

The report also found that data engineers are the first line of defense in managing data issues, followed closely by software engineers. The role of data engineer has moved on par with software engineering. Like software engineers, data engineers are in charge of a product — the data product — that increasingly demands software-like levels of process, maintenance, and code review.

Survey respondents say it takes an estimated 37,500 man hours to build in-house data quality monitoring, equating to about one year of work for 20 engineers

Those who used third-party data monitoring solutions found about a 2x to 3x ROI over in-house solutions. They also noted that at full utilization, third-party data monitoring solved for two issues: fractured infrastructure and anomalous data. They further reported that third-party data monitoring solutions had better test libraries, and a broader perspective on data problems.

"Data quality issues are the biggest blockers preventing data teams from being successful," said Kyle Kirwan, Bigeye CEO and co-founder. "We've heard that around 250-500 hours are lost every quarter, just dealing with data pipeline issues."

Methodology: The report consisted of answers from 100 survey respondents. At least 63 came from mid-to-large cloud data warehouse customers (with a spend of more than $500k per annum) who have some form of data monitoring in place, whether third-party or built in-house.

Share this

The Latest

October 09, 2024
A well-performing application is no longer a luxury; it has become a necessity for many business organizations worldwide. End users expect applications to be fast, reliable, and responsive — anything less can cause user frustration, app abandonment, and ultimately lost revenue. This is where application performance testing comes in ....
October 08, 2024

The demand for real-time AI capabilities is pushing data scientists to develop and manage infrastructure that can handle massive volumes of data in motion. This includes streaming data pipelines, edge computing, scalable cloud architecture, and data quality and governance. These new responsibilities require data scientists to expand their skill sets significantly ...

October 07, 2024

As the digital landscape constantly evolves, it's critical for businesses to stay ahead, especially when it comes to operating systems updates. A recent ControlUp study revealed that 82% of enterprise Windows endpoint devices have yet to migrate to Windows 11. With Microsoft's cutoff date on October 14, 2025, for Windows 10 support fast approaching, the urgency cannot be overstated ...

October 04, 2024

In Part 1 of this two-part series, I defined multi-CDN and explored how and why this approach is used by streaming services, e-commerce platforms, gaming companies and global enterprises for fast and reliable content delivery ... Now, in Part 2 of the series, I'll explore one of the biggest challenges of multi-CDN: observability.

October 03, 2024

CDNs consist of geographically distributed data centers with servers that cache and serve content close to end users to reduce latency and improve load times. Each data center is strategically placed so that digital signals can rapidly travel from one "point of presence" to the next, getting the digital signal to the viewer as fast as possible ... Multi-CDN refers to the strategy of utilizing multiple CDNs to deliver digital content across the internet ...

October 02, 2024

We surveyed IT professionals on their attitudes and practices regarding using Generative AI with databases. We asked how they are layering the technology in with their systems, where it's working the best for them, and what their concerns are ...

October 01, 2024

40% of generative AI (GenAI) solutions will be multimodal (text, image, audio and video) by 2027, up from 1% in 2023, according to Gartner ...

September 30, 2024

Today's digital business landscape evolves rapidly ... Among the areas primed for innovation, the long-standing ticket-based IT support model stands out as particularly outdated. Emerging as a game-changer, the concept of the "ticketless enterprise" promises to shift IT management from a reactive stance to a proactive approach ...

September 27, 2024

In MEAN TIME TO INSIGHT Episode 10, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses Generative AI ...

September 26, 2024

By 2026, 30% of enterprises will automate more than half of their network activities, an increase from under 10% in mid-2023, according to Gartner ...