Skip to main content

2024 DataOps Predictions - Part 1

Industry experts offer predictions on how DataOps will evolve and impact IT and business in 2024.

Data observability becomes mandatory

Data observability will become mandatory as organizations seek to drive smarter automation and faster decision-making in 2024. As the volume of data has continued to double every two years, organizations are urgently seeking to ingest and analyze it faster and at a greater scale. However, the cost and risk of poor-quality data is greater than ever. In a recent survey, 57% of DevOps practitioners said the absence of data observability makes it difficult to drive automation in a compliant way. As a result, there will be an increased demand for solutions that provide data observability to enable organizations to rapidly and securely ingest high-quality and reliable data that is ready for analytics on demand. Increased data observability will enable users to understand not only the availability of data, but also the structure, distribution, relationships, and lineage of that data across all sources. This is essential to generating insights that users can trust, by ensuring its freshness, identifying anomalies, and eliminating duplicates that could lead to errors.
Bernd Greifeneder
CTO and Founder, Dynatrace

AI DRIVES Data Observability

In 2024 large enterprises will rapidly increase investment in AI and LLM technologies. This will create a greater need for data observability to validate that data feeding AI initiatives is accurate and complete. As a result, data observability vendors will be expected to expand support from predominantly cloud-native data environments to larger, more traditional enterprise data stacks and provide native solutions for LLM data pipeline monitoring and validation.
Kyle Kirwan
Co-Founder and CEO, Bigeye

AI DRIVES DATAOPS

2024 will see the impact of rapid AI/ML advances on DataOps — empowering developers and DevOps teams by making data analytics that much more accessible, accurate, and definitive. AI/ML and new data science techniques will enable new operating models, data analytics tools, and technologies that enable data-driven decisions. Specific benefits will manifest in everything from customer recommendation engines to insight-optimized business operations, including streamlined development practices and DevOps processes. In this way, DataOps will provide enterprises with decisive advantages over competitors who lack that data command and clarity.
Anil Inamdar
VP & Head of Data, Instaclustr, part of Spot by NetApp

AI DRIVES DATA INFRASTRUCTURE MODERNIZATION

The continuous and rapid adoption of AI will force organizations to modernize their data infrastructure in 2024. Enterprises are examining their data and it's pushing them to have a better handle on it so technologies like AI can be properly used. Organizations will double down on data management and data integrity to ensure third-party applications are seamlessly integrated. Data practitioners will look for solutions that continuously keep data clean to quickly act on workflows. Better data means better-trained models on less data, as well as a better ability to leverage that data in AI applications that incorporate retrieval.
Matt Wallace
Technical Advisor, Faction

AI WILL NOT REPLACE DATA ENGINEERS

Data engineering will evolve — and be highly valued — in an AI world. There's been a lot of chatter that the AI revolution will replace the role of data engineers. That's not the case, and in fact their data expertise will be more critical than ever — just in new and different ways. To keep up with the evolving landscape, data engineers will need to understand how generative AI adds value. The data pipelines built and managed by data engineers will be perhaps the first place to connect with large language models for organizations to unlock value. Data engineers will be the ones who understand how to consume a model and plug it into a data pipeline to automate the extraction of value. They will also be expected to oversee and understand the AI work.
Jeff Hollan
Director of Product Management, Snowflake

BREAKING DOWN DATA SILOS

The traditional silos between IT, legal, and business departments will crumble in 2024, giving way to a more collaborative approach. This cross-functional synergy will ensure that data, including unstructured data, is not just collected but strategically utilized to drive business value.
Rohit Choudhary
CEO, Acceldata

Data as Business Asset

In many senses, data is the new oil. It's a finite resource that needs to be mined and managed strategically, and its value is highly dependent on your ability to refine and manipulate it for specific applications. For this reason, we see 2024 as being a critical year in the transition of data from being 1s and 0s on a screen to an actual asset to be managed, tracked, and optimized within an enterprise.
Jackie McGuire
Senior Security Strategist, Cribl

Data management will evolve beyond mere data storage in 2024. Organizations will recognize the strategic value of weaving data, including unstructured data, into their business strategies. This shift will unlock a wealth of insights and redefine business decision-making.
Rohit Choudhary
CEO, Acceldata

In 2024, the technology landscape will witness a transformative shift as data evolves from being a valuable asset to the lifeblood of thriving enterprises. Organizations that overlook data quality, integrity, and lineage will be challenged to not only make informed decisions but also realize the full potential of generative AI, LLM and ML applications and use cases. As the year unfolds, I predict that organizations neglecting to craft robust data foundations and strategies will find it increasingly challenging to stay afloat in the swiftly evolving tech industry. Those who fail to adapt and prioritize data fundamentals will struggle to outpace their competitors and may even risk survival in this highly competitive environment.
Armon Petrossian
CEO and Co-Founder, Coalesce

Data as innovation asset

The rapid expansion of data will continue to be a dominant trend in 2024. However, the ability to efficiently gather, process, and utilize this data will become the critical factor limiting or accelerating innovation within organizations. The challenge will lie in developing methods to quickly and securely assimilate this growing data influx, converting it into actionable insights. Companies that can effectively manage this data deluge, turning it into a strategic asset for innovation, will gain a competitive edge in the increasingly data-driven business landscape.
David Boskovic
Founder and CEO, Flatfile

Data as a product

Until recently, only large companies had the expertise and resources needed to create reusable data assets that can be easily repurposed across different teams and applications. Thanks to advancements in the governance products required to build these assets, in 2024 more companies will be able to create reusable data products, greatly accelerating efficiency and data innovation. Multiple teams can benefit from having access to the same data to build a service or application. However, this data must be presented in a way that is secure, well-contextualized, and understandable for users who weren't involved in its production. As data moves farther away from its initial source, you have to do more checks, which becomes increasingly expensive. Starting the data governance process at the source is not only less expensive but is a better way to understand the data's source and how it's schematized. New data governance capabilities, pre-built into products such as cloud data warehouses, databases, and other data infrastructure services, can meet these needs. Developers no longer need to manually build the infrastructure to create and share reusable data products. As a result, reusable data products will no longer be restricted to companies with large, sophisticated data engineering teams. With more companies building reusable data products, in 2024, developers will increase the value of their data and spend more time building innovative data applications and services.
Andrew Sellers
Head of Technology Strategy, Confluent

Go to: 2024 DataOps Predictions - Part 2

Hot Topics

The Latest

Like most digital transformation shifts, organizations often prioritize productivity and leave security and observability to keep pace. This usually translates to both the mass implementation of new technology and fragmented monitoring and observability (M&O) tooling. In the era of AI and varied cloud architecture, a disparate observability function can be dangerous. IT teams will lack a complete picture of their IT environment, making it harder to diagnose issues while slowing down mean time to resolve (MTTR). In fact, according to recent data from the SolarWinds State of Monitoring & Observability Report, 77% of IT personnel said the lack of visibility across their on-prem and cloud architecture was an issue ...

In MEAN TIME TO INSIGHT Episode 23, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses the NetOps labor shortage ... 

Technology management is evolving, and in turn, so is the scope of FinOps. The FinOps Foundation recently updated their mission statement from "advancing the people who manage the value of cloud" to "advancing the people who manage the value of technology." This seemingly small change solidifies a larger evolution: FinOps practitioners have organically expanded to be focused on more than just cloud cost optimization. Today, FinOps teams are largely — and quickly — expanding their job descriptions, evolving into a critical function for managing the full value of technology ...

Enterprises are under pressure to scale AI quickly. Yet despite considerable investment, adoption continues to stall. One of the most overlooked reasons is vendor sprawl ... In reality, no organization deliberately sets out to create sprawling vendor ecosystems. More often, complexity accumulates over time through well-intentioned initiatives, such as enterprise-wide digital transformation efforts, point solutions, or decentralized sourcing strategies ...

Nearly every conversation about AI eventually circles back to compute. GPUs dominate the headlines while cloud platforms compete for workloads and model benchmarks drive investment decisions. But underneath that noise, a quieter infrastructure challenge is taking shape. The real bottleneck in enterprise AI is not processing power, it is the ability to store, manage and retrieve the relentless volumes of data that AI systems generate, consume and multiply ...

The 2026 Observability Survey from Grafana Labs paints a vivid picture of an industry maturing fast, where AI is welcomed with careful conditions, SaaS economics are reshaping spending decisions, complexity remains a defining challenge, and open standards continue to underpin it all ...

The observability industry has an evolving relationship with AI. We're not skeptics, but it's clear that trust in AI must be earned ... In Grafana Labs' annual Observability Survey, 92% said they see real value in AI surfacing anomalies before they cause downtime. Another 91% endorsed AI for forecasting and root cause analysis. So while the demand is there, customers need it to be trustworthy, as the survey also found that the practitioners most enthusiastic about AI are also the most insistent on explainability ...

In the modern enterprise, the conversation around AI has moved past skepticism toward a stage of active adoption. According to our 2026 State of IT Trends Report: The Human Side of Autonomous AI, nearly 90% of IT professionals view AI as a net positive, and this optimism is well-founded. We are seeing agentic AI move beyond simple automation to actively streamlining complex data insights and eliminating the manual toil that has long hindered innovation. However, as we integrate these autonomous agents into our ecosystems, the fundamental DNA of the IT role is evolving ...

AI workloads require an enormous amount of computing power ... What's also becoming abundantly clear is just how quickly AI's computing needs are leading to enterprise systems failure. According to Cockroach Labs' State of AI Infrastructure 2026 report, enterprise systems are much closer to failure than their organizations realize. The report ... suggests AI scale could cause widespread failures in as little as one year — making it a clear risk for business performance and reliability.

The quietest week your engineering team has ever had might also be its best. No alarms going off. No escalations. No frantic Teams or Slack threads at 2 a.m. Everything humming along exactly as it should. And somewhere in a leadership meeting, someone looks at the metrics dashboard, sees a flat line of incidents and says: "Seems like things are pretty calm over there. Do we really need all those people?" ... I've spent many years in engineering, and this pattern keeps repeating ...

2024 DataOps Predictions - Part 1

Industry experts offer predictions on how DataOps will evolve and impact IT and business in 2024.

Data observability becomes mandatory

Data observability will become mandatory as organizations seek to drive smarter automation and faster decision-making in 2024. As the volume of data has continued to double every two years, organizations are urgently seeking to ingest and analyze it faster and at a greater scale. However, the cost and risk of poor-quality data is greater than ever. In a recent survey, 57% of DevOps practitioners said the absence of data observability makes it difficult to drive automation in a compliant way. As a result, there will be an increased demand for solutions that provide data observability to enable organizations to rapidly and securely ingest high-quality and reliable data that is ready for analytics on demand. Increased data observability will enable users to understand not only the availability of data, but also the structure, distribution, relationships, and lineage of that data across all sources. This is essential to generating insights that users can trust, by ensuring its freshness, identifying anomalies, and eliminating duplicates that could lead to errors.
Bernd Greifeneder
CTO and Founder, Dynatrace

AI DRIVES Data Observability

In 2024 large enterprises will rapidly increase investment in AI and LLM technologies. This will create a greater need for data observability to validate that data feeding AI initiatives is accurate and complete. As a result, data observability vendors will be expected to expand support from predominantly cloud-native data environments to larger, more traditional enterprise data stacks and provide native solutions for LLM data pipeline monitoring and validation.
Kyle Kirwan
Co-Founder and CEO, Bigeye

AI DRIVES DATAOPS

2024 will see the impact of rapid AI/ML advances on DataOps — empowering developers and DevOps teams by making data analytics that much more accessible, accurate, and definitive. AI/ML and new data science techniques will enable new operating models, data analytics tools, and technologies that enable data-driven decisions. Specific benefits will manifest in everything from customer recommendation engines to insight-optimized business operations, including streamlined development practices and DevOps processes. In this way, DataOps will provide enterprises with decisive advantages over competitors who lack that data command and clarity.
Anil Inamdar
VP & Head of Data, Instaclustr, part of Spot by NetApp

AI DRIVES DATA INFRASTRUCTURE MODERNIZATION

The continuous and rapid adoption of AI will force organizations to modernize their data infrastructure in 2024. Enterprises are examining their data and it's pushing them to have a better handle on it so technologies like AI can be properly used. Organizations will double down on data management and data integrity to ensure third-party applications are seamlessly integrated. Data practitioners will look for solutions that continuously keep data clean to quickly act on workflows. Better data means better-trained models on less data, as well as a better ability to leverage that data in AI applications that incorporate retrieval.
Matt Wallace
Technical Advisor, Faction

AI WILL NOT REPLACE DATA ENGINEERS

Data engineering will evolve — and be highly valued — in an AI world. There's been a lot of chatter that the AI revolution will replace the role of data engineers. That's not the case, and in fact their data expertise will be more critical than ever — just in new and different ways. To keep up with the evolving landscape, data engineers will need to understand how generative AI adds value. The data pipelines built and managed by data engineers will be perhaps the first place to connect with large language models for organizations to unlock value. Data engineers will be the ones who understand how to consume a model and plug it into a data pipeline to automate the extraction of value. They will also be expected to oversee and understand the AI work.
Jeff Hollan
Director of Product Management, Snowflake

BREAKING DOWN DATA SILOS

The traditional silos between IT, legal, and business departments will crumble in 2024, giving way to a more collaborative approach. This cross-functional synergy will ensure that data, including unstructured data, is not just collected but strategically utilized to drive business value.
Rohit Choudhary
CEO, Acceldata

Data as Business Asset

In many senses, data is the new oil. It's a finite resource that needs to be mined and managed strategically, and its value is highly dependent on your ability to refine and manipulate it for specific applications. For this reason, we see 2024 as being a critical year in the transition of data from being 1s and 0s on a screen to an actual asset to be managed, tracked, and optimized within an enterprise.
Jackie McGuire
Senior Security Strategist, Cribl

Data management will evolve beyond mere data storage in 2024. Organizations will recognize the strategic value of weaving data, including unstructured data, into their business strategies. This shift will unlock a wealth of insights and redefine business decision-making.
Rohit Choudhary
CEO, Acceldata

In 2024, the technology landscape will witness a transformative shift as data evolves from being a valuable asset to the lifeblood of thriving enterprises. Organizations that overlook data quality, integrity, and lineage will be challenged to not only make informed decisions but also realize the full potential of generative AI, LLM and ML applications and use cases. As the year unfolds, I predict that organizations neglecting to craft robust data foundations and strategies will find it increasingly challenging to stay afloat in the swiftly evolving tech industry. Those who fail to adapt and prioritize data fundamentals will struggle to outpace their competitors and may even risk survival in this highly competitive environment.
Armon Petrossian
CEO and Co-Founder, Coalesce

Data as innovation asset

The rapid expansion of data will continue to be a dominant trend in 2024. However, the ability to efficiently gather, process, and utilize this data will become the critical factor limiting or accelerating innovation within organizations. The challenge will lie in developing methods to quickly and securely assimilate this growing data influx, converting it into actionable insights. Companies that can effectively manage this data deluge, turning it into a strategic asset for innovation, will gain a competitive edge in the increasingly data-driven business landscape.
David Boskovic
Founder and CEO, Flatfile

Data as a product

Until recently, only large companies had the expertise and resources needed to create reusable data assets that can be easily repurposed across different teams and applications. Thanks to advancements in the governance products required to build these assets, in 2024 more companies will be able to create reusable data products, greatly accelerating efficiency and data innovation. Multiple teams can benefit from having access to the same data to build a service or application. However, this data must be presented in a way that is secure, well-contextualized, and understandable for users who weren't involved in its production. As data moves farther away from its initial source, you have to do more checks, which becomes increasingly expensive. Starting the data governance process at the source is not only less expensive but is a better way to understand the data's source and how it's schematized. New data governance capabilities, pre-built into products such as cloud data warehouses, databases, and other data infrastructure services, can meet these needs. Developers no longer need to manually build the infrastructure to create and share reusable data products. As a result, reusable data products will no longer be restricted to companies with large, sophisticated data engineering teams. With more companies building reusable data products, in 2024, developers will increase the value of their data and spend more time building innovative data applications and services.
Andrew Sellers
Head of Technology Strategy, Confluent

Go to: 2024 DataOps Predictions - Part 2

Hot Topics

The Latest

Like most digital transformation shifts, organizations often prioritize productivity and leave security and observability to keep pace. This usually translates to both the mass implementation of new technology and fragmented monitoring and observability (M&O) tooling. In the era of AI and varied cloud architecture, a disparate observability function can be dangerous. IT teams will lack a complete picture of their IT environment, making it harder to diagnose issues while slowing down mean time to resolve (MTTR). In fact, according to recent data from the SolarWinds State of Monitoring & Observability Report, 77% of IT personnel said the lack of visibility across their on-prem and cloud architecture was an issue ...

In MEAN TIME TO INSIGHT Episode 23, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses the NetOps labor shortage ... 

Technology management is evolving, and in turn, so is the scope of FinOps. The FinOps Foundation recently updated their mission statement from "advancing the people who manage the value of cloud" to "advancing the people who manage the value of technology." This seemingly small change solidifies a larger evolution: FinOps practitioners have organically expanded to be focused on more than just cloud cost optimization. Today, FinOps teams are largely — and quickly — expanding their job descriptions, evolving into a critical function for managing the full value of technology ...

Enterprises are under pressure to scale AI quickly. Yet despite considerable investment, adoption continues to stall. One of the most overlooked reasons is vendor sprawl ... In reality, no organization deliberately sets out to create sprawling vendor ecosystems. More often, complexity accumulates over time through well-intentioned initiatives, such as enterprise-wide digital transformation efforts, point solutions, or decentralized sourcing strategies ...

Nearly every conversation about AI eventually circles back to compute. GPUs dominate the headlines while cloud platforms compete for workloads and model benchmarks drive investment decisions. But underneath that noise, a quieter infrastructure challenge is taking shape. The real bottleneck in enterprise AI is not processing power, it is the ability to store, manage and retrieve the relentless volumes of data that AI systems generate, consume and multiply ...

The 2026 Observability Survey from Grafana Labs paints a vivid picture of an industry maturing fast, where AI is welcomed with careful conditions, SaaS economics are reshaping spending decisions, complexity remains a defining challenge, and open standards continue to underpin it all ...

The observability industry has an evolving relationship with AI. We're not skeptics, but it's clear that trust in AI must be earned ... In Grafana Labs' annual Observability Survey, 92% said they see real value in AI surfacing anomalies before they cause downtime. Another 91% endorsed AI for forecasting and root cause analysis. So while the demand is there, customers need it to be trustworthy, as the survey also found that the practitioners most enthusiastic about AI are also the most insistent on explainability ...

In the modern enterprise, the conversation around AI has moved past skepticism toward a stage of active adoption. According to our 2026 State of IT Trends Report: The Human Side of Autonomous AI, nearly 90% of IT professionals view AI as a net positive, and this optimism is well-founded. We are seeing agentic AI move beyond simple automation to actively streamlining complex data insights and eliminating the manual toil that has long hindered innovation. However, as we integrate these autonomous agents into our ecosystems, the fundamental DNA of the IT role is evolving ...

AI workloads require an enormous amount of computing power ... What's also becoming abundantly clear is just how quickly AI's computing needs are leading to enterprise systems failure. According to Cockroach Labs' State of AI Infrastructure 2026 report, enterprise systems are much closer to failure than their organizations realize. The report ... suggests AI scale could cause widespread failures in as little as one year — making it a clear risk for business performance and reliability.

The quietest week your engineering team has ever had might also be its best. No alarms going off. No escalations. No frantic Teams or Slack threads at 2 a.m. Everything humming along exactly as it should. And somewhere in a leadership meeting, someone looks at the metrics dashboard, sees a flat line of incidents and says: "Seems like things are pretty calm over there. Do we really need all those people?" ... I've spent many years in engineering, and this pattern keeps repeating ...