Skip to main content

2024 DataOps Predictions - Part 1

Industry experts offer predictions on how DataOps will evolve and impact IT and business in 2024.

Data observability becomes mandatory

Data observability will become mandatory as organizations seek to drive smarter automation and faster decision-making in 2024. As the volume of data has continued to double every two years, organizations are urgently seeking to ingest and analyze it faster and at a greater scale. However, the cost and risk of poor-quality data is greater than ever. In a recent survey, 57% of DevOps practitioners said the absence of data observability makes it difficult to drive automation in a compliant way. As a result, there will be an increased demand for solutions that provide data observability to enable organizations to rapidly and securely ingest high-quality and reliable data that is ready for analytics on demand. Increased data observability will enable users to understand not only the availability of data, but also the structure, distribution, relationships, and lineage of that data across all sources. This is essential to generating insights that users can trust, by ensuring its freshness, identifying anomalies, and eliminating duplicates that could lead to errors.
Bernd Greifeneder
CTO and Founder, Dynatrace

AI DRIVES Data Observability

In 2024 large enterprises will rapidly increase investment in AI and LLM technologies. This will create a greater need for data observability to validate that data feeding AI initiatives is accurate and complete. As a result, data observability vendors will be expected to expand support from predominantly cloud-native data environments to larger, more traditional enterprise data stacks and provide native solutions for LLM data pipeline monitoring and validation.
Kyle Kirwan
Co-Founder and CEO, Bigeye

AI DRIVES DATAOPS

2024 will see the impact of rapid AI/ML advances on DataOps — empowering developers and DevOps teams by making data analytics that much more accessible, accurate, and definitive. AI/ML and new data science techniques will enable new operating models, data analytics tools, and technologies that enable data-driven decisions. Specific benefits will manifest in everything from customer recommendation engines to insight-optimized business operations, including streamlined development practices and DevOps processes. In this way, DataOps will provide enterprises with decisive advantages over competitors who lack that data command and clarity.
Anil Inamdar
VP & Head of Data, Instaclustr, part of Spot by NetApp

AI DRIVES DATA INFRASTRUCTURE MODERNIZATION

The continuous and rapid adoption of AI will force organizations to modernize their data infrastructure in 2024. Enterprises are examining their data and it's pushing them to have a better handle on it so technologies like AI can be properly used. Organizations will double down on data management and data integrity to ensure third-party applications are seamlessly integrated. Data practitioners will look for solutions that continuously keep data clean to quickly act on workflows. Better data means better-trained models on less data, as well as a better ability to leverage that data in AI applications that incorporate retrieval.
Matt Wallace
Technical Advisor, Faction

AI WILL NOT REPLACE DATA ENGINEERS

Data engineering will evolve — and be highly valued — in an AI world. There's been a lot of chatter that the AI revolution will replace the role of data engineers. That's not the case, and in fact their data expertise will be more critical than ever — just in new and different ways. To keep up with the evolving landscape, data engineers will need to understand how generative AI adds value. The data pipelines built and managed by data engineers will be perhaps the first place to connect with large language models for organizations to unlock value. Data engineers will be the ones who understand how to consume a model and plug it into a data pipeline to automate the extraction of value. They will also be expected to oversee and understand the AI work.
Jeff Hollan
Director of Product Management, Snowflake

BREAKING DOWN DATA SILOS

The traditional silos between IT, legal, and business departments will crumble in 2024, giving way to a more collaborative approach. This cross-functional synergy will ensure that data, including unstructured data, is not just collected but strategically utilized to drive business value.
Rohit Choudhary
CEO, Acceldata

Data as Business Asset

In many senses, data is the new oil. It's a finite resource that needs to be mined and managed strategically, and its value is highly dependent on your ability to refine and manipulate it for specific applications. For this reason, we see 2024 as being a critical year in the transition of data from being 1s and 0s on a screen to an actual asset to be managed, tracked, and optimized within an enterprise.
Jackie McGuire
Senior Security Strategist, Cribl

Data management will evolve beyond mere data storage in 2024. Organizations will recognize the strategic value of weaving data, including unstructured data, into their business strategies. This shift will unlock a wealth of insights and redefine business decision-making.
Rohit Choudhary
CEO, Acceldata

In 2024, the technology landscape will witness a transformative shift as data evolves from being a valuable asset to the lifeblood of thriving enterprises. Organizations that overlook data quality, integrity, and lineage will be challenged to not only make informed decisions but also realize the full potential of generative AI, LLM and ML applications and use cases. As the year unfolds, I predict that organizations neglecting to craft robust data foundations and strategies will find it increasingly challenging to stay afloat in the swiftly evolving tech industry. Those who fail to adapt and prioritize data fundamentals will struggle to outpace their competitors and may even risk survival in this highly competitive environment.
Armon Petrossian
CEO and Co-Founder, Coalesce

Data as innovation asset

The rapid expansion of data will continue to be a dominant trend in 2024. However, the ability to efficiently gather, process, and utilize this data will become the critical factor limiting or accelerating innovation within organizations. The challenge will lie in developing methods to quickly and securely assimilate this growing data influx, converting it into actionable insights. Companies that can effectively manage this data deluge, turning it into a strategic asset for innovation, will gain a competitive edge in the increasingly data-driven business landscape.
David Boskovic
Founder and CEO, Flatfile

Data as a product

Until recently, only large companies had the expertise and resources needed to create reusable data assets that can be easily repurposed across different teams and applications. Thanks to advancements in the governance products required to build these assets, in 2024 more companies will be able to create reusable data products, greatly accelerating efficiency and data innovation. Multiple teams can benefit from having access to the same data to build a service or application. However, this data must be presented in a way that is secure, well-contextualized, and understandable for users who weren't involved in its production. As data moves farther away from its initial source, you have to do more checks, which becomes increasingly expensive. Starting the data governance process at the source is not only less expensive but is a better way to understand the data's source and how it's schematized. New data governance capabilities, pre-built into products such as cloud data warehouses, databases, and other data infrastructure services, can meet these needs. Developers no longer need to manually build the infrastructure to create and share reusable data products. As a result, reusable data products will no longer be restricted to companies with large, sophisticated data engineering teams. With more companies building reusable data products, in 2024, developers will increase the value of their data and spend more time building innovative data applications and services.
Andrew Sellers
Head of Technology Strategy, Confluent

Go to: 2024 DataOps Predictions - Part 2

Hot Topics

The Latest

Developers building AI applications are not just looking for fault patterns after deployment; they must detect issues quickly during development and have the ability to prevent issues after going live. Unfortunately, traditional observability tools can no longer meet the needs of AI-driven enterprise application development. AI-powered detection and auto-remediation tools designed to keep pace with rapid development are now emerging to proactively manage performance and prevent downtime ...

Every few years, the cybersecurity industry adopts a new buzzword. "Zero Trust" has endured longer than most — and for good reason. Its promise is simple: trust nothing by default, verify everything continuously. Yet many organizations still hesitate to implement Zero Trust Network Access (ZTNA). The problem isn't that ZTNA doesn't work. It's that it's often misunderstood ...

For many retail brands, peak season is the annual stress test of their digital infrastructure. It's also when often technical dashboards glow green, yet customer feedback, digital experience frustration, and conversion trends tell a different story entirely. Over the past several years, we've seen the same pattern across retail, financial services, travel, and media: internal application performance metrics fail to capture the true experience of users connecting over local broadband, mobile carriers, and congested networks using multiple devices across geographies ...

PostgreSQL promises greater flexibility, performance, and cost savings compared to proprietary alternatives. But successfully deploying it isn't always straightforward, and there are some hidden traps along the way that even seasoned IT leaders can stumble into. In this blog, I'll highlight five of the most common pitfalls with PostgreSQL deployment and offer guidance on how to avoid them, along with the best path forward ...

The rise of hybrid cloud environments, the explosion of IoT devices, the proliferation of remote work, and advanced cyber threats have created a monitoring challenge that traditional approaches simply cannot meet. IT teams find themselves drowning in a sea of data, struggling to identify critical threats amidst a deluge of alerts, and often reacting to incidents long after they've begun. This is where AI and ML are leveraged ...

Three practices, chaos testing, incident retrospectives, and AIOps-driven monitoring, are transforming platform teams from reactive responders into proactive builders of resilient, self-healing systems. The evolution is not just technical; it's cultural. The modern platform engineer isn't just maintaining infrastructure. They're product owners designing for reliability, observability, and continuous improvement ...

Getting applications into the hands of those who need them quickly and securely has long been the goal of a branch of IT often referred to as End User Computing (EUC). Over recent years, the way applications (and data) have been delivered to these "users" has changed noticeably. Organizations have many more choices available to them now, and there will be more to come ... But how did we get here? Where are we going? Is this all too complicated? ...

On November 18, a single database permission change inside Cloudflare set off a chain of failures that rippled across the Internet. Traffic stalled. Authentication broke. Workers KV returned waves of 5xx errors as systems fell in and out of sync. For nearly three hours, one of the most resilient networks on the planet struggled under the weight of a change no one expected to matter ... Cloudflare recovered quickly, but the deeper lesson reaches far beyond this incident ...

Chris Steffen and Ken Buckler from EMA discuss the Cloudflare outage and what availability means in the technology space ...

Every modern industry is confronting the same challenge: human reaction time is no longer fast enough for real-time decision environments. Across sectors, from financial services to manufacturing to cybersecurity and beyond, the stakes mirror those of autonomous vehicles — systems operating in complex, high-risk environments where milliseconds matter ...

2024 DataOps Predictions - Part 1

Industry experts offer predictions on how DataOps will evolve and impact IT and business in 2024.

Data observability becomes mandatory

Data observability will become mandatory as organizations seek to drive smarter automation and faster decision-making in 2024. As the volume of data has continued to double every two years, organizations are urgently seeking to ingest and analyze it faster and at a greater scale. However, the cost and risk of poor-quality data is greater than ever. In a recent survey, 57% of DevOps practitioners said the absence of data observability makes it difficult to drive automation in a compliant way. As a result, there will be an increased demand for solutions that provide data observability to enable organizations to rapidly and securely ingest high-quality and reliable data that is ready for analytics on demand. Increased data observability will enable users to understand not only the availability of data, but also the structure, distribution, relationships, and lineage of that data across all sources. This is essential to generating insights that users can trust, by ensuring its freshness, identifying anomalies, and eliminating duplicates that could lead to errors.
Bernd Greifeneder
CTO and Founder, Dynatrace

AI DRIVES Data Observability

In 2024 large enterprises will rapidly increase investment in AI and LLM technologies. This will create a greater need for data observability to validate that data feeding AI initiatives is accurate and complete. As a result, data observability vendors will be expected to expand support from predominantly cloud-native data environments to larger, more traditional enterprise data stacks and provide native solutions for LLM data pipeline monitoring and validation.
Kyle Kirwan
Co-Founder and CEO, Bigeye

AI DRIVES DATAOPS

2024 will see the impact of rapid AI/ML advances on DataOps — empowering developers and DevOps teams by making data analytics that much more accessible, accurate, and definitive. AI/ML and new data science techniques will enable new operating models, data analytics tools, and technologies that enable data-driven decisions. Specific benefits will manifest in everything from customer recommendation engines to insight-optimized business operations, including streamlined development practices and DevOps processes. In this way, DataOps will provide enterprises with decisive advantages over competitors who lack that data command and clarity.
Anil Inamdar
VP & Head of Data, Instaclustr, part of Spot by NetApp

AI DRIVES DATA INFRASTRUCTURE MODERNIZATION

The continuous and rapid adoption of AI will force organizations to modernize their data infrastructure in 2024. Enterprises are examining their data and it's pushing them to have a better handle on it so technologies like AI can be properly used. Organizations will double down on data management and data integrity to ensure third-party applications are seamlessly integrated. Data practitioners will look for solutions that continuously keep data clean to quickly act on workflows. Better data means better-trained models on less data, as well as a better ability to leverage that data in AI applications that incorporate retrieval.
Matt Wallace
Technical Advisor, Faction

AI WILL NOT REPLACE DATA ENGINEERS

Data engineering will evolve — and be highly valued — in an AI world. There's been a lot of chatter that the AI revolution will replace the role of data engineers. That's not the case, and in fact their data expertise will be more critical than ever — just in new and different ways. To keep up with the evolving landscape, data engineers will need to understand how generative AI adds value. The data pipelines built and managed by data engineers will be perhaps the first place to connect with large language models for organizations to unlock value. Data engineers will be the ones who understand how to consume a model and plug it into a data pipeline to automate the extraction of value. They will also be expected to oversee and understand the AI work.
Jeff Hollan
Director of Product Management, Snowflake

BREAKING DOWN DATA SILOS

The traditional silos between IT, legal, and business departments will crumble in 2024, giving way to a more collaborative approach. This cross-functional synergy will ensure that data, including unstructured data, is not just collected but strategically utilized to drive business value.
Rohit Choudhary
CEO, Acceldata

Data as Business Asset

In many senses, data is the new oil. It's a finite resource that needs to be mined and managed strategically, and its value is highly dependent on your ability to refine and manipulate it for specific applications. For this reason, we see 2024 as being a critical year in the transition of data from being 1s and 0s on a screen to an actual asset to be managed, tracked, and optimized within an enterprise.
Jackie McGuire
Senior Security Strategist, Cribl

Data management will evolve beyond mere data storage in 2024. Organizations will recognize the strategic value of weaving data, including unstructured data, into their business strategies. This shift will unlock a wealth of insights and redefine business decision-making.
Rohit Choudhary
CEO, Acceldata

In 2024, the technology landscape will witness a transformative shift as data evolves from being a valuable asset to the lifeblood of thriving enterprises. Organizations that overlook data quality, integrity, and lineage will be challenged to not only make informed decisions but also realize the full potential of generative AI, LLM and ML applications and use cases. As the year unfolds, I predict that organizations neglecting to craft robust data foundations and strategies will find it increasingly challenging to stay afloat in the swiftly evolving tech industry. Those who fail to adapt and prioritize data fundamentals will struggle to outpace their competitors and may even risk survival in this highly competitive environment.
Armon Petrossian
CEO and Co-Founder, Coalesce

Data as innovation asset

The rapid expansion of data will continue to be a dominant trend in 2024. However, the ability to efficiently gather, process, and utilize this data will become the critical factor limiting or accelerating innovation within organizations. The challenge will lie in developing methods to quickly and securely assimilate this growing data influx, converting it into actionable insights. Companies that can effectively manage this data deluge, turning it into a strategic asset for innovation, will gain a competitive edge in the increasingly data-driven business landscape.
David Boskovic
Founder and CEO, Flatfile

Data as a product

Until recently, only large companies had the expertise and resources needed to create reusable data assets that can be easily repurposed across different teams and applications. Thanks to advancements in the governance products required to build these assets, in 2024 more companies will be able to create reusable data products, greatly accelerating efficiency and data innovation. Multiple teams can benefit from having access to the same data to build a service or application. However, this data must be presented in a way that is secure, well-contextualized, and understandable for users who weren't involved in its production. As data moves farther away from its initial source, you have to do more checks, which becomes increasingly expensive. Starting the data governance process at the source is not only less expensive but is a better way to understand the data's source and how it's schematized. New data governance capabilities, pre-built into products such as cloud data warehouses, databases, and other data infrastructure services, can meet these needs. Developers no longer need to manually build the infrastructure to create and share reusable data products. As a result, reusable data products will no longer be restricted to companies with large, sophisticated data engineering teams. With more companies building reusable data products, in 2024, developers will increase the value of their data and spend more time building innovative data applications and services.
Andrew Sellers
Head of Technology Strategy, Confluent

Go to: 2024 DataOps Predictions - Part 2

Hot Topics

The Latest

Developers building AI applications are not just looking for fault patterns after deployment; they must detect issues quickly during development and have the ability to prevent issues after going live. Unfortunately, traditional observability tools can no longer meet the needs of AI-driven enterprise application development. AI-powered detection and auto-remediation tools designed to keep pace with rapid development are now emerging to proactively manage performance and prevent downtime ...

Every few years, the cybersecurity industry adopts a new buzzword. "Zero Trust" has endured longer than most — and for good reason. Its promise is simple: trust nothing by default, verify everything continuously. Yet many organizations still hesitate to implement Zero Trust Network Access (ZTNA). The problem isn't that ZTNA doesn't work. It's that it's often misunderstood ...

For many retail brands, peak season is the annual stress test of their digital infrastructure. It's also when often technical dashboards glow green, yet customer feedback, digital experience frustration, and conversion trends tell a different story entirely. Over the past several years, we've seen the same pattern across retail, financial services, travel, and media: internal application performance metrics fail to capture the true experience of users connecting over local broadband, mobile carriers, and congested networks using multiple devices across geographies ...

PostgreSQL promises greater flexibility, performance, and cost savings compared to proprietary alternatives. But successfully deploying it isn't always straightforward, and there are some hidden traps along the way that even seasoned IT leaders can stumble into. In this blog, I'll highlight five of the most common pitfalls with PostgreSQL deployment and offer guidance on how to avoid them, along with the best path forward ...

The rise of hybrid cloud environments, the explosion of IoT devices, the proliferation of remote work, and advanced cyber threats have created a monitoring challenge that traditional approaches simply cannot meet. IT teams find themselves drowning in a sea of data, struggling to identify critical threats amidst a deluge of alerts, and often reacting to incidents long after they've begun. This is where AI and ML are leveraged ...

Three practices, chaos testing, incident retrospectives, and AIOps-driven monitoring, are transforming platform teams from reactive responders into proactive builders of resilient, self-healing systems. The evolution is not just technical; it's cultural. The modern platform engineer isn't just maintaining infrastructure. They're product owners designing for reliability, observability, and continuous improvement ...

Getting applications into the hands of those who need them quickly and securely has long been the goal of a branch of IT often referred to as End User Computing (EUC). Over recent years, the way applications (and data) have been delivered to these "users" has changed noticeably. Organizations have many more choices available to them now, and there will be more to come ... But how did we get here? Where are we going? Is this all too complicated? ...

On November 18, a single database permission change inside Cloudflare set off a chain of failures that rippled across the Internet. Traffic stalled. Authentication broke. Workers KV returned waves of 5xx errors as systems fell in and out of sync. For nearly three hours, one of the most resilient networks on the planet struggled under the weight of a change no one expected to matter ... Cloudflare recovered quickly, but the deeper lesson reaches far beyond this incident ...

Chris Steffen and Ken Buckler from EMA discuss the Cloudflare outage and what availability means in the technology space ...

Every modern industry is confronting the same challenge: human reaction time is no longer fast enough for real-time decision environments. Across sectors, from financial services to manufacturing to cybersecurity and beyond, the stakes mirror those of autonomous vehicles — systems operating in complex, high-risk environments where milliseconds matter ...