DataOps
If AI is the engine of a modern organization, then data engineering is the road system beneath it. You can build the most powerful engine in the world, but without paved roads, traffic signals, and bridges that can support its weight, it will stall. In many enterprises, the engine is ready. The roads are not ...
Nearly every conversation about AI eventually circles back to compute. GPUs dominate the headlines while cloud platforms compete for workloads and model benchmarks drive investment decisions. But underneath that noise, a quieter infrastructure challenge is taking shape. The real bottleneck in enterprise AI is not processing power, it is the ability to store, manage and retrieve the relentless volumes of data that AI systems generate, consume and multiply ...
In a 2026 survey conducted by Liquibase, the research found that 96.5% of organizations reported at least one AI or LLM interaction with their production databases, often through analytics and reporting, training pipelines, internal copilots, and AI generated SQL. Only a small fraction reported no interaction at all. That means the database is no longer a downstream system that AI "might" reach later. AI is already there ...
AI can't fix broken data. CIOs who modernize revenue data governance unlock predictable growth-those who don't risk millions in failed AI investments. For decades, CIOs kept the lights on. Revenue was someone else's problem, owned by sales, led by the CRO, measured by finance. Those days are behind us ...
Organizations are discovering that AI performance reflects the health of their core systems as pilots move into production ... Most stall in the early stages, not because of model limitations, but because their operational foundation isn't ready to support the next level. Lucid's AI Readiness Report found that only 26% of organizations that have implemented AI agents say those efforts have been "completely successful," a clear sign that something beneath the surface is holding teams back ...
AI agents are starting to do something that used to be slow by design. They are creating databases, spinning up branches, and iterating on the data layer as part of the build loop. You can argue about the exact percentages in any one report, but the direction is unmistakable. The database is moving from foundational infrastructure to active surface area for modern applications, and that shift is going to collide with how most enterprises still control change ...
Artificial intelligence (AI) has become the dominant force shaping enterprise data strategies. Boards expect progress. Executives expect returns. And data leaders are under pressure to prove that their organizations are "AI-ready" ...
Data has never been more central to a greater portion of enterprise operations than it is today. From software development to marketing strategy, data has become an essential component for success. But as data use cases multiply, so too does the diversity of the data itself. This shift is pushing organizations toward increasingly complex data infrastructure ...
If you work with AI, you know this story. A model performs during testing, looks great in early reviews, works perfectly in production and then slowly loses relevance after operating for a while. Everything on the surface looks perfect — pipelines are running, predictions or recommendations are error-free, data quality checks show green; yet outcomes don't meet the ground reality. This pattern often repeats across enterprise AI programs. Take for example, a mid-sized retail banking and wealth-management firm with heavy investments in AI-powered risk analytics, fraud detection and personalized credit-decisioning systems. The model worked well for a while, but transactions increased, so did false positives by 18% ...
SolarWinds data shows that one in three DBAs are contemplating leaving their positions — a striking indicator of workforce pressure in this role. This is likely due to the technical and interpersonal frustrations plaguing today's DBAs. Hybrid IT environments provide widespread organizational benefits but also present growing complexity. Simultaneously, AI presents a paradox of benefits and pain points ...
My latest title for O'Reilly, The Rise of Logical Data Management, was an eye-opener for me. I'd never heard of "logical data management," even though it's been around for several years, but it makes some extraordinary promises, like the ability to manage data without having to first move it into a consolidated repository, which changes everything. Now, with the demands of AI and other modern use cases, logical data management is on the rise, so it's "new" to many. Here, I'd like to introduce you to it and explain how it works ...
APMdigest's Predictions Series continues with 2026 DataOps Predictions — industry experts offer predictions on how DataOps and related technologies will evolve and impact business in 2026. Part 2 covers data and data platforms ...
APMdigest's Predictions Series continues with 2026 DataOps Predictions — industry experts offer predictions on how DataOps and related technologies will evolve and impact business in 2026 ...
In APMdigest's 2026 Observability Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2026. Part 7 covers Observability data ...
IT organizations are preparing for 2026 with increased expectations around modernization, cloud maturity, and data readiness. At the same time, many teams continue to operate with limited staffing and are trying to maintain complex environments with small internal groups. These conditions are creating a distinct set of priorities for the year ahead. The DataStrike 2026 Data Infrastructure Survey Report, based on responses from nearly 280 IT leaders across industries, points to five trends that are shaping data infrastructure planning for 2026 ...
Developers building AI applications are not just looking for fault patterns after deployment; they must detect issues quickly during development and have the ability to prevent issues after going live. Unfortunately, traditional observability tools can no longer meet the needs of AI-driven enterprise application development. AI-powered detection and auto-remediation tools designed to keep pace with rapid development are now emerging to proactively manage performance and prevent downtime ...
PostgreSQL promises greater flexibility, performance, and cost savings compared to proprietary alternatives. But successfully deploying it isn't always straightforward, and there are some hidden traps along the way that even seasoned IT leaders can stumble into. In this blog, I'll highlight five of the most common pitfalls with PostgreSQL deployment and offer guidance on how to avoid them, along with the best path forward ...
Governments and social platforms face an escalating challenge: hyperrealistic synthetic media now spreads faster than legacy moderation systems can react. From pandemic-related conspiracies to manipulated election content, disinformation has moved beyond "false text" into the realm of convincing audiovisual deception ...
Traditional monitoring often stops at uptime and server health without any integrated insights. Cross-platform observability covers not just infrastructure telemetry but also client-side behavior, distributed service interactions, and the contextual data that connects them. Emerging technologies like OpenTelemetry, eBPF, and AI-driven anomaly detection have made this vision more achievable, but only if organizations ground their observability strategy in well-defined pillars. Here are the five foundational pillars of cross-platform observability that modern engineering teams should focus on for seamless platform performance ...
A recent MIT study reveals that 95% of enterprise AI solutions fail, with 85% of AI project failures attributed to data readiness issues ... The reality is stark: AI effectiveness depends primarily on data quality, and organizations consistently struggle with data discovery, access, quality, structure, readiness, security, and governance. These challenges demand expert solutions, yet they often receive less attention than the flashy "AI will change everything" narratives that dominate industry discourse ...
Your company's knowledge base is the engine that will power your AI agents. It's the structured data repository that an AI system uses to understand, reason, and make decisions ... Thus, transferring accurate knowledge is crucial. Incomplete or poor-quality data can reduce agent performance, accuracy, and even increase agent bias. Here are five steps to ensuring that your company's knowledge base transfer is optimal ...
Most teams collect observability data for the obvious reasons: uptime, latency, troubleshooting. It's the stuff we have to do to keep the lights on. But that mindset limits what this data is really capable of. When we treat logs like a transient utility instead of a long-term resource, we end up throwing away insight we can't get back. Losing that data isn't just a technical issue; it limits your ability to make smarter business decisions ...
Enterprises are racing to leverage AI in their database environments — but most are skipping the guardrails. According to Quest research, 67% of organizations say AI is already critical to their database operations. Yet fewer than half report having a formal governance framework in place to manage it. That mismatch puts businesses at risk — operationally, financially, and reputationally ...
If you've worked in infrastructure or observability engineering for more than a few years, chances are you've been told more than once that practices like sampling, data aggregation, and short retention windows are just "best practices." The rationale is familiar: save money, reduce system strain, stay agile. But I want to challenge that framing. These aren't best practices. They're coping mechanisms. And they're costing us more than we realize ...
We're inching ever closer toward a long-held goal: technology infrastructure that is so automated that it can protect itself. But as IT leaders aggressively employ automation across our enterprises, we need to continuously reassess what AI is ready to manage autonomously and what can not yet be trusted to algorithms ...