Skip to main content

How Legacy Friction Strangles AI-Driven DevOps

David Torgerson
Lucid Software

Organizations are discovering that AI performance reflects the health of their core systems as pilots move into production. Whether organizations realize it or not, they are already somewhere on the AI maturity curve — progressing from fragmented AI use to aggregated consumption, contextual processing, logic execution, and ultimately strategic transformation. Most stall in the early stages, not because of model limitations, but because their operational foundation isn't ready to support the next level. Lucid's AI Readiness Report found that only 26% of organizations that have implemented AI agents say those efforts have been "completely successful," a clear sign that something beneath the surface is holding teams back.

In many cases, the constraint is what I call the "Legacy Layer": the accumulation of old systems, and undocumented and fragmented workflows that quietly power day-to-day operations. Over time, this layer becomes the operational backbone and the primary source of friction.

This infrastructure of undocumented workarounds and isolated data silos drains momentum long before a project reaches production. When you pull back the covers on AI success stories, it almost always boils down to the maturity of their documentation and processes. If your AI efforts have hit a wall, the problem is likely the hidden blockers in your workflow. When organizations try to layer AI on top of this legacy foundation, they often assume automation will make up for these complex issues, but in reality, it only exposes them.

Spotting Associated Pain Points and Frictions in Legacy Systems

AI thrives on clean data and clearly defined processes, yet legacy systems offer the opposite — siloed tools, point-to-point integrations, and human workarounds. This structural disconnect creates associated pain — subtle frictions that rarely trigger alarms but steadily drain momentum. When 61% of workers say their AI strategy is misaligned with operational capabilities, they are feeling the weight of this friction.

Because modern AI depends on an open architecture where data moves freely, these isolated silos make it nearly impossible for an agent to create a single source of truth or act across a broader ecosystem. Without that shared context, AI is able to analyze data in one corner of the organization but is unable to execute meaningful action across the entire workflow.

This lack of connectivity is compounded by the tacit knowledge gap. Many DevOps environments function because only a handful of people know how things really work. They understand the edge cases, and the undocumented steps that keep systems running. AI can't learn from tacit knowledge. It needs that expertise extracted and structured, which is why 49% of organizations say undocumented or ad-hoc processes impact efficiency. In practice, much of this knowledge already surfaces in diagrams, scratch pads, and collaborative workspaces created as part of day-to-day activities.

Recognizing AI Readiness Gaps

Until hidden expertise is codified, AI remains blocked by a map it cannot read. If a workflow is inherently inefficient or relies on human intuition to bridge technical gaps, deploying AI will only serve to make those inefficiencies move at machine speed.

Time compounds the risk. As experienced employees leave, organizations lose the institutional memory of how their legacy systems actually behave. Once that tacit knowledge is gone, it becomes nearly impossible to train an AI to replicate those nuances accurately. This explains why 46% of organizations have integrated AI into only "some" or "almost no" workflows. They lack the basic visibility needed to support day-to-day operations, let alone a sophisticated automation layer.

Before scaling AI, you must assess your associated pain metric. If a system requires constant manual intervention or custom workarounds, it is a high-drag environment. Highly associated pain acts as a firewall that prevents AI from delivering measurable ROI.

Practical Interventions to Reduce Pain Points/Friction

The good news is that stalled AI initiatives don't require a full IT overhaul to get moving again. Small, targeted interventions can unlock immediate progress. For DevOps teams looking to reduce friction, I recommend these four steps:

  • Make the current state visible. Intelligent diagramming tools can help teams map workflows as they actually exist, not only as they were designed on paper. This extracts low-level documentation without making it an extra step, because you are tying into the place where people actually work day-to-day.
  • Streamline and standardize where possible. You don't need perfection, but consistency matters. Standard inputs and outputs give AI something reliable to work with.
  • Focus on quick wins. Automating a single high-friction handoff or reducing manual reporting can show immediate productivity gains and build internal confidence in AI-driven improvements.
  • Align systems with business objectives. AI should support real operational goals, not abstract innovation metrics. When workflows are clearer and less fragmented, AI becomes more actionable by default.

Moving Past Stalled AI Projects

AI can't deliver results in disconnected systems or broken workflows. The organizations seeing real productivity gains aren't deploying more tools, they're identifying pain points, clarifying processes and aligning stakeholders around how work actually gets done.

The organizations seeing real productivity gains today aren't necessarily the ones with the most advanced models or the largest budgets, it's the ones identifying hidden pain points, clarifying their Legacy Layer, and aligning stakeholders around how work actually gets done.

For DevOps leaders, the takeaway is simple: before deploying more AI, look for the hidden blockers underneath. If humans don't understand the workflow, AI never will. 

David Torgerson is VP of Infrastructure and IT at Lucid Software

The Latest

The gap is widening between what teams spend on observability tools and the value they receive amid surging data volumes and budget pressures, according to The Breaking Point for Observability Leaders, a report from Imply ...

Seamless shopping is a basic demand of today's boundaryless consumer — one with little patience for friction, limited tolerance for disconnected experiences and minimal hesitation in switching brands. Customers expect intuitive, highly personalized experiences and the ability to move effortlessly across physical and digital channels within the same journey. Failure to deliver can cost dearly ...

If your best engineers spend their days sorting tickets and resetting access, you are wasting talent. New global data shows that employees in the IT sector rank among the least motivated across industries. They're under a lot of pressure from many angles. Pressure to upskill and uncertainty around what agentic AI means for job security is creating anxiety. Meanwhile, these roles often function like an on-call job and require many repetitive tasks ...

In a 2026 survey conducted by Liquibase, the research found that 96.5% of organizations reported at least one AI or LLM interaction with their production databases, often through analytics and reporting, training pipelines, internal copilots, and AI generated SQL. Only a small fraction reported no interaction at all. That means the database is no longer a downstream system that AI "might" reach later. AI is already there ...

In many organizations, IT still operates as a reactive service provider. Systems are managed through fragmented tools, teams focus heavily on operational metrics, and business leaders often see IT as a necessary cost center rather than a strategic partner. Even well-run ITIL environments can struggle to bridge the gap between operational excellence and business impact. This is where the concept of ITIL+ comes in ...

UK IT leaders are reaching a critical inflection point in how they manage observability, according to research from LogicMonitor. As infrastructure complexity grows and AI adoption accelerates, fragmented monitoring environments are driving organizations to rethink their operational strategies and consolidate tools ...

For years, many infrastructure teams treated the edge as a deployment variation. It was seen as the same cloud model, only stretched outward: more devices, more gateways, more locations and a little more latency. That assumption is proving costly. The edge is not just another place to run workloads. It is a fundamentally different operating condition ...

AI can't fix broken data. CIOs who modernize revenue data governance unlock predictable growth-those who don't risk millions in failed AI investments. For decades, CIOs kept the lights on. Revenue was someone else's problem, owned by sales, led by the CRO, measured by finance. Those days are behind us ...

Over the past few years, organizations have made enormous strides in enabling remote and hybrid work. But the foundational technologies powering today's digital workplace were never designed for the volume, velocity, and complexity that is coming next. By 2026 and beyond, three forces — 5G, the metaverse, and edge AI — will fundamentally reshape how people connect, collaborate, and access enterprise resources ... The businesses that begin preparing now will gain a competitive head start. Those that wait will find themselves trying to secure environments that have already outgrown their architecture ...

Ask where enterprise AI is making its most decisive impact, and the answer might surprise you: not marketing, not finance, not customer experience. It's IT. Across three years of industry research conducted by Digitate, one constant holds true is that IT is both the testing ground and the proving ground for enterprise AI. Last year, that position only strengthened ...

How Legacy Friction Strangles AI-Driven DevOps

David Torgerson
Lucid Software

Organizations are discovering that AI performance reflects the health of their core systems as pilots move into production. Whether organizations realize it or not, they are already somewhere on the AI maturity curve — progressing from fragmented AI use to aggregated consumption, contextual processing, logic execution, and ultimately strategic transformation. Most stall in the early stages, not because of model limitations, but because their operational foundation isn't ready to support the next level. Lucid's AI Readiness Report found that only 26% of organizations that have implemented AI agents say those efforts have been "completely successful," a clear sign that something beneath the surface is holding teams back.

In many cases, the constraint is what I call the "Legacy Layer": the accumulation of old systems, and undocumented and fragmented workflows that quietly power day-to-day operations. Over time, this layer becomes the operational backbone and the primary source of friction.

This infrastructure of undocumented workarounds and isolated data silos drains momentum long before a project reaches production. When you pull back the covers on AI success stories, it almost always boils down to the maturity of their documentation and processes. If your AI efforts have hit a wall, the problem is likely the hidden blockers in your workflow. When organizations try to layer AI on top of this legacy foundation, they often assume automation will make up for these complex issues, but in reality, it only exposes them.

Spotting Associated Pain Points and Frictions in Legacy Systems

AI thrives on clean data and clearly defined processes, yet legacy systems offer the opposite — siloed tools, point-to-point integrations, and human workarounds. This structural disconnect creates associated pain — subtle frictions that rarely trigger alarms but steadily drain momentum. When 61% of workers say their AI strategy is misaligned with operational capabilities, they are feeling the weight of this friction.

Because modern AI depends on an open architecture where data moves freely, these isolated silos make it nearly impossible for an agent to create a single source of truth or act across a broader ecosystem. Without that shared context, AI is able to analyze data in one corner of the organization but is unable to execute meaningful action across the entire workflow.

This lack of connectivity is compounded by the tacit knowledge gap. Many DevOps environments function because only a handful of people know how things really work. They understand the edge cases, and the undocumented steps that keep systems running. AI can't learn from tacit knowledge. It needs that expertise extracted and structured, which is why 49% of organizations say undocumented or ad-hoc processes impact efficiency. In practice, much of this knowledge already surfaces in diagrams, scratch pads, and collaborative workspaces created as part of day-to-day activities.

Recognizing AI Readiness Gaps

Until hidden expertise is codified, AI remains blocked by a map it cannot read. If a workflow is inherently inefficient or relies on human intuition to bridge technical gaps, deploying AI will only serve to make those inefficiencies move at machine speed.

Time compounds the risk. As experienced employees leave, organizations lose the institutional memory of how their legacy systems actually behave. Once that tacit knowledge is gone, it becomes nearly impossible to train an AI to replicate those nuances accurately. This explains why 46% of organizations have integrated AI into only "some" or "almost no" workflows. They lack the basic visibility needed to support day-to-day operations, let alone a sophisticated automation layer.

Before scaling AI, you must assess your associated pain metric. If a system requires constant manual intervention or custom workarounds, it is a high-drag environment. Highly associated pain acts as a firewall that prevents AI from delivering measurable ROI.

Practical Interventions to Reduce Pain Points/Friction

The good news is that stalled AI initiatives don't require a full IT overhaul to get moving again. Small, targeted interventions can unlock immediate progress. For DevOps teams looking to reduce friction, I recommend these four steps:

  • Make the current state visible. Intelligent diagramming tools can help teams map workflows as they actually exist, not only as they were designed on paper. This extracts low-level documentation without making it an extra step, because you are tying into the place where people actually work day-to-day.
  • Streamline and standardize where possible. You don't need perfection, but consistency matters. Standard inputs and outputs give AI something reliable to work with.
  • Focus on quick wins. Automating a single high-friction handoff or reducing manual reporting can show immediate productivity gains and build internal confidence in AI-driven improvements.
  • Align systems with business objectives. AI should support real operational goals, not abstract innovation metrics. When workflows are clearer and less fragmented, AI becomes more actionable by default.

Moving Past Stalled AI Projects

AI can't deliver results in disconnected systems or broken workflows. The organizations seeing real productivity gains aren't deploying more tools, they're identifying pain points, clarifying processes and aligning stakeholders around how work actually gets done.

The organizations seeing real productivity gains today aren't necessarily the ones with the most advanced models or the largest budgets, it's the ones identifying hidden pain points, clarifying their Legacy Layer, and aligning stakeholders around how work actually gets done.

For DevOps leaders, the takeaway is simple: before deploying more AI, look for the hidden blockers underneath. If humans don't understand the workflow, AI never will. 

David Torgerson is VP of Infrastructure and IT at Lucid Software

The Latest

The gap is widening between what teams spend on observability tools and the value they receive amid surging data volumes and budget pressures, according to The Breaking Point for Observability Leaders, a report from Imply ...

Seamless shopping is a basic demand of today's boundaryless consumer — one with little patience for friction, limited tolerance for disconnected experiences and minimal hesitation in switching brands. Customers expect intuitive, highly personalized experiences and the ability to move effortlessly across physical and digital channels within the same journey. Failure to deliver can cost dearly ...

If your best engineers spend their days sorting tickets and resetting access, you are wasting talent. New global data shows that employees in the IT sector rank among the least motivated across industries. They're under a lot of pressure from many angles. Pressure to upskill and uncertainty around what agentic AI means for job security is creating anxiety. Meanwhile, these roles often function like an on-call job and require many repetitive tasks ...

In a 2026 survey conducted by Liquibase, the research found that 96.5% of organizations reported at least one AI or LLM interaction with their production databases, often through analytics and reporting, training pipelines, internal copilots, and AI generated SQL. Only a small fraction reported no interaction at all. That means the database is no longer a downstream system that AI "might" reach later. AI is already there ...

In many organizations, IT still operates as a reactive service provider. Systems are managed through fragmented tools, teams focus heavily on operational metrics, and business leaders often see IT as a necessary cost center rather than a strategic partner. Even well-run ITIL environments can struggle to bridge the gap between operational excellence and business impact. This is where the concept of ITIL+ comes in ...

UK IT leaders are reaching a critical inflection point in how they manage observability, according to research from LogicMonitor. As infrastructure complexity grows and AI adoption accelerates, fragmented monitoring environments are driving organizations to rethink their operational strategies and consolidate tools ...

For years, many infrastructure teams treated the edge as a deployment variation. It was seen as the same cloud model, only stretched outward: more devices, more gateways, more locations and a little more latency. That assumption is proving costly. The edge is not just another place to run workloads. It is a fundamentally different operating condition ...

AI can't fix broken data. CIOs who modernize revenue data governance unlock predictable growth-those who don't risk millions in failed AI investments. For decades, CIOs kept the lights on. Revenue was someone else's problem, owned by sales, led by the CRO, measured by finance. Those days are behind us ...

Over the past few years, organizations have made enormous strides in enabling remote and hybrid work. But the foundational technologies powering today's digital workplace were never designed for the volume, velocity, and complexity that is coming next. By 2026 and beyond, three forces — 5G, the metaverse, and edge AI — will fundamentally reshape how people connect, collaborate, and access enterprise resources ... The businesses that begin preparing now will gain a competitive head start. Those that wait will find themselves trying to secure environments that have already outgrown their architecture ...

Ask where enterprise AI is making its most decisive impact, and the answer might surprise you: not marketing, not finance, not customer experience. It's IT. Across three years of industry research conducted by Digitate, one constant holds true is that IT is both the testing ground and the proving ground for enterprise AI. Last year, that position only strengthened ...