Skip to main content

A Fresh Look at Advanced IT Analytics - Why the Industry Continues to Get it Wrong

Dennis Drogseth

Buzzwords in tech (like politics) do a lot to call attention to themselves, but they don't always do a very good job of calling attention to the truth. Reality, after all, is often mystifyingly multi-dimensional, while "what's hot" tends to become linear and often cartoonish.

Over the last few years I've tried to represent a clear and growing trend that I've come to call "Advanced IT Analytics" or AIA, in contrast with other industry terms such as "IT Operations Analytics" and "Big Data". My issue with the former is that AIA isn't restricted to operations, but can reach out across all of IT, including executives, service desk and ITSM teams, development and even non-IT business stakeholders. It is multi-use case and multi-stakeholder in value, as the same data mosaic may serve performance, security, change management, and DevOps requirements, while also supporting business stakeholders in areas such as customer experience and market planning.

My issue with "big data" is that when it comes to AIA, just taking big data by itself misses the point. While AIA often thrives on significant volumes of data across multiple domains, what's key to the more progressive AIA solutions are its powers to interrelate and analyze data with a clear eye to meaningful outcomes. Genetically (taking the term metaphorically) I would argue that AIA is not primarily an outgrowth of business intelligence and big data pots, including NoSQL options like Hadoop and Cassandra. Rather, AIA grew out of advanced self-learning tools targeting far more finite data sources, such as time-series data directed at service performance outcomes, or even advanced event correlation.

What made AIA distinctive early on was its ability to assimilate data from many different toolsets and create a common fabric of intelligence that crossed domain silos. These tools often had surprising options for predicting future outcomes and discovering patterns that were not looked for or sought after. They also had political and social challenges from IT siloed communities refusing to give up their own siloed toolset preeminence or even share their data with others in IT. These benefits (and these political issues) continue even as AIA continues to evolve to include many new options, including big data pots in some cases.

What we're witnessing now is, I believe, a great deal of industry confusion about how to go about bringing advanced analytics to the IT community — aggravated inevitably by both marketing hype, and, sadly, boxed-in categories from the analyst community wedded far too much to technology and far too little to use case. AIA is, in fact, especially a challenge because it tends to support a diversity of use cases, making it less like a traditional market and more like an architectural revolution (or evolution) in next-generation business service management. Or given current buzzword pre-eminence, let's make that digital service management. At least here the buzzword really does have some genuine meaning and value.

So I'd like to go back to what I believe are AIA's roots. These include tiered or blended capabilities to assimilate data from many different sources — either from many different toolset investments (in recent research our respondents indicated 10-20 toolsets either directly or via an aggregated data store); and/or from a wide variety of sources ranging from transactional data including user and customer behaviors, to log files, to packets and wire data, to events, to Excel spreadsheets, and unstructured data as in text and social media.

What also distinguishes AIA is a unique ability to link critical IT business service interdependencies for both change and performance in context with event, time series, transaction and other data. While many of our research respondents sought out interdependency mapping within the analytics solution itself, probably the most frequent linkage in real adoptions comes from the application discovery and dependency (ADDM) arena, as well as newer, more dynamic instances of CMDBs and federated configuration management systems (CMSs).

The net values of good AIA solutions include much faster time to value and far less administrative overhead than massive data lakes that are created virtually as an end in themselves. The ability to assimilate many multiple "trusted sources" and discover new and unexpected values needn't be an investment in an army of white coats. It can be — in some cases at least — surprisingly dynamic and self-administrating.

This AIA tidal wave is still new. Still a relatively small and distant rise in the information technology ocean. Yet there are already a growing number of AIA innovators with different directions and focus — from cloud, to integrated DevOps and change management, to user and customer and digital experience optimization.

I will be presenting a webinar on November 10 — with a better chance to explain the values of tiered or blended AIA. And I'll be following up with some new research to be completed in Q1 of next year: "Advanced IT Analytics Part II: Deployment Priorities and Lessons Learned." Hopefully the data will reinforce what I believe should be AIA progress toward more effective advanced analytics for IT, and not a sudden dip into white-coated chaos. But then you never know — that's part of the appeal of doing research. Invariably, if it's any good, it will always teach you something new.

Hot Topics

The Latest

Most organizations approach OpenTelemetry as a collection of individual tools they need to assemble from scratch. This view misses the bigger picture. OpenTelemetry is a complete telemetry framework with composable components that address specific problems at different stages of organizational maturity. You start with what you need today and adopt additional pieces as your observability practices evolve ...

One of the earliest lessons I learned from architecting throughput-heavy services is that simplicity wins repeatedly: fewer moving parts, loosely coupled execution (fewer synchronous calls), and precise timing metering. You want data and decisions to travel the shortest possible path. The goal is to build a system where every strategy and each line of code (contention is the key metric) complements the decision trees ...

As discussions around AI "autonomous coworkers" accelerate, many industry projections assume that agents will soon operate alongside human staff in making decisions, taking actions, and managing tasks with minimal oversight. But a growing number of critics (including some of the developers building these systems) argue that the industry still has a long way to go to be able to treat AI agents like fully trusted teammates ...

Enterprise AI has entered a transformational phase where, according to Digitate's recently released survey, Agentic AI and the Future of Enterprise IT, companies are moving beyond traditional automation toward Agentic AI systems designed to reason, adapt, and collaborate alongside human teams ...

The numbers back this urgency up. A recent Zapier survey shows that 92% of enterprises now treat AI as a top priority. Leaders want it, and teams are clamoring for it. But if you look closer at the operations of these companies, you see a different picture. The rollout is slow. The results are often delayed. There's a disconnect between what leaders want and what their technical infrastructure can handle ...

Kyndryl's 2025 Readiness Report revealed that 61% of global business and technology leaders report increasing pressure from boards and regulators to prove AI's ROI. As the technology evolves and expectations continue to rise, leaders are compelled to generate and prove impact before scaling further. This will lead to a decisive turning point in 2026 ...

Cloudflare's disruption illustrates how quickly a single provider's issue cascades into widespread exposure. Many organizations don't fully realize how tightly their systems are coupled to thirdparty services, or how quickly availability and security concerns align when those services falter ... You can't avoid these dependencies, but you can understand them ...

If you work with AI, you know this story. A model performs during testing, looks great in early reviews, works perfectly in production and then slowly loses relevance after operating for a while. Everything on the surface looks perfect — pipelines are running, predictions or recommendations are error-free, data quality checks show green; yet outcomes don't meet the ground reality. This pattern often repeats across enterprise AI programs. Take for example, a mid-sized retail banking and wealth-management firm with heavy investments in AI-powered risk analytics, fraud detection and personalized credit-decisioning systems. The model worked well for a while, but transactions increased, so did false positives by 18% ...

Basic uptime is no longer the gold standard. By 2026, network monitoring must do more than report status, it must explain performance in a hybrid-first world. Networks are no longer just static support systems; they are agile, distributed architectures that sit at the very heart of the customer experience and the business outcomes ... The following five trends represent the new standard for network health, providing a blueprint for teams to move from reactive troubleshooting to a proactive, integrated future ...

APMdigest's Predictions Series concludes with 2026 AI Predictions — industry experts offer predictions on how AI and related technologies will evolve and impact business in 2026. Part 5, the final installment, covers AI's impacts on IT teams ...

A Fresh Look at Advanced IT Analytics - Why the Industry Continues to Get it Wrong

Dennis Drogseth

Buzzwords in tech (like politics) do a lot to call attention to themselves, but they don't always do a very good job of calling attention to the truth. Reality, after all, is often mystifyingly multi-dimensional, while "what's hot" tends to become linear and often cartoonish.

Over the last few years I've tried to represent a clear and growing trend that I've come to call "Advanced IT Analytics" or AIA, in contrast with other industry terms such as "IT Operations Analytics" and "Big Data". My issue with the former is that AIA isn't restricted to operations, but can reach out across all of IT, including executives, service desk and ITSM teams, development and even non-IT business stakeholders. It is multi-use case and multi-stakeholder in value, as the same data mosaic may serve performance, security, change management, and DevOps requirements, while also supporting business stakeholders in areas such as customer experience and market planning.

My issue with "big data" is that when it comes to AIA, just taking big data by itself misses the point. While AIA often thrives on significant volumes of data across multiple domains, what's key to the more progressive AIA solutions are its powers to interrelate and analyze data with a clear eye to meaningful outcomes. Genetically (taking the term metaphorically) I would argue that AIA is not primarily an outgrowth of business intelligence and big data pots, including NoSQL options like Hadoop and Cassandra. Rather, AIA grew out of advanced self-learning tools targeting far more finite data sources, such as time-series data directed at service performance outcomes, or even advanced event correlation.

What made AIA distinctive early on was its ability to assimilate data from many different toolsets and create a common fabric of intelligence that crossed domain silos. These tools often had surprising options for predicting future outcomes and discovering patterns that were not looked for or sought after. They also had political and social challenges from IT siloed communities refusing to give up their own siloed toolset preeminence or even share their data with others in IT. These benefits (and these political issues) continue even as AIA continues to evolve to include many new options, including big data pots in some cases.

What we're witnessing now is, I believe, a great deal of industry confusion about how to go about bringing advanced analytics to the IT community — aggravated inevitably by both marketing hype, and, sadly, boxed-in categories from the analyst community wedded far too much to technology and far too little to use case. AIA is, in fact, especially a challenge because it tends to support a diversity of use cases, making it less like a traditional market and more like an architectural revolution (or evolution) in next-generation business service management. Or given current buzzword pre-eminence, let's make that digital service management. At least here the buzzword really does have some genuine meaning and value.

So I'd like to go back to what I believe are AIA's roots. These include tiered or blended capabilities to assimilate data from many different sources — either from many different toolset investments (in recent research our respondents indicated 10-20 toolsets either directly or via an aggregated data store); and/or from a wide variety of sources ranging from transactional data including user and customer behaviors, to log files, to packets and wire data, to events, to Excel spreadsheets, and unstructured data as in text and social media.

What also distinguishes AIA is a unique ability to link critical IT business service interdependencies for both change and performance in context with event, time series, transaction and other data. While many of our research respondents sought out interdependency mapping within the analytics solution itself, probably the most frequent linkage in real adoptions comes from the application discovery and dependency (ADDM) arena, as well as newer, more dynamic instances of CMDBs and federated configuration management systems (CMSs).

The net values of good AIA solutions include much faster time to value and far less administrative overhead than massive data lakes that are created virtually as an end in themselves. The ability to assimilate many multiple "trusted sources" and discover new and unexpected values needn't be an investment in an army of white coats. It can be — in some cases at least — surprisingly dynamic and self-administrating.

This AIA tidal wave is still new. Still a relatively small and distant rise in the information technology ocean. Yet there are already a growing number of AIA innovators with different directions and focus — from cloud, to integrated DevOps and change management, to user and customer and digital experience optimization.

I will be presenting a webinar on November 10 — with a better chance to explain the values of tiered or blended AIA. And I'll be following up with some new research to be completed in Q1 of next year: "Advanced IT Analytics Part II: Deployment Priorities and Lessons Learned." Hopefully the data will reinforce what I believe should be AIA progress toward more effective advanced analytics for IT, and not a sudden dip into white-coated chaos. But then you never know — that's part of the appeal of doing research. Invariably, if it's any good, it will always teach you something new.

Hot Topics

The Latest

Most organizations approach OpenTelemetry as a collection of individual tools they need to assemble from scratch. This view misses the bigger picture. OpenTelemetry is a complete telemetry framework with composable components that address specific problems at different stages of organizational maturity. You start with what you need today and adopt additional pieces as your observability practices evolve ...

One of the earliest lessons I learned from architecting throughput-heavy services is that simplicity wins repeatedly: fewer moving parts, loosely coupled execution (fewer synchronous calls), and precise timing metering. You want data and decisions to travel the shortest possible path. The goal is to build a system where every strategy and each line of code (contention is the key metric) complements the decision trees ...

As discussions around AI "autonomous coworkers" accelerate, many industry projections assume that agents will soon operate alongside human staff in making decisions, taking actions, and managing tasks with minimal oversight. But a growing number of critics (including some of the developers building these systems) argue that the industry still has a long way to go to be able to treat AI agents like fully trusted teammates ...

Enterprise AI has entered a transformational phase where, according to Digitate's recently released survey, Agentic AI and the Future of Enterprise IT, companies are moving beyond traditional automation toward Agentic AI systems designed to reason, adapt, and collaborate alongside human teams ...

The numbers back this urgency up. A recent Zapier survey shows that 92% of enterprises now treat AI as a top priority. Leaders want it, and teams are clamoring for it. But if you look closer at the operations of these companies, you see a different picture. The rollout is slow. The results are often delayed. There's a disconnect between what leaders want and what their technical infrastructure can handle ...

Kyndryl's 2025 Readiness Report revealed that 61% of global business and technology leaders report increasing pressure from boards and regulators to prove AI's ROI. As the technology evolves and expectations continue to rise, leaders are compelled to generate and prove impact before scaling further. This will lead to a decisive turning point in 2026 ...

Cloudflare's disruption illustrates how quickly a single provider's issue cascades into widespread exposure. Many organizations don't fully realize how tightly their systems are coupled to thirdparty services, or how quickly availability and security concerns align when those services falter ... You can't avoid these dependencies, but you can understand them ...

If you work with AI, you know this story. A model performs during testing, looks great in early reviews, works perfectly in production and then slowly loses relevance after operating for a while. Everything on the surface looks perfect — pipelines are running, predictions or recommendations are error-free, data quality checks show green; yet outcomes don't meet the ground reality. This pattern often repeats across enterprise AI programs. Take for example, a mid-sized retail banking and wealth-management firm with heavy investments in AI-powered risk analytics, fraud detection and personalized credit-decisioning systems. The model worked well for a while, but transactions increased, so did false positives by 18% ...

Basic uptime is no longer the gold standard. By 2026, network monitoring must do more than report status, it must explain performance in a hybrid-first world. Networks are no longer just static support systems; they are agile, distributed architectures that sit at the very heart of the customer experience and the business outcomes ... The following five trends represent the new standard for network health, providing a blueprint for teams to move from reactive troubleshooting to a proactive, integrated future ...

APMdigest's Predictions Series concludes with 2026 AI Predictions — industry experts offer predictions on how AI and related technologies will evolve and impact business in 2026. Part 5, the final installment, covers AI's impacts on IT teams ...