APMdigest asked experts across the industry — including analysts, consultants and vendors — for their opinions on the next steps for ITOA. These next steps include where the experts believe ITOA is headed, as well as where they think it should be headed. Part 2 covers visibility and data.
Start with Next Steps for ITOA - Part 1
REAL-TIME DATA
We see IT operational analytics evolving into a real-time process. Today, the vast majority of ITOA platforms are "post-facto" solutions analyzing events and problems that occurred in the past. They need to evolve to a true machine-learning based real-time to-the-millisecond approach. Such an approach starts with and requires wire data sources of information, which log analysis products do not possess. Modern data center infrastructure managers can't afford to react to problems. They need to predict and proactively take action in real-time, which means having the ITOA platform accessing true real-time data.
Len Rosenthal
CMO, Virtual Instruments
ITOA's next major evolution is the harnessing of real-time big data for performance management. Infrastructure, networks, and apps throw off massive volumes of relevant performance data, but ITOA had no way to process and make use of it at high resolution. Meanwhile, big data technologies focused first on off-line business intelligence problems, but are now increasingly applied to real-time, operational use cases. Big data ITOA platforms will unify key performance data sets in real-time and give operators a comprehensive, high-resolution view of performance across their enterprise, with the data instantly at hand to solve even the toughest performance problems.
Mark Sarbiewski
CMO, Kentik
IT and Security teams are drowning in dashboards and alerts in an attempt to derive answers from a sea of data. Machine learning done right will take IT Operations Analytics to the next level by proactively detecting and surfacing issues that might affect availability or security. IT teams will start relying on machine learning as a form of intelligence augmentation, or IA. To make this transition successful, high-fidelity, real-time telemetry will become a must-have.
Jesse Rothstein
Co-Founder and CTO, ExtraHop
BIG DATA
The adoption of big data principles by ITOA will follow a similar path to that of previous big data technologies resulting in an IT Operational data lake with a large analytics platform serving up intelligence, visibility tools, reporting and predictive analytics.
Trace3 Research 360 View Trend Report: IT Operations Monitoring & Analytics (ITOMA)
SMART DATA
Data is the driving force behind analytics, serving a vital role in providing much needed application assurance and insight into service delivery. One of the biggest problems IT faces is the large volume of unstructured data delivered at high velocity from a variety of disparate sources. This continuous tsunami of data does not translate into actionable insight, even when used with advanced analytics. Analytics needs to be powered by smart data that is well-structured, contextual, available in real-time, and based on pervasive visibility across the entire enterprise. Since every action and transaction traverses the enterprise through traffic flows or wire-data, it is the best source of information to glean actionable insight from in complex IT environments and to detect and investigate hidden threats faster and more accurately. When it comes to service assurance and cybersecurity, better analytics starts with smart data.
Ron Lifton
Senior Solutions Marketing Manager, NetScout
EASY ACCESS
Reviewers of APM solutions on the IT Central Station platform highlight the features that enable users to take the next step in IT Operations Analytics, namely the ability to access instrumented data analytics at any point in time. In one product review of IT operational analytics solution, a user writes that what's important is drilling down data from all different systems, in a minimal amount of time, and not impacting any performance on the servers. According to IT Central Station reviewers, these systems should be easily readable to users, so that scenarios that require problem solving are simple to identify and subsequently fix.
Russell Rothstein
Founder and CEO, IT Central Station
Last year we saw significant progress in the ITOA space blending and correlating multiple data sources. However, most of the ITOA solutions still require customers to slice and dice outcomes of blended analysis to interpret them or present these outcomes in a complex, specialized manner. I expect that this year ITOA technologies will expand use of recent advances in machine learning to automate data interpretation. The result will be a generation of specific, easy to understand insights that can be utilized by Operations teams without significant training and investigation overhead. The complexity of the analytics will be hidden from the users which will be just reading and acting on automatically generated findings, guidelines and instructions presented in a human language.
Sasha Gilenson
CEO, Evolven
UNIVERSAL DASHBOARD
Since ITOA tools combine data from multiple data sources into a single system, we will finally reach the goal of "one single dashboard" rather than siloed, disjointed reporting tools.
Kimberley Parsons Trommler
Product Evangelist, Paessler AG
COMPLETE OBSERVABILITY
We're seeing ITOA move from just visualizing data to getting complete observability into the application infrastructure. Visualizing data using charts and graphs on a dashboard is no longer sufficient for today's hyper-scale applications. So ITOA is moving towards using machine learning and artificial intelligence to understand the "normal" behavior of all data – potentially millions of metrics – then immediately surface anomalies when they occur.
JF Huard, Ph.D.
Founder and CTO, Perspica
Read Next Steps for ITOA - Part 3, covering monitoring and user experience.
The Latest
Interestingly, some experts say that — although convergence is happening, and sharing the data has great value — the security dashboards should not necessarily be combined with observability dashboards for ITOps, NetOps or DevOps ...
The experts have all agreed that security teams can gain great benefits from utilizing observability data. But does this mean security and observability tools should be integrated, or even combined? ...
One reason why observability and security make a good pairing is that traditional telemetry signals — metrics, logs, and traces — are helpful to maintain both performance and security ...
Observability and security — are they a match made in IT heaven, or a culture clash from IT hell? Sorry to be so dramatic, but it's actually a serious question that has gravity. The convergence of observability and security could change IT operations as we know it. And many IT authorities see this as a good thing. With input from industry experts — both analysts and vendors — this 8-part blog series to be posted over the next two weeks will explore what is driving this convergence, the challenges and advantages, and how it may transform the IT landscape ...
The journey of maturing observability practices for users entails navigating peaks and valleys. Users have clearly witnessed the maturation of their monitoring capabilities, embraced DevOps practices, and adopted cloud and cloud-native technologies. Notwithstanding that, we witness the gradual increase of the Mean Time To Recovery (MTTR) for production issues year over year ...
Optimizing existing use of cloud is the top initiative — for the seventh year in a row, reported by 62% of respondents in the Flexera 2023 State of the Cloud Report ...
Gartner highlighted four trends impacting cloud, data center and edge infrastructure in 2023, as infrastructure and operations teams pivot to support new technologies and ways of working during a year of economic uncertainty ...
Developers need a tool that can be portable and vendor agnostic, given the advent of microservices. It may be clear an issue is occurring; what may not be clear is if it's part of a distributed system or the app itself. Enter OpenTelemetry, commonly referred to as OTel, an open-source framework that provides a standardized way of collecting and exporting telemetry data (logs, metrics, and traces) from cloud-native software ...
As SLOs grow in popularity their usage is becoming more mature. For example, 82% of respondents intend to increase their use of SLOs, and 96% have mapped SLOs directly to their business operations or already have a plan to, according to The State of Service Level Objectives 2023 from Nobl9 ...
Observability has matured beyond its early adopter position and is now foundational for modern enterprises to achieve full visibility into today's complex technology environments, according to The State of Observability 2023, a report released by Splunk in collaboration with Enterprise Strategy Group ...