APMdigest asked experts across the industry — including analysts, consultants and vendors — for their opinions on the next steps for ITOA. These next steps include where the experts believe ITOA is headed, as well as where they think it should be headed. Part 2 covers visibility and data.
Start with Next Steps for ITOA - Part 1
We see IT operational analytics evolving into a real-time process. Today, the vast majority of ITOA platforms are "post-facto" solutions analyzing events and problems that occurred in the past. They need to evolve to a true machine-learning based real-time to-the-millisecond approach. Such an approach starts with and requires wire data sources of information, which log analysis products do not possess. Modern data center infrastructure managers can't afford to react to problems. They need to predict and proactively take action in real-time, which means having the ITOA platform accessing true real-time data.
CMO, Virtual Instruments
ITOA's next major evolution is the harnessing of real-time big data for performance management. Infrastructure, networks, and apps throw off massive volumes of relevant performance data, but ITOA had no way to process and make use of it at high resolution. Meanwhile, big data technologies focused first on off-line business intelligence problems, but are now increasingly applied to real-time, operational use cases. Big data ITOA platforms will unify key performance data sets in real-time and give operators a comprehensive, high-resolution view of performance across their enterprise, with the data instantly at hand to solve even the toughest performance problems.
IT and Security teams are drowning in dashboards and alerts in an attempt to derive answers from a sea of data. Machine learning done right will take IT Operations Analytics to the next level by proactively detecting and surfacing issues that might affect availability or security. IT teams will start relying on machine learning as a form of intelligence augmentation, or IA. To make this transition successful, high-fidelity, real-time telemetry will become a must-have.
Co-Founder and CTO, ExtraHop
The adoption of big data principles by ITOA will follow a similar path to that of previous big data technologies resulting in an IT Operational data lake with a large analytics platform serving up intelligence, visibility tools, reporting and predictive analytics.
Trace3 Research 360 View Trend Report: IT Operations Monitoring & Analytics (ITOMA)
Data is the driving force behind analytics, serving a vital role in providing much needed application assurance and insight into service delivery. One of the biggest problems IT faces is the large volume of unstructured data delivered at high velocity from a variety of disparate sources. This continuous tsunami of data does not translate into actionable insight, even when used with advanced analytics. Analytics needs to be powered by smart data that is well-structured, contextual, available in real-time, and based on pervasive visibility across the entire enterprise. Since every action and transaction traverses the enterprise through traffic flows or wire-data, it is the best source of information to glean actionable insight from in complex IT environments and to detect and investigate hidden threats faster and more accurately. When it comes to service assurance and cybersecurity, better analytics starts with smart data.
Senior Solutions Marketing Manager, NetScout
Reviewers of APM solutions on the IT Central Station platform highlight the features that enable users to take the next step in IT Operations Analytics, namely the ability to access instrumented data analytics at any point in time. In one product review of IT operational analytics solution, a user writes that what's important is drilling down data from all different systems, in a minimal amount of time, and not impacting any performance on the servers. According to IT Central Station reviewers, these systems should be easily readable to users, so that scenarios that require problem solving are simple to identify and subsequently fix.
Founder and CEO, IT Central Station
Last year we saw significant progress in the ITOA space blending and correlating multiple data sources. However, most of the ITOA solutions still require customers to slice and dice outcomes of blended analysis to interpret them or present these outcomes in a complex, specialized manner. I expect that this year ITOA technologies will expand use of recent advances in machine learning to automate data interpretation. The result will be a generation of specific, easy to understand insights that can be utilized by Operations teams without significant training and investigation overhead. The complexity of the analytics will be hidden from the users which will be just reading and acting on automatically generated findings, guidelines and instructions presented in a human language.
Since ITOA tools combine data from multiple data sources into a single system, we will finally reach the goal of "one single dashboard" rather than siloed, disjointed reporting tools.
Kimberley Parsons Trommler
Product Evangelist, Paessler AG
We're seeing ITOA move from just visualizing data to getting complete observability into the application infrastructure. Visualizing data using charts and graphs on a dashboard is no longer sufficient for today's hyper-scale applications. So ITOA is moving towards using machine learning and artificial intelligence to understand the "normal" behavior of all data – potentially millions of metrics – then immediately surface anomalies when they occur.
JF Huard, Ph.D.
Founder and CTO, Perspica
Read Next Steps for ITOA - Part 3, covering monitoring and user experience.
Today, there are multiple market research studies that discuss and estimate a thriving growth for the application development segment. The market scenario seems relevant and business-ready for the growing popularity of applications. In order to keep the performance and functioning of the applications upbeat, enterprises are increasingly considering application performance management (APM) ...
Self-service and the concept of “Shift Left” are some of the phrases you will hear the most in the modern service management industry. The reason being is that you want to provide your users with the most important knowledge that you can to help them solve their issues and problems themselves, saving you time to focus on more important priorities. It’s a common problem, sort of a chicken and egg approach, but when you help your internal teams better meet their needs through such efforts, you also want to make sure that what is best for your service department also is best for your users ...
Confidence in satisfying and supporting core IT has diminished due in part to a strain on declining IT budgets and initiatives now progressing beyond implementation into production mode, according to TEKsystems' annual IT Forecast research ...
Making predictions is always a gamble. But given the way 2017 played out and the way 2018 is shaping up, odds are that certain technology trends will play a significant role in your IT department this year ...
With more than one-third of IT Professionals citing "moving faster" as their top goal for 2018, and an overwhelming 99 percent of IT and business decision makers noticing an increasing pace of change in today's connected world, it's clear that speed has become intrinsically linked to business success. For companies looking to compete in the digital economy, this pace of transformation is being driven by their customers and requires speedy software releases, agility through cloud services, and automation ...
Looking back on this year, we can see threads of what the future holds in enterprise networking. Specifically, taking a closer look at the biggest news and trends of this year, IT areas where businesses are investing and perspectives from the analyst community, as well as our own experiences, here are five network predictions for the coming year ...
As we enter 2018, businesses are busy anticipating what the new year will bring in terms of industry developments, growing trends, and hidden surprises. In 2017, the increased use of automation within testing teams (where Agile development boosted speed of release), led to QA becoming much more embedded within development teams than would have been the case a few years ago. As a result, proper software testing and monitoring assumes ever greater importance. The natural question is – what next? Here are some of the changes we believe will happen within our industry in 2018 ...
Application Performance Monitoring (APM) has become a must-have technology for IT organizations. In today’s era of digital transformation, distributed computing and cloud-native services, APM tools enable IT organizations to measure the real experience of users, trace business transactions to identify slowdowns and deliver the code-level visibility needed for optimizing the performance of applications. 2018 will see the requirements and expectations from APM solutions increase in the following ways ...
We don't often enough look back at the prior year’s predictions to see if they actually came to fruition. That is the purpose of this analysis. I have picked out a few key areas in APMdigest's 2017 Application Performance Management Predictions, and analyzed which predictions actually came true ...
Planning for a new year often includes predicting what’s going to happen. However, we don't often enough look back at the prior year’s predictions to see if they actually came to fruition. That is the purpose of this analysis. I have picked out a few key areas in APMdigest's 2017 Application Performance Management Predictions, and analyzed which predictions actually came true ...