Industry experts — from analysts and consultants to users and the top vendors — offer thoughtful, insightful, and often controversial predictions on how APM and related technologies will evolve and impact business in 2016. Part 3 focuses on IT Operations Analytics (ITOA).
ANALYTICS: MAKING BIG DATA ACTIONABLE
In 2016, how well you compete through software will be based on the ability to analyze that software to drive innovation. Effective software analysis will require a full spectrum of data covering business outcomes and customer experience, as well as applications and infrastructure. Companies will be shocked to learn the sheer volume of analytics data thrown off by their applications, and ask hard questions on the most efficient way to harness all that data, specifically whether it should occur on-premise or in the cloud.
Sr. Director, Product Marketing, New Relic
2015 saw the continued growth, maturity, and adoption of infrastructure services from AWS, Azure, and others along with technologies to enable us to store, analyze, and act on large volumes of data. In 2016, APM will see great strides in enabling users to tie all these sources of data together in more intelligent and actionable ways across different application inputs - even providing non-application owners greater access to use that data with less technological constraints.
Analytics in APM will get more attention in 2016. The traditional monitoring elements such as alerts, dashboards and transaction traces will still be the staple of APM. However, a goldmine of insight is available when those billions of metrics are crunched by a big-data engine. While APM vendors are introducing their own analytics solutions, rise of other analytics tools make this movement very interesting and competitive.
Application Support Expert, www.karunsubramanian.com
Companies are increasingly basing their business decisions on insights gleaned from data. More companies are looking to launch big data projects that harvest data from their applications and contribute to real business insights. Anomaly detection capabilities powered by big data can help organizations get to the root cause of issues faster and improve operational efficiency. As more big data projects are launched, organizations will need detailed insights into how their big data applications and infrastructure are performing in real time for easy troubleshooting and optimization.
Senior Marketing Analyst, ManageEngine
We will continue to become data rich and information poor while we try to adapt to the rate of growth that has been enabled with recent innovations and improvements. The rate of increase of information is growing at a rapid scale along with our technology deployments. The advent of microservices architectures, hybrid clouds, containers, multi-cloud application architectures and more has been a massive gain for bringing us the next generation of applications. This means a massively growing scale of performance and application data that is coming in to the IT organization. The availability of telemetry in these rapidly growing platforms will drive the need to move beyond visualizing the state of the environment, to assuring the performance and the health of the environment using software. The good news is apps are easier to deploy. The bad news is that they are more difficult to manage at human-scale.
Principal Solutions Engineer and Technology Evangelist, VMTurbo
In an always on, real-time world that IT now needs to operate in, ITOA products need to be able to accept large amounts of streaming data and analyze it using advance mathematical techniques, all in real-time. The analytics will need to be both pattern and anomaly detection, and causal analyze to help people to spot issues quickly and guide them to the most likely cause. We believe this combination of data and powerful analytics in real-time context is key to realizing the future now.
ANALYTICS: THE RISE OF MACHINE LEARNING
Continuing the trends we saw in 2015 APM, leading vendors will gain additional traction by integrating and relating data back to the user and application. This results in less relevance for siloed tools due to lack of user and application context. We'll continue seeing many new ITOA vendors and capabilities to try to solve the context problem, some will fade, but more will be created. In 2016 we will start seeing early advanced machine learning technologies versus those driven by rules and time based algorithms which exist today.
VP of Market Development and Insights, AppDynamics
Today's APM tools collect data from various applications and infrastructure components and help correlate results. However the last mile decision making is still left to the human intuition. As the role of IT changes in 2016, the IT operations and DevOps experts will be expected to not just ensure application quality but also perform data-driven decision making. APM will be increasingly supplemented by analytics and machine learning – whether in product or via integrations – to help contextual decision making.
Product Marketing Manager, AlertSite, SmartBear
Humans will no longer be able to cope with the amount of scale and change in their environments. The "eyes on glass" approach to traditional APM where humans manually analyze data in charts and dashboards will be replaced with insight derived by software analytics and machine learning algorithms that can automate the detection and analysis of anomalies. Any APM solution that relies on rules, thresholds or configuration to detect anomalies will end up being ignored by IT Operations due to the amount of noise and false positives they will produce.
Chairman, VP of Product Marketing, Moogsoft
We're seeing the addition of machine learning and better data analysis capabilities to tools across the areas of APM, NPM, and infrastructure monitoring. I see a big opportunity to aggregate and analyze data across tools and hosts via self service metrics to surface an even greater degree of insights regardless of complexity or size of a given environment.
VP of Engineering, PagerDuty
In 2016, analytics will no longer be a nice-to-have, but an essential component for any IT operations team looking to identify the root cause of performance issues quickly and easily. Leveraging automated unsupervised machine learning, organizations can analyze millions of data points each minute to determine what normal activity is and what's anomalous. This means that only the right alerts are raised, rather than flooding IT with low priority activity or false positives. Further, behavioral analytics have the ability to grow with organizations by continuously adapting to new and existing patterns within huge volumes of data. As IT environments grow more complex over the coming year, organizations won't need a team of data scientists to keep applications and websites running smoothly, they'll need behavioral analytics.
VP of Products, Prelert
There were two major shifts in 2015 for APM and IT Operation Analytics: (1) we needed to adjust to new technologies like Docker and micro-service architectures; and (2) related, we saw methodologies for how to collect and visualize data for such systems emerge. My take is that 2016 will have a much stronger focus on analytical capabilities. We know it's no longer enough to collect and visualize data, and the need for more sophisticated and advanced analytics will drive innovation. For example, we more frequently see DevOps using log data for more advanced analysis beyond simple searching of events – we're seeing more advanced analytical tasks, such as outlier identification and anomaly detection. The challenge, as always, is for vendors to provide these more advanced capabilities in a way that is consumable for the end user without a very steep learning curve and this tends to be where most products fall down. Striking the right balance will be critical.
Senior Director, Log Management & Search, Rapid7
ANALYTICS: A NEW TYPE OF ITOA
Composable IT analytics will emerge as the most effective means of understanding application performance. This means application performance analysis and infrastructure performance analysis will be combined into a coherent time series to produce a new type of ITOA platform. A composite view will result which will lead to (a) faster resolution of performance problems; (b) more accurate use of resources for initial placement and provisioning (c) safer configuration changes to avoid the creation of performance conflicts.
Co-Founder and VP of Strategy, CloudPhysics
ANALYTICS: CONVERGENCE OF APM AND BUSINESS INTELLIGENCE
In 2016, APM will become the pivotal solution that supports the entire value chain. From development, to operations, to application owners, to CXOs – the convergence of application management, infrastructure monitoring, data analytics and web service APIs will spawn a new breed of APM solutions. These solutions will correlate not only IT data, but business data, market data and social sentiment – bringing a very human perspective to data-driven decision making. There will be seamless movement between business-level dashboards, application topology and impact management, infrastructure health and performance, all the way down to the log-level. Thus allowing all types of users to detect, anticipate and prevent business critical problems at every level.
President of the Performance and Availability Business, BMC Software
ANALYTICS: APM GROWTH VIA ANALYTICS
APM tools will no longer evolve in the number of features in their monitoring stack. The available features will mature and will support new technologies. I expect big steps to be taken by vendors in the analytics space. The core APM product will reach a saturation point and growth can only be achieved in this analytics space.
Online Performance Consultant and Founder of Blue Factory Internet
I expect APM products to improve in terms of how well they understand the applications and platforms that are being monitored. While the last couple of years have brought analytics capabilities into many APM products, customers are realizing that the amount of performance and business data is growing faster than their ability to manually interpret and analyze it. The best APM products will adapt, and start offering automated insight that explains the most important factors which impact applications' health.
Co-founder and CEO, Plumbr
Managing emerging technologies such as Cloud, microservices and containers and SDx are driving organizations to redefine their IT monitoring strategies, according to a new study titled 17 Areas Shaping the Information Technology Operations Market in 2018 from Digital Enterprise Journal (DEJ) ...
Balancing digital innovation with security is critical to helping businesses deliver strong digital experiences, influencing factors such maintaining a competitive edge, customer satisfaction, customer trust, and risk mitigation. But some businesses struggle to meet that balance according to new data ...
In the course of researching, documenting and advising on user experience management needs and directions for more than a decade, I've found myself waging a quiet (and sometimes not so quiet) war with several industry assumptions. Chief among these is the notion that user experience management (UEM) is purely a subset of application performance management (APM). This APM-centricity misses some of UEM's most critical value points, and in a basic sense fails to recognize what UEM is truly about ...
We now live in the kind of connected world where established businesses that are not evolving digitally are in jeopardy of becoming extinct. New research shows companies are preparing to make digital transformation a priority in the near future. However most of them have a long way to go before achieving any kind of mastery over the multiple disciples required to effectively innovate ...
IT Transformation can result in bottom-line benefits that drive business differentiation, innovation and growth, according to new research conducted by Enterprise Strategy Group (ESG) ...
While regulatory compliance is an important activity for medium to large businesses, easy and cost-effective solutions can be difficult to find. Network visibility is an often overlooked, but critically important, activity that can help lower costs and make life easier for IT personnel that are responsible for these regulatory compliance solutions ...
This is the third in a series of three blogs directed at recent EMA research on the digital war room. In this blog, we'll look at three areas that have emerged in a spotlight in and of themselves — as signs of changing times — let alone as they may impact digital war room decision making. They are the growing focus on development and agile/DevOps; the impacts of cloud; and the growing need for security and operations (SecOps) to team more effectively ...
As we've seen, hardware is at the root of a large proportion of data center outages, and the costs and consequences are often exacerbated when VMs are affected. The best answer, therefore, is for IT pros to get back to basics ...
Risk is relative. The Peltzman Effect describes how humans change behavior when risk factors are reduced. They often act more recklessly and drive risk right back up. The phenomenon is recognized by many economists, its effects have been studied in the field of medicine, and I'd argue it is at the root of an interesting trend in IT — namely the increasing cost of downtime despite our more reliable virtualized environments ...
How do enterprises prepare for the future that our Cloud Vision 2020 survey forecasts? I see three immediate takeaways to focus on ...