Industry experts — from analysts and consultants to users and the top vendors — offer thoughtful, insightful, and often controversial predictions on how APM and related technologies will evolve and impact business in 2016. Part 3 focuses on IT Operations Analytics (ITOA).
ANALYTICS: MAKING BIG DATA ACTIONABLE
In 2016, how well you compete through software will be based on the ability to analyze that software to drive innovation. Effective software analysis will require a full spectrum of data covering business outcomes and customer experience, as well as applications and infrastructure. Companies will be shocked to learn the sheer volume of analytics data thrown off by their applications, and ask hard questions on the most efficient way to harness all that data, specifically whether it should occur on-premise or in the cloud.
Sr. Director, Product Marketing, New Relic
2015 saw the continued growth, maturity, and adoption of infrastructure services from AWS, Azure, and others along with technologies to enable us to store, analyze, and act on large volumes of data. In 2016, APM will see great strides in enabling users to tie all these sources of data together in more intelligent and actionable ways across different application inputs - even providing non-application owners greater access to use that data with less technological constraints.
Analytics in APM will get more attention in 2016. The traditional monitoring elements such as alerts, dashboards and transaction traces will still be the staple of APM. However, a goldmine of insight is available when those billions of metrics are crunched by a big-data engine. While APM vendors are introducing their own analytics solutions, rise of other analytics tools make this movement very interesting and competitive.
Application Support Expert, www.karunsubramanian.com
Companies are increasingly basing their business decisions on insights gleaned from data. More companies are looking to launch big data projects that harvest data from their applications and contribute to real business insights. Anomaly detection capabilities powered by big data can help organizations get to the root cause of issues faster and improve operational efficiency. As more big data projects are launched, organizations will need detailed insights into how their big data applications and infrastructure are performing in real time for easy troubleshooting and optimization.
Senior Marketing Analyst, ManageEngine
We will continue to become data rich and information poor while we try to adapt to the rate of growth that has been enabled with recent innovations and improvements. The rate of increase of information is growing at a rapid scale along with our technology deployments. The advent of microservices architectures, hybrid clouds, containers, multi-cloud application architectures and more has been a massive gain for bringing us the next generation of applications. This means a massively growing scale of performance and application data that is coming in to the IT organization. The availability of telemetry in these rapidly growing platforms will drive the need to move beyond visualizing the state of the environment, to assuring the performance and the health of the environment using software. The good news is apps are easier to deploy. The bad news is that they are more difficult to manage at human-scale.
Principal Solutions Engineer and Technology Evangelist, VMTurbo
In an always on, real-time world that IT now needs to operate in, ITOA products need to be able to accept large amounts of streaming data and analyze it using advance mathematical techniques, all in real-time. The analytics will need to be both pattern and anomaly detection, and causal analyze to help people to spot issues quickly and guide them to the most likely cause. We believe this combination of data and powerful analytics in real-time context is key to realizing the future now.
ANALYTICS: THE RISE OF MACHINE LEARNING
Continuing the trends we saw in 2015 APM, leading vendors will gain additional traction by integrating and relating data back to the user and application. This results in less relevance for siloed tools due to lack of user and application context. We'll continue seeing many new ITOA vendors and capabilities to try to solve the context problem, some will fade, but more will be created. In 2016 we will start seeing early advanced machine learning technologies versus those driven by rules and time based algorithms which exist today.
VP of Market Development and Insights, AppDynamics
Today's APM tools collect data from various applications and infrastructure components and help correlate results. However the last mile decision making is still left to the human intuition. As the role of IT changes in 2016, the IT operations and DevOps experts will be expected to not just ensure application quality but also perform data-driven decision making. APM will be increasingly supplemented by analytics and machine learning – whether in product or via integrations – to help contextual decision making.
Product Marketing Manager, AlertSite, SmartBear
Humans will no longer be able to cope with the amount of scale and change in their environments. The "eyes on glass" approach to traditional APM where humans manually analyze data in charts and dashboards will be replaced with insight derived by software analytics and machine learning algorithms that can automate the detection and analysis of anomalies. Any APM solution that relies on rules, thresholds or configuration to detect anomalies will end up being ignored by IT Operations due to the amount of noise and false positives they will produce.
Chairman, VP of Product Marketing, Moogsoft
We're seeing the addition of machine learning and better data analysis capabilities to tools across the areas of APM, NPM, and infrastructure monitoring. I see a big opportunity to aggregate and analyze data across tools and hosts via self service metrics to surface an even greater degree of insights regardless of complexity or size of a given environment.
VP of Engineering, PagerDuty
In 2016, analytics will no longer be a nice-to-have, but an essential component for any IT operations team looking to identify the root cause of performance issues quickly and easily. Leveraging automated unsupervised machine learning, organizations can analyze millions of data points each minute to determine what normal activity is and what's anomalous. This means that only the right alerts are raised, rather than flooding IT with low priority activity or false positives. Further, behavioral analytics have the ability to grow with organizations by continuously adapting to new and existing patterns within huge volumes of data. As IT environments grow more complex over the coming year, organizations won't need a team of data scientists to keep applications and websites running smoothly, they'll need behavioral analytics.
VP of Products, Prelert
There were two major shifts in 2015 for APM and IT Operation Analytics: (1) we needed to adjust to new technologies like Docker and micro-service architectures; and (2) related, we saw methodologies for how to collect and visualize data for such systems emerge. My take is that 2016 will have a much stronger focus on analytical capabilities. We know it's no longer enough to collect and visualize data, and the need for more sophisticated and advanced analytics will drive innovation. For example, we more frequently see DevOps using log data for more advanced analysis beyond simple searching of events – we're seeing more advanced analytical tasks, such as outlier identification and anomaly detection. The challenge, as always, is for vendors to provide these more advanced capabilities in a way that is consumable for the end user without a very steep learning curve and this tends to be where most products fall down. Striking the right balance will be critical.
Senior Director, Log Management & Search, Rapid7
ANALYTICS: A NEW TYPE OF ITOA
Composable IT analytics will emerge as the most effective means of understanding application performance. This means application performance analysis and infrastructure performance analysis will be combined into a coherent time series to produce a new type of ITOA platform. A composite view will result which will lead to (a) faster resolution of performance problems; (b) more accurate use of resources for initial placement and provisioning (c) safer configuration changes to avoid the creation of performance conflicts.
Co-Founder and VP of Strategy, CloudPhysics
ANALYTICS: CONVERGENCE OF APM AND BUSINESS INTELLIGENCE
In 2016, APM will become the pivotal solution that supports the entire value chain. From development, to operations, to application owners, to CXOs – the convergence of application management, infrastructure monitoring, data analytics and web service APIs will spawn a new breed of APM solutions. These solutions will correlate not only IT data, but business data, market data and social sentiment – bringing a very human perspective to data-driven decision making. There will be seamless movement between business-level dashboards, application topology and impact management, infrastructure health and performance, all the way down to the log-level. Thus allowing all types of users to detect, anticipate and prevent business critical problems at every level.
President of the Performance and Availability Business, BMC Software
ANALYTICS: APM GROWTH VIA ANALYTICS
APM tools will no longer evolve in the number of features in their monitoring stack. The available features will mature and will support new technologies. I expect big steps to be taken by vendors in the analytics space. The core APM product will reach a saturation point and growth can only be achieved in this analytics space.
Online Performance Consultant and Founder of Blue Factory Internet
I expect APM products to improve in terms of how well they understand the applications and platforms that are being monitored. While the last couple of years have brought analytics capabilities into many APM products, customers are realizing that the amount of performance and business data is growing faster than their ability to manually interpret and analyze it. The best APM products will adapt, and start offering automated insight that explains the most important factors which impact applications' health.
Co-founder and CEO, Plumbr
The retail industry is highly competitive, and as retailers move online and into apps, tech factors play a deciding role in brand differentiation. According to a recent QualiTest survey, a lack of proper software testing — meaning glitches and bugs during the shopping experience — is one of the most critical factors in affecting consumer behavior and long-term business ...
Consumers aren't patient, and they are only one back-button click from Google search results and competitors' websites. A one-second delay can bump the bounce rate by almost 50 percent on mobile, and a two-second delay more than doubles it ...
Optimizing online web performance is critical to keep and convert customers and achieve success for the holidays and the entire retail year. Recent research from Akamai indicates that website slowdowns as small as 100 milliseconds can significantly impact revenues ...
Public sector organizations undergoing digital transformation are losing confidence in IT Operations' ability to manage the influx of new technologies and evolving expectations, according to the 2017 Splunk Public Sector IT Operations Survey ...
It's no surprise that web application quality is incredibly important for businesses; 99 percent of those surveyed by Sencha are in agreement. But despite technological advances in testing, including automation, problems with web application quality remain an issue for most businesses ...
Market hype and growing interest in artificial intelligence (AI) are pushing established software vendors to introduce AI into their product strategy, creating considerable confusion in the process, according to Gartner. Analysts predict that by 2020, AI technologies will be virtually pervasive in almost every new software product and service ...
Organizations are encountering user, revenue or customer-impacting digital performance problems once every five days, according a new study by Dynatrace. Furthermore, the study reveals that individuals are losing a quarter of their working lives battling to address these problems ...
Cloud adoption is still the most vexing factor in increased network complexity, ahead of the internet of things (IoT), software-defined networking (SDN), and network functions virtualization (NFV), according to a new survey conducted by Kentik ...
Gigabit speeds and new technologies are driving new capabilities and even more opportunities to innovate and differentiate. Faster compute, new applications and more storage are all working together to enable greater efficiency and greater power. Yet with opportunity comes complexity ...