Q&A Part Two: IBM Talks About Predictive Analytics
January 31, 2012
Share this

In Part Two of APMdigest's exclusive interview, Matthew Ellis, IBM Vice President of Service Availability and Performance, discusses predictive analytics.

Click here to start with Part One of the Q&A with IBM VP Matthew Ellis.

APM: Why is predictive analytics gaining so much momentum recently, especially with respect to APM?

ME: Analytics is important to all phases of operations. In all areas of business it is axiomatic that more data enables better decisions, and operations and application management are no exceptions.

Just as important, however, is sorting that data to identify the critical context for decision makers to act on, and this is where analytics come in.

IBM is investing in analytics very seriously, and from an operations management perspective, we apply analytics in three categories: Simplify Operations Management, Avoid Business Disruption, and Enable Optimization.

Simplify Operations Management is a class of analytics technology that enables our customers to do the work that they do today more easily. This includes historical analysis of data to recommend and establish dynamic thresholds, and trending of performance and capacity data to identify areas that may become bottlenecks based on historical behavior.

Avoid Business Disruption is the key driver for the predictive analytics component. The goal is early identification of environmental changes that indicate a significant change in the behavior of an application or service, and to bring this information to the attention of the operations management team so that problems can be identified and addressed before they ever impact a customer. We have identified emerging problems days before traditional management tools saw signs of trouble and in some situations, discovered problems in unmonitored resources that were affecting the behavior of critical applications.

Enable Optimization is the ability to mine collected data across multiple dimensions enabling insight and optimization of services and applications by enabling rich insight. It is also known as business analytics.

APM: What specific functionality should an organization look for in predictive analytics technology?

ME: At IBM, we believe there are three key capabilities that any analytics solution must have to provide maximum predictive capability:

1. Algorithms: Multivariate Analytic techniques are critical to identifying emerging problems early, while all metric data is still well within their normal range.

The key to this statistical approach is to monitor the relationships of important related data metrics and raise an exception when the relationships of data change in significant ways. Any single metric displays a wide range of variability during a normal day, increasing and decreasing with changing workloads, and daily, weekly and seasonal behavior.

In general, however, related metrics will follow the same pattern all the time in a healthy system. Successfully identifying these relationships, and accurately determining when these relationships diverge in an important way is key to accurate early identification of problems.

Our algorithms are developed and refined by one of the largest private math departments in the world; the same organization that developed Watson to win at Jeopardy.

2. Scalability: Analytics solutions work better when they have more data upon which to base their conclusions. The IBM analytics solutions directly leverage proven data collection technologies that have been in use for most of a decade and have seen continual refinement. This capability is proven to be able to collect millions of data points per second, and deliver that data to the analytics engine with very low latency offering real-time evaluation of very large data streams. We believe that the data collection technology we are using is the most scalable and high performance in the industry.

3. Breadth of Monitored Resources: One of our design requirements was to deliver an easily extensible mediation capability allowing customers (or our services teams) to connect any data source to our data collection solution in a matter of hours or days.

During our pilot, we have worked with many products from non-IBM vendors and our team has found that almost all data integration work can be done in a very short time without ever requiring a visit to the customer site, saving time and money while maximizing data availability for analysis.

APM: How do you see Predictive Analytics evolving over the next few years?

ME: IBM expects that analytics tools, and the organizations that use them, will evolve rapidly over the next few years. IBM is investing heavily in providing highly scalable, flexible, and robust systems for identifying emerging problems as early as possible.

We expect analytics to evolve along multiple dimensions:

1. Improvements in analytics learning and data exchange with existing application and service discovery, topology, and CMDB data to combine the strengths of traditional IT tools with analytics learning solutions. This will accelerate the statistical learning process and allow the learned relationships to be built back into the visible topology of the environment.

2. Apply analytics solutions to additional IT management domains to include Smarter Infrastructures, improved detection of security problems, asset management and maintenance scheduling and additional problems

3. Further improve feedback and integration of learning technologies, process optimization, and analytics in general with operations processes.

About Matthew Ellis

Matthew Ellis is the Vice President of Development for Tivoli's Service Availability & Performance Management product portfolio with IBM. This product suite enables monitoring and modeling the utilization, performance, capacity and energy-use of distributed, mainframe and virtualized platforms and associated application software. Ellis joined IBM in 2006 through the Micromuse acquisition, where he was the Vice President of Software Development.

Click here to read Part One of the Q&A with IBM VP Matthew Ellis.

Share this

The Latest

January 20, 2017

Traditionally, Application Performance Management (APM) is usually associated with solutions that instrument application code. There are two fundamental limitations with such associations. If instrumenting the code is what APM is all about, then APM is applicable only to homegrown applications for which access to code is available ...

January 19, 2017

The correlation between mobile app crashes and increasing churn rates (or declining user retention) has long been suspected. In the report, titled Crash and Churn, Apteligent set out to understand the impact of per user crash rate on churn ...

January 18, 2017

In Fall 2016, Paessler AG surveyed 650 system administrators from 49 countries to get a "state of the SysAdmin" and find out how their jobs are changing, how they spend their time, and what their priorities are. The survey responses led to some interesting findings – namely, that when it comes to today's SysAdmins, things are not as they seem. Here are some of the key findings that illustrate the gap between perception and reality ...

January 17, 2017

Choosing an application performance monitoring (APM) solution can be a daunting task. A quick Google search will show popular products, but there's also a long list of less-well-known open source products available, too. So how do you choose the right solution? ...

January 13, 2017

Digital transformation is a key initiative for enterprises that want to reach new customers and offer greater value via technology. Changing user expectations, new modes of engagement and the need to improve responsiveness are the main factors driving companies to update outdated processes and develop new applications as part of a digital transformation strategy. But in order to deliver on the promise of digital transformation, organizations must also modernize their IT infrastructure to support speed, scale and change ...

January 12, 2017

Digital transformation is evolving the enterprise to one in which high performance applications are now the norm as organizations use video, graphics and other information intensive multimedia to populate these new channels of engagement. Digital technologies, and high performance applications, create further pressure on IT staffs which are grappling with PCs that are past their optimum performance. As a result, IT is looking at alternatives to swapping out PCs and investing in more costly equipment that will inevitably have an expiration date. One solution is to build on virtualization solutions that incorporate high-performance thin clients ...

January 11, 2017

If your business depends on mission-critical web or legacy applications, then monitoring how your end users interact with your applications is critical. Most monitoring solutions try to infer the end-user experience based on resource utilization. However, resource utilization cannot provide meaningful results on how the end-user is experiencing an interaction with an application. The true measurement of end-user experience is availability and response time of the application, end-to-end and hop-by-hop ...

January 10, 2017

There's nothing like a major web outage to remind us how much our applications rely on other web services and technologies to function. In late October of last year, a Distributed Denial of Service (DDoS) attack on Dyn, one of the largest Domain Name Service (DNS) providers on the internet, disrupted service for consumer and business applications across the web. This attack shed light on the delicate interdependent nature of the web as productivity and uptime across the world was effected ...

January 09, 2017

As an IT professional, I'm used to words that mean different things to different people. For example, "log monitoring" could mean anything from simple text files to logfile aggregation systems. "Uptime" is also notoriously hard to nail down. Heck, even the word "monitoring" itself can be obscure. This is why I'm not surprised that application performance monitoring (APM) can mean so many different things depending on the context ...

January 06, 2017

Big data continues to be the fastest-growing segment of the information management software market. New findings released by Ovum estimate that the big data market will grow from $1.7bn in 2016 to $9.4bn by 2020, comprising 10% of the overall market for information management tooling ...