IT Operations Analytics Can Provide Eye-Opening Insights
July 26, 2013

Diego Lomanto

Share this

There is no denying the role of application monitoring in today’s application-centric marketplace. Applications are everywhere and most businesses rely on them to operate and compete (not to mention generate revenue, manage customers, etc.) It’s no surprise then that the monitoring of said applications is rapidly become a top priority in IT management circles, both for enterprises and small-to-mid sized businesses (SMBs).

Modern vendors and sophisticated technology has paved the way for application performance monitoring (APM) software that is easy-to-use, fast-to-deploy, and simple-to-understand. Unfortunately, progressive businesses are looking to do more than simply monitor their applications. They ambitiously want to optimize said applications and drive business value using the abundance of data generated from APM platforms. This is where the worlds of APM and Big Data collide.

Complexity is the New Normal

The spiraling complexity of application environments and IT architecture is making it harder than ever to collect and analyze reams of technical (i.e. application) data to generate operational improvements and business momentum.

How can I reduce my costs by learning which of my servers is under-utilized?

What browsers and devices are customers using to access my application and what are the implications?

What are the most common transactions (i.e. checkout, product overview, etc.) in my application and how can I leverage that knowledge to my advantage?

These are the types of questions that application owners and IT operations managers don’t have time to answer because they’re too busy monitoring the performance and availability of their applications (a full time job in its own right).

More than that though, they don’t have the tools or technology to analyze a myriad of data to generate meaningful business insights. These insights, which traditionally would take weeks to uncover, are usually only extractable by expensive and seasoned data scientists.

Enter ITOA

The advent of IT Operations Analytics (ITOA), however, is finally allowing IT managers, operations personnel and business executives to benefit from sophisticated analytics capabilities.

In a recent session by Gartner Analyst Will Cappelli, titled IT Operations Analytics: Big Data for the Data Center, Cappelli suggested that Global 2000 Enterprises will likely leverage ITOA platforms as a centerpiece to their next-generation performance and availability monitoring architectures.

Gartner even predicts that 60% of global 2000 enterprises will consider ITOA a higher priority in 2015, up from 20% in 2012.

Coupling ITOA with modern APM can produce compelling insights about application usage that can be used to fuel a slew of operational efficiencies. Let’s look at an example.

What if your ITOA platform was able to analyze your application’s transaction data on the fly and rank the most common transactions? ITOA platforms not only tell you the most common transactions, but how often they’re executing, how long they’re taking, and how many errors are happening. That ability to seamlessly couple analytics and application data is incredibly alluring to IT teams who need quick and simple insights to act on.

How else can ITOA simplify the lives of IT managers looking to drive quick and impactful business improvements? How about using browser and device analytics to better understand — and improve — customer interactions with your application.

37.4% of the users of my application are using Chrome. Users accessing my application via IE 8 experienced a total of 16 errors. Maybe we want to re-examine the compatibility of our application with IE 8. Before the advent of ITOA, this data would have taken days to synthesize and understand. Now, it’s automatically done using advanced analytics platforms such as ITOA. It’s easy to see why IT operations managers seeking to better understand how their applications affect their business, end-users, and infrastructure are bullish on the potential of ITOA.

According to Gartner, while spend on ITOA amounted to approximately $350 million in 2011, they anticipate that Global 2000 enterprises will have spent $720 million on ITOA licenses and first-year maintenance in 2012, plus an additional $30 million on managed services or software as a service (SaaS).

More interestingly, they expect that amount to grow by an average of 15% a year for the next five years (although it is likely that the mix of on-premises to service spend will shift dramatically in favor of services in that same time period).

Businesses that have already implemented some form of ITOA technology have reported achieving significant cuts in their mean time to repair (MTTR), reduced number of incident and downtime, and smooth error-free releases.


As a key component of the next generation of APM, ITOA has the potential to fundamentally change how IT departments operate. There is no denying the combination of APM and ITOA, if approached correctly, could prove enormously beneficial. It will allow business to extract value from processes such as statistical pattern-based analysis, event correlation analysis, heuristics-based analytics, and log analysis.

Is ITOA the end-all, be all? Not even close. The consensus among distinguished APM experts seems to be that ITOA will serve as a complement, not a replacement, to conventional APM solutions. Gartner asserts that “analytics will play a larger part in future APM solutions as data volumes and complexity increases, but enterprises will realize that analytics engines are far more effective when set to work on data gathered and organized by more traditional APM technologies.”

The stage is set. The future of ITOA is bright. How it will all play out? No one knows for sure, but we’ll all certainly be watching.

Diego Lomanto is Vice President of Marketing for OpTier.

Share this