Information is Power, But Only If ...
July 28, 2016

Robin Lyon
AppEnsure

Share this

IT has access to an amazing amount of data. Often we collect hundreds of data points on one server such as individual processor load, thread state, disk throughput both in and out etc. We then store this in a bin and use this to create a metric called something similar to server performance. When it comes time to provide reports (weekly, monthly and so on) IT then assigns some poor person the job of collating this information. This is usually done by running a report and importing it into a spread sheet and then combining various servers and metrics into some grouping and calling it an application. Then some numbers are calculated and saved in the spreadsheet to create a performance over time graph. The same is done with database numbers, application performance, network statistics etc. This process is then repeated by levels of management combining more numbers into a single number to represent service performance to allow reporting to more senior levels of management.

Given that IT is all about automating processes, this has struck me as somewhat backwards.

Data Management and IT – Operational Intelligence

IT by and large is staffed by realists – the type that don’t respond well to marketing, want solutions and have little time for repetition.

A second reality is that IT is a fledgling science. While it has a century under its’ belt, it has not developed some niceties like the common taxonomy of biology; every company creates its own rankings and groupings of IT functions. Quite often a great deal of resources are used in creating the custom taxonomy.

To add to the frustration of IT managers everywhere, different off the shelf applications also present data in the taxonomy that is coded specific to that application. It becomes more and more difficult to extract and combine data in a meaningful way.

An IT user friendly application should allow its user base to create rules for the grouping of data for reports. By allowing atomic bits of data, such as unused server capacity for a select group of servers, it now can report on the unused server capacity for an application. Using this application data as a new data point, the well-designed application will allow another ad hoc grouping to provide information on an over-all service.

This process of using groups to create other groups goes on as needed until the application is configured to match the taxonomy the company has designed. Instead of complex calculations each month, a one-time setup is created and automation is achieved.

By allowing different data elements to be members of more than one group, we can avoid a second common pitfall such as the question of factoring the time of DNS queries or a multi-application database server.

IT needs to save time, and its internal applications need to accept the reality of reporting against an ever changing data set that is custom to each company that uses it.

Robin Lyon is Director of Analytics at AppEnsure.

Share this

The Latest

May 20, 2019

In today's competitive landscape, businesses must have the ability and process in place to face new challenges and find ways to successfully tackle them in a proactive manner. For years, this has been placed on the shoulders of DevOps teams within IT departments. But, as automation takes over manual intervention to increase speed and efficiency, these teams are facing what we know as IT digitization. How has this changed the way companies function over the years, and what do we have to look forward to in the coming years? ...

May 16, 2019

Although the vast majority of IT organizations have implemented a broad variety of systems and tools to modernize, simplify and streamline data center operations, many are still burdened by inefficiencies, security risks and performance gaps in their IT infrastructure as well as the excessive time it takes to manage legacy infrastructure, according to the State of IT Transformation, a report from Datrium ...

May 15, 2019

When it comes to network visibility, there are a lot of discussions about packet broker technology and the various features these solutions provide to network architects and IT managers. Packet brokers allow organizations to aggregate the data required for a variety of monitoring solutions including network performance monitoring and diagnostic (NPMD) platforms and unified threat management (UTM) appliances. But, when it comes to ensuring these solutions provide the insights required by NetOps and security teams, IT can spend an exorbitant amount of time dealing with issues around adds, moves and changes. This can have a dramatic impact on budgets and tool availability. Why does this happen? ...

May 14, 2019

Data may be pouring into enterprises but IT professionals still find most of it stuck in siloed departments and weeks away from being able to drive any valued action. Coupled with the ongoing concerns over security responsiveness, IT teams have to push aside other important performance-oriented data in order to ensure security data, at least, gets prominent attention. A new survey by Ivanti shows the disconnect between enterprise departments struggling to improve operations like automation while being challenged with a siloed structure and a data onslaught ...

May 13, 2019

A subtle, deliberate shift has occurred within the software industry which, at present, only the most innovative organizations have seized upon for competitive advantage. Although primarily driven by Artificial Intelligence (AI), this transformation strikes at the core of the most pervasive IT resources including cloud computing and predictive analytics ...

May 09, 2019

When asked who is mandated with developing and delivering their organization's digital competencies, 51% of respondents say their IT departments have a leadership role. The critical question is whether IT departments are prepared to take on a leadership role in which collaborating with other functions and disseminating knowledge and digital performance data are requirements ...

May 08, 2019

The Economist Intelligence Unit just released a new study commissioned by Riverbed that explores nine digital competencies that help organizations improve their digital performance and, ultimately, achieve their objectives. Here's a brief summary of 7 key research findings you'll find covered in detail in the report ...

May 07, 2019

Today, the overall customer scenario has digitally transformed and practically there is no limitation to the ways in which the target customers can be reached. These opportunities are throwing multiple challenges for brands and enterprises, and one of the prominent ones is to ensure Omni Channel experience for customers ...

May 06, 2019

Most businesses (92 percent of respondents) see the potential value of data and 36 percent are already monetizing their data, according to the Global Data Protection Index from Dell EMC. While this acknowledgement is positive, however, most respondents are struggling to properly protect their data ...

May 02, 2019

IT practitioners are still in experimentation mode with artificial intelligence in many cases, and still have concerns about how credible the technology can be. A recent study from OpsRamp targeted these IT managers who have implemented AIOps, and among other data, reports on the primary concerns of this new approach to operations management ...