Today’s market environment demands businesses to change and adapt rapidly according to market dynamics, while still remaining in control. For business, these dynamics can mean sifting through what can amount to petabytes of data to act tactically and strategically.
Business Intelligence (BI) analytics tools help companies catch what could have been missed opportunities, using robust infrastructure to sift through mountains of data, and applying intelligent analytics. This way, business can identify hidden trends, customer relationships, buying behavior, operational and financial patterns, business opportunities and other vital information allowing business to take part in the market proactively.
Through BSM initiatives, IT is charged with supporting the changing demands of business, maintaining availability and ensuring that performance remains high. Similar to the business side experience, the IT landscape has grown in complexity, supporting a wider and growing range of technologies and platforms (Virtualization, Cloud, Open Source etc.), and accelerated application release schedules. This now means IT faces near-overwhelming quantities of information.
So while business progresses via BI, adopting analytics for management decisions, ironically the organization supporting this infrastructure, IT Operations, has adhered to an older, static-process driven paradigm. By not applying the analytics-based approach (like business) for their own operations, IT jeopardizes system stability, ultimately exposing business to the risk of devastating consequences.
Mountains of Data
Mountains of dynamic information confront IT. One of the prominent areas is in the cloud scenario. Self-service provisioning has multiplied the amount of activities occurring outside of static processes. The new provisioning opportunities are beyond IT management, leaving IT with limited visibility to what happens there. For example, an organization sets up a private cloud with a dynamic management system, allowing self-service provisioning of servers for the testing team. Traditionally, testing professionals would have come to IT and request an environment, and IT would oversee and manage this entire process. Now the process is independent, when testing needs an environment, they just create it.
Today’s Approach: Static Processes Drive IT
IT Operations has been running on static processes and strict workflows. For instance, ITIL has a process for Change Management that works according to certain steps. There are also a set metrics for measuring performance, like the amount of changes that successfully went through or failed.
IT Ops can plan as much as possible, but it won’t ensure that everything will occur as planned.
For example, when IT implements an application upgrade, and makes changes to the environment, IT administration can go through an entire established process, and still the application doesn’t function as planned. IT managers check the processes that the upgrade went through, yet still performance lags. Then they need to go into the fine, granular details and see every step, identifying the make-up of even minor changes, seeing how it was deployed to all the servers, what is the consistency between servers, have there been additional interference to the servers. They need to take this enormous amount of data – configuration and granular changes – and pinpoint what was the root cause.
Workflow-driven Management Processes
Static processes operate through workflows. The workflow only supports part of the process but there are so many things surrounding the workflow, happening outside the workflow. Business demands can force shortcuts to be taken. Steps in the workflow can be skipped in order to get immediate approval, even omitting the test stage.
Workflows Create False Security
Even when processes are enforced, like having registrations as part of the workflow management, this creates the belief that everything has been solved. There is no organization that can claim they operate completely within the bounds of established processes and approvals.
This situation creates a sense of false security that IT is on top of all the changes. IT Ops can think that everything works perfect and then the organization religiously adheres to their processes, relying on CMDB systems and workflows, ultimately undermining operations.
A Shift in Paradigm to Analytics Driven Management
Neurologists will explain that the brain has two distinct hemispheres. The right side of the brain collects information, while the left side is cognitive and analyzes this information, translating all of the sensory input into usable data.
This is really the same model for today’s IT organization, where operations need to know what is happening now. IT Ops can find itself stuck, trying to adjust static processes while keeping track of and handling dynamic events, and then getting caught off-guard when issues arise. The solution is to approach this situation with dynamic analytics, for dealing with all the changing data, and to see what is really happening. This goes beyond those few designated indicators that were usually watched, rather IT Ops needs Analytic Driven Management, similar to how business has adopted BI, extracting actionable information out of mountains of data to help decision makers respond efficiently.
About Sasha Gilenson
Sasha Gilenson is Founder and CEO of Evolven. Prior to founding Evolven in 2007, he spent 13 years with Mercury Interactive (acquired by HP), managing the QA organization and participating in establishing Mercury Interactive's Software as a Service (SaaS). Sasha played a key role in the development of Mercury Interactive's worldwide Business Technology Optimization (BTO) strategy and drove field operations of the Wireless Business Unit, all while taking on the duties as the Mercury Interactive's top "guru" in quality processes and IT practices domain.
This blog presents the case for a radical new approach to basic information technology (IT) education. This conclusion is based on a study of courses and other forms of IT education which purport to cover IT "fundamentals" ...
To achieve maximum availability, IT leaders must employ domain-agnostic solutions that identify and escalate issues across all telemetry points. These technologies, which we refer to as Artificial Intelligence for IT Operations, create convergence — in other words, they provide IT and DevOps teams with the full picture of event management and downtime ...
APMdigest and leading IT research firm Enterprise Management Associates (EMA) are partnering to bring you the EMA-APMdigest Podcast, a new podcast focused on the latest technologies impacting IT Operations. In Episode 2 - Part 1 Pete Goldin, Editor and Publisher of APMdigest, discusses Network Observability with Shamus McGillicuddy, Vice President of Research, Network Infrastructure and Operations, at EMA ...
CIOs have stepped into the role of digital leader and strategic advisor, according to the 2023 Global CIO Survey from Logicalis ...
Synthetic monitoring is crucial to deploy code with confidence as catching bugs with E2E tests on staging is becoming increasingly difficult. It isn't trivial to provide realistic staging systems, especially because today's apps are intertwined with many third-party APIs ...
Recent EMA field research found that ServiceOps is either an active effort or a formal initiative in 78% of the organizations represented by a global panel of 400+ IT leaders. It is relatively early but gaining momentum across industries and organizations of all sizes globally ...
Managing availability and performance within SAP environments has long been a challenge for IT teams. But as IT environments grow more complex and dynamic, and the speed of innovation in almost every industry continues to accelerate, this situation is becoming a whole lot worse ...
Harnessing the power of network-derived intelligence and insights is critical in detecting today's increasingly sophisticated security threats across hybrid and multi-cloud infrastructure, according to a new research study from IDC ...
Recent research suggests that many organizations are paying for more software than they need. If organizations are looking to reduce IT spend, leaders should take a closer look at the tools being offered to employees, as not all software is essential ...
Organizations are challenged by tool sprawl and data source overload, according to the Grafana Labs Observability Survey 2023, with 52% of respondents reporting that their companies use 6 or more observability tools, including 11% that use 16 or more.