While organizations generally agree that ITIL-based process improvements are a "good" thing, executives often struggle to measure quantifiable benefits of investment in the framework. We have found that an effective approach to ITIL is characterized by a manageable yet scalable implementation, a focus on people and skills issues, and ongoing measurement that takes a comprehensive view of the enterprise-wide impact of process maturity.
ITIL initiatives are often initially confined to individual business units or infrastructure towers. The idea is to conduct a "pilot study" that can then be rolled out on a broader basis. In practice, however, such initiatives often lack a well-defined long-term plan for extending process improvement, as well as specific goals or milestones to track.
Another problem is that ITIL, by definition, requires an integrated, enterprise-wide approach – the underlying philosophy is to understand how problems in one specific area impact other areas and permeate throughout the entire organization, and to then use that analysis to take corrective action. A narrow pilot study is therefore a poor choice for a proof of concept.
At the other extreme, organizations get overly ambitious and endeavor to roll out all ITIL processes concurrently across the entire enterprise without prior expectation setting and buy-in. When reality sets in and the complexity of implementing changes and gauging cause/effect impacts across a global organization become apparent, the initiative quickly loses momentum. This all-at-once approach not only fails to deliver benefits, it can actually result in a regression and decline in overall process maturity. As a result, the viability of the ITIL framework is often called into question.
Start Small, Think Big
An effective ITIL implementation aims to create a seamless organization, unrestricted by pockets, units or departments each of which have a particular non-standard way of doing things.
As such, an ITIL initiative typically requires addressing four types of organizational silos:
- Geographic silos – still a reality for most IT organizations – that can hinder process consistency and communications.
- Group-based silos, comprised of multiple development and/or support teams, that provide redundant services that lead to higher costs, conflicts, and project delays.
- Technology silos that add complexity and inhibit change, as applications running on multiple distributed platforms require managing dependencies between the platforms and synchronizing work on each platform.
- Functional silos, put in place to address complexities related to project management, architecture, database administration, testing, and other areas, that can impede coordination and communication.
Top-performing businesses address the silo challenges by applying focus and detail on the one hand, coupled with a holistic perspective and a long-term plan for extending ITIL across the enterprise. Specifically, an effective approach is characterized by a phased, process-by-process implementation that begins with one ITIL process – Change Management, for example – and extends from the IT organization across the business.
Once the initial process is rolled out, integrated, and assessed, a second ITIL process (Incident Management, perhaps) can be similarly rolled out, followed by a third (Release Management) and so forth.
Through this approach, repeatable leading practice processes and a supporting knowledge base can be established to facilitate communication between group members and across groups.
From this perspective, a pilot-based approach can be effective, if implemented within the context of a long-term process improvement plan that includes ongoing measurement and communication of benefits. This approach makes it possible to combine a phased or step-by-step implementation, while at the same time gauging the enterprise-wide impact of the changes.
A benchmark analysis of the operational environment can identify the specific downstream changes that result from specific process enhancements resulting from ITIL maturity. Ideally, efficiency, productivity, and service availability are baselined before the ITIL initiative begins. This baseline can then be used to more accurately quantify improvements resulting from process changes.
That said, no data exists to show a clear correlation between ITIL compliance and cost savings, and there’s no "ITIL maturity equals x% savings" formula. The ITIL framework is designed to improve quality and efficiency by enhancing an organization's ability to manage activity within the IT function and the IT function's interface with the business. So, while total costs might not change, or while savings might not be measurable in concrete dollar terms, ITIL process improvement can allow IT to spend less time fighting fires and more time providing value to the business for developing new applications and deploying new technologies.
In this context, a longer, big picture view of ITIL is most effective – one that recognizes that, ultimately, implementing rigor and discipline will deliver benefits to the business.
ABOUT Chris Pfauser and Cindy LaChapelle
Chris Pfauser is an ISG Principal Consultant with more than 20 years of experience in management consulting and operational improvement. He specializes in service management and process optimization and works with global organizations in a variety of industry sectors.
ISG Principal Consultant Cindy LaChapelle has over 25 years of industry experience. Her areas of expertise include sourcing strategy development, data and storage assessment and lifecycle management, and backup and recovery and data protection strategies. Both Pfauser and LaChapelle hold Foundations-Level V.3 ITIL Certifications.
For the past 10 years, the majority of CIOs have had a transformational focus (currently 42%), however, this year, there is strong momentum in CIOs taking on more strategic responsibilities (40%), according to the 2020 State of the CIO research from IDG's CIO ...
The tech world may be falling in love with artificial intelligence and automation, but when it comes to managing critical assets, old school tools like spreadsheets are still in common use. A new survey by Ivanti illustrates how these legacy tools are forcing IT to waste valuable time analyzing assets due to incomplete data ...
Over 70% of C-Suite decision makers believe business innovation and staff retention are driven by improved visibility into network and application performance, according to Rethink Possible: Visibility and Network Performance – The Pillars of Business Success, a survey
conducted by Riverbed ...
Modern enterprises rely upon their IT departments to deliver a seamless digital customer experience. Performance and availability are the foundational stepping stones to delivering that customer experience. Along those lines, this month we released a new research study titled the IT Downtime Detection and Mitigation Report that contains recommendations on how to best prevent, detect or mitigate brownouts and outages, given the context of today’s IT transformation trends ...
While Application Performance Management (APM) has become mainstream, with a majority of tech pros using APM tools regularly, there's work to be done to move beyond troubleshooting ...
Over the last few decades, IT departments have decreased budgets in part because of recession. As a result, they have are being asked to do more with less. The increase in work has amplified the need for automation ...
Many variables must align for optimum APM, and security is certainly among them. I offer the following APM predictions for 2020, which revolve around the reality that we will definitely begin to see much deeper integration of WAN technology on the security front. Look for this integration to take shape in the following ways ...
When it comes to growing a successful company, research shows it isn't about getting the most out of employees, but delivering an experience that empowers them to be and do their best. And according to Priming a New Era of Digital Wellness, a new study conducted by Quartz Insights in partnership with Citrix Systems, technology is the secret to doing so ...
Only 11% of website decision-makers feel that they have complete insight into the scripts that they use on their websites. However, industry estimates state that about 70% of the code on a website comes from a third-party library or service. Research highlights a clear need to raise awareness of the potential threats associated with the vulnerabilities inherent in third-party code ...