Will Cloud Eventually Make ITIL Obsolete? (Or will Cloud help to Redefine ITIL to make it even more relevant?)
The IT industry would appear to be divided in 2013 about the future of ITIL — the IT Infrastructure Library — with its best practices guidelines for service management, especially vis-à-vis cloud computing.
One of the reasons for this is the very dynamic nature of cloud, in which infrastructure can be dynamically resourced through automation versus more static change reviews. Some in the industry have even argued that cloud requires a new way of approaching the provisioning and management of services altogether — leaving ITIL and its derived concepts, like Configuration Management Database (CMDB) in the dust.
However, the reality is that many in the industry strongly support ITIL and its concepts and even see them as areas for growth.
So what is the “truth?”
This Q&A with EMA VP Dennis Drogseth and EMA Director Torsten Volk leverages recent EMA research such as Service Desk in the Age of Cloud and Agile and Demystifying Cloud as well as prior EMA research on cloud adoptions as they impact strategic management requirements.
APM: If “ITIL” is about process, and cloud is about new technology options, then why is this even a question?
DD: One way of rephrasing the question is, “Does cloud computing impact IT processes?” And the answer almost certainly is “yes.” The questions then become “how?” and to what degree can ITIL embrace and even help to accelerate these needed changes.
TV: Cloud radically changes the way IT services are delivered. While this does not decrease the importance of ITIL in general, it certainly requires organizations to rethink how they have to adjust their individual ITIL implementation.
APM: To what degree are the people who care about ITIL and the people more focused on cloud separate groups? To what degree do they overlap?
DD: ITIL has evolved with roots in the service desk, and cloud with its roots in the data center. And our current research — with these two various groups — clearly shows differences in opinions.
For instance, in service desk-related answers, 47% viewed ITIL as critical or very important, and only 5% viewed ITIL as unimportant. Moreover, 40% viewed ITIL as becoming more important, while only 6% viewed ITIL as becoming less important. However, it should be pointed out that more than 75% of these same respondents were already aware of meaningful impacts from cloud. So they are in no way immune to cloud or its impacts.
Moreover, prior research from 2011 (Operationalizing Cloud) specifically targeting both cloud and service management in Operations indicated that ITIL would become more important by a two-to-one margin.
TV: Our recent Demystifying Cloud study polled cloud decision makers for their opinion on the importance of ITIL. We found that almost half of these decision makers — business staff, IT operations staff and application developers — believe that the importance of ITIL is declining in the face of cloud. Also striking was the fact that the further along enterprises were with their cloud deployments, the more the perceived importance of ITIL decreased.
These numbers are in stark contrast, when compared to Dennis’ service-centric research. I think that this is due to cloud decision makers believing that cloud has all the necessary best practices for service delivery already “built in” and therefore, ITIL is less important.
APM: What, more specifically, do you see as the impact of cloud on ITIL? Where does ITIL become more relevant? Where less relevant?
DD: Here I think common sense applies even more than raw data. In its larger vision, ITIL is about establishing more effective cross-domain capabilities and processes so that, for instance, stakeholders can manage and optimize change more cohesively. In my view, nothing could be more relevant to successful cloud adoption.
Quite tellingly, the same population in Operationalizing Cloud that voted two-to-one in favor of ITIL was also asked, in the same question, about the move to cross-domain and increased dialog with business constituencies. However, conversely, ITIL has evolved to become rather elaborate in terms of process specifics and stakeholder roles. If followed to the letter of the law, these can engender more stasis than fluidity, which is completely counter to the broader goal. As such ITIL remains — as it has always been — far more effective as a departure point than as a Bible.
TV: I agree with Dennis’ proposed common sense approach, where ITIL is not seen as a Bible, but as a collection of best practices.
During their cloud deployment, organizations have to think about how these practices apply to their individual situation. Cloud is a delivery mechanism that simplifies and accelerates how IT services are created and managed. However, even simplified service delivery and management processes need to be governed by a cross-domain best practices approach.
In addition, cloud makes IT services more easily available to business units, which is leading to a much larger number of services deployed. This results in an additional strain on the IT department that must be alleviated through the use of the ITIL best practice approach.
APM: Given that cloud is made up of different types of capabilities — internal, public, SaaS, IaaS, etc. — is ITIL more appropriate for certain types of cloud adoption and less for others? What do you recommend in applying ITIL to cloud?
DD: I believe that the move to cloud needs to be viewed as an enabler, not an endgame. And from that perspective, ITIL will always be relevant, whether for governing service providers for public cloud offerings, or for supporting continuous improvement for internal service delivery internally. I think the main thing is to always keep your eye on the “goal” which is provide more valuable and cost-effective IT services to internal and external customers.
TV: While cloud — private or public; IaaS, PaaS or SaaS — enhances the agility of service delivery, these services still have to be governed by the ITIL best practices approach. The fact that they can now be delivered, faster, cheaper and more flexibly, by combining private and public cloud resources, should not lead us to negate the importance of optimizing composition, delivery and management.
ABOUT Dennis Drogseth
Dennis Drogseth is VP of Research at Enterprise Management Associates (EMA). He manages the New Hampshire office and has been a driving force in establishing EMA’s New England presence. Drogseth brings more than 30 of experience in various aspects of marketing and business planning for service management solutions. He supports EMA through leadership in Business Service Management (BSM), CMDB Systems, automation systems and service-centric financial optimization. He also works across practice areas to promote dialogs across critical areas of technology and market interdependencies.
Prior to joining EMA in 1998, Drogseth worked for Cabletron’s SPECTRUM management software and spent 14 years with IBM in marketing and communications.
ABOUT Torsten Volk
Torsten joined EMA following more than 10 years of conceptualizing and managing highly complex IT projects within the virtualization, cloud, and custom software application development realm. In his past positions as Principal and Director of Professional Services of two major consulting firms, Volk helped a long list of national and international organizations like The World Bank, Prometric and Cricket Communications evaluate and identify the business value of emerging enterprise technologies.
Volk has conceptualized application-aware private, public and hybrid cloud solutions for multiple hosting providers, as well as for a number of large end-customers. The common denominator of all of these solutions was offering customers access to IT infrastructure and software tools that they could not have afforded outside of a cloud setup.
As the data generated by organizations grows, APM tools are now required to do a lot more than basic monitoring of metrics. Modern data is often raw and unstructured and requires more advanced methods of analysis. The tools must help dig deep into this data for both forensic analysis and predictive analysis. To extract more accurate and cheaper insights, modern APM tools use Big Data techniques to store, access, and analyze the multi-dimensional data ...
Modern enterprises are generating data at an unprecedented rate but aren't taking advantage of all the data available to them in order to drive real-time, actionable insights. According to a recent study commissioned by Actian, more than half of enterprises today are unable to efficiently manage nor effectively use data to drive decision-making ...
According to a study by Forrester Research, an enhanced UX design can increase the conversion rate by 400%. If UX has become the ultimate arbiter in determining the success or failure of a product or service, let us first understand what UX is all about ...
The requirements of an APM tool are now much more complex than they've ever been. Not only do they need to trace a user transaction across numerous microservices on the same system, but they also need to happen pretty fast ...
Performance monitoring is an old problem. As technology has advanced, we've had to evolve how we monitor applications. Initially, performance monitoring largely involved sending ICMP messages to start troubleshooting a down or slow application. Applications have gotten much more complex, so this is no longer enough. Now we need to know not just whether an application is broken, but why it broke. So APM has had to evolve over the years for us to get there. But how did this evolution take place, and what happens next? Let's find out ...
There are some IT organizations that are using DevOps methodology but are wary of getting bogged down in ITSM procedures. But without at least some ITSM controls in place, organizations lose their focus on systematic customer engagement, making it harder for them to scale ...
If you have deployed a Java application in production, you've probably encountered a situation where the application suddenly starts to take up a large amount of CPU. When this happens, application response becomes sluggish and users begin to complain about slow response. Often the solution to this problem is to restart the application and, lo and behold, the problem goes away — only to reappear a few days later. A key question then is: how to troubleshoot high CPU usage of a Java application? ...
Operations are no longer tethered tightly to a main office, as the headquarters-centric model has been retired in favor of a more decentralized enterprise structure. Rather than focus the business around a single location, enterprises are now comprised of a web of remote offices and individuals, where network connectivity has broken down the geographic barriers that in the past limited the availability of talent and resources. Key to the success of the decentralized enterprise model is a new generation of collaboration and communication tools ...
To better understand the AI maturity of businesses, Dotscience conducted a survey of 500 industry professionals. Research findings indicate that although enterprises are dedicating significant time and resources towards their AI deployments, many data science and ML teams don't have the adequate tools needed to properly collaborate on, build and deploy AI models efficiently ...