Tracing the Transaction into 2011
November 27, 2010
Byrne Chapman
Share this

Believe it or not, 2010 is almost over and IT organizations are starting to allocate their 2011 budgets and decide how they will cope with the continued pressures to reduce spending while improving the availability and performance of business transactions. High on the priority list is improving the quality of service of their applications, infrastructure and web presence. However, managing the costs associated with ensuring high availability and reliability is challenging IT managers to think in different directions and dimensions.

In 2010, enterprises increased their adoption of transaction management solutions to improve the quality of service while maintaining, or even reducing, costs. Transaction management is the tracking and reporting of reliability and performance data for business transactions. By tracing these transactions and stitching the information together into a cohesive reporting structure, IT may determine where and when issues arise before there are any service outages or concerns.

The adoption of transaction management integrates the performance of IT systems with the performance of business transactions to create transparency between the impact that IT systems have on the success of the business and the key part it plays in improving business service quality.

Looking ahead into 2011, the increased use of transaction management in the enterprises will be driven by three items: the need to link business and IT goals, effective capacity planning and management, and virtualization.

The first of the trends will be a strong adoption of transaction management practices to improve service and reduce costs. Successful enterprises will realize closer communications between business and IT functions. IT will be able to clearly see the impact of IT systems on business users. Business and IT will have a common understanding where services may be improved based on the in-depth data from the performance of business transactions.

As an example, think about your interaction with an ATM machine. When you use an ATM, you want to complete a business transaction as quickly as possible, without any service flaws or interruptions. You aren’t thinking about the technical aspects of the transaction such as network utilization, processor performance and storage configurations as would concern IT managers. Transaction management processes and technology create the strong relationship between business transactions and technical transactions. The creation of these relationships provides an opportunity for IT to monitor the health of business transactions. It also offers the opportunity for IT and business personnel to have meaningful discussions on service levels, availability management and transaction costs. Transaction management is a new opportunity to strengthen the relationships with business personnel.

Formal capacity planning for processors, storage and networks is the second 2011 trend. IT is tasked with carefully balancing the service delivery with operational costs. Accurately understanding business trends and modeling resource demands allows IT to ensure that processor, storage and network resources needed are available. By tracking business transaction trends and understanding the resources consumed, you can build a knowledge base for estimating how changes will impact the IT infrastructure. Transaction management is a key element in this equation as it offers a looking glass into the technical transactions that comprise a business transaction.

Clearly understanding this relationship is important in the capacity planning process. Once the changes for business transactions have been projected, the transaction management processes can be used to identify the changes in IT transaction volumes and the associated changes in resource consumption patterns. Capacity planning is also an important planning element in the movement to virtualized platforms.

Finally, virtualized environments will continue to rise in 2011 as enterprises look for cost efficient methods to consume more of the processor and storage capacity in their data centers. Virtualization adds a new level to the IT infrastructure. Enterprises will need to carefully manage these new computing and storage solutions in order to ensure that the economic benefits are realized and the business value is delivered. Transaction management offers a management solution in the virtualized processor and storage environment to ensure that the service provided to business transactions is accurately reported and meets the service level agreements (SLAs) established by the business users.

In summary, transaction management is a key component in the success of IT organizations. Understanding the level of service being delivered for critical business transactions is invaluable in building customer relationships and goodwill with business partners. Capacity planning is critical to ensure that sufficient resources are available to meet workload demands and agreed upon SLAs and virtualization offers opportunities to reduce capital and operational costs while meeting agreed upon service levels.

About Byrne Chapman

Byrne Chapman has spent most of his professional career as an IT executive in the insurance industry. He is a technology consultant for Correlsense, which offers SharePath, software for managing business transactions throughout the data center.

Related Links:

www.correlsense.com

Share this

The Latest

July 25, 2024

The 2024 State of the Data Center Report from CoreSite shows that although C-suite confidence in the economy remains high, a VUCA (volatile, uncertain, complex, ambiguous) environment has many business leaders proceeding with caution when it comes to their IT and data ecosystems, with an emphasis on cost control and predictability, flexibility and risk management ...

July 24, 2024

In June, New Relic published the State of Observability for Energy and Utilities Report to share insights, analysis, and data on the impact of full-stack observability software in energy and utilities organizations' service capabilities. Here are eight key takeaways from the report ...

July 23, 2024

The rapid rise of generative AI (GenAI) has caught everyone's attention, leaving many to wonder if the technology's impact will live up to the immense hype. A recent survey by Alteryx provides valuable insights into the current state of GenAI adoption, revealing a shift from inflated expectations to tangible value realization across enterprises ... Here are five key takeaways that underscore GenAI's progression from hype to real-world impact ...

July 22, 2024
A defective software update caused what some experts are calling the largest IT outage in history on Friday, July 19. The impact reverberated through multiple industries around the world ...
July 18, 2024

As software development grows more intricate, the challenge for observability engineers tasked with ensuring optimal system performance becomes more daunting. Current methodologies are struggling to keep pace, with the annual Observability Pulse surveys indicating a rise in Mean Time to Remediation (MTTR). According to this survey, only a small fraction of organizations, around 10%, achieve full observability today. Generative AI, however, promises to significantly move the needle ...

July 17, 2024

While nearly all data leaders surveyed are building generative AI applications, most don't believe their data estate is actually prepared to support them, according to the State of Reliable AI report from Monte Carlo Data ...

July 16, 2024

Enterprises are putting a lot of effort into improving the digital employee experience (DEX), which has become essential to both improving organizational performance and attracting and retaining talented workers. But to date, most efforts to deliver outstanding DEX have focused on people working with laptops, PCs, or thin clients. Employees on the frontlines, using mobile devices to handle logistics ... have been largely overlooked ...

July 15, 2024

The average customer-facing incident takes nearly three hours to resolve (175 minutes) while the estimated cost of downtime is $4,537 per minute, meaning each incident can cost nearly $794,000, according to new research from PagerDuty ...

July 12, 2024

In MEAN TIME TO INSIGHT Episode 8, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses AutoCon with the conference founders Scott Robohn and Chris Grundemann ...

July 11, 2024

Numerous vendors and service providers have recently embraced the NaaS concept, yet there is still no industry consensus on its definition or the types of networks it involves. Furthermore, providers have varied in how they define the NaaS service delivery model. I conducted research for a new report, Network as a Service: Understanding the Cloud Consumption Model in Networking, to refine the concept of NaaS and reduce buyer confusion over what it is and how it can offer value ...