APM Predictions 2016: Choosing an APM for Maximum Advantage
December 28, 2015

Larry Haig
Intechnica

Share this

As foreseen in previous years, the Application Performance Management (APM) market continues to develop, both in terms of numbers of providers and inherent functionality, particularly with regard to end user visibility. However, a lack of fundamental differentiation between many of the providers means that consolidation is to be expected, either due to the collapse of (over geared) Vendors, or via acquisition.

New adopters of APM would be well advised to consider whether their favored vendor is likely to be a purchaser (and therefore provide continuity), or a purchase (and run the risk of absorption into a larger system, or disappearance). This advice would tend to favor the larger / more established Vendors. Partly, however, this will depend on the investment and time horizons for the chosen product. "An APM is for life, not just for Christmas" does not necessarily have to be a given, particularly for smaller companies with straightforward delivery infrastructure and little legacy baggage.

It is increasingly necessary to trade off the ongoing, expensive, skills requirements necessary to gain the most from some APMs versus the immediate (but perhaps ultimately more limited) value of competing products.

Such decisions are largely dependent upon the inherent complexity of a delivery infrastructure. Dimensions include:

■ Number and type of underlying technology (legacy to bleeding edge)

■ Inherent application complexity

■ Extensibility – physical or virtual

■ 3rd party inclusions (client side affiliates to server-side web services links)

■ Nature and distribution of end usage – devices, native applications, geography, etc.

It is worth considering the overall benefits of (for example) gaining say 75% of the theoretical maximal visibility and value from an easily configurable and accessible "light touch" APM; against driving for maximum value and precision. The latter approach necessarily requires higher overheads in a range of areas, from more intricate definition of output dashboards/reports to greater ongoing management and interpretative skills. Whatever the salesman says, none of the higher end APM tools are truly "plug and play" – all require tuning, both to provide good understanding and to minimize performance overhead.

As always, the ability of your chosen APM to work with your current (and anticipated) technology stack – application framework, legacy components (mainframe, VAX?), modern extensions (microservices containers), in addition to the ability to cope with your anticipated throughput volumes and demand patterns) are primary requirements.

Larry Haig is Senior Consultant at Intechnica.

Share this

The Latest

March 27, 2024

Nearly all (99%) globa IT decision makers, regardless of region or industry, recognize generative AI's (GenAI) transformative potential to influence change within their organizations, according to The Elastic Generative AI Report ...

March 27, 2024

Agent-based approaches to real user monitoring (RUM) simply do not work. If you are pitched to install an "agent" in your mobile or web environments, you should run for the hills ...

March 26, 2024

The world is now all about end-users. This paradigm of focusing on the end-user was simply not true a few years ago, as backend metrics generally revolved around uptime, SLAs, latency, and the like. DevOps teams always pitched and presented the metrics they thought were the most correlated to the end-user experience. But let's be blunt: Unless there was an egregious fire, the correlated metrics were super loose or entirely false ...

March 25, 2024

This year, New Relic published the State of Observability for Financial Services and Insurance Report to share insights derived from the 2023 Observability Forecast on the adoption and business value of observability across the financial services industry (FSI) and insurance sectors. Here are seven key takeaways from the report ...

March 22, 2024

In MEAN TIME TO INSIGHT Episode 4 - Part 2, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses artificial intelligence and AIOps ...

March 21, 2024

In the course of EMA research over the last twelve years, the message for IT organizations looking to pursue a forward path in AIOps adoption is overall a strongly positive one. The benefits achieved are growing in diversity and value ...

March 20, 2024

Today, as enterprises transcend into a new era of work, surpassing the revolution, they must shift their focus and strategies to thrive in this environment. Here are five key areas that organizations should prioritize to strengthen their foundation and steer themselves through the ever-changing digital world ...

March 19, 2024

If there's one thing we should tame in today's data-driven marketing landscape, this would be data debt, a silent menace threatening to undermine all the trust you've put in the data-driven decisions that guide your strategies. This blog aims to explore the true costs of data debt in marketing operations, offering four actionable strategies to mitigate them through enhanced marketing observability ...

March 18, 2024

Gartner has highlighted the top trends that will impact technology providers in 2024: Generative AI (GenAI) is dominating the technical and product agenda of nearly every tech provider ...

March 15, 2024

In MEAN TIME TO INSIGHT Episode 4 - Part 1, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses artificial intelligence and network management ...