Skip to main content

New EMA Report: OpenTelemetry's Emerging Role in IT Performance and Availability

Pete Goldin
Editor and Publisher
APMdigest

OpenTelemetry is quickly becoming a foundational element of observability, according to a new report I wrote in partnership with Dan Twing, President and COO of Enterprise Management Associates (EMA), titled Taking Observability to the Next Level: OpenTelemetry's Emerging Role in IT Performance and Reliability. The report was sponsored by Elastic, an APMdigest sponsor, as well as Apica, Beta Systems, Dynatrace, Embrace and SolarWinds.

WEBINAR APRIL 15: Unlocking the Future of Observability: OpenTelemetry’s Role in IT Performance and Innovation

OpenTelemetry (OTel) is an open source CNCF project offering a framework and suite of tools including APIs and SDKs that facilitate the generation, collection, and exporting of telemetry data for observability platforms and related tools. OTel collects logs, metrics and traces, and is expanding data types to include profiling and many other possibilities.

This report comes at just the right time, with OpenTelemetry emerging as an essential component of modern observability. Our first objective for the research was to assess the awareness and perception of OpenTelemetry in the IT industry. We assumed the research would show that the project has some good momentum, but the results were even a bit higher than expected, with a majority (68.3%) of respondents saying they are moderately or very familiar with OTel.

OpenTelemetry also enjoys a positive perception, with half of respondents considering OpenTelemetry mature enough for implementation today, and another 31% considering it moderately mature and useful. So more than 80% basically feel that OpenTelemetry can be used now. And almost everyone surveyed (98.7%) expresses support for where OpenTelemetry is heading — a very strong vote of confidence. BTW those last two groupings include respondents that are only marginally familiar with OpenTelemetry, which suggests that OTel has a rock solid reputation.

The majority also say OpenTelemetry's role in observability is important — 61% believe OpenTelemetry is a very important or critical enabler of observability, and 57% place a similar value on the importance of OpenTelemetry to their own observability strategy.

The usage numbers are also encouraging. The report states, "Almost half (48.5%) of respondents currently use OpenTelemetry. Another 25.3% are not using OpenTelemetry yet, but are planning to implement. This means that just under 75% are either using or planning to use OpenTelemetry, a statistic that bodes well for the future of the standard. The remaining 24.8% are still evaluating, while only 1.5% of respondents had no plans to implement."

The survey findings further reflect the momentum of OpenTelemetry by showing how observability maturity correlates directly with the awareness, perception and even adoption of OpenTelemetry. A majority (64%) of survey respondents assess their own observability practices as mature or very mature, and 45% of that group are very familiar with OpenTelemetry; 67% see OpenTelemetry as very important or critical to their own observability strategy; and 61% already use OpenTelemetry.

Image
EMA

The EMA report holds much more interesting stats about OpenTelemetry that can be valuable to both observability practitioners and IT product vendors, answering questions such as:

  • Where are users deploying OpenTelemetry?
  • What are the concerns and challenges?
  • What are the benefits of OpenTelemetry?
  • What level of ROI are users gaining?
  • What are the expectations for OpenTelemetry's future?

One of the final points we made in the report: OpenTelemetry will become a competitive advantage for organizations across most industries. "One of the most consequential points to consider: the survey findings suggest that your competitors have already started using OpenTelemetry to improve digital performance, availability, and the user experience. With this in mind, if you have not already adopted OpenTelemetry, the time to start is now."

Pete Goldin is Editor and Publisher of APMdigest

The Latest

A new study by the IBM Institute for Business Value reveals that enterprises are expected to significantly scale AI-enabled workflows, many driven by agentic AI, relying on them for improved decision making and automation. The AI Projects to Profits study revealed that respondents expect AI-enabled workflows to grow from 3% today to 25% by the end of 2025. With 70% of surveyed executives indicating that agentic AI is important to their organization's future, the research suggests that many organizations are actively encouraging experimentation ...

Respondents predict that agentic AI will play an increasingly prominent role in their interactions with technology vendors over the coming years and are positive about the benefits it will bring, according to The Race to an Agentic Future: How Agentic AI Will Transform Customer Experience, a report from Cisco ...

A new wave of tariffs, some exceeding 100%, is sending shockwaves across the technology industry. Enterprises are grappling with sudden, dramatic cost increases that threaten to disrupt carefully planned budgets, sourcing strategies, and deployment plans. For CIOs and CTOs, this isn't just an economic setback; it's a wake-up call. The era of predictable cloud pricing and stable global supply chains is over ...

As artificial intelligence (AI) adoption gains momentum, network readiness is emerging as a critical success factor. AI workloads generate unpredictable bursts of traffic, demanding high-speed connectivity that is low latency and lossless. AI adoption will require upgrades and optimizations in data center networks and wide-area networks (WANs). This is prompting enterprise IT teams to rethink, re-architect, and upgrade their data center and WANs to support AI-driven operations ...

Artificial intelligence (AI) is core to observability practices, with some 41% of respondents reporting AI adoption as a core driver of observability, according to the State of Observability for Financial Services and Insurance report from New Relic ...

Application performance monitoring (APM) is a game of catching up — building dashboards, setting thresholds, tuning alerts, and manually correlating metrics to root causes. In the early days, this straightforward model worked as applications were simpler, stacks more predictable, and telemetry was manageable. Today, the landscape has shifted, and more assertive tools are needed ...

Cloud adoption has accelerated, but backup strategies haven't always kept pace. Many organizations continue to rely on backup strategies that were either lifted directly from on-prem environments or use cloud-native tools in limited, DR-focused ways ... Eon uncovered a handful of critical gaps regarding how organizations approach cloud backup. To capture these prevailing winds, we gathered insights from 150+ IT and cloud leaders at the recent Google Cloud Next conference, which we've compiled into the 2025 State of Cloud Data Backup ...

Private clouds are no longer playing catch-up, and public clouds are no longer the default as organizations recalibrate their cloud strategies, according to the Private Cloud Outlook 2025 report from Broadcom. More than half (53%) of survey respondents say private cloud is their top priority for deploying new workloads over the next three years, while 69% are considering workload repatriation from public to private cloud, with one-third having already done so ...

As organizations chase productivity gains from generative AI, teams are overwhelmingly focused on improving delivery speed (45%) over enhancing software quality (13%), according to the Quality Transformation Report from Tricentis ...

Back in March of this year ... MongoDB's stock price took a serious tumble ... In my opinion, it reflects a deeper structural issue in enterprise software economics altogether — vendor lock-in ...

New EMA Report: OpenTelemetry's Emerging Role in IT Performance and Availability

Pete Goldin
Editor and Publisher
APMdigest

OpenTelemetry is quickly becoming a foundational element of observability, according to a new report I wrote in partnership with Dan Twing, President and COO of Enterprise Management Associates (EMA), titled Taking Observability to the Next Level: OpenTelemetry's Emerging Role in IT Performance and Reliability. The report was sponsored by Elastic, an APMdigest sponsor, as well as Apica, Beta Systems, Dynatrace, Embrace and SolarWinds.

WEBINAR APRIL 15: Unlocking the Future of Observability: OpenTelemetry’s Role in IT Performance and Innovation

OpenTelemetry (OTel) is an open source CNCF project offering a framework and suite of tools including APIs and SDKs that facilitate the generation, collection, and exporting of telemetry data for observability platforms and related tools. OTel collects logs, metrics and traces, and is expanding data types to include profiling and many other possibilities.

This report comes at just the right time, with OpenTelemetry emerging as an essential component of modern observability. Our first objective for the research was to assess the awareness and perception of OpenTelemetry in the IT industry. We assumed the research would show that the project has some good momentum, but the results were even a bit higher than expected, with a majority (68.3%) of respondents saying they are moderately or very familiar with OTel.

OpenTelemetry also enjoys a positive perception, with half of respondents considering OpenTelemetry mature enough for implementation today, and another 31% considering it moderately mature and useful. So more than 80% basically feel that OpenTelemetry can be used now. And almost everyone surveyed (98.7%) expresses support for where OpenTelemetry is heading — a very strong vote of confidence. BTW those last two groupings include respondents that are only marginally familiar with OpenTelemetry, which suggests that OTel has a rock solid reputation.

The majority also say OpenTelemetry's role in observability is important — 61% believe OpenTelemetry is a very important or critical enabler of observability, and 57% place a similar value on the importance of OpenTelemetry to their own observability strategy.

The usage numbers are also encouraging. The report states, "Almost half (48.5%) of respondents currently use OpenTelemetry. Another 25.3% are not using OpenTelemetry yet, but are planning to implement. This means that just under 75% are either using or planning to use OpenTelemetry, a statistic that bodes well for the future of the standard. The remaining 24.8% are still evaluating, while only 1.5% of respondents had no plans to implement."

The survey findings further reflect the momentum of OpenTelemetry by showing how observability maturity correlates directly with the awareness, perception and even adoption of OpenTelemetry. A majority (64%) of survey respondents assess their own observability practices as mature or very mature, and 45% of that group are very familiar with OpenTelemetry; 67% see OpenTelemetry as very important or critical to their own observability strategy; and 61% already use OpenTelemetry.

Image
EMA

The EMA report holds much more interesting stats about OpenTelemetry that can be valuable to both observability practitioners and IT product vendors, answering questions such as:

  • Where are users deploying OpenTelemetry?
  • What are the concerns and challenges?
  • What are the benefits of OpenTelemetry?
  • What level of ROI are users gaining?
  • What are the expectations for OpenTelemetry's future?

One of the final points we made in the report: OpenTelemetry will become a competitive advantage for organizations across most industries. "One of the most consequential points to consider: the survey findings suggest that your competitors have already started using OpenTelemetry to improve digital performance, availability, and the user experience. With this in mind, if you have not already adopted OpenTelemetry, the time to start is now."

Pete Goldin is Editor and Publisher of APMdigest

The Latest

A new study by the IBM Institute for Business Value reveals that enterprises are expected to significantly scale AI-enabled workflows, many driven by agentic AI, relying on them for improved decision making and automation. The AI Projects to Profits study revealed that respondents expect AI-enabled workflows to grow from 3% today to 25% by the end of 2025. With 70% of surveyed executives indicating that agentic AI is important to their organization's future, the research suggests that many organizations are actively encouraging experimentation ...

Respondents predict that agentic AI will play an increasingly prominent role in their interactions with technology vendors over the coming years and are positive about the benefits it will bring, according to The Race to an Agentic Future: How Agentic AI Will Transform Customer Experience, a report from Cisco ...

A new wave of tariffs, some exceeding 100%, is sending shockwaves across the technology industry. Enterprises are grappling with sudden, dramatic cost increases that threaten to disrupt carefully planned budgets, sourcing strategies, and deployment plans. For CIOs and CTOs, this isn't just an economic setback; it's a wake-up call. The era of predictable cloud pricing and stable global supply chains is over ...

As artificial intelligence (AI) adoption gains momentum, network readiness is emerging as a critical success factor. AI workloads generate unpredictable bursts of traffic, demanding high-speed connectivity that is low latency and lossless. AI adoption will require upgrades and optimizations in data center networks and wide-area networks (WANs). This is prompting enterprise IT teams to rethink, re-architect, and upgrade their data center and WANs to support AI-driven operations ...

Artificial intelligence (AI) is core to observability practices, with some 41% of respondents reporting AI adoption as a core driver of observability, according to the State of Observability for Financial Services and Insurance report from New Relic ...

Application performance monitoring (APM) is a game of catching up — building dashboards, setting thresholds, tuning alerts, and manually correlating metrics to root causes. In the early days, this straightforward model worked as applications were simpler, stacks more predictable, and telemetry was manageable. Today, the landscape has shifted, and more assertive tools are needed ...

Cloud adoption has accelerated, but backup strategies haven't always kept pace. Many organizations continue to rely on backup strategies that were either lifted directly from on-prem environments or use cloud-native tools in limited, DR-focused ways ... Eon uncovered a handful of critical gaps regarding how organizations approach cloud backup. To capture these prevailing winds, we gathered insights from 150+ IT and cloud leaders at the recent Google Cloud Next conference, which we've compiled into the 2025 State of Cloud Data Backup ...

Private clouds are no longer playing catch-up, and public clouds are no longer the default as organizations recalibrate their cloud strategies, according to the Private Cloud Outlook 2025 report from Broadcom. More than half (53%) of survey respondents say private cloud is their top priority for deploying new workloads over the next three years, while 69% are considering workload repatriation from public to private cloud, with one-third having already done so ...

As organizations chase productivity gains from generative AI, teams are overwhelmingly focused on improving delivery speed (45%) over enhancing software quality (13%), according to the Quality Transformation Report from Tricentis ...

Back in March of this year ... MongoDB's stock price took a serious tumble ... In my opinion, it reflects a deeper structural issue in enterprise software economics altogether — vendor lock-in ...