Following up on our previous exploration as part of a KubeCon London 2025 talk, OTel Sucks (But Also Rocks!), we wanted to dive deeper into the candid conversations we had with practitioners from companies like Atlassian, Delivery Hero, Liatrio, and Pismo. While our KubeCon talk shared snippets of these experiences, much more was left on the cutting room floor. This two-part piece aims to bring those richer details to light, offering fellow observability professionals an unvarnished look at the real-world challenges and triumphs of adopting OpenTelemetry.
Start with Balancing OTel's Strengths and Struggles - Part 1
Part 2 of this blog covers the powerful advantages and breakthroughs — the "OTel Rocks" moments.
OTel Rocks - The Power, Flexibility, and Future-Proofing
Despite the frustrations, every engineer we spoke with ultimately affirmed the value and power of OpenTelemetry. The "sucks" moments are often the flip side of its greatest strengths.
1. Vendor Neutrality: Freedom and Flexibility
This is arguably OTel's foundational promise and a major win cited by all interviewees. Before OTel, choosing an observability vendor often meant committing to their proprietary agents and data formats. Switching vendors was a painful, resource-intensive process involving re-instrumenting applications.
OTel breaks this lock-in. By instrumenting applications with OTel SDKs and using the OTel Collector to process and route data, organizations gain the freedom to choose best-of-breed backend platforms for different signals or to switch vendors with minimal disruption to the application teams. Alexandre Magno emphasized the strategic importance of this, allowing Pismo to control their data destiny and optimize costs. Adriel Perkins also valued the ability to send telemetry to multiple destinations simultaneously, enabling gradual migrations or specialized analysis in different tools. This decoupling is a massive strategic advantage in a market with rapidly evolving vendor capabilities and pricing models.
2. The Collector: A Swiss Army Knife for Telemetry
While its configuration can be complex, the OTel Collector's power and flexibility were universally praised. Elena Kovalenko, despite noting the update challenges, called it the "best option" for Delivery Hero's complex needs. The Collector acts as a central hub for receiving, processing, and exporting telemetry data.
Its processor pipeline allows teams to enrich data (e.g., adding Kubernetes metadata), filter noise (e.g., dropping health checks), ensure compliance (e.g., masking sensitive data), and manage costs (e.g., sampling). James Moessis highlighted this modularity: "When OTel does suck, the good thing is that it's designed in a way that doesn't suck so that you can replace little modular bits here and there." Need custom processing? Write a custom processor. Need to export to a new backend? Add an exporter. This extensibility allows teams to tailor their observability pipeline precisely to their needs without being constrained by a specific vendor's agent capabilities. It's the key enabler for managing telemetry quality and cost at scale.
3. Unification and Standardization
Before OTel, teams often wrestled with disparate agents and libraries for traces, metrics, and logs, leading to inconsistent data and correlation challenges. OTel provides a unified approach — standardized SDKs, APIs, and data protocols (OTLP) across signals. This simplifies instrumentation efforts and, crucially, enables better correlation between different telemetry types. Seeing a spike in metric latency? OTel makes it easier to jump to the corresponding traces to understand the cause. This unified view is essential for truly understanding the behavior of complex, distributed systems.
4. Enabling Cost Optimization and Deeper Insights
Alexandre Magno shared compelling examples of how Pismo leveraged OTel (specifically, sampling via the Collector) to achieve significant cost savings on their observability spend — potentially millions of dollars. By gaining fine-grained control over what data is sent where, teams can optimize for both cost and performance.
Furthermore, the rich, standardized data OTel provides enables deeper insights that might be harder to achieve with proprietary formats. Consistent attribute propagation across services allows for more accurate distributed tracing and analysis of end-to-end user journeys.
5. A Vibrant, Collaborative Community
OpenTelemetry isn't just code; it's a massive community effort. Adriel Perkins spoke positively about the welcoming nature of the community and the opportunities to learn and contribute. James Moessis echoed this, noting the responsiveness of maintainers and the rigorous code review process, which ultimately improves the quality of the project.
While navigating the community and contributing might have its own learning curve, the fact that OTel is developed in the open means users aren't reliant on a single vendor's roadmap. If a feature is missing or a bug is impacting you, there's a pathway (though sometimes challenging) to influence the direction or contribute a fix. This collaborative aspect fosters innovation and ensures OTel evolves based on the real-world needs of its users. The existence of initiatives like the contributor experience survey shows a commitment to making the community accessible and effective.
The Verdict: Worth the Climb?
The experiences of Adriel, Alexandre, Elena, and James paint a clear picture: OpenTelemetry is immensely powerful, but it's not a plug-and-play panacea. It demands investment — in learning, in configuration, in keeping pace with its evolution, and in carefully managing the quality and volume of telemetry data generated, especially when relying heavily on auto-instrumentation.
The "sucks" moments — the breaking changes, the configuration complexity, the occasional documentation gaps, the challenge of taming auto-instrumentation noise — are real and require dedicated engineering effort to overcome. However, the "rocks" moments — unparalleled flexibility, vendor freedom, a unified data model, powerful processing capabilities via the Collector, and a vibrant community — represent a fundamental shift in how we approach observability.
For observability engineers navigating today's complex cloud-native environments, OTel offers a path towards a more standardized, flexible, and future-proof observability strategy. It requires embracing the complexities and contributing back to the ecosystem, but the rewards — deeper insights, greater control, and freedom from lock-in — appear to be well worth the climb. The journey might have its frustrations, but OpenTelemetry is undeniably shaping the future of the field.
A special thank you to Adriel Perkins, Alexandre Magno Prado Machado, Elena Kovalenko, and James Moessis for generously sharing their time and candid experiences for this ongoing conversation about OpenTelemetry in the real world.