
Mezmo announced a simple, more predictable pricing structure for its intelligent telemetry orchestration platform.
The new structure includes one component for processing and analyzing data and another for data retention: $0.20 per gigabyte ingested and $0.20 per gigabyte retained per month, respectively. That’s down from $1.80 per gig retained, delivering nearly 90% in savings. The company’s simple, transparent approach to pricing gives companies the freedom to scale without high, unpredictable costs.
The move is a response to a compounding industry problem: Infrastructure and applications have created an explosion in data from cloud, microservices and AI, causing observability costs to skyrocket. Most vendors charge premium prices — calculated in complex formulas — to process and analyze telemetry data, even as global data volumes are projected to more than double by 2028. As a result, companies are forced to keep everything, driving up observability costs with bloated, unpredictable bills.
Mezmo took a different approach, restructuring and modernizing its backend to compress infrastructure costs while continuously enhancing its pipeline processing and data orchestration capabilities to enable greater efficiency and pass those savings directly to customers. The company saw more than a 90% reduction in infrastructure and resources and a 70% reduction in operational applications. The result? Mezmo completely transformed its approach to pricing, significantly reducing customer costs and simplifying the structure to correlate cost with value.
“We tackled the rising costs of observability head-on by taking a holistic look at how organizations process, analyze and store telemetry data,” said Tucker Callaway, CEO of Mezmo. “The result is a more sustainable, scalable and valuable model for observability data.”
Key benefits of Mezmo’s 2025 pricing:
- Simple, transparent pricing for contract customers. $0.20 per gigabyte ingested, and $0.20 per gigabyte retained monthly. No confusing formulas based on CPU, data type, user count or queries.
- Built-in cost control. Teams can decide what data to ingest, preprocess it locally with Mezmo Edge, or send it directly to Mezmo to intelligently filter and route, retaining only what matters, wherever they need it.
- Cold storage with rehydration. Rehydration lets users archive data to reduce spend and restore it to Mezmo when needed for analysis or debugging, balancing cost with access.
The Latest
In MEAN TIME TO INSIGHT Episode 14, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses hybrid multi-cloud network observability...
While companies adopt AI at a record pace, they also face the challenge of finding a smart and scalable way to manage its rapidly growing costs. This requires balancing the massive possibilities inherent in AI with the need to control cloud costs, aim for long-term profitability and optimize spending ...
Telecommunications is expanding at an unprecedented pace ... But progress brings complexity. As WanAware's 2025 Telecom Observability Benchmark Report reveals, many operators are discovering that modernization requires more than physical build outs and CapEx — it also demands the tools and insights to manage, secure, and optimize this fast-growing infrastructure in real time ...
As businesses increasingly rely on high-performance applications to deliver seamless user experiences, the demand for fast, reliable, and scalable data storage systems has never been greater. Redis — an open-source, in-memory data structure store — has emerged as a popular choice for use cases ranging from caching to real-time analytics. But with great performance comes the need for vigilant monitoring ...
Kubernetes was not initially designed with AI's vast resource variability in mind, and the rapid rise of AI has exposed Kubernetes limitations, particularly when it comes to cost and resource efficiency. Indeed, AI workloads differ from traditional applications in that they require a staggering amount and variety of compute resources, and their consumption is far less consistent than traditional workloads ... Considering the speed of AI innovation, teams cannot afford to be bogged down by these constant infrastructure concerns. A solution is needed ...
AI is the catalyst for significant investment in data teams as enterprises require higher-quality data to power their AI applications, according to the State of Analytics Engineering Report from dbt Labs ...
Misaligned architecture can lead to business consequences, with 93% of respondents reporting negative outcomes such as service disruptions, high operational costs and security challenges ...
A Gartner analyst recently suggested that GenAI tools could create 25% time savings for network operational teams. Where might these time savings come from? How are GenAI tools helping NetOps teams today, and what other tasks might they take on in the future as models continue improving? In general, these savings come from automating or streamlining manual NetOps tasks ...
IT and line-of-business teams are increasingly aligned in their efforts to close the data gap and drive greater collaboration to alleviate IT bottlenecks and offload growing demands on IT teams, according to The 2025 Automation Benchmark Report: Insights from IT Leaders on Enterprise Automation & the Future of AI-Driven Businesses from Jitterbit ...
A large majority (86%) of data management and AI decision makers cite protecting data privacy as a top concern, with 76% of respondents citing ROI on data privacy and AI initiatives across their organization, according to a new Harris Poll from Collibra ...