10 Things to Consider Before Multicasting Your Observability Data
November 14, 2023

Will Krause
Circonus

Share this

Multicasting in this context refers to the process of directing data streams to two or more destinations. This might look like sending the same telemetry data to both an on-premises storage system and a cloud-based observability platform concurrently. The two principal benefits of this strategy are cost savings and service redundancy.

Cost Savings: Depending on the use-case, storing or processing data in one location might be cheaper than another. By multicasting the data, businesses can choose the most cost-effective solution for each specific need, without being locked into one destination.

Service Redundancy: No system is foolproof. By sending data to multiple locations, you create a built-in backup. If one service goes down, data isn't lost and can still be accessed and analyzed from another source.

The following are 10 things to consider before multicasting you observability data:

1. Consistency of User Expectations

It's crucial that both destinations receive data reliably and consistently. If it is unclear to users what data resides in which platform, it will impede adoption and make this strategy less effective. A common heuristic is to keep all of your data in a cheaper observability platform and send the more essential data to the more feature rich expensive platform. Likewise if one platform has data integrity issues due to the fact that no one is using it outside of break glass scenarios, it will reduce the effectiveness of this strategy.

2. Data Consistency

While it's good to have a process for evaluating the correctness of your data, when you write data to two systems, not everything will always line up. This could be due to ingestion latency, differences in how each platform rolls up long term data, or even just the graphing libraries that are used. Make sure to set the right expectations with teams, that small differences are expected if both platforms are in active use.

3. Bandwidth and Network Load

Transmitting the same piece of data multiple times can put an additional load on your network. This is more of an issue if you're sending data out from a cloud environment where you have to pay the egress cost.

Additionally, some telemetry components are aggregation points that can push the limits of vertical scaling (for example carbon relay servers). Multicasting the data may not be possible directly at that point in the architecture due to limitations in how much data can traverse the NIC. It's essential to understand the impact on bandwidth and provision appropriately.

4. Cost Analysis

While multicasting can lead to savings, it's crucial to do a detailed cost analysis. Transmitting and storing data in multiple places might increase costs in certain scenarios.

5. Security and Compliance

Different storage destinations might have different security features and compliance certifications. Ensure that all destinations align with your company's security and regulatory needs.

6. Tool Integration

Not all observability tools might natively support multicasting data. Some observability vendors' agents can only send data to their product. You may need to explore a multi-agent strategy in cases like that.

7. Data Retrieval and Analysis

With data residing in multiple locations, the way your teams will need to engage with the data may differ. If you're using a popular open source telemetry dashboarding tool, then there will be at least some degree of consistency with how to engage with the data, even if the query syntax supported by each platform is different. This becomes a little more challenging if your teams are using the UI of the higher cost observability platform.

8. Data Lifecycle Management

Consider how long you need the data stored in each location. You might choose to have short-term data in one location and long-term archival in another.

9. Maintenance and Monitoring

With more destinations come more points of potential failure. Implement robust monitoring to ensure all destinations are consistently available and performing as expected. This is a good opportunity to introduce cross monitoring, where each observability stack monitors the other.

10. Migration and Scalability

As your business grows, you might need to migrate or scale your lower cost observability platform. Ensure the chosen destinations support such migrations without significant overhead.

Conclusion

Multicasting data that is collected by your observability tools offers an innovative approach to maximize both cost efficiency and system resilience. However, like all strategies, it comes with its set of considerations. By understanding and preparing for these considerations, businesses can harness the power of this approach to create observability solutions that are both robust and cost-effective.

Will Krause is VP of Engineering at Circonus
Share this

The Latest

May 17, 2024

In MEAN TIME TO INSIGHT Episode 6, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses network automation ...

May 16, 2024

In the ever-evolving landscape of software development and infrastructure management, observability stands as a crucial pillar. Among its fundamental components lies log collection ... However, traditional methods of log collection have faced challenges, especially in high-volume and dynamic environments. Enter eBPF, a groundbreaking technology ...

May 15, 2024

Businesses are dazzled by the promise of generative AI, as it touts the capability to increase productivity and efficiency, cut costs, and provide competitive advantages. With more and more generative AI options available today, businesses are now investigating how to convert the AI promise into profit. One way businesses are looking to do this is by using AI to improve personalized customer engagement ...

May 14, 2024

In the fast-evolving realm of cloud computing, where innovation collides with fiscal responsibility, the Flexera 2024 State of the Cloud Report illuminates the challenges and triumphs shaping the digital landscape ... At the forefront of this year's findings is the resounding chorus of organizations grappling with cloud costs ...

May 13, 2024

Government agencies are transforming to improve the digital experience for employees and citizens, allowing them to achieve key goals, including unleashing staff productivity, recruiting and retaining talent in the public sector, and delivering on the mission, according to the Global Digital Employee Experience (DEX) Survey from Riverbed ...

May 09, 2024

App sprawl has been a concern for technologists for some time, but it has never presented such a challenge as now. As organizations move to implement generative AI into their applications, it's only going to become more complex ... Observability is a necessary component for understanding the vast amounts of complex data within AI-infused applications, and it must be the centerpiece of an app- and data-centric strategy to truly manage app sprawl ...

May 08, 2024

Fundamentally, investments in digital transformation — often an amorphous budget category for enterprises — have not yielded their anticipated productivity and value ... In the wake of the tsunami of money thrown at digital transformation, most businesses don't actually know what technology they've acquired, or the extent of it, and how it's being used, which is directly tied to how people do their jobs. Now, AI transformation represents the biggest change management challenge organizations will face in the next one to two years ...

May 07, 2024

As businesses focus more and more on uncovering new ways to unlock the value of their data, generative AI (GenAI) is presenting some new opportunities to do so, particularly when it comes to data management and how organizations collect, process, analyze, and derive insights from their assets. In the near future, I expect to see six key ways in which GenAI will reshape our current data management landscape ...

May 06, 2024

The rise of AI is ushering in a new disrupt-or-die era. "Data-ready enterprises that connect and unify broad structured and unstructured data sets into an intelligent data infrastructure are best positioned to win in the age of AI ...

May 02, 2024

A majority (61%) of organizations are forced to evolve or rethink their data and analytics (D&A) operating model because of the impact of disruptive artificial intelligence (AI) technologies, according to a new Gartner survey ...