
cPacket Networks announced new features to its cClear platform that enable customers to improve monitoring and fault identification on 40Gbps and 100Gbps networks.
The new capabilities offer a wide range of customer benefits including:
- 40Gbps Encapsulated Remote Switch Port Analyzer (ERSPAN) Type III termination functionality
- Consistent timestamp extraction across network streams generated by third-party devices
- Dynamic truncation combined with new and advanced cSearch features will provide users with highly customizable visibility into the network
“As 40Gbps and 100Gbps networks become the new norm for large data centers, customers need to have the proper tools in place to prevent network downtime and increased operational costs,” said Brendan O’Flaherty, CEO at cPacket. “With our new set of features, we’ve extended our lead over packet broker and NPM providers that are still scaling their products for the basic needs of high-speed networks. Features such as ERSPAN termination, timestamping, and dynamic truncation have become essentials in the toolsets of network personnel challenged with harnessing network traffic.”
cPacket’s new features and benefits include:
- 40Gbps ERSPAN Type III Termination: cPacket devices can extract timestamps from ERSPAN packets with one-nanosecond accuracy. With the increase in traffic rates across the network, it is crucial to be able to deliver more traffic from more points in the network. This new feature allows users to terminate 40Gbps and 10Gbps ERSPAN connections, ultimately improving monitoring capabilities from edge to core to data center.
- Timestamp Extraction from Arista 7150 Series Switches: cPacket’s new feature enables customers with Arista 7150 Series switches to standardize timing with all monitoring tools and devices. It should be noted that due to its place in the network, cPacket’s cVu generates accurate timestamps and is considered one of the industry’s de facto standards for providing accurate timestamps.
- Special Action Filters and Dynamic Truncation: This new feature overcomes the challenges of selectively filtering traffic to perform additional processing on a case by case basis. cPacket’s unique Smart Filter technology allows full packet inspection of every byte in every packet in both the header and the payload at wirespeed. This ensures fully granular traffic pruning for performance monitoring and network troubleshooting. In addition, cPacket’s intelligent Dynamic Truncation feature removes TCP or UDP payloads while leaving the header intact. This feature is highly beneficial to cPacket’s customers in finance and healthcare who must remain compliant by removing Personally Identifiable Information (PII) while also retaining the important header information required for troubleshooting the network.
- cSearch – New and Improved: This highly sought-after feature allows customers to search the entire monitored network for a specific pattern and can be used to identify the location of network traffic matching specified profiles anywhere in the network, reducing troubleshooting time and improving security. In addition, with the 18.1.1 release, cSearch can be accessed and used from an external device via a RESTful API, enabling easier integration and automation.
These new features are available now as part of the new cPacket release.
The Latest
In the world of digital-first business, there is no tolerance for service outages. Businesses know that outages are the quickest way to lose money and customers. For smaller organizations, unplanned downtime could even force the business to close ... A new study from PagerDuty, The State of AI-First Operations, reveals that companies actively incorporating AI into operations now view operational resilience as a growth driver rather than a cost center. But how are they achieving it? ...
In live financial environments, capital markets software cannot pause for rebuilds. New capabilities are introduced as stacked technology layers to meet evolving demands while systems remain active, data keeps moving, and controls stay intact. AI is no exception, and its opportunities are significant: accelerated decision cycles, compressed manual workflows, and more effective operations across complex environments. The constraint isn't the models themselves, but the architectural environments they enter ...
Like most digital transformation shifts, organizations often prioritize productivity and leave security and observability to keep pace. This usually translates to both the mass implementation of new technology and fragmented monitoring and observability (M&O) tooling. In the era of AI and varied cloud architecture, a disparate observability function can be dangerous. IT teams will lack a complete picture of their IT environment, making it harder to diagnose issues while slowing down mean time to resolve (MTTR). In fact, according to recent data from the SolarWinds State of Monitoring & Observability Report, 77% of IT personnel said the lack of visibility across their on-prem and cloud architecture was an issue ...
In MEAN TIME TO INSIGHT Episode 23, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses the NetOps labor shortage ...
Technology management is evolving, and in turn, so is the scope of FinOps. The FinOps Foundation recently updated their mission statement from "advancing the people who manage the value of cloud" to "advancing the people who manage the value of technology." This seemingly small change solidifies a larger evolution: FinOps practitioners have organically expanded to be focused on more than just cloud cost optimization. Today, FinOps teams are largely — and quickly — expanding their job descriptions, evolving into a critical function for managing the full value of technology ...
Enterprises are under pressure to scale AI quickly. Yet despite considerable investment, adoption continues to stall. One of the most overlooked reasons is vendor sprawl ... In reality, no organization deliberately sets out to create sprawling vendor ecosystems. More often, complexity accumulates over time through well-intentioned initiatives, such as enterprise-wide digital transformation efforts, point solutions, or decentralized sourcing strategies ...
Nearly every conversation about AI eventually circles back to compute. GPUs dominate the headlines while cloud platforms compete for workloads and model benchmarks drive investment decisions. But underneath that noise, a quieter infrastructure challenge is taking shape. The real bottleneck in enterprise AI is not processing power, it is the ability to store, manage and retrieve the relentless volumes of data that AI systems generate, consume and multiply ...
The 2026 Observability Survey from Grafana Labs paints a vivid picture of an industry maturing fast, where AI is welcomed with careful conditions, SaaS economics are reshaping spending decisions, complexity remains a defining challenge, and open standards continue to underpin it all ...
The observability industry has an evolving relationship with AI. We're not skeptics, but it's clear that trust in AI must be earned ... In Grafana Labs' annual Observability Survey, 92% said they see real value in AI surfacing anomalies before they cause downtime. Another 91% endorsed AI for forecasting and root cause analysis. So while the demand is there, customers need it to be trustworthy, as the survey also found that the practitioners most enthusiastic about AI are also the most insistent on explainability ...
In the modern enterprise, the conversation around AI has moved past skepticism toward a stage of active adoption. According to our 2026 State of IT Trends Report: The Human Side of Autonomous AI, nearly 90% of IT professionals view AI as a net positive, and this optimism is well-founded. We are seeing agentic AI move beyond simple automation to actively streamlining complex data insights and eliminating the manual toil that has long hindered innovation. However, as we integrate these autonomous agents into our ecosystems, the fundamental DNA of the IT role is evolving ...