Endace announced the launch of the EndaceAccess 100, a brand new 100 Gigabit Ethernet (100G) capable Network Visibility Headend.
With this system, organizations can get the network access that they need to monitor, analyze, protect and troubleshoot 100G network segments. Up until this point, 100G networks have been invisible to IT operations teams.
“Until today, there has been no way for organizations to get visibility into 100G network segments, which has slowed 100G adoption,” said Spencer Greene, senior vice president, product management and marketing at Endace. “The EndaceAccess 100 enables organizations to leverage their existing 10 Gbps capable tools in a 100G environment. Previously, when the world moved from 1 Gbps to 10 Gbps networking, the ability to leverage existing tools in this way proved to be a major advantage for organizations, and we expect this to be the case as organizations make the natural transition from 10 Gbps to 40G and 100G networking.”
The EndaceAccess 100 is a 100G and 40G capable headend system that leverages Endace’s world-famous 100 percent accurate DAG® technology. The system can be configured to support LAN or WAN protocols, making it practical for deployment in both the data center and WAN environments. EndaceAccess is powerful because it enables organizations to continue to use their existing 10 Gbps monitoring and security tools, which helps ease the transition from 10 Gbps to 100G.
“100G is growing in adoption, but remains the domain of large enterprises and carrier core networks; it will ultimately become approachable for smaller companies within data center deployments. As of today, there is no practical way for organizations seeking to deploy 100G to gain access to the network traffic for the purposes of network and/or network security monitoring,” said Jonah Kowall, research director at Gartner. “Additionally, the monitoring tools themselves cannot handle this level of traffic which is a critical shortcoming that will prevent the deployment of 100G technology within many organizations."
Like traditional monitoring switches, the EndaceAccess 100 headend receives high-speed network traffic from passive optical taps and distributes the traffic to multiple lower-speed ports, which can then be connected to scale-out clusters of monitoring tools. Unlike existing monitoring switches that receive 10 Gbps and distribute to multiple 1 Gbps or 10 Gbps capable destinations, the EndaceAccess 100 scales everything up by a factor of ten, receiving 100G or 40G inputs and distributing to multiple 10 Gbps destinations where the traffic can be analyzed.
The system provides support for two 100G or 40G monitoring ports, that are configured to capture traffic from both sides of a bidirectional link, in two rack-units. Traffic from each monitoring port is distributed in a flow-safe way across 12 ports of 10 Gbps, enabling the system to scale to support full line-rate 100G. The load-balancing algorithm used in EndaceAccess 100 enables captured traffic to be directed to specific egress ports by flow and guarantees 100 percent accuracy at 100G.
The Latest
E-commerce is set to skyrocket with a 9% rise over the next few years ... To thrive in this competitive environment, retailers must identify digital resilience as their top priority. In a world where savvy shoppers expect 24/7 access to online deals and experiences, any unexpected downtime to digital services can lead to significant financial losses, damage to brand reputation, abandoned carts with designer shoes, and additional issues ...
Efficiency is a highly-desirable objective in business ... We're seeing this scenario play out in enterprises around the world as they continue to struggle with infrastructures and remote work models with an eye toward operational efficiencies. In contrast to that goal, a recent Broadcom survey of global IT and network professionals found widespread adoption of these strategies is making the network more complex and hampering observability, leading to uptime, performance and security issues. Let's look more closely at these challenges ...

The pressure on IT teams has never been greater. As data environments grow increasingly complex, resource shortages are emerging as a major obstacle for IT leaders striving to meet the demands of modern infrastructure management ... According to DataStrike's newly released 2025 Data Infrastructure Survey Report, more than half (54%) of IT leaders cite resource limitations as a top challenge, highlighting a growing trend toward outsourcing as a solution ...

Gartner revealed its top strategic predictions for 2025 and beyond. Gartner's top predictions explore how generative AI (GenAI) is affecting areas where most would assume only humans can have lasting impact ...
The adoption of artificial intelligence (AI) is accelerating across the telecoms industry, with 88% of fixed broadband service providers now investigating or trialing AI automation to enhance their fixed broadband services, according to new research from Incognito Software Systems and Omdia ...
AWS is a cloud-based computing platform known for its reliability, scalability, and flexibility. However, as helpful as its comprehensive infrastructure is, disparate elements and numerous siloed components make it difficult for admins to visualize the cloud performance in detail. It requires meticulous monitoring techniques and deep visibility to understand cloud performance and analyze operational efficiency in detail to ensure seamless cloud operations ...
Imagine a future where software, once a complex obstacle, becomes a natural extension of daily workflow — an intuitive, seamless experience that maximizes productivity and efficiency. This future is no longer a distant vision but a reality being crafted by the transformative power of Artificial Intelligence ...
Enterprise data sprawl already challenges companies' ability to protect and back up their data. Much of this information is never fully secured, leaving organizations vulnerable. Now, as GenAI platforms emerge as yet another environment where enterprise data is consumed, transformed, and created, this fragmentation is set to intensify ...

OpenTelemetry (OTel) has revolutionized the way we approach observability by standardizing the collection of telemetry data ... Here are five myths — and truths — to help elevate your OTel integration by harnessing the untapped power of logs ...