In order to achieve continuous visibility and control of today's complex networks, organizations rely on specialized monitoring and security tools that are connected to live links. In my last blog, I discussed how TAPs facilitate that failsafe connection. In this blog, I'd like to expand beyond the TAP and look at the role Packet Brokers play in an organization's visibility architecture. This includes exploring some of the common mistakes engineers should avoid when utilizing the more sophisticated Packet Broker features. Being aware of these issues can help network managers implement an efficient visibility architecture and avoid errors that could adversely affect network monitoring, performance and ultimately business operations.
Here are 5 common mistakes that are made when deploying Packet Brokers, and how to avoid them:
1. Don't Mistake a Packet Broker for a TAP
TAPs are relatively simple devices that are often confused with Packet Brokers. Both TAPs and Packer Brokers provide tool connectivity and have similar feature sets. However, TAPs provide failsafe network ports. These ports have copper relays or optical splitters that will keep network traffic flowing even if power is lost to the TAP. Packet Brokers generally do not have failsafe network ports. Therefore, it's important to make the initial network connections using TAPs and send the traffic through to the Packet Broker for management.
There are some combination TAP/Packet Brokers on the market that provide failsafe network connections and Packet Broker features. These combo (or Hybrid) units can save space and money depending on network size, complexity, and ports needed.
2. Buying New Monitoring Tools When New Links are Too Fast for Older Equipment
With ever-increasing bandwidth demands on networks, new links are often moving from copper connections (10Mbps to 100Mbps) to optical fiber (1Gbps), or from lower-speed fiber (1Gbps) to high-speed fiber (10Gbps – 100Gbps). Changing link media does not necessarily require replacing all legacy monitoring tools. Packet Brokers provide load balancing features that allow high-speed network links to evenly distribute the traffic among a number of lower-speed tools.
For example, an incoming network connection at 40Gbps can be connected to a Packet Broker and distributed through output/tool ports to five monitoring devices with a maximum processing capacity of 8Gbps each. This feature allows network managers to save CAPEX on monitoring tools while keeping pace with faster networking speeds.
3. Not Using a Packet Broker for In-Line Security Tools
Many security tools require in-line access to links, meaning that live traffic passes through the tool and back into the network. There are many TAPs that provide in-line access so the tool can have real-time control over live traffic. These TAPs protect live links through an active bypass function that keeps network traffic flowing even if the security tool is taken offline.
In complex networks, managers may be tempted to use multiple independent TAPs for in-line security tools, and Packet Brokers only to manage passive monitoring tools. Packet Brokers, however, can pass real-time traffic delivered through in-line TAPs. This allows the Packet Broker to manage both in-line security and passive monitoring tools through one central device, simplifying deployment of all connected tools.
4. Packet Slicing is Not Packet Manipulation
Packet Slicing is a Packet Broker feature that removes the payload from a packet before it arrives at the monitoring tool. This is done when only packet header information is required by the monitoring tool. Packet slicing can be an efficiency feature that allows the monitoring tool to work faster. It's also an important feature for privacy and legal compliance when monitoring equipment shouldn't have access to actual payload data. Accurate traffic monitoring, however, often requires visibility to the entire packet size in order to accurately capture and report on packet size and time through the network.
There are Packet Brokers that provide packet manipulation, which is similar to slicing, but more complex and more accurate for traffic monitoring and planning. This is done by replacing payload information with random 1's and 0's rather than simply removing the payload. Packet manipulation provides privacy compliance, accurate traffic management and a wider range of user-defined options for traffic analysis.
5. Not Planning for Scale
When designing a visibility architecture, it's critical that future needs be included in the plan. Plan A is purchasing more equipment and ports than initially required to be sure that future capacity for new links and tools is built into the initial plan. Plan B is to purchase only what is needed for today and worry about future needs and budget when the time comes.
However the best plan, Plan C, is to carefully evaluate all Packet Broker equipment options to build extensibility into the plan without breaking the budget. Some Packet Brokers offer scale-out options that allow the purchase of smaller initial units for immediate needs and provide extension units for future growth. This plan allows immediate budgetary savings and provides for growth by simple add-on rather than replacement of older equipment.
Monitoring tools were once used primarily for ad hoc diagnostics, but as networks advance and evolve, these solutions are now permanent additions that deliver vital information for today's modern digital businesses. Trends around BYOD, IoT, social media, and more, are increasing network traffic and malicious activity, making it harder to ensure performance and secure users. Understanding the role of a TAP and Packet Broker — and what mistakes to avoid when deploying them — will allow you to create a flexible visibility architecture that meets the needs of IT, while saving time and money.
The Cloud Performance Benchmark from ThousandEyes compares global network performance and connectivity differences between the five major public cloud providers — Amazon Web Services (AWS), Google Cloud Platform (GCP), Microsoft Azure, Alibaba Cloud and IBM Cloud — proving that, when it comes to performance, not all clouds are created equal ...
For the past 10 years, the majority of CIOs have had a transformational focus (currently 42%), however, this year, there is strong momentum in CIOs taking on more strategic responsibilities (40%), according to the 2020 State of the CIO research from IDG's CIO ...
The tech world may be falling in love with artificial intelligence and automation, but when it comes to managing critical assets, old school tools like spreadsheets are still in common use. A new survey by Ivanti illustrates how these legacy tools are forcing IT to waste valuable time analyzing assets due to incomplete data ...
Over 70% of C-Suite decision makers believe business innovation and staff retention are driven by improved visibility into network and application performance, according to Rethink Possible: Visibility and Network Performance – The Pillars of Business Success, a survey
conducted by Riverbed ...