Network Forensics at 40G and 100G Speeds
February 23, 2016

Mandana Javaheri
Savvius

Share this

The 40G and 100G market will generate tens of billions of dollars in revenue in the next few years according to a recent Infonetics market forecast. Growth in traffic, which some analysts estimate will reach 50 to 60 percent annually, enables new opportunities but also puts enormous pressure on networks and creates new challenges.

Network forensics is one of these new challenges. Although network forensics is most commonly associated with investigating security incidents and breaches, it is also very valuable for providing visibility into network activities, troubleshooting issues quickly and diagnosing common network problems such as connectivity, unexpected change in utilization, or poor VoIP call quality.

Here are some of the ways you can prepare for successful network forensics as network speeds increase.

Know your Network

To identify anomalies, first you need to define or benchmark what is "normal" for your network. Your network performance solution is your best friend here. Baselining key business applications as well as measuring important network-based metrics such as packet size distribution, protocol and node usage will build an accurate model to know the normal behavior so you have something to compare to in case of problems.

Prepare for Everything

It is not just about having the right network forensics solution; you need the right infrastructure for your new, fast network as well. From your switches to your routers to your network packet brokers to your filtering criteria to your monitoring and forensics tools, everything has to be fast-speed compatible.

And most importantly you need to know your network and ask yourself the right questions:

What is your strategy?

Does it make sense to load-balance your traffic across multiple network forensics devices to get the full visibility?

Does it make sense to filter out the traffic you don't need?

What is your use case?

How do you usually find out there is an issue?

Is it by constantly monitoring the network or by receiving trouble tickets about performance?

Every network has its own specific needs, so make sure you know what those needs are and pick a network forensics partner that will help you meet them.

Smart Storage

One of the important components of making sure you have the network level data available to you when needed is defining the storage requirements. The faster the network becomes, the more storage is required to store what you need.

A fully utilized 1G network will generate 11TB of data per day. To control storage costs, you will need to get smarter about what is stored. This is only possible by knowing the network and your specific use cases. Techniques like filtering, packet slicing and load-balancing will help you use your storage more efficiently, while extended storage, SAN, and cloud-based technologies are also available if needed.

Depending on your network traffic, forensics and storage requirements, you should pick the amount and type of storage you require today and make sure it can scale to meet your needs in the future.

Intelligent Forensics

Searching through large amounts of packet data to find that essential little trace can be a frustrating process. So pick your search criteria and the type of analytics you need to run on your traffic wisely. Use your knowledge about the network baseline to define the forensics criteria. Make your search as focused as possible using filters. Define the time range, the application, the server or client which is experiencing the issue and drill down to as much detail as needed for troubleshooting. For example, if your problem is not VoIP or wireless related, don't use hardware resources to analyze those.

By knowing your network, using the right techniques and planning ahead, you can turn 40G and 100G network challenges into new opportunities.

Mandana Javaheri is CTO of Savvius.

Share this

The Latest

April 25, 2024

The use of hybrid multicloud models is forecasted to double over the next one to three years as IT decision makers are facing new pressures to modernize IT infrastructures because of drivers like AI, security, and sustainability, according to the Enterprise Cloud Index (ECI) report from Nutanix ...

April 24, 2024

Over the last 20 years Digital Employee Experience has become a necessity for companies committed to digital transformation and improving IT experiences. In fact, by 2025, more than 50% of IT organizations will use digital employee experience to prioritize and measure digital initiative success ...

April 23, 2024

While most companies are now deploying cloud-based technologies, the 2024 Secure Cloud Networking Field Report from Aviatrix found that there is a silent struggle to maximize value from those investments. Many of the challenges organizations have faced over the past several years have evolved, but continue today ...

April 22, 2024

In our latest research, Cisco's The App Attention Index 2023: Beware the Application Generation, 62% of consumers report their expectations for digital experiences are far higher than they were two years ago, and 64% state they are less forgiving of poor digital services than they were just 12 months ago ...

April 19, 2024

In MEAN TIME TO INSIGHT Episode 5, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses the network source of truth ...

April 18, 2024

A vast majority (89%) of organizations have rapidly expanded their technology in the past few years and three quarters (76%) say it's brought with it increased "chaos" that they have to manage, according to Situation Report 2024: Managing Technology Chaos from Software AG ...

April 17, 2024

In 2024 the number one challenge facing IT teams is a lack of skilled workers, and many are turning to automation as an answer, according to IT Trends: 2024 Industry Report ...

April 16, 2024

Organizations are continuing to embrace multicloud environments and cloud-native architectures to enable rapid transformation and deliver secure innovation. However, despite the speed, scale, and agility enabled by these modern cloud ecosystems, organizations are struggling to manage the explosion of data they create, according to The state of observability 2024: Overcoming complexity through AI-driven analytics and automation strategies, a report from Dynatrace ...

April 15, 2024

Organizations recognize the value of observability, but only 10% of them are actually practicing full observability of their applications and infrastructure. This is among the key findings from the recently completed Logz.io 2024 Observability Pulse Survey and Report ...

April 11, 2024

Businesses must adopt a comprehensive Internet Performance Monitoring (IPM) strategy, says Enterprise Management Associates (EMA), a leading IT analyst research firm. This strategy is crucial to bridge the significant observability gap within today's complex IT infrastructures. The recommendation is particularly timely, given that 99% of enterprises are expanding their use of the Internet as a primary connectivity conduit while facing challenges due to the inefficiency of multiple, disjointed monitoring tools, according to Modern Enterprises Must Boost Observability with Internet Performance Monitoring, a new report from EMA and Catchpoint ...