Network Forensics at 40G and 100G Speeds
February 23, 2016

Mandana Javaheri
Savvius

Share this

The 40G and 100G market will generate tens of billions of dollars in revenue in the next few years according to a recent Infonetics market forecast. Growth in traffic, which some analysts estimate will reach 50 to 60 percent annually, enables new opportunities but also puts enormous pressure on networks and creates new challenges.

Network forensics is one of these new challenges. Although network forensics is most commonly associated with investigating security incidents and breaches, it is also very valuable for providing visibility into network activities, troubleshooting issues quickly and diagnosing common network problems such as connectivity, unexpected change in utilization, or poor VoIP call quality.

Here are some of the ways you can prepare for successful network forensics as network speeds increase.

Know your Network

To identify anomalies, first you need to define or benchmark what is "normal" for your network. Your network performance solution is your best friend here. Baselining key business applications as well as measuring important network-based metrics such as packet size distribution, protocol and node usage will build an accurate model to know the normal behavior so you have something to compare to in case of problems.

Prepare for Everything

It is not just about having the right network forensics solution; you need the right infrastructure for your new, fast network as well. From your switches to your routers to your network packet brokers to your filtering criteria to your monitoring and forensics tools, everything has to be fast-speed compatible.

And most importantly you need to know your network and ask yourself the right questions:

What is your strategy?

Does it make sense to load-balance your traffic across multiple network forensics devices to get the full visibility?

Does it make sense to filter out the traffic you don't need?

What is your use case?

How do you usually find out there is an issue?

Is it by constantly monitoring the network or by receiving trouble tickets about performance?

Every network has its own specific needs, so make sure you know what those needs are and pick a network forensics partner that will help you meet them.

Smart Storage

One of the important components of making sure you have the network level data available to you when needed is defining the storage requirements. The faster the network becomes, the more storage is required to store what you need.

A fully utilized 1G network will generate 11TB of data per day. To control storage costs, you will need to get smarter about what is stored. This is only possible by knowing the network and your specific use cases. Techniques like filtering, packet slicing and load-balancing will help you use your storage more efficiently, while extended storage, SAN, and cloud-based technologies are also available if needed.

Depending on your network traffic, forensics and storage requirements, you should pick the amount and type of storage you require today and make sure it can scale to meet your needs in the future.

Intelligent Forensics

Searching through large amounts of packet data to find that essential little trace can be a frustrating process. So pick your search criteria and the type of analytics you need to run on your traffic wisely. Use your knowledge about the network baseline to define the forensics criteria. Make your search as focused as possible using filters. Define the time range, the application, the server or client which is experiencing the issue and drill down to as much detail as needed for troubleshooting. For example, if your problem is not VoIP or wireless related, don't use hardware resources to analyze those.

By knowing your network, using the right techniques and planning ahead, you can turn 40G and 100G network challenges into new opportunities.

Mandana Javaheri is CTO of Savvius.

Share this

The Latest

February 27, 2024

Generative AI has recently experienced unprecedented dramatic growth, making it one of the most exciting transformations the tech industry has seen in some time. However, this growth also poses a challenge for tech leaders who will be expected to deliver on the promise of new technology. In 2024, delivering tangible outcomes that meet the potential of AI, and setting up incubator projects for the future will be key tasks ...

February 26, 2024

SAP is a tool for automating business processes. Managing SAP solutions, especially with the shift to the cloud-based S/4HANA platform, can be intricate. To explore the concerns of SAP users during operational transformations and automation, a survey was conducted in mid-2023 by Digitate and Americas' SAP Users' Group ...

February 22, 2024

Some companies are just starting to dip their toes into developing AI capabilities, while (few) others can claim they have built a truly AI-first product. Regardless of where a company is on the AI journey, leaders must understand what it means to build every aspect of their product with AI in mind ...

February 21, 2024

Generative AI will usher in advantages within various industries. However, the technology is still nascent, and according to the recent Dynatrace survey there are many challenges and risks that organizations need to overcome to use this technology effectively ...

February 20, 2024

In today's digital era, monitoring and observability are indispensable in software and application development. Their efficacy lies in empowering developers to swiftly identify and address issues, enhance performance, and deliver flawless user experiences. Achieving these objectives requires meticulous planning, strategic implementation, and consistent ongoing maintenance. In this blog, we're sharing our five best practices to fortify your approach to application performance monitoring (APM) and observability ...

February 16, 2024

In MEAN TIME TO INSIGHT Episode 3, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses network security with Chris Steffen, VP of Research Covering Information Security, Risk, and Compliance Management at EMA ...

February 15, 2024

In a time where we're constantly bombarded with new buzzwords and technological advancements, it can be challenging for businesses to determine what is real, what is useful, and what they truly need. Over the years, we've witnessed the rise and fall of various tech trends, such as the promises (and fears) of AI becoming sentient and replacing humans to the declaration that data is the new oil. At the end of the day, one fundamental question remains: How can companies navigate through the tech buzz and make informed decisions for their future? ...

February 14, 2024

We increasingly see companies using their observability data to support security use cases. It's not entirely surprising given the challenges that organizations have with legacy SIEMs. We wanted to dig into this evolving intersection of security and observability, so we surveyed 500 security professionals — 40% of whom were either CISOs or CSOs — for our inaugural State of Security Observability report ...

February 13, 2024

Cloud computing continues to soar, with little signs of slowing down ... But, as with any new program, companies are seeing substantial benefits in the cloud but are also navigating budgetary challenges. With an estimated 94% of companies using cloud services today, priorities for IT teams have shifted from purely adoption-based to deploying new strategies. As they explore new territories, it can be a struggle to exploit the full value of their spend and the cloud's transformative capabilities ...

February 12, 2024

What will the enterprise of the future look like? If we asked this question three years ago, I doubt most of us would have pictured today as we know it: a future where generative AI has become deeply integrated into business and even our daily lives ...