Network Forensics in a World of Faster Networks
April 18, 2014

Jay Botelho
LiveAction

Share this

Enterprises are relying more on their networks than ever before, but the volume of traffic on faster, higher bandwidth networks is outstripping the data collection and analysis capabilities of traditional network analysis tools. Yesterday's network analyzers – that were designed originally for 1G or slower networks – can't handle the increased amount of traffic, resulting in dropped packets and erroneous reports.

Earlier this year, WildPackets surveyed more than 250 network engineers and IT professionals to better understand how network forensics solutions were being used within the enterprise. Respondents hailed from organizations of all sizes and industries – with the plurality (30%) coming from the technology industry. Furthermore, 50% of all respondents identified themselves as network engineers, with 28% at the director-level or above.

According to the survey, 72% of organizations have increased their network utilization over the past year, resulting in slower problem identification and resolution (38%), less real-time visibility (25%) and more dropped packets leading to inaccurate results (15%).

What we found most interesting was that even though 66% of the survey respondents supported 10G or faster network speeds, only 40% of respondents answered affirmatively to the question "Does your organization currently have a network forensics solution in place?"

So what's the big deal? Not only do faster network speeds make securing and troubleshooting networks difficult, but also traditional network analysis solutions simply cannot keep up with the massive volumes of data being transported.

Organizations need better visibility of the data that are traversing their networks, and deploying a network forensics solution is the only way to gain 24/7 visibility into business operations while also analyzing network performance and IT risks with 100% reliability. Current solutions rely on sampled traffic and high-level statistics, which lack the details and hard evidence that IT engineers need to quickly troubleshoot problems and characterize security attacks.

With faster networks leading to a significant increase in the volume of data being transported - 74% of survey respondents have seen an increase in the volume of data traversing their networks over the last year - network forensics has become an essential IT capability to be deployed at every network location. The recent increase in security breaches is a perfect example of how the continued adoption of network forensics within the security operations center of organizations can be used to pinpoint breaches and infiltrations.

In the past, folks used to think that network forensics was synonymous with security incident investigations. But the results of our survey show that organizations are using these solutions for a variety of reasons. While 25% of respondents said they deploy network forensics for troubleshooting security breaches, almost an equal number (24%) cited verifying and troubleshooting transactions as the key function. 17% percent said analyzing network performance on 10G and faster networks was their main use for forensics, another 17% reported using the solution for verifying VoIP or video traffic problems, and 14% for validating compliance.

In addition, organizations said the biggest benefits of network forensics include: improved overall network performance (40%), reduced time to resolution (30%), and reduced operating costs (21%).

Enterprises recognize that network forensics provides them with the necessary visibility into their business operations, and with increased 40G and 100G network deployments forecast in the next year, network forensics will be a critical tool to gain visibility into these high-performing networks and troubleshoot issues when they arise. Based on the many uses of network forensics, it is expected that the gap between those deploying high speed networks and those deploying network forensics will shrink over the coming years.

Jay Botelho is Director of Product Management at WildPackets.

Jay Botelho is Senior Director of Product Management at LiveAction
Share this

The Latest

April 18, 2024

A vast majority (89%) of organizations have rapidly expanded their technology in the past few years and three quarters (76%) say it's brought with it increased "chaos" that they have to manage, according to Situation Report 2024: Managing Technology Chaos from Software AG ...

April 17, 2024

In 2024 the number one challenge facing IT teams is a lack of skilled workers, and many are turning to automation as an answer, according to IT Trends: 2024 Industry Report ...

April 16, 2024

Organizations are continuing to embrace multicloud environments and cloud-native architectures to enable rapid transformation and deliver secure innovation. However, despite the speed, scale, and agility enabled by these modern cloud ecosystems, organizations are struggling to manage the explosion of data they create, according to The state of observability 2024: Overcoming complexity through AI-driven analytics and automation strategies, a report from Dynatrace ...

April 15, 2024

Organizations recognize the value of observability, but only 10% of them are actually practicing full observability of their applications and infrastructure. This is among the key findings from the recently completed Logz.io 2024 Observability Pulse Survey and Report ...

April 11, 2024

Businesses must adopt a comprehensive Internet Performance Monitoring (IPM) strategy, says Enterprise Management Associates (EMA), a leading IT analyst research firm. This strategy is crucial to bridge the significant observability gap within today's complex IT infrastructures. The recommendation is particularly timely, given that 99% of enterprises are expanding their use of the Internet as a primary connectivity conduit while facing challenges due to the inefficiency of multiple, disjointed monitoring tools, according to Modern Enterprises Must Boost Observability with Internet Performance Monitoring, a new report from EMA and Catchpoint ...

April 10, 2024

Choosing the right approach is critical with cloud monitoring in hybrid environments. Otherwise, you may drive up costs with features you don’t need and risk diminishing the visibility of your on-premises IT ...

April 09, 2024

Consumers ranked the marketing strategies and missteps that most significantly impact brand trust, which 73% say is their biggest motivator to share first-party data, according to The Rules of the Marketing Game, a 2023 report from Pantheon ...

April 08, 2024

Digital experience monitoring is the practice of monitoring and analyzing the complete digital user journey of your applications, websites, APIs, and other digital services. It involves tracking the performance of your web application from the perspective of the end user, providing detailed insights on user experience, app performance, and customer satisfaction ...

April 04, 2024
Modern organizations race to launch their high-quality cloud applications as soon as possible. On the other hand, time to market also plays an essential role in determining the application's success. However, without effective testing, it's hard to be confident in the final product ...
April 03, 2024

Enterprises are experiencing a 13% year-over-year increase in customer-facing incidents, reflecting rising levels of complexity and risk as businesses drive operational transformation at scale, according to the 2024 State of Digital Operations study from PagerDuty ...