Visibility is Security
May 16, 2018

Keith Bromley
Ixia

Share this

While security experts may disagree on exactly how to secure a network, one thing they all agree on is that you cannot defend against what you cannot see. In other words, network visibility IS network security.

Visibility needs to be the starting the point. After that, you can implement whatever appliances, processes, and configurations you need to finish off the security architecture. By adopting this strategy, IT will acquire an even better insight and understanding of the network and application performance to maximize security defenses and breach remediation.

One easy way to gain this insight is to implement a visibility architecture that utilizes application intelligence. This type of architecture delivers the critical intelligence needed to boost network security protection and create more efficiencies.

For instance, early detection of breaches using application data reduces the loss of personally identifiable information (PII) and reduces breach costs. Specifically, application level information can be used to expose indicators of compromise, provide geolocation of attack vectors, and combat secure sockets layer (SSL) encrypted threats.

You might be asking, what is a visibility architecture?

A visibility architecture is nothing more than an end-to-end infrastructure which enables physical and virtual network, application, and security visibility. This includes taps, bypass switches, packet brokers, security and monitoring tools, and application-level solutions.

Let's look at a couple use cases to see the real benefits.

Use Case #1 – Application filtering for security and monitoring tools

A core benefit of application intelligence is the ability to use application data filtering to improve security and monitoring tool efficiencies. Delivering the right information is critical because as we all know, garbage in results in garbage out.

For instance, by screening application data before it is sent to an intrusion detection system (IDS), information that typically does not require screening (e.g. voice and video) can be routed downstream and bypass IDS inspection. Eliminating inspection of this low-risk data can make your IDS solution up to 35% more efficient.

Use Case #2 – Exposing Indicators of Compromise (IOC)

The main purpose of investigating indicators of compromise for security attacks is so that you can discover and remediate breaches faster. Security breaches almost always leave behind some indication of the intrusion, whether it is malware, suspicious activity, some sign of other exploit, or the IP addresses of the malware controller.

Despite this, according to the 2016 Verizon Data Breach Investigation Report, most victimized companies don't discover security breaches themselves. Approximately 75% have to be informed by law enforcement and 3rd parties (customers, suppliers, business partners, etc.) that they have been breached. In other words, the company had no idea the breach had happened.

To make matters worse, the average time for the breach detection was 168 days, according to the 2016 Trustwave Global Security Report.

To thwart these security attacks, you need the ability to detect application signatures and monitor your network so that you know what is, and what is not, happening on your network. This allows you to see rogue applications running on your network along with visible footprints that hackers leave as they travel through your systems and networks. The key is to look at a macroscopic, or application view, of the network for IOC.

For instance, suppose there is a foreign actor in Eastern Europe (or other area of the world) that has gained access to your network. Using application data and geo-location information, you would easily be able to see that someone in Eastern Europe is transferring files off of the network from an FTP server in Dallas, Texas back to an address in Eastern Europe. Is this an issue? It depends upon whether you have authorized users in that location or not. If not, it's probably a problem.

Due to application intelligence, you now know that the activity is happening. The rest is up to you to decide if this is an indicator of compromise for your network or not.

Keith Bromley is Senior Manager, Solutions Marketing at Ixia Solutions Group, a Keysight Technologies business
Share this

The Latest

May 27, 2020

Keeping networks operational is critical for businesses to run smoothly. The Ponemon Institute estimates that the average cost of an unplanned network outage is $8,850 per minute, a staggering number. In addition to cost, a network failure has a negative effect on application efficiency and user experience ...

May 26, 2020

Nearly 3,700 people told GitLab about their DevOps journeys. Respondents shared that their roles are changing dramatically, no matter where they sit in the organization. The lines surrounding the traditional definitions of dev, sec, ops and test have blurred, and as we enter the second half of 2020, it is perhaps more important than ever for companies to understand how these roles are evolving ...

May 21, 2020

As cloud computing continues to grow, tech pros say they are increasingly prioritizing areas like hybrid infrastructure management, application performance management (APM), and security management to optimize delivery for the organizations they serve, according to SolarWinds IT Trends Report 2020: The Universal Language of IT ...

May 20, 2020

Businesses see digital experience as a growing priority and a key to their success, with execution requiring a more integrated approach across development, IT and business users, according to Digital Experiences: Where the Industry Stands ...

May 19, 2020

Fully 90% of those who use observability tooling say those tools are important to their team's software development success, including 39% who say observability tools are very important ...

May 18, 2020

As our production application systems continuously increase in complexity, the challenges of understanding, debugging, and improving them keep growing by orders of magnitude. The practice of Observability addresses both the social and the technological challenges of wrangling complexity and working toward achieving production excellence. New research shows how observable systems and practices are changing the APM landscape ...

May 14, 2020
Digital technologies have enveloped our lives like never before. Be it on the personal or professional front, we have become dependent on the accurate functioning of digital devices and the software running them. The performance of the software is critical in running the components and levers of the new digital ecosystem. And to ensure our digital ecosystem delivers the required outcomes, a robust performance testing strategy should be instituted ...
May 13, 2020

The enforced change to working from home (WFH) has had a massive impact on businesses, not just in the way they manage their employees and IT systems. As the COVID-19 pandemic progresses, enterprise IT teams are looking to answer key questions such as: Which applications have become more critical for working from home? ...

May 12, 2020

In ancient times — February 2020 — EMA research found that more than 50% of IT leaders surveyed were considering new ITSM platforms in the near future. The future arrived with a bang as IT organizations turbo-pivoted to deliver and support unprecedented levels and types of services to a global workplace suddenly working from home ...

May 11, 2020

The Internet of Things (IoT) is changing the world. From augmented reality advanced analytics to new consumer solutions, IoT and the cloud are together redefining both how we work and how we engage with our audiences. They are changing how we live, as well ...