GDPR and the Need for a Smart Approach to Service Assurance
June 28, 2018

Michael Segal
NetScout

Share this

Following the introduction of the EU General Data Protection Regulation, or GDPR, on May 25 this year, organizations across the globe with customers and suppliers in the European Union have been working to ensure they are compliant, and bringing the subject of data projection to the front of everyone's mind.

It's little surprise that network security and information assurance are key to complying with the GDPR; the regulation includes a requirement for measures to be put in place that will mitigate the risk associated with assuring the availability and integrity of an organization's information in the event of an attack or outage, for example.

Article 32 is concerned with confidentiality, integrity, availability and resilience of processing systems and data, and with the speed at which availability and access to personal data can be restored in the event of downtime resulting for a breach or network outage. Of course, as the information protected by the GDPR and other similar regulations constantly traverses the network, it's important to assure its availability, reliability and responsiveness. Indeed, not only is this important for regulatory compliance, it should be high on the list of priorities for any business.

Given the size and complexity of today's IT networks, however, it can be almost impossible to detect just when and where a security breach or network failure might occur. It's critical, therefore, that businesses have complete visibility over their IT networks, and any applications and services that run on those networks, in order to protect their customers' information, assure uninterrupted service delivery and, of course, comply with the GDPR.

Insight and Intelligence

The volume of data being produced has exploded in recent years and this is only set to continue, with analysts predicting a tenfold increase within the next decade, 60 percent of which will be generated by enterprises.

Much of this will comprise what the GDPR, and other regulations such as PCI-DSS and HIPAA, define as personal data: the personal email addresses, phone numbers, IP addresses and credit card information that may be collected and recorded by a business. For compliance purposes, it's important that networking teams are able to understand how this data traverses their organization's networks, the paths it will take and where it will be stored.

Keeping track of this information requires full visibility across the entire network, including data centers, applications and the cloud. To comply with regulatory requirements around the processing of data, as well as for service and security assurance, businesses should consider a smart approach to the way they handle data. Such an approach would involve monitoring all "wire data" information, that is every action and transaction that traverses an organization's service delivery infrastructure, and continuously analyzing it and compressing it into metadata at its source. This "smart data" is normalized, organized, and structured in a service and security contextual fashion in real time. The inherent intelligence of the metadata enables analytics tools to clearly understand application performance, infrastructure complexities, service dependencies and, importantly for GDPR compliance, any threats or anomalies.

Essentially, continuous monitoring of this wire data means that businesses can have access to contextualized data that will provide them with the real-time, actionable insights they need for assurance of effective, resilient and secure infrastructure, crucial for complying with the GDPR, not to mention for much of modern business activity.

More at Stake than Ever

The recent implementation of the GDPR means that any organization that processes the personal data of UK citizens, regardless of where in the world that organization is located, is now within the scope of the law. Much has been written over the past year on the eye-watering financial penalties that could be imposed on any company found to be neglectful in fulfilling its duty to protect the privacy of that data. The privacy and protection of personal data have always been considerations for a business, but with the prospect of facing fines of up to €20 million or four percent of annual turnover, there is more at stake for businesses than ever before.

With robust protection in place, and with visibility, insight and intelligence delivering assurance of complete network availability, businesses across the world breathe a little easier that the reliability of their networks, and of the applications that run on those networks, meet the requirements of the GDPR.

Michael Segal is VP of Strategy at NetScout
Share this

The Latest

March 31, 2020

Organizations face major infrastructure and security challenges in supporting multi-cloud and edge deployments, according to new global survey conducted by Propeller Insights for Volterra ...

March 30, 2020

Developers spend roughly 17.3 hours each week debugging, refactoring and modifying bad code — valuable time that could be spent writing more code, shipping better products and innovating. The bottom line? Nearly $300B (US) in lost developer productivity every year ...

March 26, 2020

While remote work policies have been gaining steam for the better part of the past decade across the enterprise space — driven in large part by more agile and scalable, cloud-delivered business solutions — recent events have pushed adoption into overdrive ...

March 25, 2020

Time-critical, unplanned work caused by IT disruptions continues to plague enterprises around the world, leading to lost revenue, significant employee morale problems and missed opportunities to innovate, according to the State of Unplanned Work Report 2020, conducted by Dimensional Research for PagerDuty ...

March 24, 2020

In today's iterative world, development teams care a lot more about how apps are running. There's a demand for fixing actionable items. Developers want to know exactly what's broken, what to fix right now, and what can wait. They want to know, "Do we build or fix?" This trade-off between building new features versus fixing bugs is one of the key factors behind the adoption of Application Stability management tools ...

March 23, 2020

With the rise of mobile apps and iterative development releases, Application Stability has answered the widespread need to monitor applications in a new way, shifting the focus from servers and networks to the customer experience. The emergence of Application Stability has caused some consternation for diehard APM fans. However, these two solutions embody very distinct monitoring focuses, which leads me to believe there's room for both tools, as well as different teams for both ...

March 19, 2020

The 2019 State of E-Commerce Infrastructure Report, from Webscale, analyzes findings from a comprehensive survey of more than 450 ecommerce professionals regarding how their online stores performed during the 2019 holiday season. Some key insights from the report include ...

March 18, 2020

Robinhood is a unicorn startup that has been disrupting the way by which many millennials have been investing and managing their money for the past few years. For Robinhood, the burden of proof was to show that they can provide an infrastructure that is as scalable, reliable and secure as that of major banks who have been developing their trading infrastructure for the last quarter-century. That promise fell flat last week, when the market volatility brought about a set of edge cases that brought Robinhood's trading app to its knees ...

March 17, 2020

Application backend monitoring is the key to acquiring visibility across the enterprise's application stack, from the application layer and underlying infrastructure to third-party API services, web servers and databases, be they on-premises, in a public or private cloud, or in a hybrid model. By tracking and reporting performance in real time, IT teams can ensure applications perform at peak efficiency — and guarantee a seamless customer experience. How can IT operations teams improve application backend monitoring? By embracing artificial intelligence for operations — AIOps ...

March 16, 2020

In 2020, DevOps teams will face heightened expectations for higher speed and frequency of code delivery, which means their IT environments will become even more modular, ephemeral and dynamic — and significantly more complicated to monitor. As a result, AIOps will further cement its position as the most effective technology that DevOps teams can use to see and control what's going on with their applications and their underlying infrastructure, so that they can prevent outages. Here I outline five key trends to watch related to how AIOps will impact DevOps in 2020 and beyond ...