Achieving Intelligent Data Governance
Why It's Better to be Intelligent than High, When it Comes to Data Governance
August 21, 2018

Don Boxley
DH2i

Share this

High availability's (HA) primary objective has historically been focused on ensuring continuous operations and performance. HA was built on a foundation of redundancy and failover technologies and methodologies to ensure business continuity in the event of workload spikes, planned maintenance, and unplanned downtime.

Today, HA methodologies have been superseded by intelligent workload routing automation (i.e., intelligent availability), in that data and their processing are consistently directed to the proper place at the right time. Intelligent availability partially stems from the distributed realities of the modern data landscape, in that information assets are disbursed on premises, in the cloud, and at the cloud's edge.

Consequently, regulatory compliance has emerged as much a driver for intelligent availability as has performance. With increasing regulations and penalties (such as those for the European Union's General Data Protection Regulation, i.e., GDPR), missteps about where workloads are routed could have dire legal and financial consequences — especially for data in the cloud.

Many countries and industries have stringent regulations about data's location that directly affect cloud deployments. Organizations must know how and where such data is permitted in the cloud before shifting it there for availability and performance issues.

Crafting policies in accordance with these regulations is crucial to leveraging intelligent availability to ensure compliance, and effectively transforms data governance into intelligent data governance.

Cloud Concerns

Cloud deployments have a number of opaque areas in relation to routing workloads for availability. These pertain to the type of cloud involved (public, private, or hybrid), the method of redundancy used, and the nature of the data.

The GDPR, for example, has a number of regulations for personal data, a broad term for "any information related to an identified or identifiable natural person." As such, organizations must be extremely cautious about transporting this type of data, despite the performance gains of doing so. For example, cloud bursting is advantageous for optimizing performance during sudden peaks in network activity, that are common for online transaction processing in finance or manufacturing. Migrating these workloads from local settings to public ones may balance network activity, but can forsake regulations in the process.

Organizations must take similar precautions when strategizing for disaster recovery (DR), one of the chief benefits of intelligent availability. Downtime may be minimized by implementing automatic failovers into the cloud, but can also compromise regulatory compliance.

Cloud compliance issues not only involve where data are stored, but also where (and how) they're processed. GDPR, for example, distinguishes data processors from data controllers. The latter are organizations using data, but the former can involve any assortment of SaaS or SOA options that must adhere to GDPR's personal data regulations. Organizations must assess these measures when cloud brokering among various providers — particularly for transient pricing specials.

Other regulations such as the Payment Card Industry Data Security Standard have rigid stipulations about encrypting data (especially for data in transit) that may apply to workloads spontaneously moved to the cloud. Those in the e-commerce or retail spaces must consider the intricacies of server side or client-side encryption, especially when replicating data between clouds.

The Intelligent Way

Intelligent availability provides the best means of effecting regulatory compliance while dynamically shifting workloads between environments for all of the preceding scenarios. The core of this method is the governance policies devised to meet compliance standards.

Although intelligent availability doesn't determine sensitive information or dictate where it can be routed, it offers portability freedom across settings (including operation systems, physical and virtual infrastructure) that all but forces organizations to identify these factors because of its flexibility. This real-time, on-demand shifting of resources is the catalyst to evaluate workloads through a governance lens, update policies as needed, and leverage them to predetermine optimal routing of data and their processing for availability. Intelligent availability is the means of implementing Intelligent data governance; it's a conduit between performance and regulatory compliance that increases competitive advantage.

Implementing Intelligent Governance

Once those policies are in place, the intelligent availability approach maximizes cloud deployments while maintaining regulatory adherence. Its intelligent algorithms continuously monitor server performance to automatically detect surges, either issuing alerts to organizations or heralding the transfer of workloads to alternative hosts. By already having agreed upon policies conforming to governance practices, prudent organizations can confidently move data to the cloud without violating regulations. Thus, cloud bursting measures can regularly be deployed to minimize network strain during spikes for OLTP (or any other reason) without costly penalties.

Companies also have the benefit of automatic failovers to the cloud to maintain business continuity in the event of natural disasters or failure. This option virtually eliminates downtime, enabling IT to perform maintenance on even the most mission critical infrastructure once data is properly re-routed offsite.

One of the most useful intelligent availability advantages is the capability to span clouds, both among providers and all the variations of clouds available. Although well sourced governance policies are essential to receiving the pricing boons of cloud brokering, intelligent availability's ability to start and stop workloads at the instance level while transporting data between settings is just as valuable.

The data processing issue is a little more complicated but is assisted by intelligent availability's flexibility. Once organizations have researched the various Service Level Agreements of cloud vendors — as well as policies for other types of data processing, including software companies' — they can utilize these platforms in accordance with regulations, transferring their resources where they're permitted. Most encryption concerns are solved with client-side encryption whereby organizations encrypt data before replicating them to the cloud, retaining the sole keys to them. Intelligent availability measures transport this data to the cloud and back as needed.

Adherence

The escalating presence of regulatory mandates isn't likely to soon subside. Compliance standards are just as critical as performance issues, when making workloads available across heterogeneous settings. Intelligent availability's support of versatile storage and processing environments, in conjunction with its low-latency portability, makes it a natural extension of intelligent data governance implementations. These methods ensure data is moved correctly — the first time — to maintain regulatory adherence during an age when it's most difficult to do so.

Don Boxley is CEO and Co-Founder of DH2i
Share this

The Latest

February 22, 2024

Some companies are just starting to dip their toes into developing AI capabilities, while (few) others can claim they have built a truly AI-first product. Regardless of where a company is on the AI journey, leaders must understand what it means to build every aspect of their product with AI in mind ...

February 21, 2024

Generative AI will usher in advantages within various industries. However, the technology is still nascent, and according to the recent Dynatrace survey there are many challenges and risks that organizations need to overcome to use this technology effectively ...

February 20, 2024

In today's digital era, monitoring and observability are indispensable in software and application development. Their efficacy lies in empowering developers to swiftly identify and address issues, enhance performance, and deliver flawless user experiences. Achieving these objectives requires meticulous planning, strategic implementation, and consistent ongoing maintenance. In this blog, we're sharing our five best practices to fortify your approach to application performance monitoring (APM) and observability ...

February 16, 2024

In MEAN TIME TO INSIGHT Episode 3, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses network security with Chris Steffen, VP of Research Covering Information Security, Risk, and Compliance Management at EMA ...

February 15, 2024

In a time where we're constantly bombarded with new buzzwords and technological advancements, it can be challenging for businesses to determine what is real, what is useful, and what they truly need. Over the years, we've witnessed the rise and fall of various tech trends, such as the promises (and fears) of AI becoming sentient and replacing humans to the declaration that data is the new oil. At the end of the day, one fundamental question remains: How can companies navigate through the tech buzz and make informed decisions for their future? ...

February 14, 2024

We increasingly see companies using their observability data to support security use cases. It's not entirely surprising given the challenges that organizations have with legacy SIEMs. We wanted to dig into this evolving intersection of security and observability, so we surveyed 500 security professionals — 40% of whom were either CISOs or CSOs — for our inaugural State of Security Observability report ...

February 13, 2024

Cloud computing continues to soar, with little signs of slowing down ... But, as with any new program, companies are seeing substantial benefits in the cloud but are also navigating budgetary challenges. With an estimated 94% of companies using cloud services today, priorities for IT teams have shifted from purely adoption-based to deploying new strategies. As they explore new territories, it can be a struggle to exploit the full value of their spend and the cloud's transformative capabilities ...

February 12, 2024

What will the enterprise of the future look like? If we asked this question three years ago, I doubt most of us would have pictured today as we know it: a future where generative AI has become deeply integrated into business and even our daily lives ...

February 09, 2024

With a focus on GenAI, industry experts offer predictions on how AI will evolve and impact IT and business in 2024. Part 5, the final installment in this series, covers the advantages AI will deliver: Generative AI will become increasingly important for resolving complicated data integration challenges, essentially providing a natural-language intermediary between data endpoints ...

February 08, 2024

With a focus on GenAI, industry experts offer predictions on how AI will evolve and impact IT and business in 2024. Part 4 covers the challenges of AI: In the short term, the rapid development and adoption of AI tools and products leveraging AI services will lead to an increase in biased outputs ...