3 Reasons Most Enterprises Aren't Ready For Advanced Analytics Strategies
May 20, 2016

Dan Ortega
Blazent

Share this

"Data, data everywhere, and not a drop to drink." All businesses are fully aware of how much data they're swimming through on a daily basis. And because its buzzy and trendy, most of these businesses are looking to do more with their data, striving to implement cool sounding technologies like machine learning and predictive analytics.

How many, exactly? 41% of executives in a recent 451 Research survey of advanced analytics are looking to begin implementing applications such as Machine Learning or Predictive Modeling in the next 12 months, and an additional 14% plan to do so in the next 24.

And why shouldn't they? These sophisticated programs are highly efficient and represent the future of many different verticals supported by the technology industry.

Yet as enterprises and their leadership see these initiatives on the horizon, a startling number are overlooking a crucial factor that could make or break the success of these investments: the quality of their own data. With some enterprises curating up to 200 disparate data sources, ensuring data quality is no easy task. But getting it right can literally make the difference between a very public crash 'n' burn, or being the standard that everyone tries to emulate.

Here are three reasons why the average enterprise isn't properly prepared for an advanced analytics strategy.

Reason 1: Medieval Methods for Managing Data Quality

According to the survey, 37% of enterprises employ a manual data cleansing process. Given current data volumes, manually cleaning something isn't so 1990s, it's actually more like 1500s. Many of these enterprises are starting to look towards algorithmic automation – but how can they successfully automate advanced processes when their back-end data quality checks remain manual?

44.5% of respondents are in a reactive mode, meaning they only deal with their data quality when it becomes a problem … that they notice (and by the way, their customers noticed way before they did).

The majority of respondents (65%) acknowledge up to 50% of business value can be lost to poor data quality – think that number is going to decrease when the number of initiatives that rely on clean data increases?

Reason 2: Businesses Don't Know The Exact Quality of Their Data

Because of these current Data Quality Management "strategies", IT departments and C-suite executives have a lack of faith in the actual quality of their data.

Over half (57%) of respondents in this survey were "somewhat confident", "unaware", or "less than confident" in the state of their data. Not exactly a resounding endorsement.

This feeling is compounded by the dependency on manual effort to drive remediation in many enterprises' data quality process. Manual entry was the leading cause of poor data quality, also coming in at 57%.

To be fair, you can't blame employees for making mistakes in data entry or processing, but you can blame their management for not providing them with the right tools to handle the volume of data they face every day.

Reason 3: The Stream of Data Today is About to Become a Tsunami

If proper preparations aren't undertaken right now with the relatively manageable amount of data that currently exists, it will be not just be harder, it will be impossible to get a handle on it at the rate that data sources and volumes will continue to expand over the next 3-5 years.

95% of survey respondents acknowledge they expect data to increase (the other 5% presumably in businesses that won't be around in five years).

70% expect data volumes to grow by 70%, while nearly all of the remaining 30% expect it to grow by more than 75%. Chances are, all of them are underestimating what's headed in their direction.

The problems faced by the enterprise today are significant, but can be managed if IT executives deal with the data quality issue now. Tools and technologies are available to ensure viable data quality, which becomes the foundation for growth and value-add, but the choice to act now or quickly get buried is in our collective face, and requires immediate action.

Dan Ortega is VP of Marketing at Blazent.

Share this

The Latest

October 04, 2024

In Part 1 of this two-part series, I defined multi-CDN and explored how and why this approach is used by streaming services, e-commerce platforms, gaming companies and global enterprises for fast and reliable content delivery ... Now, in Part 2 of the series, I'll explore one of the biggest challenges of multi-CDN: observability.

October 03, 2024

CDNs consist of geographically distributed data centers with servers that cache and serve content close to end users to reduce latency and improve load times. Each data center is strategically placed so that digital signals can rapidly travel from one "point of presence" to the next, getting the digital signal to the viewer as fast as possible ... Multi-CDN refers to the strategy of utilizing multiple CDNs to deliver digital content across the internet ...

October 02, 2024

We surveyed IT professionals on their attitudes and practices regarding using Generative AI with databases. We asked how they are layering the technology in with their systems, where it's working the best for them, and what their concerns are ...

October 01, 2024

40% of generative AI (GenAI) solutions will be multimodal (text, image, audio and video) by 2027, up from 1% in 2023, according to Gartner ...

September 30, 2024

Today's digital business landscape evolves rapidly ... Among the areas primed for innovation, the long-standing ticket-based IT support model stands out as particularly outdated. Emerging as a game-changer, the concept of the "ticketless enterprise" promises to shift IT management from a reactive stance to a proactive approach ...

September 27, 2024

In MEAN TIME TO INSIGHT Episode 10, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses Generative AI ...

September 26, 2024

By 2026, 30% of enterprises will automate more than half of their network activities, an increase from under 10% in mid-2023, according to Gartner ...

September 25, 2024

A recent report by Enterprise Management Associates (EMA) reveals that nearly 95% of organizations use a combination of do-it-yourself (DIY) and vendor solutions for network automation, yet only 28% believe they have successfully implemented their automation strategy. Why is this mixed approach so popular if many engineers feel that their overall program is not successful? ...

September 24, 2024

As AI improves and strengthens various product innovations and technology functions, it's also influencing and infiltrating the observability space ... Observability helps translate technical stability into customer satisfaction and business success and AI amplifies this by driving continuous improvement at scale ...

September 23, 2024

Technical debt is a pressing issue for many organizations, stifling innovation and leading to costly inefficiencies ... Despite these challenges, 90% of IT leaders are planning to boost their spending on emerging technologies like AI in 2025 ... As budget season approaches, it's important for IT leaders to address technical debt to ensure that their 2025 budgets are allocated effectively and support successful technology adoption ...