Big Data Analytics Taking Too Long, Study Says
October 22, 2012
Share this

Amid all the hype, few companies are able to apply analytics to Big Data for competitive advantage, because of the time and costs associated with traditional approaches to analytics, according to a Big Data Analytics study by OpTier.

Key Findings:

- Despite all the hype about social media, the majority of Big Data resides within the four walls of a company’s data center.

- Companies are not gaining a business advantage from their Big Data because it is distributed across many silos and lacks the context and uniformity necessary to allow analysts to quickly leverage it.

- A huge bottleneck to Analytics is the Data Preparation phase, which accounts for 30-60% of the time spent in analytics, because data is saved without context.

- The companies surveyed that have a Big Data Analytics solution in place spent 2-3 years setting up their data warehouses, and spend a minimum of 2-3 months each time a new data set is incorporated.

- There are three alternatives companies use today to perform analytics: 1) Traditional statistical modeling of data relationships; 2) Re-writing applications and re-building from the ground up; 3) Capturing transaction in context at the time of execution. The first two are time-consuming and cost-prohibitive (only the largest enterprises can afford these).

- Companies surveyed agree that context would dramatically accelerate Analytics because they would be able to reduce the time and cost spent on analytics by 50-90%.

- Companies reported that the ability to understand the value and/or cost of servicing each individual customer would be extremely valuable.

“Analytics have emerged as the fastest growing segment of IT budgets, but what’s missing today from Big Data analytics? In a word, it’s context!” said Andy Wild, president of OpTier. “Today’s leading companies are struggling to take advantage of the volume and variety of data available within the four walls of the enterprise. By rethinking the way they analyze Big Data and capturing transactions that are already in context, CIOs can fundamentally change the economics and process of analyzing Big Data – saving 50% of time spent and cost."

"Companies worldwide are anticipating the value in analyzing their Big Data, but many do not have an efficient process in place to effectively take advantage of the data," said Professor Russell Walker at the Kellogg School of Management. "Based on the research conducted, a common theme that emerged was the need for a faster way to get meaningful business value - such as the interrelationships between various data sets - out of their Big Data. Contextual data is key to drive business growth."

About the Study

Jacques Takou Tuh, an MBA student at the Kellogg School of Management and Adam Kanouse, CIO of OpTier, conducted primary research with business executives at Global 250 companies. Based on one-on-one interviews and focus groups spanning industries including Financial Services, Healthcare, Media & Entertainment, Retail and Telecommunications, these findings are summarized in a research paper titled Making Big Data Analytics Fast and Easy.

Share this

The Latest

September 23, 2021

The Internet played a greater role than ever in supporting enterprise productivity over the past year-plus, as newly remote workers logged onto the job via residential links that, it turns out, left much to be desired in terms of enabling work ...

September 22, 2021

The world's appetite for cloud services has increased but now, more than 18 months since the beginning of the pandemic, organizations are assessing their cloud spend and trying to better understand the IT investments that were made under pressure. This is a huge challenge in and of itself, with the added complexity of embracing hybrid work ...

September 21, 2021

After a year of unprecedented challenges and change, tech pros responding to this year’s survey, IT Pro Day 2021 survey: Bring IT On from SolarWinds, report a positive perception of their roles and say they look forward to what lies ahead ...

September 20, 2021

One of the key performance indicators for IT Ops is MTTR (Mean-Time-To-Resolution). MTTR essentially measures the length of your incident management lifecycle: from detection; through assignment, triage and investigation; to remediation and resolution. IT Ops teams strive to shorten their incident management lifecycle and lower their MTTR, to meet their SLAs and maintain healthy infrastructures and services. But that's often easier said than done, with incident triage being a key factor in that challenge ...

September 16, 2021

Achieve more with less. How many of you feel that pressure — or, even worse, hear those words — trickle down from leadership? The reality is that overworked and under-resourced IT departments will only lead to chronic errors, missed deadlines and service assurance failures. After all, we're only human. So what are overburdened IT departments to do? Reduce the human factor. In a word: automate ...

September 15, 2021

On average, data innovators release twice as many products and increase employee productivity at double the rate of organizations with less mature data strategies, according to the State of Data Innovation report from Splunk ...

September 14, 2021

While 90% of respondents believe observability is important and strategic to their business — and 94% believe it to be strategic to their role — just 26% noted mature observability practices within their business, according to the 2021 Observability Forecast ...

September 13, 2021

Let's explore a few of the most prominent app success indicators and how app engineers can shift their development strategy to better meet the needs of today's app users ...

September 09, 2021

Business enterprises aiming at digital transformation or IT companies developing new software applications face challenges in developing eye-catching, robust, fast-loading, mobile-friendly, content-rich, and user-friendly software. However, with increased pressure to reduce costs and save time, business enterprises often give a short shrift to performance testing services ...

September 08, 2021

DevOps, SRE and other operations teams use observability solutions with AIOps to ingest and normalize data to get visibility into tech stacks from a centralized system, reduce noise and understand the data's context for quicker mean time to recovery (MTTR). With AI using these processes to produce actionable insights, teams are free to spend more time innovating and providing superior service assurance. Let's explore AI's role in ingestion and normalization, and then dive into correlation and deduplication too ...