Gartner Identifies 3 Key Analytics Trends for 2013
February 11, 2013

Pete Goldin
APMdigest

Share this

Business intelligence (BI) and analytics continues to be a top CIO investment priority, and yet user surveys by Gartner, Inc. show that only 30 percent of potential users in an organization adopt CIO-sponsored analytics tools. This appears to be changing as organizations invest in making analytics "invisible," and more consumable and accessible, to the nontraditional analytics user.

"A large enterprise makes millions of decisions every day," said Rita Sallam, research VP analyst at Gartner. "The challenge is that companies have far more data than people have time, and the amount of data that is generated every minute keeps increasing. In the face of accelerating business processes and a myriad of distractions, real-time operational intelligence systems are moving from 'nice to have' to 'must have for survival.' The more pervasively analytics can be deployed to business users, customers and consumers, the greater the impact will be in real time on business activities, competitiveness, innovation and productivity."

Gartner has identified three key trends for analytics and BI professionals to consider in 2013 and recommendations on how to tackle them:

1. Making Analytics Invisible to Users

To make analytics more actionable and pervasively deployed, BI and analytics professionals must make analytics more invisible and transparent to their users — through easy natural language interfaces for exploring data and through embedded analytic applications at the point of decision or action.

As analytics moves closer to the point of action in real time, a shift is occurring from systems that primarily aggregate and compute structured data, toward analytic systems that correlate and relate structured and unstructured data, and reason, learn and deliver prescriptive advice. These man-machine partnerships are emerging and becoming increasingly sophisticated in ways that position the machine or application to take more natural inputs, such as written or spoken questions, extending analytics to nontraditional users. The friendlier, more transparent and therefore more invisible the analytics are to users, the more broadly they will be adopted — particularly by users that have never used BI tools — and the greater the impact analytics can have on business activities.

Moving toward something that looks simple and invisible from the user's perspective will require a great deal of computing power, extended capabilities and skills, and potential complexity in information management systems. Business intelligence and analytics professionals should begin by identifying targeted data exploration and high-value decision-making opportunities where making analytics invisible, transparent, context-aware and accessible in real time to specific constituencies can add demonstrable value.

2. Deploying Real-Time Intelligence

The growing volume of real-time data and the reduced time for decision making are driving companies to implement real-time operational intelligence systems that make supervisors and operations staff more effective.

The volume of relevant, real-time data is growing, but the time available to make decisions and respond is shrinking. At the same time, virtually all the event data available to human recipients — even news feeds, email, tweets and other unstructured data (content) — is now in digital form so software tools can process it. Effective operational intelligence systems offload as much work as possible from people.

Organizations should offload event data capture, filtering, mathematical calculations and pattern detection to real-time operational intelligence software, to provide better situation awareness to business people. Where the cause and sequence of events are understood, leading indicators can be used to predict situations of threat or opportunity before they occur — so that the response can be proactive. Where this is not possible, the system can be used to improve the outcome by reducing the lag time between events and responses.

3. Automating Decision Making

Increasing competition, cost and regulatory pressures will motivate business leaders to adopt more prescriptive analytics, making business decisions smarter and more repeatable and reducing personnel costs

Companies are under pressure to improve the quality of their decisions, while reducing their staffing and complying with ever-increasing regulation to make decisions transparent, auditable and repeatable. These forces are motivating managers to use decision management software technologies in more places, and also to use more sophisticated forms of these technologies.

Decision management software runs on-demand when a person or an application program needs computational support for making a decision. In some cases, the system can make the decision (intelligent decision automation). In other cases, the system prepares recommendations or performs part of the analysis and presents information to a human decision maker (decision support systems).

Solutions architects should work with business analysts, subject matter experts and business managers to develop an understanding of the kinds of business decisions that will be made and let computers make decisions that are structured and repeatable to conserve people's time and attention for the thinking and actions that computers cannot do.

Share this

The Latest

October 21, 2019

An effective breakpoint strategy helps deliver sharp, properly sized images, which are some of the most compelling pieces of content on a web page. Lack of such a strategy can lead to jagged images or ones that take too long to render due to excessive size, potentially reducing the overall effectiveness of web pages — and driving down the quality of the user experience. In this blog, we will explore just how significant image breakpoints are to businesses, and some important device-related factors to consider in image breakpoint decisions to deliver the optimally-sized web image every time ...

October 17, 2019

As the data generated by organizations grows, APM tools are now required to do a lot more than basic monitoring of metrics. Modern data is often raw and unstructured and requires more advanced methods of analysis. The tools must help dig deep into this data for both forensic analysis and predictive analysis. To extract more accurate and cheaper insights, modern APM tools use Big Data techniques to store, access, and analyze the multi-dimensional data ...

October 16, 2019

Modern enterprises are generating data at an unprecedented rate but aren't taking advantage of all the data available to them in order to drive real-time, actionable insights. According to a recent study commissioned by Actian, more than half of enterprises today are unable to efficiently manage nor effectively use data to drive decision-making ...

October 15, 2019

According to a study by Forrester Research, an enhanced UX design can increase the conversion rate by 400%. If UX has become the ultimate arbiter in determining the success or failure of a product or service, let us first understand what UX is all about ...

October 10, 2019

The requirements of an APM tool are now much more complex than they've ever been. Not only do they need to trace a user transaction across numerous microservices on the same system, but they also need to happen pretty fast ...

October 09, 2019

Performance monitoring is an old problem. As technology has advanced, we've had to evolve how we monitor applications. Initially, performance monitoring largely involved sending ICMP messages to start troubleshooting a down or slow application. Applications have gotten much more complex, so this is no longer enough. Now we need to know not just whether an application is broken, but why it broke. So APM has had to evolve over the years for us to get there. But how did this evolution take place, and what happens next? Let's find out ...

October 08, 2019

There are some IT organizations that are using DevOps methodology but are wary of getting bogged down in ITSM procedures. But without at least some ITSM controls in place, organizations lose their focus on systematic customer engagement, making it harder for them to scale ...

October 07, 2019
OK, I admit it. "Service modeling" is an awkward term, especially when you're trying to frame three rather controversial acronyms in the same overall place: CMDB, CMS and DDM. Nevertheless, that's exactly what we did in EMA's most recent research: <span style="font-style: italic;">Service Modeling in the Age of Cloud and Containers</span>. The goal was to establish a more holistic context for looking at the synergies and differences across all these areas ...
October 03, 2019

If you have deployed a Java application in production, you've probably encountered a situation where the application suddenly starts to take up a large amount of CPU. When this happens, application response becomes sluggish and users begin to complain about slow response. Often the solution to this problem is to restart the application and, lo and behold, the problem goes away — only to reappear a few days later. A key question then is: how to troubleshoot high CPU usage of a Java application? ...

October 02, 2019

Operations are no longer tethered tightly to a main office, as the headquarters-centric model has been retired in favor of a more decentralized enterprise structure. Rather than focus the business around a single location, enterprises are now comprised of a web of remote offices and individuals, where network connectivity has broken down the geographic barriers that in the past limited the availability of talent and resources. Key to the success of the decentralized enterprise model is a new generation of collaboration and communication tools ...