Assuring User Experience is Big Data Job Number One
May 30, 2013

Gabriel Lowy

Share this

Assuring user experience should be the top priority among Big Data projects for enterprises and cloud service providers. Megatrends such as mobile, cloud and social drive the need for application awareness via better visibility and control. With survey after survey showing availability as the number one priority, spending on user experience assurance, also known as application performance management (APM), is expected to remain strong. However, only solutions that cover the entire application delivery chain from the end-user experience perspective will suffice.

This means visibility that extends from behind the corporate firewall out to the cloud, implying an end-to-end view from user devices back through the tiers of data center infrastructure. The “point of delivery” — which is where the user accesses a composite application — is the only perspective from which user experience should be addressed.

Cloud architectures — public, private or hybrid — beget complexity. Projects such as cloud computing, server and desktop virtualization and data center consolidation are undertaken for the perceived returns on investment (ROI) they can delivery. However, while one of the major benefits of virtualization was supposed to break down silos in IT, it actually created another management silo.

The majority of virtualization management tools focus on capacity planning, utilization and availability metrics. Most do not provide insights into how the user experience will be impacted if something changes in a virtualized environment. Without assuring user experience, lower costs and productivity gains become unattainable.

Another reason why user experience assurance must be a priority is the link between application performance and revenue generation. Studies have shown that slower end-user experience results in fewer page views, which in turn reduces the probability of completing the sales cycle.

The adoption of agile practices implies changes to code on a much more frequent basis. This requires more visibility into the web browser given how applications are being developed. The typical web application today has a lot of content and third-party services, components beyond the control of the organization.

For example, consider an online retail application comprising numerous functions derived from within the data center as well as external third-party services, such as a shopping cart, preference engine and ad networks. The average website connects as many as 10 hosts before ultimately being served to the end user.

While extensive third-party functions can enrich the online experience, they can also create performance risks. If any one component fails, it can degrade the performance of an application or an entire website. In addition, many third-party cloud services are opaque, providing little visibility into the overall health of the compute infrastructure.

More processing occurring closer to the end-user on the user device or on the browser itself requires better visibility inside the browser. Monitoring network traffic, database and servers does not provide visibility into how the browser affects user experience. Poor performance anywhere along the application delivery chain will negatively impact the end user experience. This includes cloud service providers, regional and local ISPs, content delivery networks, browsers and devices.

The Answer is Analytics

Transaction tracing and predictive analytics are the most important trends driving the market, and will soon be considered table stakes for any serious APM vendor.

Transaction tracing goes beyond real-time monitoring to provide a more unified view into different components of the application delivery chain.

Meanwhile, analytics is improving with new tools that can correlate thousands of metrics and identify patterns that provide early warning signs of impending trouble.

Analytics can help reduce time being spent on correlating and normalizing data from different sources. This includes information collected by different tools that monitor users, servers, mainframes and synthetic transactions. It also includes tools that are being deployed independent of IT. Deep-dive diagnostics also allows IT organizations to be more proactive by pinpointing the source of problems before calls to the help desk occur or before a visitor departs a website.

As such, the most relevant metric for any IT organization is not about infrastructure utilization. Instead, it is at what point of utilization the user experience begins to degrade. Being able to centrally store, manage and analyze this data provides a more accurate picture into user experience.

Amid a do-more-with-less budget environment and more pressure on IT to justify resource allocations, CIOs can strengthen their role in the strategic planning process by having intelligence about revenue-generating transactions, customer interactions and usage consumption patterns that drive improved business outcomes. Analytics should now be at the top of any CIO’s list. All the talk about realizing ROI on big data investments will also go for naught with inferior user experience.

Over the next few years, expect user experience assurance to become a feeder to, and a subset of, BI/analytics. In fact, it should be Big Data project number one. To ease the technology and vendor selection process, IT operations teams should define the use cases, application types, pain points and underlying technology to perform ROI analyses. For vendors, making the deployment process easier — from the adds, drops, and changes perspective — can open up new opportunities by solving the ROI equation.

Gabriel Lowy is the founder of TechTonics Advisors, a research-first investor relations consultancy that helps technology companies maximize value for all stakeholders by bridging vision, strategy, product portfolio and markets with analysts and investors
Share this

The Latest

July 22, 2019

Many organizations are unsure where to begin with AIOps, but should seriously consider adopting an AIOps strategy and solution. To get started, it's important to identify the key capabilities of AIOps that are needed to realize maximum value from your investments ...

July 18, 2019

Organizations that are working with artificial intelligence (AI) or machine learning (ML) have, on average, four AI/ML projects in place, according to a recent survey by Gartner, Inc. Of all respondents, 59% said they have AI deployed today ...

July 17, 2019

The 11th anniversary of the Apple App Store frames a momentous time period in how we interact with each other and the services upon which we have come to rely. Even so, we continue to have our in-app mobile experiences marred by poor performance and instability. Apple has done little to help, and other tools provide little to no visibility and benchmarks on which to prioritize our efforts outside of crashes ...

July 16, 2019

Confidence in artificial intelligence (AI) and its ability to enhance network operations is high, but only if the issue of bias is tackled. Service providers (68%) are most concerned about the bias impact of "bad or incomplete data sets," since effective AI requires clean, high quality, unbiased data, according to a new survey of communication service providers ...

July 15, 2019

Every internet connected network needs a visibility platform for traffic monitoring, information security and infrastructure security. To accomplish this, most enterprise networks utilize from four to seven specialized tools on network links in order to monitor, capture and analyze traffic. Connecting tools to live links with TAPs allow network managers to safely see, analyze and protect traffic without compromising network reliability. However, like most networking equipment it's critical that installation and configuration are done properly ...

July 11, 2019

The Democratic presidential debates are likely to have many people switching back-and-forth between live streams over the coming months. This is going to be especially true in the days before and after each debate, which will mean many office networks are likely to see a greater share of their total capacity going to streaming news services than ever before ...

July 10, 2019

Monitoring of heating, ventilation and air conditioning (HVAC) infrastructures has become a key concern over the last several years. Modern versions of these systems need continual monitoring to stay energy efficient and deliver satisfactory comfort to building occupants. This is because there are a large number of environmental sensors and motorized control systems within HVAC systems. Proper monitoring helps maintain a consistent temperature to reduce energy and maintenance costs for this type of infrastructure ...

July 09, 2019

Shoppers won’t wait for retailers, according to a new research report titled, 2019 Retailer Website Performance Evaluation: Are Retail Websites Meeting Shopper Expectations? from Yottaa ...

June 27, 2019

Customer satisfaction and retention were the top concerns for a majority (58%) of IT leaders when suffering downtime or outages, according to a survey of top IT leaders conducted by AIOps Exchange. The effect of service interruptions on customers outweighed other concerns such as loss of revenue, brand reputation, negative press coverage, or the impact on IT Ops teams.

June 26, 2019

It is inevitable that employee productivity and the quality of customer experiences suffer as a consequence of the poor performance of O365. The quick detection and rapid resolution of problems associated with O365 are top of mind for any organization to keep its business humming ...