APMdigest asked experts across the industry — including analysts, consultants and vendors — for their opinions on the next steps for ITOA. These next steps include where the experts believe ITOA is headed, as well as where they think it should be headed. Part 3 covers monitoring and user experience.
Start with Next Steps for ITOA - Part 1
Start with Next Steps for ITOA - Part 2
MONITORING INTEGRATES WITH ITOA
Advanced analytics and machine learning will become table stakes in monitoring tools. Initially this will create a flurry of unsubstantiated rebranding efforts by vendors eager to catch up, but these will eventually either acquire their way into ITOA or exit the market.
Trace3 Research 360 View Trend Report: IT Operations Monitoring & Analytics (ITOMA)
As powerful as APM tools are, they have always been application- or infrastructure-centric and have therefore missed a very important piece of the puzzle: the actual users. I predict that IT departments, with the encouragement of corporate management, will not only begin to recognize the value of understanding user experience and behavior, but will take the lead in leveraging these analytics to improve the quality of service they deliver. They will begin integrating user analytics as a core capability within their toolset to see exactly what happens when users enter information and navigate through screens. These unique insights will help them improve problem resolution, system performance, process optimization, employee efficiency and more.
CEO, Knoa Software
DIGITAL EXPERIENCE MONITORING
In a customer-centric age, empowered users are accustomed to getting extremely high levels of service, a reality that is forcing companies to evolve traditional performance monitoring into what Gartner now calls digital experience monitoring (DEM). DEM treats the user experience as the ultimate metric, and identifies how the myriad of underlying services, systems and components influence it. DEM is far more multi-dimensional than past end user experience monitoring approaches. IT Operations Analytics will evolve concurrently with DEM, handling more complexity (ingesting and analyzing more data from more sources), and increasing diagnostic accuracy and speed.
Director of Industry Innovation, Catchpoint
MONITOR WHAT MATTERS
We will see a significant shift away from "monitor everything", and a return to "monitor what matters." But this time, "what matters" will be determined algorithmically, not by policy, and consequently the performance data will be more adaptive and relevant.
Chief Evangelist, Moogsoft
IT Operations Analytics (ITOA), in relation to performance management has yet to deliver the first promise of analytics: predictability. Although there are a number of interesting solutions around that are that advanced, especially in areas like network management (in combination with vertical/domain problems), the market has yet to witness an easy-to-use, intelligent solution that can see within the crystal ball and predict outages, failures and problems.
VP of Engineering, Comtrade Software
Read Next Steps for ITOA - Part 4, covering automation and dynamic IT environment.
High availability's (HA) primary objective has historically been focused on ensuring continuous operations and performance. HA was built on a foundation of redundancy and failover technologies and methodologies to ensure business continuity in the event of workload spikes, planned maintenance, and unplanned downtime. Today, HA methodologies have been superseded by intelligent workload routing automation (i.e., intelligent availability), in that data and their processing are consistently directed to the proper place at the right time ...
You need insight to maximize performance — not inefficient troubleshooting, longer time to resolution, and an overall lack of application intelligence. Steps 5 through 10 will help you maximize the performance of your applications and underlying network infrastructure ...
As a Network Operations professional, you know how hard it is to ensure optimal network performance when you’re unsure of how end-user devices, application code, and infrastructure affect performance. Identifying your important applications and prioritizing their performance is more difficult than ever, especially when much of an organization’s web-based traffic appears the same to the network. You need insight to maximize performance — not inefficient troubleshooting, longer time to resolution, and an overall lack of application intelligence. But you can stay ahead. Follow these 10 steps to maximize the performance of your applications and underlying network infrastructure ...
IT organizations are constantly trying to optimize operations and troubleshooting activities and for good reason. Let's look at one example for the medical industry. Networked applications, such as electronic medical records (EMR), are vital for hospitals to provide outstanding service to their patients and physicians. However, a networking team can often not be aware of slow response times on the remotely hosted EMR application until a physician or someone else calls in to complain ...
In 2014, AWS Lambda introduced serverless architecture. Since then, many other cloud providers have developed serverless options. What’s behind this rapid growth? ...
This question is really two questions. The first would be: What's really going on in terms of a confusion of terms? — as we wrestle with AIOps, IT Operational Analytics, big data, AI bots, machine learning, and more generically stated "AI platforms" (… and the list is far from complete). The second might be phrased as: What's really going on in terms of real-world advanced IT analytics deployments — where are they succeeding, and where are they not? This blog will look at both questions as a way of introducing EMA's newest research with data ...
Consumers will now trade app convenience for security, according to a study commissioned by F5 Networks, The Curve of Convenience – The Trade-Off between Security and Convenience ...
Gartner unveiled the CX Pyramid, a new methodology to test organizations’ customer journeys and forge more powerful experiences that deliver greater customer loyalty and brand advocacy ...
Nearly half (48 percent) of consumers report that they currently use, or have used in the past, services of organizations that were involved in a publicly disclosed data breach and, of those, 48 percent have stopped using the services of an organization because of a breach, according to Global State of Digital Trust Survey and Index 2018, a new report from CA Technologies ...
Here's the problem: IT teams are in the dark. The only information they have available to them is based on what users decide to tell them about through calls to the help desk ...