If your business depends on mission-critical web or legacy applications, then monitoring how your end users interact with your applications is critical. The end users' experience after pressing the ENTER key or clicking SUBMIT might decide the bottom line of your enterprise.
Most monitoring solutions try to infer the end-user experience based on resource utilization. However, resource utilization cannot provide meaningful results on how the end-user is experiencing an interaction with an application. The true measurement of end-user experience is availability and response time of the application, end-to-end and hop-by-hop.
The responsiveness of the application determines the end user's experience. In order to understand the end user's experience, contextual intelligence on how the application is responding based on the time of the day, the day of the week, the week of the month and the month of the year must be measured. Baselining requires capturing these metrics across a time dimension. The base line of response time of an application at regular intervals provides the ability to ensure that the application is working as designed. It is more than a single report detailing the health of the application at a certain point in time.
"Dynamic baselining" is a technique to compare real response times against historical averages. Dynamic baselining is an effective technique to provide meaningful insight into service anomalies without requiring the impossible task of setting absolute thresholds for every transaction.
A robust user experience solution will also include application and system errors that have a significant impact on the ability of the user to complete a task. Since the user experience is often impacted by the performance of the user's device, metrics about desktop/laptop performance are required for adequate root-cause analysis.
For example, when you collect response time within the Exchange environment over a period of time, with data reflecting periods of low, average, and peak usage, you can make a subjective determination of what is acceptable performance for your system. That determination is your baseline, which you can then use to detect bottlenecks and to watch for long-term changes in usage patterns that require Ops to balance infrastructure capacity against demand to achieve the intended performance.
When you need to troubleshoot system problems, the response time baseline gives you information about the behavior of system resources at the time the problem occurred, which is useful in discovering its cause. When determining your baseline, it is important to know the types of work that are being done and the days and times when that work is done. This provides the association of the work performed with the resource usage to determine whether performance during those intervals is acceptable.
Response time baselining helps you to understand not only resource utilization issues but also availability and responsiveness of services on which the application flow is dependent upon. For example, if your Active Directory is not responding in an optimal way, the end-user experiences unintended latencies with the application's performance.
By following the baseline process, you can obtain the following information:
■ What is the real experience of the user when using any application?
■ What is "normal" behavior?
■ Is "normal" meeting service levels that drive productivity?
■ Is "normal" optimal?
■ Are deterministic answers available? Time to close a ticket, Root cause for outage, Predictive warnings, etc.
■ Who is using what, when and how much?
■ What is the experience of each individual user and a group of users?
■ Dependencies on infrastructure
■ Real-time interaction with infrastructure
■ Gain valuable information on the health of the hardware and software that is part of the application service delivery chain
■ Determine resource utilization
■ Make accurate decisions about alarm thresholds
Response time baselining empowers you to provide guaranteed service levels to your end users for every business critical application which in turns helps the bottom-line of the business.
Sri Chaganty is COO and CTO/Founder at AppEnsure.
More than half (57 percent) of employees surveyed either don’t know (20%) or misinterpret (37%) the meaning of "digital transformation" according to a survey by YouGov, commissioned by Cherwell Software ...
A vast majority of organizations are still unprepared to properly respond to cybersecurity incidents, with 77% of respondents indicating they do not have a cybersecurity incident response plan applied consistently across the enterprise, according to The 2019 Study on the Cyber Resilient Organization, a study conducted by the Ponemon Institute on behalf of IBM ...
People and businesses today make mistakes similar to Troy, when they get too enamored by the latest, flashiest technology. These modern Trojan Horses work through their ability to "wow" us. Cybercriminals find IoT devices an easy target because they are the cool new technology on the block ...
Software security flaws cause the majority of product vulnerabilities, according to the 2019 Security Report from Ixia's Application and Threat Intelligence (ATI) Research Center ...
The majority of organizations (nearly 70 percent) do not prioritize the protection of the applications that their business depend on — such as ERP and CRM systems — any differently than how low-value data, applications or services are secured, according to a new survey from CyberArk ...
While 97 percent of organizations are currently undertaking or planning to undertake digital transformation initiatives, integration challenges are hindering efforts for 84 percent of organizations, according to the 2019 Connectivity Benchmark Report from MuleSoft ...
Companies have low visibility into their public cloud environments, and the tools and data supplied by cloud providers are insufficient, according to The State of Public Cloud Monitoring, a report sponsored by Ixia ...
Without improvement in time and budget constraints, the majority of tech pros (75 percent) say they will be unable to confidently manage future innovations, according to IT Trends Report 2019: Skills for Tech Pros of Tomorrow, a new report from SolarWinds. This reality ultimately puts businesses at risk of performance and competitive advantage losses, making the prioritization of skills and career development for tech pros paramount ...
Tech pros have one foot grounded in today's hybrid IT realities while also setting their sights on emerging technology, according to IT Trends Report 2019: Skills for Tech Pros of Tomorrow ...
This Thursday EMA will be presenting a webinar — Automation, AI and Analytics: Reinventing ITSM — covering recent research. There were quite a few surprises. And in fact, many of the surprises indicated a yet-more-positive outlook than we expected ...