The Key to a Well-Running Web Site
October 25, 2016

Sven Hammar
Apica

Share this

Website development and maintenance is not a simple proposition. You need to get your message out to your audience in a fast, attractive and secure way. Yet, making a website attractive and keeping it secure may take away some speed. Buy back speed by reducing security and you stand a chance of having your users avoid your site. Skimp on the aesthetics and your speed and security may not mean a thing, nobody's coming. What's a developer to do?

The answer, of course, is "test, test, test!” Test in development, test in the real world and, as you're fine-tuning and fixing, test each content exchange until the website is humming and your users are busy navigating your site, not complaining about it.

Using the correct monitoring tools is one key to bringing your website to the public quickly and keeping it working flawlessly with the least amount of pain to your users. So let's introduce some three-letter abbreviations here; The tools available are Web Performance Monitoring (WPM – aka Synthetic Monitoring), Application Performance Management (APM) and Real User Monitoring (RUM). Each of these has its use cases and when used together they combine to keep your website responsive and your users satisfied.

WPM

WPM uses synthetic monitoring, also known as active monitoring, which is monitoring using web browser emulation or scripted recordings of web transactions. You control testing the performance of the website as a whole, including how pages render, response time to content requests and other aspects of website operation that are directly responsible for how well or how poorly the website runs.

Use synthetic monitoring to test specific pages or transaction types that may not get regular traffic on your website, monitoring it from a user's perspective. Behavioral scripts simulate the actions or exercise paths that your users will take. An example of this would be to have the script login to the website, go through a transaction, get to the purchase page and then abandon the purchase. This gives you a clear indication of how a user will experience your checkout page and whether it takes too long to complete a purchase.

Use WPM, as well, to check javascript timing to see how long it takes your pages to render. Use synthetic monitoring on live websites from clients scattered throughout the world to test the network paths from where your users are connecting.

APM

Use APM to allow your developers to dive deeper into website problems so root causes can be uncovered and fixes can be put in where they will do the most good. APM allows you to follow critical transactions through from start to finish so you can determine exactly what is going wrong on your website and perform searches for values to find where bugs, bottlenecks or less-than-optimal code can be found and fixed to create a faster, more efficient website.

RUM

Since RUM is a passive monitoring process, use it to provide information about how real-world users are experiencing your website. Find out whether slowdowns are tied to time-of-day, or specific content requests, or any of the variety of issues that can plague a normally smooth-running site. RUM won't tell you exactly what's wrong but it will alert you as things do go wrong and how your users are affected.

Unfortunately, RUM cannot be directed against specific pages or processes and cannot give you on-demand testing, nor can it be used to create an artificial load on your website to see how it reacts to stress. What it can do is alert you to those times when your website is starting to experience sub-optimal performance so you can get your team working on your issues.

Combine WPM, APM and RUM

The key to developing and maintaining a well-running website, then, is combining all three monitoring tools, using RUM to get a good sense of how your users are experiencing the website, WPM to exercise your code and get real baseline monitoring as well as testing of lesser-visited pages and using APM to troubleshoot and find the source of problems found by RUM and WPM. Only by using each of these tools can you ensure that your site is performance optimized.

Sven Hammar is Chief Strategy Officer and Founder of Apica
Share this

The Latest

May 16, 2019

Although the vast majority of IT organizations have implemented a broad variety of systems and tools to modernize, simplify and streamline data center operations, many are still burdened by inefficiencies, security risks and performance gaps in their IT infrastructure as well as the excessive time it takes to manage legacy infrastructure, according to the State of IT Transformation, a report from Datrium ...

May 15, 2019

When it comes to network visibility, there are a lot of discussions about packet broker technology and the various features these solutions provide to network architects and IT managers. Packet brokers allow organizations to aggregate the data required for a variety of monitoring solutions including network performance monitoring and diagnostic (NPMD) platforms and unified threat management (UTM) appliances. But, when it comes to ensuring these solutions provide the insights required by NetOps and security teams, IT can spend an exorbitant amount of time dealing with issues around adds, moves and changes. This can have a dramatic impact on budgets and tool availability. Why does this happen? ...

May 14, 2019

Data may be pouring into enterprises but IT professionals still find most of it stuck in siloed departments and weeks away from being able to drive any valued action. Coupled with the ongoing concerns over security responsiveness, IT teams have to push aside other important performance-oriented data in order to ensure security data, at least, gets prominent attention. A new survey by Ivanti shows the disconnect between enterprise departments struggling to improve operations like automation while being challenged with a siloed structure and a data onslaught ...

May 13, 2019

A subtle, deliberate shift has occurred within the software industry which, at present, only the most innovative organizations have seized upon for competitive advantage. Although primarily driven by Artificial Intelligence (AI), this transformation strikes at the core of the most pervasive IT resources including cloud computing and predictive analytics ...

May 09, 2019

When asked who is mandated with developing and delivering their organization's digital competencies, 51% of respondents say their IT departments have a leadership role. The critical question is whether IT departments are prepared to take on a leadership role in which collaborating with other functions and disseminating knowledge and digital performance data are requirements ...

May 08, 2019

The Economist Intelligence Unit just released a new study commissioned by Riverbed that explores nine digital competencies that help organizations improve their digital performance and, ultimately, achieve their objectives. Here's a brief summary of 7 key research findings you'll find covered in detail in the report ...

May 07, 2019

Today, the overall customer scenario has digitally transformed and practically there is no limitation to the ways in which the target customers can be reached. These opportunities are throwing multiple challenges for brands and enterprises, and one of the prominent ones is to ensure Omni Channel experience for customers ...

May 06, 2019

Most businesses (92 percent of respondents) see the potential value of data and 36 percent are already monetizing their data, according to the Global Data Protection Index from Dell EMC. While this acknowledgement is positive, however, most respondents are struggling to properly protect their data ...

May 02, 2019

IT practitioners are still in experimentation mode with artificial intelligence in many cases, and still have concerns about how credible the technology can be. A recent study from OpsRamp targeted these IT managers who have implemented AIOps, and among other data, reports on the primary concerns of this new approach to operations management ...

May 01, 2019

NVMe storage's strong performance, combined with the capacity and data availability benefits of shared NVMe storage over local SSD, makes it a strong solution for AI / ML infrastructures of any size. There are several AI / ML focused use cases to highlight ...