Want a Slow Website? Make it Bigger and More Complex
September 28, 2015

Kent Alstad
Radware

Share this

Is your website slow to load?

Page size and complexity are two of the main factors you need to consider.

Looking back at the trends over the last five years, the average site has ballooned from just over 700KB to 2,135KB. That’s over a 200% increase in five years!

The number of requests have grown as well, from around 70 to about 100.

Consider the data from WebPagetest.org (numbers overlap, due to the number of samples):


What’s going on?

Sites Are Opting For Complexity Over Speed

It’s clear from the data that sites are built with a preference for rich, complex pages, unfortunately relegating load times to a lower tier of importance. While broadband penetration continues to climb, the payload delivered to browsers is increasing as well.

This is a similar dynamic to what’s going on in smartphones with their battery life: the amount of a phone’s “on” time is staying static, even though processors have become smaller and more efficient. Why? Because the processors have to work harder on larger, more complex applications, and there’s pressure to deliver thin, svelte devices at the expense of the physical size of the batteries. Any gains in efficiency are offset by the work the processors have to do.

So yes, people generally have more bandwidth available to them, but all the data being thrown at their browsers has to be sorted and rendered, hence the slowdown in page speed.

Take images are a perfect example of this trend. The rise in ecommerce has brought with it all the visual appeal of a high-end catalogue combined with a commercial. The result: larger images – and more of them.


While the number of image requests hasn’t risen dramatically, the size of those images has. Looking at the above chart, the total size of a typical page’s images has grown from 418KB in 2010 to 1,348KB for today’s typical page, an increase of 222 percent.

You could go on and on about the impact of custom fonts, CSS transfer size and requests and the same for JavaScript, but the trends are the same. Other than the number of sites utilizing Flash decreasing due to the switch to HTML5, the story is always boils down to “bigger” and “more”, leading to a user experience that equates to more waiting.

What Can You Do About It?

Thankfully, there are steps you can take to get things moving. For example:

Consolidate JavaScript and CSS: Consolidating JavaScript code and CSS styles into common files that can be shared across multiple pages should be a common practice. This technique simplifies code maintenance and improves the efficiency of client-side caching. In JavaScript files, be sure that the same script isn’t downloaded multiple times for one page. Redundant script downloads are especially likely when large teams or multiple teams collaborate on page development.

Sprite Images: Spriting is a CSS technique for consolidating images. Sprites are simply multiple images combined into a rectilinear grid in one large image. The page fetches the large image all at once as a single CSS background image and then uses CSS background positioning to display the individual component images as needed on the page. This reduces multiple requests to only one, significantly improving performance.

Compress Images: Image compression is a performance technique that minimizes the size (in bytes) of a graphics file without degrading the quality of the image to an unacceptable level. Reducing an image’s file size has two benefits: reducing the amount of time required for images to be sent over the internet or downloaded, and increasing the number of images that can be stored in the browser cache, thereby improving page render time on repeat visits to the same page.

Defer Rendering “Below the Fold” Content: Ensure that the user sees the page quicker by delaying the loading and rendering of any content that is below the initially visible area, sometimes called “below the fold.” To eliminate the need to reflow content after the remainder of the page is loaded, replace images initially with placeholder tags that specify the correct height and width.

Preload Page Resources in the Browser: Auto-preloading is a powerful performance technique in which all user paths through a website are observed and recorded. Based on this massive amount of aggregated data, the auto-preloading engine can predict where a user is likely to go based on the page they are currently on and the previous pages in their path. The engine loads the resources for those “next” pages in the user’s browser cache, enabling the page to render up to 70 percent faster. Note that this is a data-intensive, highly dynamic technique that can only be performed by an automated solution.

Implement an Automated Web Performance Optimization Solution: While many of the performance techniques outlined in this section can be performed manually by developers, hand-coding pages for performance is specialized, time-consuming work. It is a never-ending task, particularly on highly dynamic sites that contain hundreds of objects per page, as both browser requirements and page requirements continue to develop. Automated front-end performance optimization solutions apply a range of performance techniques that deliver faster pages consistently and reliably across the entire site.

The Bottom Line

While pages are still within the trend of seeing their size and complexity grow, the toolsets available to combat slow loading times have increased as well. HTTP/2 promises protocol optimization, and having a powerful content optimization solution in place will help you take care of the rest.

Still – if you can – keep it simple. That’s always a great rule to follow.

Kent Alstad is VP of Acceleration at Radware.

Kent Alstad is VP of Acceleration at Radware
Share this

The Latest

April 24, 2024

Over the last 20 years Digital Employee Experience has become a necessity for companies committed to digital transformation and improving IT experiences. In fact, by 2025, more than 50% of IT organizations will use digital employee experience to prioritize and measure digital initiative success ...

April 23, 2024

While most companies are now deploying cloud-based technologies, the 2024 Secure Cloud Networking Field Report from Aviatrix found that there is a silent struggle to maximize value from those investments. Many of the challenges organizations have faced over the past several years have evolved, but continue today ...

April 22, 2024

In our latest research, Cisco's The App Attention Index 2023: Beware the Application Generation, 62% of consumers report their expectations for digital experiences are far higher than they were two years ago, and 64% state they are less forgiving of poor digital services than they were just 12 months ago ...

April 19, 2024

In MEAN TIME TO INSIGHT Episode 5, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses the network source of truth ...

April 18, 2024

A vast majority (89%) of organizations have rapidly expanded their technology in the past few years and three quarters (76%) say it's brought with it increased "chaos" that they have to manage, according to Situation Report 2024: Managing Technology Chaos from Software AG ...

April 17, 2024

In 2024 the number one challenge facing IT teams is a lack of skilled workers, and many are turning to automation as an answer, according to IT Trends: 2024 Industry Report ...

April 16, 2024

Organizations are continuing to embrace multicloud environments and cloud-native architectures to enable rapid transformation and deliver secure innovation. However, despite the speed, scale, and agility enabled by these modern cloud ecosystems, organizations are struggling to manage the explosion of data they create, according to The state of observability 2024: Overcoming complexity through AI-driven analytics and automation strategies, a report from Dynatrace ...

April 15, 2024

Organizations recognize the value of observability, but only 10% of them are actually practicing full observability of their applications and infrastructure. This is among the key findings from the recently completed Logz.io 2024 Observability Pulse Survey and Report ...

April 11, 2024

Businesses must adopt a comprehensive Internet Performance Monitoring (IPM) strategy, says Enterprise Management Associates (EMA), a leading IT analyst research firm. This strategy is crucial to bridge the significant observability gap within today's complex IT infrastructures. The recommendation is particularly timely, given that 99% of enterprises are expanding their use of the Internet as a primary connectivity conduit while facing challenges due to the inefficiency of multiple, disjointed monitoring tools, according to Modern Enterprises Must Boost Observability with Internet Performance Monitoring, a new report from EMA and Catchpoint ...

April 10, 2024

Choosing the right approach is critical with cloud monitoring in hybrid environments. Otherwise, you may drive up costs with features you don’t need and risk diminishing the visibility of your on-premises IT ...