Want a Slow Website? Make it Bigger and More Complex
September 28, 2015

Kent Alstad
Radware

Share this

Is your website slow to load?

Page size and complexity are two of the main factors you need to consider.

Looking back at the trends over the last five years, the average site has ballooned from just over 700KB to 2,135KB. That’s over a 200% increase in five years!

The number of requests have grown as well, from around 70 to about 100.

Consider the data from WebPagetest.org (numbers overlap, due to the number of samples):


What’s going on?

Sites Are Opting For Complexity Over Speed

It’s clear from the data that sites are built with a preference for rich, complex pages, unfortunately relegating load times to a lower tier of importance. While broadband penetration continues to climb, the payload delivered to browsers is increasing as well.

This is a similar dynamic to what’s going on in smartphones with their battery life: the amount of a phone’s “on” time is staying static, even though processors have become smaller and more efficient. Why? Because the processors have to work harder on larger, more complex applications, and there’s pressure to deliver thin, svelte devices at the expense of the physical size of the batteries. Any gains in efficiency are offset by the work the processors have to do.

So yes, people generally have more bandwidth available to them, but all the data being thrown at their browsers has to be sorted and rendered, hence the slowdown in page speed.

Take images are a perfect example of this trend. The rise in ecommerce has brought with it all the visual appeal of a high-end catalogue combined with a commercial. The result: larger images – and more of them.


While the number of image requests hasn’t risen dramatically, the size of those images has. Looking at the above chart, the total size of a typical page’s images has grown from 418KB in 2010 to 1,348KB for today’s typical page, an increase of 222 percent.

You could go on and on about the impact of custom fonts, CSS transfer size and requests and the same for JavaScript, but the trends are the same. Other than the number of sites utilizing Flash decreasing due to the switch to HTML5, the story is always boils down to “bigger” and “more”, leading to a user experience that equates to more waiting.

What Can You Do About It?

Thankfully, there are steps you can take to get things moving. For example:

Consolidate JavaScript and CSS: Consolidating JavaScript code and CSS styles into common files that can be shared across multiple pages should be a common practice. This technique simplifies code maintenance and improves the efficiency of client-side caching. In JavaScript files, be sure that the same script isn’t downloaded multiple times for one page. Redundant script downloads are especially likely when large teams or multiple teams collaborate on page development.

Sprite Images: Spriting is a CSS technique for consolidating images. Sprites are simply multiple images combined into a rectilinear grid in one large image. The page fetches the large image all at once as a single CSS background image and then uses CSS background positioning to display the individual component images as needed on the page. This reduces multiple requests to only one, significantly improving performance.

Compress Images: Image compression is a performance technique that minimizes the size (in bytes) of a graphics file without degrading the quality of the image to an unacceptable level. Reducing an image’s file size has two benefits: reducing the amount of time required for images to be sent over the internet or downloaded, and increasing the number of images that can be stored in the browser cache, thereby improving page render time on repeat visits to the same page.

Defer Rendering “Below the Fold” Content: Ensure that the user sees the page quicker by delaying the loading and rendering of any content that is below the initially visible area, sometimes called “below the fold.” To eliminate the need to reflow content after the remainder of the page is loaded, replace images initially with placeholder tags that specify the correct height and width.

Preload Page Resources in the Browser: Auto-preloading is a powerful performance technique in which all user paths through a website are observed and recorded. Based on this massive amount of aggregated data, the auto-preloading engine can predict where a user is likely to go based on the page they are currently on and the previous pages in their path. The engine loads the resources for those “next” pages in the user’s browser cache, enabling the page to render up to 70 percent faster. Note that this is a data-intensive, highly dynamic technique that can only be performed by an automated solution.

Implement an Automated Web Performance Optimization Solution: While many of the performance techniques outlined in this section can be performed manually by developers, hand-coding pages for performance is specialized, time-consuming work. It is a never-ending task, particularly on highly dynamic sites that contain hundreds of objects per page, as both browser requirements and page requirements continue to develop. Automated front-end performance optimization solutions apply a range of performance techniques that deliver faster pages consistently and reliably across the entire site.

The Bottom Line

While pages are still within the trend of seeing their size and complexity grow, the toolsets available to combat slow loading times have increased as well. HTTP/2 promises protocol optimization, and having a powerful content optimization solution in place will help you take care of the rest.

Still – if you can – keep it simple. That’s always a great rule to follow.

Kent Alstad is VP of Acceleration at Radware.

Kent Alstad is VP of Acceleration at Radware
Share this

The Latest

May 17, 2024

In MEAN TIME TO INSIGHT Episode 6, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses network automation ...

May 16, 2024

In the ever-evolving landscape of software development and infrastructure management, observability stands as a crucial pillar. Among its fundamental components lies log collection ... However, traditional methods of log collection have faced challenges, especially in high-volume and dynamic environments. Enter eBPF, a groundbreaking technology ...

May 15, 2024

Businesses are dazzled by the promise of generative AI, as it touts the capability to increase productivity and efficiency, cut costs, and provide competitive advantages. With more and more generative AI options available today, businesses are now investigating how to convert the AI promise into profit. One way businesses are looking to do this is by using AI to improve personalized customer engagement ...

May 14, 2024

In the fast-evolving realm of cloud computing, where innovation collides with fiscal responsibility, the Flexera 2024 State of the Cloud Report illuminates the challenges and triumphs shaping the digital landscape ... At the forefront of this year's findings is the resounding chorus of organizations grappling with cloud costs ...

May 13, 2024

Government agencies are transforming to improve the digital experience for employees and citizens, allowing them to achieve key goals, including unleashing staff productivity, recruiting and retaining talent in the public sector, and delivering on the mission, according to the Global Digital Employee Experience (DEX) Survey from Riverbed ...

May 09, 2024

App sprawl has been a concern for technologists for some time, but it has never presented such a challenge as now. As organizations move to implement generative AI into their applications, it's only going to become more complex ... Observability is a necessary component for understanding the vast amounts of complex data within AI-infused applications, and it must be the centerpiece of an app- and data-centric strategy to truly manage app sprawl ...

May 08, 2024

Fundamentally, investments in digital transformation — often an amorphous budget category for enterprises — have not yielded their anticipated productivity and value ... In the wake of the tsunami of money thrown at digital transformation, most businesses don't actually know what technology they've acquired, or the extent of it, and how it's being used, which is directly tied to how people do their jobs. Now, AI transformation represents the biggest change management challenge organizations will face in the next one to two years ...

May 07, 2024

As businesses focus more and more on uncovering new ways to unlock the value of their data, generative AI (GenAI) is presenting some new opportunities to do so, particularly when it comes to data management and how organizations collect, process, analyze, and derive insights from their assets. In the near future, I expect to see six key ways in which GenAI will reshape our current data management landscape ...

May 06, 2024

The rise of AI is ushering in a new disrupt-or-die era. "Data-ready enterprises that connect and unify broad structured and unstructured data sets into an intelligent data infrastructure are best positioned to win in the age of AI ...

May 02, 2024

A majority (61%) of organizations are forced to evolve or rethink their data and analytics (D&A) operating model because of the impact of disruptive artificial intelligence (AI) technologies, according to a new Gartner survey ...