Web Performance and the Impact of SPDY, HTTP/2 & QUIC - Part 4
May 25, 2016

Jean Tunis
RootPerformance

Share this

This blog is the fourth in a 5-part series on APMdigest where I discuss web application performance and how new protocols like SPDY, HTTP/2, and QUIC will hopefully improve it so we can have happy website users.

Start with Web Performance 101: The Bandwidth Myth

Start with Web Performance 101: 4 Recommendations to Improve Web Performance

Start with Web Performance and the Impact of SPDY, HTTP/2 & QUIC - Part 1

Start with Web Performance and the Impact of SPDY, HTTP/2 & QUIC - Part 2

Start with Web Performance and the Impact of SPDY, HTTP/2 & QUIC - Part 3

The new HTTP/2 protocol includes a number of things that did not exist at all in HTTP before:

Uses only one TCP connection

In HTTP/1.1, we needed many connections, but not too many due to resource constraints and latency considerations. In HTTP/2, the standard calls for only one TCP connection to be used. This will reduce the overhead of opening and closing TCP connections and reduce the round-trip time (RTT) of going to the server and back for numerous requests.

Requests are multiplexed

What allows the one-connection capability to occur and not impact performance is the ability of requests to be multiplexed. HTTP requests are broken up into streams, and each stream can be sent down one connection. This is what pipelining was hoping to achieve, but did not.

It's binary, not text-based to allow for multiplexing

The ability to multiplex the HTTP requests is enabled by the fact that the protocol is now binary. HTTP/1.1 is a text-based protocol, which make it difficult to break up HTTP data for the multiplexing capability needed.

Compresses headers

One of the recommendations to help improve performance is to enable caching on the server. Since web browsers generally support caching, returns to the browser would not have to re-download the same data it previously downloaded. This will save a round-trip request, and users get their request almost instantaneously, depending of the performance of their PC.

The drawback of all this caching is the data in the HTTP header used to identify whether data is cached via a cookie. The size of the cookies have gotten bigger and bigger over the years. Most browsers allow a cookie to be about 4KB. With this size, an HTTP request can sometimes be mostly of cookie data in the header.

Compression also occurs with a new format called HPACK, defined in RFC 7541. This compression format replaces GZIP because of a security risk (CRIME) discovered in 2012 discovered about this format.

Compressing the headers helps to reduce the growth of the HTTP headers.

Has different frame types: headers and data

At the core of the performance improvement gains expected of HTTP/2 is the new binary framing format. Each HTTP message is encoded in binary format. With this format, HTTP/2 introduces different types of frames that are part of a message. Instead of having an HTTP message with the headers and the payload in one frame, there are frames only for data and frames only for header information. There are in total ten new frame types in HTTP/2, which help allow for the new capabilities.

Prioritizes requests sent

HTTP/2 allows for the browser to be able to prioritize requests that are sent. Higher priority requests can go ahead of other requests via the multiplexing mechanism. This is done with the PRIORITY frame type.

Can reset HTTP/2 stream instead of TCP connection

In HTTP/1.1, when a request is complete, the connection can be reset and closed by either end. The problem is that it means if you want to use that connection again, you have to open it, and hence another trip to the server.

With HTTP/2, we can now reset a HTTP stream inside of a TCP connection. This allows for close and reusing another stream, without tearing down the TCP connection, and requiring another trip to the server when we need to send some data down that connection. This is done with the RST_STREAM frame type.

Servers can push data to browser

Web servers now have the ability to push content directly to client browsers even if they are not explicitly requested. It means that when a client, for example, makes a request for a particular page, the server will automatically push any additional data, such as Javascript or CSS files, required to properly render the page. This removes the need for the browser to make more requests for those files, which would create additional round-trips.

The server must specify to the client that it will be pushing content to it before it does so. This is done via the PUSH_PROMISE frame type.

Controls the flow of data

The TCP protocol has the ability to control the flow of data by opening and closing the TCP congestion window. When the receiver needs to slow down the other side, it does so by reducing its window.

With HTTP/2, we have one connection, and if that happens, everything slows down.

But with the capability of having multiplexed streams, HTTP/2 was given the ability to provide for its own flow control at the stream and connection level. This way, if a stream of data needs to be slow down, other streams are not impacted, and the TCP connection continues to operates appropriately.

This is done via the WINDOW_UPDATE frame type.

Read Web Performance and the Impact of SPDY, HTTP/2 & QUIC - Part 5, the last installment in this blog series, taking a final look at HTTP/2.

Jean Tunis is Senior Consultant and Founder of RootPerformance.

Share this

The Latest

February 23, 2018

With 2017 behind us, the news cycle is still stirring up stories on artificial intelligence (AI) and machine learning (ML), but has some of the excitement worn off? We're witnessing a surge of activity in the space. Can actual examples of AI in the enterprise rise among some of the noise that's inundating the market and hindering the credibility of everyone? ...

February 22, 2018

Everyone wants to talk about how analytics is the future of network engineering and operations. The phrase "network analytics" is used by vendors of various stripes to imply that a particular technology is smarter and better than the average solution. But what is it? What does the term network analytics mean to the enterprise network infrastructure professionals? ...

February 21, 2018

Three out of four (76%) of organizations think IT complexity could soon make it impossible to manage digital performance efficiently, according to the Top Challenges Facing CIOs in a Cloud-Native World report from Dynatrace ...

February 20, 2018

The Global CIO Point of View report compiled by ServiceNow notes that 89 percent of organizations are either in the planning stages or are already taking advantage of machine learning. Nearly 90 percent of the CIOs surveyed anticipate that increasing automation will increase the speed and accuracy of decisions, and more than two-thirds believe that decisions made by machines will be more accurate than human-made decisions ...

February 16, 2018

The enterprise WAN is unable to keep up with digital transformation demands, according to Foundation for Digital Transformation, a new research report, authored by Ensemble IQ and supported by InfoVista. This challenge was universal across all three vertical industries surveyed — retail, manufacturing, and banking/financial services ...

February 15, 2018

Achieving optimum Java Virtual Machine (JVM) performance is key to ensuring proper memory management and fast application processing. According to a Cornell University study, a 1-millisecond improvement in the performance of a trading application can be worth $100 million a year to a major brokerage firm. Because of this potential for loss, IT teams owning banking, financial, trading and other Java-based applications place a high value on having a proper JVM monitoring strategy in place ...

February 13, 2018

APM had to evolve to keep pace with development velocity and maintain the service quality for the modern applications born out of digital transformation. Automation and artificial intelligence (AI) technologies are critical to the next step in APM evolution, helping to address speed, scalability and intelligence demands ...

February 12, 2018

A worldwide survey by Gartner, Inc. showed that 91 percent of organizations have not yet reached a "transformational" level of maturity in data and analytics, despite this area being a number one investment priority for CIOs in recent years ...

February 09, 2018

Mobile app performance is still a significant issue. In a new report from PacketZoom, The Effect of Mobile Network Performance on Mobile App Users, 66% of consumers said reliable mobile app performance is "very important" — second only to mobile app security ...