Skip to main content

The New Internet: What You Need to Know About HTTP/2

Kent Alstad

Since HTTP 1.1 was introduced 17 years ago, the Internet has evolved. This evolution introduced many changes, among them the development and delivery of rich content to users. These improvements enhanced the online experience, but did come at a cost, and the currency was performance – performance challenges that HTTP 1.1 was never designed to handle.

In February 2015 the Internet Engineering Task Force (IETF), who develops and promotes voluntary Internet standards, released a new HTTP/2 version to cope with those challenges and to adapt to the evolution that internet content has undergone.

Here's what you need to know about the challenges HTTP 1.1 faced and the improvements that HTTP/2 has introduced:

HTTP 1.1 Challenge: HTTP 1.1 allowed the client to send only one object request per TCP connection at a time. Request for the next object could only be sent after the first request received the complete reply from the server.

HTTP/2 Improvement: HTTP/2 enables transaction multiplexing – so that the browser can send any number of requests and receive the responses interleaved and out of order. As a result, the use of the TCP connection between the browser and the server is much more efficient, the wait between subsequent requests and replies is eliminated and this all leads to faster page load times.

HTTP 1.1 Challenge: When visiting most webpages, the browser is requested to provide a lot of information about the session, such as transaction info (i.e. encoding used, cache control), user/server identification, cookies, etc. This information has to be provided in the HTTP header with each HTTP transaction. This can add up to a lot of data, adding more delay to the page download time.

HTTP/2 Improvement: HTTP/2 introduces a new symmetrical header compression capability, where both the client and the server use an advanced header compression algorithm to reduce the header's payload. In addition, with HTTP/2, it's enough to send the full header only once per page, and not per transaction, reducing even further the uplink payload from the client to the server. The result again is faster webpage download time.

HTTP 1.1 Challenge: With HTTP 1.1, communication could only be initiated by the client, which meant that the server could only push resources to the client, after the client has asked for them.

HTTP/2 Improvement: With HTTP/2 the server can also initiate resource push to the client, even before the client knows they will need those resources. This bi-directional communication can reduce the number of "Get" transactions, and use the available bandwidth between the server and the client much more efficiently, leading again to faster web application response times.

Why Upgrade to HTTP/2?

HTTP/2 is an important upgrade that can provide performance improvements for your web applications. It can reduce the amount of bandwidth required to support the same amount of users on your site through enabling better header compression and fewer requests. This means that web applications can have faster response times and serve your users better.

Nearly 60% of leading web browsers (including Chrome, Firefox, Internet Explorer, Safari, Opera and others) already natively support HTTP/2 – so your audience is ready for it!

Is HTTP/2 Right for Any Web Application?

While the IETF doesn't mandate encrypted web communication for HTTP/2, allowing clear text (HTTP based), communication as well, all browsers' implementation of HTTP/2 require a secured (HTTPS) connection.

This means that if a site doesn't support HTTPS URLs, or can't be upgraded to HTTPS, it can't use the new protocol. In many cases, even if the site can use encrypted HTTPS communication, it may have some severe performance penalties, having to encrypt all communication to/from the server. So only sites that have a good infrastructure that can efficiently handle HTTPS communication will be able to de-facto benefit from the performance boost HTTP/2 has to offer.

Another challenge that exists with HTTP/2 is that, unlike the majority of browsers that already have mature support for HTTP/2, many web server platforms don't offer stable and mature support for the new protocol. The implementations of HTTP/2 support also often still suffer from unexpected behavior, partially due to the lack of a testing tool that supports HTTP/2 protocol analysis.

Moreover, one of the capabilities that HTTP/2 offers in order to improve performance is server push – where the server pushes resources to the client, before the client ask for them. To leverage this capability the server needs the ability to determine which resources to push, to make the web transaction faster. Understanding which objects to push before the user asks for them, ensuring those objects don't already exist in the browsers' cache (otherwise – it will make the transaction slower not faster), is a capability no web servers natively have today.

These limitations may cause some site owners to delay their adoption of HTTP/2, but there are solutions that can help.

How Do I Accelerate HTTP/2 Adoption For My Web Applications?

One way to accelerate adoption is by making new use of your application delivery controller (ADC). Some ADCs provide an embedded functionality of HTTP/2 gateway, enabling protocol translation from HTTP/2 on the client side to HTTP 1.1 on the server side, and vice versa. Using such a solution, ensures the ADC vendor that its HTTP/2 gateway is fully debugged, operational and ready for production environments, thus eliminating the risk of deploying immature code in the web server platform.

While this type of deployment can leverage some of the new HTTP/2 capabilities, the server push functionality still requires application-specific logic, to determine which objects the server can push to clients to accelerate page load time (i.e. not push client objects that are already in their cache).

The best way to maximize the acceleration potential that HTTP/2 can provide is to incorporate web performance optimization (WPO) technology. Some WPO engines already have the ability to anticipate which resources the user will need that aren't yet in their cache or which pages the user is likely to visit next. These types of WPO engines leverage the HTTP/2 server push function to push those resources to the user, even before the user asks for them.

The combination of HTTP/2 with WPO engines can deliver significant performance acceleration to web applications and this can make the upgrade investment worthwhile.

Kent Alstad is VP of Acceleration at Radware.

Hot Topics

The Latest

E-commerce is set to skyrocket with a 9% rise over the next few years ... To thrive in this competitive environment, retailers must identify digital resilience as their top priority. In a world where savvy shoppers expect 24/7 access to online deals and experiences, any unexpected downtime to digital services can lead to significant financial losses, damage to brand reputation, abandoned carts with designer shoes, and additional issues ...

Efficiency is a highly-desirable objective in business ... We're seeing this scenario play out in enterprises around the world as they continue to struggle with infrastructures and remote work models with an eye toward operational efficiencies. In contrast to that goal, a recent Broadcom survey of global IT and network professionals found widespread adoption of these strategies is making the network more complex and hampering observability, leading to uptime, performance and security issues. Let's look more closely at these challenges ...

Image
Broadcom

The 2025 Catchpoint SRE Report dives into the forces transforming the SRE landscape, exploring both the challenges and opportunities ahead. Let's break down the key findings and what they mean for SRE professionals and the businesses relying on them ...

Image
Catchpoint

The pressure on IT teams has never been greater. As data environments grow increasingly complex, resource shortages are emerging as a major obstacle for IT leaders striving to meet the demands of modern infrastructure management ... According to DataStrike's newly released 2025 Data Infrastructure Survey Report, more than half (54%) of IT leaders cite resource limitations as a top challenge, highlighting a growing trend toward outsourcing as a solution ...

Image
Datastrike

Gartner revealed its top strategic predictions for 2025 and beyond. Gartner's top predictions explore how generative AI (GenAI) is affecting areas where most would assume only humans can have lasting impact ...

The adoption of artificial intelligence (AI) is accelerating across the telecoms industry, with 88% of fixed broadband service providers now investigating or trialing AI automation to enhance their fixed broadband services, according to new research from Incognito Software Systems and Omdia ...

 

AWS is a cloud-based computing platform known for its reliability, scalability, and flexibility. However, as helpful as its comprehensive infrastructure is, disparate elements and numerous siloed components make it difficult for admins to visualize the cloud performance in detail. It requires meticulous monitoring techniques and deep visibility to understand cloud performance and analyze operational efficiency in detail to ensure seamless cloud operations ...

Imagine a future where software, once a complex obstacle, becomes a natural extension of daily workflow — an intuitive, seamless experience that maximizes productivity and efficiency. This future is no longer a distant vision but a reality being crafted by the transformative power of Artificial Intelligence ...

Enterprise data sprawl already challenges companies' ability to protect and back up their data. Much of this information is never fully secured, leaving organizations vulnerable. Now, as GenAI platforms emerge as yet another environment where enterprise data is consumed, transformed, and created, this fragmentation is set to intensify ...

Image
Crashplan

OpenTelemetry (OTel) has revolutionized the way we approach observability by standardizing the collection of telemetry data ... Here are five myths — and truths — to help elevate your OTel integration by harnessing the untapped power of logs ...

The New Internet: What You Need to Know About HTTP/2

Kent Alstad

Since HTTP 1.1 was introduced 17 years ago, the Internet has evolved. This evolution introduced many changes, among them the development and delivery of rich content to users. These improvements enhanced the online experience, but did come at a cost, and the currency was performance – performance challenges that HTTP 1.1 was never designed to handle.

In February 2015 the Internet Engineering Task Force (IETF), who develops and promotes voluntary Internet standards, released a new HTTP/2 version to cope with those challenges and to adapt to the evolution that internet content has undergone.

Here's what you need to know about the challenges HTTP 1.1 faced and the improvements that HTTP/2 has introduced:

HTTP 1.1 Challenge: HTTP 1.1 allowed the client to send only one object request per TCP connection at a time. Request for the next object could only be sent after the first request received the complete reply from the server.

HTTP/2 Improvement: HTTP/2 enables transaction multiplexing – so that the browser can send any number of requests and receive the responses interleaved and out of order. As a result, the use of the TCP connection between the browser and the server is much more efficient, the wait between subsequent requests and replies is eliminated and this all leads to faster page load times.

HTTP 1.1 Challenge: When visiting most webpages, the browser is requested to provide a lot of information about the session, such as transaction info (i.e. encoding used, cache control), user/server identification, cookies, etc. This information has to be provided in the HTTP header with each HTTP transaction. This can add up to a lot of data, adding more delay to the page download time.

HTTP/2 Improvement: HTTP/2 introduces a new symmetrical header compression capability, where both the client and the server use an advanced header compression algorithm to reduce the header's payload. In addition, with HTTP/2, it's enough to send the full header only once per page, and not per transaction, reducing even further the uplink payload from the client to the server. The result again is faster webpage download time.

HTTP 1.1 Challenge: With HTTP 1.1, communication could only be initiated by the client, which meant that the server could only push resources to the client, after the client has asked for them.

HTTP/2 Improvement: With HTTP/2 the server can also initiate resource push to the client, even before the client knows they will need those resources. This bi-directional communication can reduce the number of "Get" transactions, and use the available bandwidth between the server and the client much more efficiently, leading again to faster web application response times.

Why Upgrade to HTTP/2?

HTTP/2 is an important upgrade that can provide performance improvements for your web applications. It can reduce the amount of bandwidth required to support the same amount of users on your site through enabling better header compression and fewer requests. This means that web applications can have faster response times and serve your users better.

Nearly 60% of leading web browsers (including Chrome, Firefox, Internet Explorer, Safari, Opera and others) already natively support HTTP/2 – so your audience is ready for it!

Is HTTP/2 Right for Any Web Application?

While the IETF doesn't mandate encrypted web communication for HTTP/2, allowing clear text (HTTP based), communication as well, all browsers' implementation of HTTP/2 require a secured (HTTPS) connection.

This means that if a site doesn't support HTTPS URLs, or can't be upgraded to HTTPS, it can't use the new protocol. In many cases, even if the site can use encrypted HTTPS communication, it may have some severe performance penalties, having to encrypt all communication to/from the server. So only sites that have a good infrastructure that can efficiently handle HTTPS communication will be able to de-facto benefit from the performance boost HTTP/2 has to offer.

Another challenge that exists with HTTP/2 is that, unlike the majority of browsers that already have mature support for HTTP/2, many web server platforms don't offer stable and mature support for the new protocol. The implementations of HTTP/2 support also often still suffer from unexpected behavior, partially due to the lack of a testing tool that supports HTTP/2 protocol analysis.

Moreover, one of the capabilities that HTTP/2 offers in order to improve performance is server push – where the server pushes resources to the client, before the client ask for them. To leverage this capability the server needs the ability to determine which resources to push, to make the web transaction faster. Understanding which objects to push before the user asks for them, ensuring those objects don't already exist in the browsers' cache (otherwise – it will make the transaction slower not faster), is a capability no web servers natively have today.

These limitations may cause some site owners to delay their adoption of HTTP/2, but there are solutions that can help.

How Do I Accelerate HTTP/2 Adoption For My Web Applications?

One way to accelerate adoption is by making new use of your application delivery controller (ADC). Some ADCs provide an embedded functionality of HTTP/2 gateway, enabling protocol translation from HTTP/2 on the client side to HTTP 1.1 on the server side, and vice versa. Using such a solution, ensures the ADC vendor that its HTTP/2 gateway is fully debugged, operational and ready for production environments, thus eliminating the risk of deploying immature code in the web server platform.

While this type of deployment can leverage some of the new HTTP/2 capabilities, the server push functionality still requires application-specific logic, to determine which objects the server can push to clients to accelerate page load time (i.e. not push client objects that are already in their cache).

The best way to maximize the acceleration potential that HTTP/2 can provide is to incorporate web performance optimization (WPO) technology. Some WPO engines already have the ability to anticipate which resources the user will need that aren't yet in their cache or which pages the user is likely to visit next. These types of WPO engines leverage the HTTP/2 server push function to push those resources to the user, even before the user asks for them.

The combination of HTTP/2 with WPO engines can deliver significant performance acceleration to web applications and this can make the upgrade investment worthwhile.

Kent Alstad is VP of Acceleration at Radware.

Hot Topics

The Latest

E-commerce is set to skyrocket with a 9% rise over the next few years ... To thrive in this competitive environment, retailers must identify digital resilience as their top priority. In a world where savvy shoppers expect 24/7 access to online deals and experiences, any unexpected downtime to digital services can lead to significant financial losses, damage to brand reputation, abandoned carts with designer shoes, and additional issues ...

Efficiency is a highly-desirable objective in business ... We're seeing this scenario play out in enterprises around the world as they continue to struggle with infrastructures and remote work models with an eye toward operational efficiencies. In contrast to that goal, a recent Broadcom survey of global IT and network professionals found widespread adoption of these strategies is making the network more complex and hampering observability, leading to uptime, performance and security issues. Let's look more closely at these challenges ...

Image
Broadcom

The 2025 Catchpoint SRE Report dives into the forces transforming the SRE landscape, exploring both the challenges and opportunities ahead. Let's break down the key findings and what they mean for SRE professionals and the businesses relying on them ...

Image
Catchpoint

The pressure on IT teams has never been greater. As data environments grow increasingly complex, resource shortages are emerging as a major obstacle for IT leaders striving to meet the demands of modern infrastructure management ... According to DataStrike's newly released 2025 Data Infrastructure Survey Report, more than half (54%) of IT leaders cite resource limitations as a top challenge, highlighting a growing trend toward outsourcing as a solution ...

Image
Datastrike

Gartner revealed its top strategic predictions for 2025 and beyond. Gartner's top predictions explore how generative AI (GenAI) is affecting areas where most would assume only humans can have lasting impact ...

The adoption of artificial intelligence (AI) is accelerating across the telecoms industry, with 88% of fixed broadband service providers now investigating or trialing AI automation to enhance their fixed broadband services, according to new research from Incognito Software Systems and Omdia ...

 

AWS is a cloud-based computing platform known for its reliability, scalability, and flexibility. However, as helpful as its comprehensive infrastructure is, disparate elements and numerous siloed components make it difficult for admins to visualize the cloud performance in detail. It requires meticulous monitoring techniques and deep visibility to understand cloud performance and analyze operational efficiency in detail to ensure seamless cloud operations ...

Imagine a future where software, once a complex obstacle, becomes a natural extension of daily workflow — an intuitive, seamless experience that maximizes productivity and efficiency. This future is no longer a distant vision but a reality being crafted by the transformative power of Artificial Intelligence ...

Enterprise data sprawl already challenges companies' ability to protect and back up their data. Much of this information is never fully secured, leaving organizations vulnerable. Now, as GenAI platforms emerge as yet another environment where enterprise data is consumed, transformed, and created, this fragmentation is set to intensify ...

Image
Crashplan

OpenTelemetry (OTel) has revolutionized the way we approach observability by standardizing the collection of telemetry data ... Here are five myths — and truths — to help elevate your OTel integration by harnessing the untapped power of logs ...