The New Internet: What You Need to Know About HTTP/2
August 13, 2015

Kent Alstad
Radware

Share this

Since HTTP 1.1 was introduced 17 years ago, the Internet has evolved. This evolution introduced many changes, among them the development and delivery of rich content to users. These improvements enhanced the online experience, but did come at a cost, and the currency was performance – performance challenges that HTTP 1.1 was never designed to handle.

In February 2015 the Internet Engineering Task Force (IETF), who develops and promotes voluntary Internet standards, released a new HTTP/2 version to cope with those challenges and to adapt to the evolution that internet content has undergone.

Here's what you need to know about the challenges HTTP 1.1 faced and the improvements that HTTP/2 has introduced:

HTTP 1.1 Challenge: HTTP 1.1 allowed the client to send only one object request per TCP connection at a time. Request for the next object could only be sent after the first request received the complete reply from the server.

HTTP/2 Improvement: HTTP/2 enables transaction multiplexing – so that the browser can send any number of requests and receive the responses interleaved and out of order. As a result, the use of the TCP connection between the browser and the server is much more efficient, the wait between subsequent requests and replies is eliminated and this all leads to faster page load times.

HTTP 1.1 Challenge: When visiting most webpages, the browser is requested to provide a lot of information about the session, such as transaction info (i.e. encoding used, cache control), user/server identification, cookies, etc. This information has to be provided in the HTTP header with each HTTP transaction. This can add up to a lot of data, adding more delay to the page download time.

HTTP/2 Improvement: HTTP/2 introduces a new symmetrical header compression capability, where both the client and the server use an advanced header compression algorithm to reduce the header's payload. In addition, with HTTP/2, it's enough to send the full header only once per page, and not per transaction, reducing even further the uplink payload from the client to the server. The result again is faster webpage download time.

HTTP 1.1 Challenge: With HTTP 1.1, communication could only be initiated by the client, which meant that the server could only push resources to the client, after the client has asked for them.

HTTP/2 Improvement: With HTTP/2 the server can also initiate resource push to the client, even before the client knows they will need those resources. This bi-directional communication can reduce the number of "Get" transactions, and use the available bandwidth between the server and the client much more efficiently, leading again to faster web application response times.

Why Upgrade to HTTP/2?

HTTP/2 is an important upgrade that can provide performance improvements for your web applications. It can reduce the amount of bandwidth required to support the same amount of users on your site through enabling better header compression and fewer requests. This means that web applications can have faster response times and serve your users better.

Nearly 60% of leading web browsers (including Chrome, Firefox, Internet Explorer, Safari, Opera and others) already natively support HTTP/2 – so your audience is ready for it!

Is HTTP/2 Right for Any Web Application?

While the IETF doesn't mandate encrypted web communication for HTTP/2, allowing clear text (HTTP based), communication as well, all browsers' implementation of HTTP/2 require a secured (HTTPS) connection.

This means that if a site doesn't support HTTPS URLs, or can't be upgraded to HTTPS, it can't use the new protocol. In many cases, even if the site can use encrypted HTTPS communication, it may have some severe performance penalties, having to encrypt all communication to/from the server. So only sites that have a good infrastructure that can efficiently handle HTTPS communication will be able to de-facto benefit from the performance boost HTTP/2 has to offer.

Another challenge that exists with HTTP/2 is that, unlike the majority of browsers that already have mature support for HTTP/2, many web server platforms don't offer stable and mature support for the new protocol. The implementations of HTTP/2 support also often still suffer from unexpected behavior, partially due to the lack of a testing tool that supports HTTP/2 protocol analysis.

Moreover, one of the capabilities that HTTP/2 offers in order to improve performance is server push – where the server pushes resources to the client, before the client ask for them. To leverage this capability the server needs the ability to determine which resources to push, to make the web transaction faster. Understanding which objects to push before the user asks for them, ensuring those objects don't already exist in the browsers' cache (otherwise – it will make the transaction slower not faster), is a capability no web servers natively have today.

These limitations may cause some site owners to delay their adoption of HTTP/2, but there are solutions that can help.

How Do I Accelerate HTTP/2 Adoption For My Web Applications?

One way to accelerate adoption is by making new use of your application delivery controller (ADC). Some ADCs provide an embedded functionality of HTTP/2 gateway, enabling protocol translation from HTTP/2 on the client side to HTTP 1.1 on the server side, and vice versa. Using such a solution, ensures the ADC vendor that its HTTP/2 gateway is fully debugged, operational and ready for production environments, thus eliminating the risk of deploying immature code in the web server platform.

While this type of deployment can leverage some of the new HTTP/2 capabilities, the server push functionality still requires application-specific logic, to determine which objects the server can push to clients to accelerate page load time (i.e. not push client objects that are already in their cache).

The best way to maximize the acceleration potential that HTTP/2 can provide is to incorporate web performance optimization (WPO) technology. Some WPO engines already have the ability to anticipate which resources the user will need that aren't yet in their cache or which pages the user is likely to visit next. These types of WPO engines leverage the HTTP/2 server push function to push those resources to the user, even before the user asks for them.

The combination of HTTP/2 with WPO engines can deliver significant performance acceleration to web applications and this can make the upgrade investment worthwhile.

Kent Alstad is VP of Acceleration at Radware.

Kent Alstad is VP of Acceleration at Radware
Share this

The Latest

May 23, 2024

Hybrid cloud architecture is breaking the backs of network engineering and operations teams. These teams are more successful when their companies go all-in with the cloud or stay out of it entirely. When companies maintain hybrid infrastructure, with applications and data residing across data centers and public cloud services, the network team struggles. This insight emerged in the newly published 2024 edition of Enterprise Management Associates' (EMA) Network Management Megatrends research ...

May 22, 2024

As IT practitioners, we often find ourselves fighting fires rather than proactively getting ahead ... Many spend countless hours managing several tools that give them different, fractured views of their own work — which isn't an effective use of time. Balancing daily technical tasks with long-term company goals requires a three-step approach. I'll share these steps and tips for others to do the same ...

May 21, 2024

IT service outages are more than a minor inconvenience. They can cost businesses millions while simultaneously leading to customer dissatisfaction and reputational damage. Moreover, the constant pressure of dealing with fire drills and escalations day and night can take a heavy toll on ITOps teams, leading to increased stress, human error, and burnout ...

May 20, 2024

Amid economic disruption, fintech competition, and other headwinds in recent years, banks have had to quickly adjust to the demands of the market. This adaptation is often reliant on having the right technology infrastructure in place ...

May 17, 2024

In MEAN TIME TO INSIGHT Episode 6, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses network automation ...

May 16, 2024

In the ever-evolving landscape of software development and infrastructure management, observability stands as a crucial pillar. Among its fundamental components lies log collection ... However, traditional methods of log collection have faced challenges, especially in high-volume and dynamic environments. Enter eBPF, a groundbreaking technology ...

May 15, 2024

Businesses are dazzled by the promise of generative AI, as it touts the capability to increase productivity and efficiency, cut costs, and provide competitive advantages. With more and more generative AI options available today, businesses are now investigating how to convert the AI promise into profit. One way businesses are looking to do this is by using AI to improve personalized customer engagement ...

May 14, 2024

In the fast-evolving realm of cloud computing, where innovation collides with fiscal responsibility, the Flexera 2024 State of the Cloud Report illuminates the challenges and triumphs shaping the digital landscape ... At the forefront of this year's findings is the resounding chorus of organizations grappling with cloud costs ...

May 13, 2024

Government agencies are transforming to improve the digital experience for employees and citizens, allowing them to achieve key goals, including unleashing staff productivity, recruiting and retaining talent in the public sector, and delivering on the mission, according to the Global Digital Employee Experience (DEX) Survey from Riverbed ...

May 09, 2024

App sprawl has been a concern for technologists for some time, but it has never presented such a challenge as now. As organizations move to implement generative AI into their applications, it's only going to become more complex ... Observability is a necessary component for understanding the vast amounts of complex data within AI-infused applications, and it must be the centerpiece of an app- and data-centric strategy to truly manage app sprawl ...