Skip to main content

Latency and Bandwidth? Of Course I Know What They Mean!

Sven Hammar

Okay, lets all be honest with ourselves here - as citizens of the 21st century we are all pretty tech-savvy. Let's give ourselves that little pat on the back and get it out of the way. Because we also need to honest about the fact that very, very few of us actually have any idea what words like "latency", "bandwidth", and "internet speed" actually mean. And by very few I mean only programmers and IT people understand these distinctions.

If you happen to be one of the select few who already know the meaning of these mysterious words, I applaud you. If you don't, I sympathize completely. The Internet remains a rather enigmatic thing to people primarily concerned with the download speed of their torrented movies. But once the welfare of your business begins to depend more and more on your download speeds, knowing these distinctions becomes increasingly important. Responsible and informed businesspersons with websites and with pulses owe it to themselves to get this little bit of Internet education under their belts.

Latency: The Wait

The easiest way to understand latency is to think of a long line at some government office. Getting from the door to the counter requires walking a physical distance, waiting in line itself is caused by a bottleneck caused by too many server requests at the same time, and even reaching the counter isn't enough - there's a final waiting period during which the worker behind the desk has to process your request and respond to it. This leg of the journey is what the tech industry calls "latency".

Latency is the period of time that directly precedes the actual download time. All forms of internet connection are subject to the laws of latency, because it is determined by the server-side rather than the user-side. No matter what internet connection you have, the limiting factor in your download time will still be the server speed of the website you're trying to access/download from.

Bandwidth: The Line

This is illustrated by the bandwidth graphic above. Although the wider "pipe" clearly allows for faster download times, latency remains unchanged because it has nothing to do with the pipe to begin with.

But what, exactly, is this pipe? Doesn't internet connection take place at the speed of electricity? Does having a bigger, thicker wire actually matter? Yes, it does. If you think of data as "packets" of electrons (because that's essentially what data is), then it's easy to see that, although the speed of data will only change when the medium of the pipe changes, widening the pipe allows room for more data to flow through at once.

An easy way to envision this is to think of the same government office, but now instead of one line there are five. Getting to the counter doesn't take as long anymore, but each worker is still processing requests at the same speed.

Speed: The Experience

Ultimately, the interplay between latency, bandwidth, and your actual connection medium (wired, wireless, fiber-optic, etc.) determines the actual "speed" experienced by the user. This is an important distinction, because the actual speed of data packet transfer isn't changing at all.

Companies should understand these distinctions in order to focus their efforts more on the things that they can control rather than the things outside their control. In other words, the questions developers (and CEOS) should be asking themselves is: How can I reduce latency? How can we improve the user experience by increasing the speed on the server-side? What front-end and back-end tweaks can we make to increase download speed and reduce latency?

None of this is rocket science, and all developers already know this, but sometimes it takes a real nudge from up-top to get everyone behind the idea of a faster, better branded experience.

Hot Topics

The Latest

Cloud migration is a highly strategic decision that involves leadership sponsorship, business justifications for moving to the cloud, and a clear understanding of expected value. Lack of this alignment can be the reigning cause of cost and budget overruns and why almost half of the migration efforts underway today will fail in the next three years ...

One of the most misunderstood culprits of poor application performance is packet loss. Even minimal packet loss can cripple the throughput of a high-speed connection, making enterprise applications sluggish and frustrating for remote employee ... So, what's going wrong? And why does adding more bandwidth fail to fix the issue? ...

Image
Cloudbrink

Industry experts offer predictions on how AI will evolve and impact technology and business in 2025. Part 5 covers the infrastructure and hardware supporting AI ...

Industry experts offer predictions on how AI will evolve and impact technology and business in 2025. Part 4 covers advancements in AI technology ...

Industry experts offer predictions on how AI will evolve and impact technology and business in 2025. Part 3 covers AI's impact on employees and their roles ...

Industry experts offer predictions on how AI will evolve and impact technology and business in 2025. Part 2 covers the challenges presented by AI, as well as solutions to those problems ...

In the final part of APMdigest's 2025 Predictions Series, industry experts offer predictions on how AI will evolve and impact technology and business in 2025 ...

E-commerce is set to skyrocket with a 9% rise over the next few years ... To thrive in this competitive environment, retailers must identify digital resilience as their top priority. In a world where savvy shoppers expect 24/7 access to online deals and experiences, any unexpected downtime to digital services can lead to significant financial losses, damage to brand reputation, abandoned carts with designer shoes, and additional issues ...

Efficiency is a highly-desirable objective in business ... We're seeing this scenario play out in enterprises around the world as they continue to struggle with infrastructures and remote work models with an eye toward operational efficiencies. In contrast to that goal, a recent Broadcom survey of global IT and network professionals found widespread adoption of these strategies is making the network more complex and hampering observability, leading to uptime, performance and security issues. Let's look more closely at these challenges ...

Image
Broadcom

The 2025 Catchpoint SRE Report dives into the forces transforming the SRE landscape, exploring both the challenges and opportunities ahead. Let's break down the key findings and what they mean for SRE professionals and the businesses relying on them ...

Image
Catchpoint

Latency and Bandwidth? Of Course I Know What They Mean!

Sven Hammar

Okay, lets all be honest with ourselves here - as citizens of the 21st century we are all pretty tech-savvy. Let's give ourselves that little pat on the back and get it out of the way. Because we also need to honest about the fact that very, very few of us actually have any idea what words like "latency", "bandwidth", and "internet speed" actually mean. And by very few I mean only programmers and IT people understand these distinctions.

If you happen to be one of the select few who already know the meaning of these mysterious words, I applaud you. If you don't, I sympathize completely. The Internet remains a rather enigmatic thing to people primarily concerned with the download speed of their torrented movies. But once the welfare of your business begins to depend more and more on your download speeds, knowing these distinctions becomes increasingly important. Responsible and informed businesspersons with websites and with pulses owe it to themselves to get this little bit of Internet education under their belts.

Latency: The Wait

The easiest way to understand latency is to think of a long line at some government office. Getting from the door to the counter requires walking a physical distance, waiting in line itself is caused by a bottleneck caused by too many server requests at the same time, and even reaching the counter isn't enough - there's a final waiting period during which the worker behind the desk has to process your request and respond to it. This leg of the journey is what the tech industry calls "latency".

Latency is the period of time that directly precedes the actual download time. All forms of internet connection are subject to the laws of latency, because it is determined by the server-side rather than the user-side. No matter what internet connection you have, the limiting factor in your download time will still be the server speed of the website you're trying to access/download from.

Bandwidth: The Line

This is illustrated by the bandwidth graphic above. Although the wider "pipe" clearly allows for faster download times, latency remains unchanged because it has nothing to do with the pipe to begin with.

But what, exactly, is this pipe? Doesn't internet connection take place at the speed of electricity? Does having a bigger, thicker wire actually matter? Yes, it does. If you think of data as "packets" of electrons (because that's essentially what data is), then it's easy to see that, although the speed of data will only change when the medium of the pipe changes, widening the pipe allows room for more data to flow through at once.

An easy way to envision this is to think of the same government office, but now instead of one line there are five. Getting to the counter doesn't take as long anymore, but each worker is still processing requests at the same speed.

Speed: The Experience

Ultimately, the interplay between latency, bandwidth, and your actual connection medium (wired, wireless, fiber-optic, etc.) determines the actual "speed" experienced by the user. This is an important distinction, because the actual speed of data packet transfer isn't changing at all.

Companies should understand these distinctions in order to focus their efforts more on the things that they can control rather than the things outside their control. In other words, the questions developers (and CEOS) should be asking themselves is: How can I reduce latency? How can we improve the user experience by increasing the speed on the server-side? What front-end and back-end tweaks can we make to increase download speed and reduce latency?

None of this is rocket science, and all developers already know this, but sometimes it takes a real nudge from up-top to get everyone behind the idea of a faster, better branded experience.

Hot Topics

The Latest

Cloud migration is a highly strategic decision that involves leadership sponsorship, business justifications for moving to the cloud, and a clear understanding of expected value. Lack of this alignment can be the reigning cause of cost and budget overruns and why almost half of the migration efforts underway today will fail in the next three years ...

One of the most misunderstood culprits of poor application performance is packet loss. Even minimal packet loss can cripple the throughput of a high-speed connection, making enterprise applications sluggish and frustrating for remote employee ... So, what's going wrong? And why does adding more bandwidth fail to fix the issue? ...

Image
Cloudbrink

Industry experts offer predictions on how AI will evolve and impact technology and business in 2025. Part 5 covers the infrastructure and hardware supporting AI ...

Industry experts offer predictions on how AI will evolve and impact technology and business in 2025. Part 4 covers advancements in AI technology ...

Industry experts offer predictions on how AI will evolve and impact technology and business in 2025. Part 3 covers AI's impact on employees and their roles ...

Industry experts offer predictions on how AI will evolve and impact technology and business in 2025. Part 2 covers the challenges presented by AI, as well as solutions to those problems ...

In the final part of APMdigest's 2025 Predictions Series, industry experts offer predictions on how AI will evolve and impact technology and business in 2025 ...

E-commerce is set to skyrocket with a 9% rise over the next few years ... To thrive in this competitive environment, retailers must identify digital resilience as their top priority. In a world where savvy shoppers expect 24/7 access to online deals and experiences, any unexpected downtime to digital services can lead to significant financial losses, damage to brand reputation, abandoned carts with designer shoes, and additional issues ...

Efficiency is a highly-desirable objective in business ... We're seeing this scenario play out in enterprises around the world as they continue to struggle with infrastructures and remote work models with an eye toward operational efficiencies. In contrast to that goal, a recent Broadcom survey of global IT and network professionals found widespread adoption of these strategies is making the network more complex and hampering observability, leading to uptime, performance and security issues. Let's look more closely at these challenges ...

Image
Broadcom

The 2025 Catchpoint SRE Report dives into the forces transforming the SRE landscape, exploring both the challenges and opportunities ahead. Let's break down the key findings and what they mean for SRE professionals and the businesses relying on them ...

Image
Catchpoint