Latency and Bandwidth? Of Course I Know What They Mean!
February 14, 2014

Sven Hammar
Apica

Share this

Okay, lets all be honest with ourselves here - as citizens of the 21st century we are all pretty tech-savvy. Let's give ourselves that little pat on the back and get it out of the way. Because we also need to honest about the fact that very, very few of us actually have any idea what words like "latency", "bandwidth", and "internet speed" actually mean. And by very few I mean only programmers and IT people understand these distinctions.

If you happen to be one of the select few who already know the meaning of these mysterious words, I applaud you. If you don't, I sympathize completely. The Internet remains a rather enigmatic thing to people primarily concerned with the download speed of their torrented movies. But once the welfare of your business begins to depend more and more on your download speeds, knowing these distinctions becomes increasingly important. Responsible and informed businesspersons with websites and with pulses owe it to themselves to get this little bit of Internet education under their belts.

Latency: The Wait

The easiest way to understand latency is to think of a long line at some government office. Getting from the door to the counter requires walking a physical distance, waiting in line itself is caused by a bottleneck caused by too many server requests at the same time, and even reaching the counter isn't enough - there's a final waiting period during which the worker behind the desk has to process your request and respond to it. This leg of the journey is what the tech industry calls "latency".

Latency is the period of time that directly precedes the actual download time. All forms of internet connection are subject to the laws of latency, because it is determined by the server-side rather than the user-side. No matter what internet connection you have, the limiting factor in your download time will still be the server speed of the website you're trying to access/download from.

Bandwidth: The Line

This is illustrated by the bandwidth graphic above. Although the wider "pipe" clearly allows for faster download times, latency remains unchanged because it has nothing to do with the pipe to begin with.

But what, exactly, is this pipe? Doesn't internet connection take place at the speed of electricity? Does having a bigger, thicker wire actually matter? Yes, it does. If you think of data as "packets" of electrons (because that's essentially what data is), then it's easy to see that, although the speed of data will only change when the medium of the pipe changes, widening the pipe allows room for more data to flow through at once.

An easy way to envision this is to think of the same government office, but now instead of one line there are five. Getting to the counter doesn't take as long anymore, but each worker is still processing requests at the same speed.

Speed: The Experience

Ultimately, the interplay between latency, bandwidth, and your actual connection medium (wired, wireless, fiber-optic, etc.) determines the actual "speed" experienced by the user. This is an important distinction, because the actual speed of data packet transfer isn't changing at all.

Companies should understand these distinctions in order to focus their efforts more on the things that they can control rather than the things outside their control. In other words, the questions developers (and CEOS) should be asking themselves is: How can I reduce latency? How can we improve the user experience by increasing the speed on the server-side? What front-end and back-end tweaks can we make to increase download speed and reduce latency?

None of this is rocket science, and all developers already know this, but sometimes it takes a real nudge from up-top to get everyone behind the idea of a faster, better branded experience.

Sven Hammar is Chief Strategy Officer and Founder of Apica
Share this

The Latest

July 09, 2020

Enterprises that halted their cloud migration journey during the current global pandemic are two and a half times more likely than those that continued their move to the cloud to have experienced IT outages that negatively impacted their SLAs, according to Virtana's latest survey report The Current State of Hybrid Cloud and IT ...

July 08, 2020

Every business has the responsibility to do their part against climate change by reducing their carbon footprint while increasing sustainability and efficiency. Harnessing optimization of IT infrastructure is one method companies can use to reduce carbon footprint, improve sustainability and increase business efficiency, while also keeping costs down ...

July 07, 2020

While the adoption of continuous integration (CI) is on the rise, software engineering teams are unable to take a zero-tolerance approach to software failures, costing enterprise organizations billions annually, according to a quantitative study conducted by Undo and a Cambridge Judge Business School MBA project ...

June 25, 2020

I've had the opportunity to work with a number of organizations embarking on their AIOps journey. I always advise them to start by evaluating their needs and the possibilities AIOps can bring to them through five different levels of AIOps maturity. This is a strategic approach that allows enterprises to achieve complete automation for long-term success ...

June 24, 2020

Sumo Logic recently commissioned an independent market research study to understand the industry momentum behind continuous intelligence — and the necessity for digital organizations to embrace a cloud-native, real-time continuous intelligence platform to support the speed and agility of business for faster decision-making, optimizing security, driving new innovation and delivering world-class customer experiences. Some of the key findings include ...

June 23, 2020

When it comes to viruses, it's typically those of the computer/digital variety that IT is concerned about. But with the ongoing pandemic, IT operations teams are on the hook to maintain business functions in the midst of rapid and massive change. One of the biggest challenges for businesses is the shift to remote work at scale. Ensuring that they can continue to provide products and services — and satisfy their customers — against this backdrop is challenging for many ...

June 22, 2020

Teams tasked with developing and delivering software are under pressure to balance the business imperative for speed with high customer expectations for quality. In the course of trying to achieve this balance, engineering organizations rely on a variety of tools, techniques and processes. The 2020 State of Software Quality report provides a snapshot of the key challenges organizations encounter when it comes to delivering quality software at speed, as well as how they are approaching these hurdles. This blog introduces its key findings ...

June 18, 2020

For IT teams, run-the-business, commodity areas such as employee help desks, device support and communication platforms are regularly placed in the crosshairs for cost takeout, but these areas are also highly visible to employees. Organizations can improve employee satisfaction and business performance by building unified functions that are measured by employee experience rather than price. This approach will ultimately fund transformation, as well as increase productivity and innovation ...

June 17, 2020

In the agile DevOps framework, there is a vital piece missing; something that previous approaches to application development did well, but has since fallen by the wayside. That is, the post-delivery portion of the toolchain. Without continuous cloud optimization, the CI/CD toolchain still produces massive inefficiencies and overspend ...

June 16, 2020

The COVID-19 pandemic has exponentially accelerated digital transformation projects. To better understand where IT professionals are turning for help, we analyzed the online behaviors of IT decision-makers. Our research found an increase in demand for resources related to APM, microservices and dependence on cloud services ...