
Okay, lets all be honest with ourselves here - as citizens of the 21st century we are all pretty tech-savvy. Let's give ourselves that little pat on the back and get it out of the way. Because we also need to honest about the fact that very, very few of us actually have any idea what words like "latency", "bandwidth", and "internet speed" actually mean. And by very few I mean only programmers and IT people understand these distinctions.
If you happen to be one of the select few who already know the meaning of these mysterious words, I applaud you. If you don't, I sympathize completely. The Internet remains a rather enigmatic thing to people primarily concerned with the download speed of their torrented movies. But once the welfare of your business begins to depend more and more on your download speeds, knowing these distinctions becomes increasingly important. Responsible and informed businesspersons with websites and with pulses owe it to themselves to get this little bit of Internet education under their belts.
Latency: The Wait
The easiest way to understand latency is to think of a long line at some government office. Getting from the door to the counter requires walking a physical distance, waiting in line itself is caused by a bottleneck caused by too many server requests at the same time, and even reaching the counter isn't enough - there's a final waiting period during which the worker behind the desk has to process your request and respond to it. This leg of the journey is what the tech industry calls "latency".
Latency is the period of time that directly precedes the actual download time. All forms of internet connection are subject to the laws of latency, because it is determined by the server-side rather than the user-side. No matter what internet connection you have, the limiting factor in your download time will still be the server speed of the website you're trying to access/download from.
Bandwidth: The Line
This is illustrated by the bandwidth graphic above. Although the wider "pipe" clearly allows for faster download times, latency remains unchanged because it has nothing to do with the pipe to begin with.
But what, exactly, is this pipe? Doesn't internet connection take place at the speed of electricity? Does having a bigger, thicker wire actually matter? Yes, it does. If you think of data as "packets" of electrons (because that's essentially what data is), then it's easy to see that, although the speed of data will only change when the medium of the pipe changes, widening the pipe allows room for more data to flow through at once.
An easy way to envision this is to think of the same government office, but now instead of one line there are five. Getting to the counter doesn't take as long anymore, but each worker is still processing requests at the same speed.
Speed: The Experience
Ultimately, the interplay between latency, bandwidth, and your actual connection medium (wired, wireless, fiber-optic, etc.) determines the actual "speed" experienced by the user. This is an important distinction, because the actual speed of data packet transfer isn't changing at all.
Companies should understand these distinctions in order to focus their efforts more on the things that they can control rather than the things outside their control. In other words, the questions developers (and CEOS) should be asking themselves is: How can I reduce latency? How can we improve the user experience by increasing the speed on the server-side? What front-end and back-end tweaks can we make to increase download speed and reduce latency?
None of this is rocket science, and all developers already know this, but sometimes it takes a real nudge from up-top to get everyone behind the idea of a faster, better branded experience.