CheckESNFree CheckESNFree – Supporting the wireless community since 2010

Raw Speed vs Latency

If you talk with most non-computer people about cellular data speeds, you’ll inevitably end up hearing about raw speed (megabits per second). Thanks in part to the marketing used by most cellular providers, raw speed is the only facet of data performance that the average person ever hears about. However, for the vast majority of things a user will do with their phones, raw speed is actually a fairly unimportant facet of the overall performance.

What matters more than raw speed is something known as latency, which often goes by its street name of PING. When the word PING is mentioned, most people will recall seeing this figure displayed during speed tests on both their phones and their computers, but will have little or no idea what it means. Many won’t even realize that it’s better have a LOWER value for PING than a higher value (as is the case for the raw speed).

It’s possible, though I’m only theorizing here, that providers don’t advertise latency simply because it isn’t a BIGGER IS BETTER number. They can rely on the unwashed public appreciating transfer rates, even though they don’t really have any reference by which to compare them, because the higher the number, the better the performance of this aspect of data transfer. It’s intuitively obvious, no matter low little understanding of you have.

However, in day-to-day use of the internet on a phone (and even on a computer) latency is far more important in determining how fast the user’s experience of the internet will be. This is because few circumstances actually call for an uninterrupted flow of large quantities of data. Once such scenario is streaming, but if your connection is SUFFICENTLY fast enough to sustain the stream rate being used, extra speed provides no benefit whatsoever. One of the few circumstances in which sustain high-speed connections matters at this point in time is downloading large files.

But what about the bulk of things you do online, such as read email, surf web pages, use social media, etc? In these situations very little TOTAL data is actually being transferred, but lots of very small data transfers are taking place. Each and every one of them is subject to a hit of delay defined by the connection’s latency.

At first glance this doesn’t seem relevant to the real world, because wireless latency is often less than 100 ms (1/10th of a second). However, since this delay applies to each and every small transaction of data, a large number of such transactions would quickly add up to seconds of wasted time....

 

 

A typical web page might have hundreds of small images that need to be transferred in separate transactions and just 100 of them adds up to 10 full seconds of delay over a connection with 100 ms latency.

So, let’s say we were to compare two internet connections, one with a raw transfer rate of 50 Mbps with a latency of 150 ms, and another with a raw transfer rate of just 25 Mbps with a latency of 50 ms. Now suppose we use these connections to surf to a web with 75 small images and a total page size of 1 megabyte. It would take approximately 2 seconds to transfer 1 MB of data at 50 Mbps, and around 4 seconds at 25 Mbps. So far, that a 2-second advantage to the faster connection, but if we then look at the effect of latency we find something surprising. 75 separate transactions with a hit of 150 ms each would total 11.25 seconds, while at 50 ms it would total just 3.75 seconds. Now add the delays due to latency with the time taken to transfer the data and we get 13.25 seconds with the 50 Mbps connection and just 7.75 seconds with the 25 Mbps connection.

Astute readers will no doubt argue that examples such as the above could be invented that would make the 50 Mbps connection (even with high latency) come out ahead, and this is undeniably true. However, for the connection with the faster raw data transfer rate to win, either there would have to be a fairly large amount of data to be transferred (which is rare in most day-to-day interactions) or there would need to be a very small number individual items to transfer (which is also rare).

A better way to look at this is that when it comes to which of the two facets of speed have the largest impact on the user’s EXPERIENCE, it will almost always be latency. This means that it’s preferable to have a connection with slow rata data speeds and excellent latency, than a blazingly fast connection with poor latency.

So next time you see an ad for a service provider that touts their raw speed as being better than their competitor’s, ask yourself which one has the better ping times. They probably aren’t telling you this because A) they don’t think you’re smart enough to understand; and B) they know they aren’t going to win the battle on speed alone and would much prefer to leave the more salient information out of the ad.

Leave a Reply

Your email address will not be published. Required fields are marked *