2004/11/30 latency

Basically, delay. Different from bandwidth, and the source of unending confusion and argument because of that.

Latency affects everything from hard drives to the internet. It's often neglected when comparing relative performance, probably sometimes from ignorance and other times to deliberately confuse.

If there's a overall, everybody gets it way to explain latency, I sure don't know it - what makes sense to me might not make sense to you. For some examples of people trying to explain it, see

It's the Latency, Stupid
Satellite Internet Explained In Plain English: Latency
Does RAM Latency Matter?

A long comment has been merged in here:

From the first article:

"The latency is exactly the same thing. It's the minimum time
between asking for a piece of data and getting it, just like the
seek time of a disk, and it's just as important."

Now I'm sure the author thinks that the above is a suitable definition
for latency, but I completely disagree, especially since he brought
seek time of a disk into the picture (seek time is typically used
to measure how fast the read/write head assembly can be positioned).

Now, when I was in school studying electronics (this was about 40
years ago), the passage of time between data request and data
receipt was broken down into three more-or-less independent elements:

1) Latency: the time required for a device to react to a command
to do something. Most often, latency measured the mechanical
performance of the device, not its electical performance, since
the latter had little effect in those days. That's changed, of
course, with today's stuff. <Smile>

2) Propagation delay: the time required for the signal representing
the data to traverse the interconnecting medium. It was assumed
for the purpose of discussion that transmission speed through copper
occurred at about 70 percent of the speed of light in a vacuum
(186,282 miles per second), which meant an electrical impulse
traveled at about 130,000 miles per second.

3) Bandwidth: the theoretical capacity of the medium as a function
of time. Then, as now, bandwidth is measured in some multiple of
bits per second (which *IS NOT* the same as baud rate).

An example in one of the articles refered to the performance of a
satellite link in terms of latency. In fact, propagation delay,
not latency, accounts for the bulk of the time required to get data
from the satellite to the receiver. Compared to the performance of
the satellite and receiver electronics, the medium (electromagnetic
transmission) is terribly slow due to the linear distance between
endpoints (about 23,000 miles one way for a geo-stationary satellite).

Also, some references to the performance of analog modems used
latency to describe what are actually internal bandwidth limitations.
A modem, like any other clocked device, can only process so many
bits of data per second, especially when on-the-fly data compression
gets into the picture. This is not latency, as the speed at which
the modem's circuits react is the same as in any other electrical
device. The limitation is in how rapidly circuits toggle from one
state to the other, which is determined by the system clock.
Incidentally, inexpensive TTL devices -- the type of silicon found
in a typical modem -- can switch between states at better than 20
million times per second.

A good example of the interelationship between latency, propagation
delay and bandwidth is in mass storage. A disk, being a mechanical
device, experiences latency due to the time required for the
mechanism to react to commands from the host machine -- including
the time required for a given sector to appear under a read/write
head. Propagation time is expended in moving the data signals from
the drive to the host, and the bandwidth of the interconnecting
bus (e.g., IDE, SCSI, SATA, etc.) determines how often bits of data
can be passed. Bandwidth, in turn, is determined by how many bits
that can be simultaneously moved on the bus and how rapidly data
line transitions can occur. The limitation on data line transition
speed gets lower as the bus gets longer...

So my opinion is that the term "latency" is, like many other words
in today's version of English, being improperly used to describe
the elapsed time between request and receipt. This, of course, is
coming from the same crop of people who tell you to buy the latest
computer so you can "grow your business."

By the way, how does one grow a business? Do you sprinkle fertilizer
on the office roof and pray for rain? Is it better to use chemical
fertilizer or male bovine excrement?

--BigDumbDinosaur 
 

Which, I think, illustrates my point: people think of latency differently (if they think of it at all), and disagree about it, have different ways to explain it, etc.



Got something to add? Send me email.




Increase ad revenue 50-250% with Ezoic


More Articles by

Find me on Google+

© Tony Lawrence



Kerio Samepage


Have you tried Searching this site?

Support Rates

This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more.

Contact us

Share
Additional Info




Today the theory of evolution is about as much open to doubt as the theory that the earth goes round the sun. (Richard Dawkins)





This post tagged: