Latency

Latency can be generally thought of as the time required to get something done. According to Microsoft SQL Server TechNet, latency is “the delay that occurs while data is processed or delivered.”

In networking, latency is typically measured by how much time it takes for a data packet to get from one designated point to another. In many cases, latency is measured by sending a packet that is returned to the sender and the latency is the total round-trip time for the packet.

Causes of latency

There are a number of causes for latency. One cause is the signal propagation time. Even at nearly the speed of light, there is a measurable delay for packets traveling long distances.

The transmission itself also contributes to latency. The medium through which the packet is traveling (optical fiber, wireless etc.) inevitably causes some delay. Furthermore, the size of the packet impacts latency given a larger packet takes longer to receive and return than a short one.

Latency is also introduced through the time required for router and gateway node processing of transmissions.

Finally, latency is also produced through storage and hard disk access delays at intermediate devices such as switches and bridges after arriving at the destination (this type of latency is not always considered in calculating latency times). Disk latency is created, for example, by the time it takes to position the correct sector under the read/write head in a disk.

Idera minimizes latency

Idera database management products have always been focused on minimizing latency, but the company’s latest tool — SQL Traffic Accelerator — is specifically designed to reduce perceived latency and lower bandwidth utilization to overcome bottlenecks in cloud application ecosystems.