Latency

Latency is an engineering term that refers to the time between a stimulus and a response. In terms of computer networking it is used to describe the various delays that can occur between the time a request for network services is sent, and the time when a response is delivered. Various factors can introduce latency into a network. For example, network name to IP address lookup via DNS, slow routing of traffic at network borders, bottlenecks at deep inspection systems that are part of the bigger network security infrastructure. Latency should not be confused with bandwidth. The latter is the theoretical maximum throughput between network nodes. See the Bandwidth entry for more details. Latency refers to the bottlenecks that cause delays that prevents the theoretical bandwidth from being achieved.

Talk to Us!

Do you have application delivery questions? Our engineers would love to help!

Schedule a Call