Login

or
don't have an account?
back to glossary

Latency

pc

Latency is the delay before a transfer of data begins following an instruction for its transfer.

expanded

In computing, latency refers to the time interval between a user's action and the response from the system, or more specifically, the duration from when data is sent from a source until it is received by the destination. Low latency is crucial in real-time applications such as video conferencing and online gaming, ensuring significant data exchange occurs with minimal delay.

examples

In high-frequency trading platforms, latency must be minimized to microseconds to execute trades effectively.

These platforms operate in financial markets where rapid decisions and transactions are performed, with systems continually optimized for the lowest possible latency.

In cloud computing services, latency can range from 20 milliseconds to over 100 milliseconds depending on network topology.

Latency impacts how efficiently cloud services can process, store, and retrieve data in data centers, influencing the performance of hosted applications.

related terms