What Is Latency in Networking?

For most people, a fast and reliable Internet connection is a priority. It's essential for everything from daily communications to business operations. However, real-time Internet speeds depend on network latency. In this article, learn what latency is, why it matters, how to improve latency, and how it affects your online experience if it's high or low.

What is latency?

In computer networking, latency, or lag, is the time it take for a data packet to travel between two points and back again. These two points can be two computers or servers. Latency is often referred to in terms of round-trip time, or RTT. This implies that it is measured by determining the round-trip time that it takes for a data packet to reach its destination and return.

This is essentially what latency is: the time it takes for a message to reach its recipient and for the recipient to receive a reply. During speed tests or latency tests, it's referred to as ping rate and is measured in milliseconds (ms).

Ideally, a good network would have low network latency. This results in a better user experience; users have faster speeds and quick response times. This is especially important for gamers, video streamers, or those who do a lot of video conferencing, as low lag maintains the real-time responsiveness that is essential for those tasks. For modern applications and tasks that rely on quick responses, low RTT is necessary.

High latency, on the other hand, creates bottlenecks in communication because of the long delays. It results in a worse experience for the user, as they will experience lagging and poor quality of service.

On a connection with high RTT, webpages and systems load slower. Many modern applications won't work. Organizations that rely on real-time applications, online services, or time-sensitive transactions require low lag.

Minimizing latency is one way of improving performance. Because a relatively small increase in response times can ruin a user's overall experience, managing latency is key.

Graphic illustrating latency

What causes latency?

Latency is an important factor to consider when implementing computer networking applications. By understanding the causes, you improve the performance of your applications. In this section, learn several common causes of lag and why they affect your Internet connection.


The geographical distance between systems and users is the primary cause of lag. The longer the distance between two computers or servers, the longer it takes for data to arrive at its destination, which increases lag.

Visiting a website hosted by a server in an adjacent city will provide much faster speeds and response times than visiting a website hosted in another country. Distance has a direct impact; the shorter the distance, the lower the latency.

The reason lag is impacted by distance is because of how fast light travels. Though the speed of light is incredibly fast, it isn't instantaneous. Therefore, it takes time for light signals to travel through the physical infrastructure that carries data.

Transmission media

The type of medium used to transmit data, voice, and video also has a big impact on lag. For example, packets traveling over fiber optic cables experience lower latency than packets traveling over copper wire. Fiber optic cable is a faster transmission medium than copper wire, reducing latency.

Satellite Internet typically has higher latency, with speeds of 594 to 612 ms. Fiber optic Internet is the fastest, with only 10-15 ms, while cable Internet and DSL fall around 15-27 ms and 24-42 ms, respectively.


Routers have to receive and forward incoming data on a network. The time and speed at which they do this has a direct impact on latency. Each router connection significantly adds lag; if you have to involve multiple router-to-router connections, the latency will be much higher.

When you use an Ethernet cable connection or wireless network, your router may slow your connection to your Internet service provider's modem. Upgrading your router can improve latency, but it won't completely solve all of your connection issues. Additionally, using outdated routers and switches could cause issues, as these devices may lack the sufficient memory to handle network traffic properly.

Network congestion

Network congestion also causes delays. When there's a significant amount of traffic on a network, it slows down the delivery of data packets, leading to increased lag. This often happens during peak usage hours.

In the evenings, for example, more people are home from work and may be streaming movies or playing online games. The increase in traffic creates bottlenecks and results in data transferring at a slower rate.

Large file downloads also generate heavy traffic, causing congestion. In general, all users experience higher lag during peak traffic times.

Other causes of network latency

Though the causes above have the greatest impact on lag, other factors affect speeds as well. A weak signal impacts your network connection speeds; you can boost weak WiFi signals with a repeater, but that may still increase latency. Additionally, overloaded servers may struggle to handle requests simultaneously, which is an issue.

It's possible your Internet service provider imposes bandwidth limitations on users, which could result in delays if you exceed your allotted bandwidth. And like with all technology, software issues can be a problem; errors in network applications or protocols can cause inefficiencies that result in high latency.

Five types of latency

Lag occurs often across more than just networks. Here are five different kinds of latency and how they might affect users.

Fiber optic latency

FIber optic latency is the perceived time light takes to travel a specified distance through a fiber optic cable, measured in microseconds or nanoseconds. A number of factors affect fiber optic latency, including the length of the cable and type of fiber.

The longer the cable, the longer it takes for light to travel. For example, a 100-kilometer cable will have a latency of about 500 microseconds.

Similarly, single-mode fiber has a lower lag than multimode fiber. The smaller core of the latter decreases the likelihood of light waves scattering.

Disk latency

Disk latency is the time it takes for a computer to access data on a disk drive. It's measured in milliseconds. Several factors affect this type, including the type of disk drive, data transfer rate, seek time, and the amount of data accessed.

Disk latency directly affects the responsiveness of applications that rely on disk operations, such as data retrieval, file access, or database queries. Lower disk latency leads to faster access time, while higher disk latency results in delays that impact the overall user experience.

Audio latency

This is the delay between sound being created and sound being heard. The distance between the source and destination, the type of audio device, and the device's processing power all determine the delay. Sound travels faster in denser mediums. This means it travels slower through air, less quickly through liquids, and fastest through solids.

When an audio latency is high, you'll notice a delay between the action that generates the sound and the moment the sound is actually heard.

Operational latency

Operational latency is the delay that occurs during computing operations or executions of a system. It encompasses the time taken for various tasks, operations, or actions to complete the desired outcome.

The lag can arise from computational delays, data processing time, network communication delays, or any other time-consuming steps that involve the execution of a system. In order to optimize system performance, users need to reduce operational lag, as it causes slow response times, outages, and errors.

Network latency

Network latency is the delay that occurs during communication over a network. A more noticeable delay occurs when two devices from different continents are communicating over the Internet. Because of the number of connections required, as well as the distance between devices, lag may be higher. Network infrastructure also impacts latency.

How do you measure lag?

Latency varies by application, but it is most often measured in one of three ways: through ping, through traceroute, or through network monitoring tools.

  • Ping sends an ICMP Echo Request message to the destination device and measures the time it takes for a response to return. The RTT is used as an indicator of lag. Ping is the most frequently used method for measuring latency.
  • Traceroute is an advanced diagnostic tool that helps trace a packet's data path between points. It sends a series of packets with various time-to-live values; each router encountered responds with an ICMP Time Exceeded message. By analyzing the times of these messages, you can get a latency estimate.
  • Network monitoring tools are automated tools used to collect data on network performance metrics, including packet loss and latency.

The way you choose to measure latency will vary depending on your network's needs.

How to improve latency issues

If you find that you do have high latency, how can you fix it? Managing latency is crucial to creating a good user experience. Fortunately, there are several ways to solve the problem.

  • Use content delivery networks (CDNs). CDNs help reduce lag by bringing web content closer to users regardless of their geographical location. It caches content to servers closer to users, reducing response time.
  • Upgrade computer hardware. Upgrading network hardware, like routers, cables, and switches, increases network speed and reduces lag.
  • Change your DNS server. Your DNS server is responsible for translating hostnames into IP addresses. If your DNS server is overloaded or slow, it causes lag.
    Consider changing your DNS server to the Cloudflare DNS server, which is, or the Google DNS server, which is
  • Reduce devices on your network. Too many devices connected to your network causes an increase in traffic.
  • Optimize and uninstall applications. Optimize all your applications and databases for peak performance, including lower data transfer and processing times.
  • Restart your router and modem. Sometimes restarting your equipment can fix simple lag issues.

One of these solutions should fix any lag issues you experience. If the issue persists, it could be a problem with your Internet service provider or another external source.

What's the difference between latency and bandwidth?

Though interrelated, these two differ conceptually. Both are a part of networking, but they measure different things.

Bandwidth is the capacity at which a network can transmit data. Essentially, it refers to the potential amount of data that could transfer on a network in a given amount of time. Latency is the time it takes for those data packets to transfer from the source to the destination and back again, meaning it's impacted by bandwidth. The more bandwidth a network has, the faster data packets can transfer.

Frequently asked questions

Why is low latency important?

Low latency is important in a variety of fields, including gaming, cloud computing, and high-frequency trading. In gaming, low latency ensures that applications and services are responsive to users.

In high-frequency trading, low latency is essential for making profitable trades. When there is high latency, traders miss out on opportunities to trade stocks at desired prices. Any task that requires quick, real-time response needs low lag.

What does latency mean?

Latency by definition is the time it takes for a computer or server to send data to another computer or server and receive a response back.

What is a good latency?

Good latency is typically considered to be around 40 to 60 milliseconds or lower.

Is 20 ms latency good?

Yes, 20ms is an excellent latency. This speed works well for gaming, video calls, and improved viewing experience when streaming videos.

Is high or low latency better?

Low latency is ideal because it results in lower response times and faster Internet speeds. It gives users a better online experience.