You can assess the network performance by jitter and latency metrics. The main distinction between jitter and latency is that latency is a delay through the network, whereas jitter is a change in the amount of latency. Higher latency and jitters impact the network negatively and hence, it is important to monitor them regularly. If there is a difference in the speed of two devices, there is a higher chance of latency and network jitter, which will lead to network congestion, resulting in buffer overflow and traffic bursts.
What is Jitter?
Jitter is a phenomenon where data packets are delayed in transmission because of network congestion or, in certain cases, route modifications. It is the usual source of buffering delays in video streaming. Extreme levels of jitter can even lead to dropped VoIP calls. If all packets have the same delay, it wouldn’t have any effect on audio and video. But, if the delay is varied, the packets arrive out of order. The term high jitter denotes significant variance and low jitter denotes a minor variance.
A certain level of jitter is acceptable as it does not interfere with the browsing or viewing experience. As such, a jitter of 30 milliseconds or less is acceptable as it is barely apparent. However, anything above this will affect your browsing experience and calls. During downloading files, even higher levels of jitter are hardly visible.
What is Latency?
Latency is the delay between an action and response to that action. It is the time that has passed from the time a packet is delivered to the destination and the time taken for the destination to acknowledge the receipt. In networking parlance, it is the time taken for a packet to travel from the source to destination and back. The most common example of latency is the time taken for loading a website after you click the URL.
As internet data theoretically travels at the speed of light, there shouldn’t be any delays. But, factors such as infrastructure, equipment, and the distance between the source and destination limit the speed. However, the good news is that there are several ways to lower the latency and increase the speed.
Bandwidth
The maximum amount of data that can move across a network at a given time is called bandwidth. However, the actual amount of traffic or throughput could be different. In short, if the absence of latency, the bandwidth will remain constant throughout. But zero-latency is almost impossible to attain and hence the throughput will always be lower than bandwidth.
Can jitter be greater than latency?
Yes, it is possible for jitter to be greater than latency because jitter is a measure of deviation from the standard rate of delivery. Latency, on the other hand, is the time taken by a packet to get from a source to destination. So, jitter and latency measure two different things.
Regardless, both jitter and latency affect the network adversely. Hence it is important to keep an eye on both and introduce optimum solutions if both go higher than accepted levels.