Tuesday, November 19, 2019

Is Connection Latency Important to Your Business?

By: John Shepler

We may be inclined to think that connection speed is the most important consideration for private lines and Internet access. If web pages are loading slowly and files take forever to move, then clearly the network is starved for bandwidth. Just order more Mbps or even Gbps and everything will straighten out, right? Sometimes, yes. Sometimes no. There is another network consideration that can affect your business big time. That is latency.

Improve your line latency now.What Does Latency Mean?
Latency is a time lag. Nothing happens instantaneously, but if the time lag is short enough it will seem that way. You know the expression, “in the blink of an eye”? That’s low latency.

In computing, you experience latency as applications that just don’t keep up with what you are doing. If you press a key and it doesn’t appear on the screen for a split second, you’ve got latency. If you type a command and nothing happens for a second, that’s latency.

When operating on local area networks and in-house data centers, latency may not be all that noticeable. Programs are responsive. Video is nice and smooth. Files move quickly. If the system can keep up with you, latency just isn’t an issue.

When Latency is Noticeable
You often get your first taste of latency when you connect to the Internet or the cloud. Suddenly things seem to be a tad sluggish. It gets destructive when the system is so slow to respond that it interrupts your workflow. You almost feel like you’ve gone back to the days of batch processing where you submit a program and wait for the results to print out.

Worst case latency shows up in real-time processes. VoIP telephony gets a bad name when latency exceeds a hundred milliseconds or two. On a phone call, you expect to carry on a normal conversation. That includes both sides talking at once sometimes. If you ask a question and don’t get a response immediately, you might start taking agin. Right then, you hear the other person’s response just as you say something else. It quickly becomes intolerable. If you are stuck with the situation, you can work around it by consciously taking turns, like you would with a two-way radio.

Higher Bandwidth, Lower Latency
One cause of latency is network traffic jams or congestion. In any size WAN pipe, expressed by bandwidth in Mbps or Gbps, you can only send so many packets per second. If you try to send more, they pile up in a transmission buffer or, worse, get dropped. The fix for this type of latency problem is to simply add more capacity. If your T1 line is full, a 10 Mbps Ethernet line may be way more than enough. Likewise you may really need 100 Mbps or a full Gigabit per second for the connection to appear transparent.

Another way to relieve latency-induced madness is to prioritize traffic. Real time processes like VoIP telephony and teleconferencing take highest priority and can work great on even limited capacity lines. As long as there is still some bandwidth left, you should prioritize business applications in the cloud next and file transfers and backups last. If you run out of bandwidth so that the lowest level processes never finish or take forever, you need to add more bandwidth, pure and simple.

Higher Bandwidth, Same Latency
What happens if you increase your bandwidth by 10x or 100x and nothing improves? “Hello, is this line working?”

With congestion relieved, something else must be slowing things down. Remember that latency is simply a time delay between transmission and reception and that nothing happens instantly. Signals can move only as fast as the speed of light, which even at 186,000 miles per second turns out to be 186 miles per millisecond. If both ends of the connection are 1,860 miles apart, you’ve got a built-in transmission time of 10 mSec each way or 20 msec total. If you need lower latency than this you’ll just have to move closer.

Know that light through fiber optic cable and transmission equipment may impose an additional penalty over a third more than ideal latency. Still not a big problem, as latencies in the tens of millisecond range are not bothersome for nearly all processes. But, what if that connection goes to a geosynchronous satellite? Now you are talking maybe 500 msec round trip. That’s most definitely noticeable and probably a show-stopper for most phone calls and some cloud services. This is why the new Low Earth Orbit satellite constellations are so eagerly anticipated. At distances of a few hundred miles up instead of thousands, latency can be back to nearly fiber optic line performance.

Other latency issues can be traced to network equipment that isn’t working correctly or the inherent nature of the good old Internet. Remember that the Internet was designed by the defense department to be robust and not particularly efficient. Packet routing can take long and convoluted paths and suffer various levels of congestion within the Internet. If you are using a shared bandwidth service, such as cable broadband, DSL, satellite or cellular broadband, other users can clog the link and up goes your latency. Even more maddening, performance can vary from minute to minute so you have no consistency. Dedicated direct connections to your cloud provider can dramatically improve performance if this is your problem.

Are you having network performance issues, especially if you’ve recently moved from an in-house data center to the cloud? Your cloud service can be working perfectly well even though it seems to drag. You might be surprised by ping testing your line and discovering that it is the weak link in the system. Find out now what low latency bandwidth options are available and what it costs to upgrade and relieve your performance issues.

Click to check pricing and features or get support from a Telarus product specialist.



Follow Telexplainer on Twitter