We use cookies to improve our services and provide a better experience. By continuing to use this site, you agree with our Cookies Policy.

The Science of Network

1454×816
July 22, 2016
| Articles, Cloud Servers, News

Bandwidth is always the metric that people use when they talk about networking. The problem with it is the same as being asked about your commute to work, and the conversation goes something like this:

THEM: “How long does it take to get to work for you?”
YOU: “There’s a 6 lane highway.”
THEM: “But, how long does it take to get there?”
YOU: “It’s 6 lanes and 10 km long, so you can do the math…”

While having a road wide enough to travel comfortably, what about traffic itself? How about those traffic lights? That’s what really matters!

In fact, would you rather commute to your workplace on a 6 lane highway with 5 traffic lights in between or would you choose 2 lane traffic lights free road? I bet the latter seems more appealing!

It is a good real life example of networking, latency & bandwidth, but enough about the roads… Let’s move on to the topic!

What is latency?

The answer is simple – latency is a delay. To be more specific, it is the time required for data to reach the receiver. Apparently all computer networks possess small amount of latency. You will not even notice it when using home or local area network. Such inherent latency is determined as low.

Latency can suddenly increase due to your hardware, traffic load or the distance between you and your target, but it is not limited only to these factors. Such delays can be noticeable and annoying for network users. These significant delays are called high latencies.

Latency is signified in milliseconds. One of the most popular ways to measure latency is to do the ping test. A small amount of data (around 32 bytes) is sent to a host and when it is received back the RTT (Round-Trip-Time) is given.

The measured network latency on the Internet can vary due to routing, used by your ISP. The report of Measuring Broadband America – December 2015 presents typical Internet connection latencies of the top U.S. broadband services:

science-of-network-content-image-one

As you see, the lowest latency is typical for cable connections, whereas satellite has the highest latency of all Internet networks.

The limit of high latency depends on the user’s web browsing experience. If he or she is satellite Internet user, who expects long delays, then additional 50 or 100 ms will not be noticeable. Whereas online gamers prefer latency of less than 50 ms and will definitely feel the difference of the higher lag or random spikes.

In general, the best network performance for general users is with the latency below 100 ms (0.10 second), as any additional delay will be noticed.

By any chance, do you wonder why 100 ms is acceptable? Here’s a fun fact for you:

“Average time for information to travel to
brain is 80 – 100 ms”

Many people think that latency and bandwidth are the same thing. However, speed and capacity are quite different. As it was mentioned above, latency is a delay, and it is measured in milliseconds. Whereas, bandwidth is the amount of data which can be transferred during a second, so it is expressed in bits per second.

Together, latency and bandwidth tell users how quickly a web page loads or a file is transferred. So, do not believe in broadband slogans “get high-speed access”, as what they actually mean “get high capacity access”.

Issues, related to bandwidth can be solved easier than latency problems. For more bandwidth – more pipes are added. Some latency issues cannot be solvable at all.

The reasons of latency

Sometimes the problem of latency can be found at your home/office networks:

Overloaded router or modem. If your family members (or office workers) use one network router at the same time, it will eventually bog down. Network contention among multiple users means an interruption in a procession of each other’s requests, causing the lag. The solution is simple – choose a more robust router model, or add another one to the network.

Overloaded client device. All client devices, such as PCs, laptops, smartphones, etc., which cannot handle the amount of network data quickly enough may cause the higher latency. Today’s computers are quite powerful, but if too many applications are running simultaneously, then they slow down significantly. Solution: close or uninstall the unnecessary applications on your device.

 Malware. Some viruses on your computer may cause slower performance similar to the overloaded problem. Run antivirus software on a network device to solve this issue.

You must have experienced the noticeable lag of loading a website or running an online application. The sources of the high latency on Internet can be the following:

Online application load. Shared Internet servers filled with online games, Web sites, and other client-server network applications can become overloaded, and users may face with the lag. The solution would be to use Host1Plus  Cloud Servers.

Weather. Sounds unbelievable? However, all wireless Internet connections, such as Satellite, can be interfered with rain. This interference causes corruption in network data transition and therefore the higher latency. The solution to this problem is to change wireless to cable or fiber connections.

Lag switches. Some inveterate players who play online games have a device called a delay switch on their local network. This device intercepts network signals and brings significant delays into the flow of data back to other players connected to a live session. It is not popular among gamers, but if it occurs, the only solution is to stop playing with them.

Internet traffic load. Traffic load usually happens during peak usage time of the day, especially in the evenings, when everybody comes home and surf the Web. The length of the delay depends on a service provider and user’s geographic location. So, this reason can be solved only by changing ISP or moving to another location.

Cloud Servers locations & reach revealed

As you might already know (and if you don’t,you might want to take a look at this) we are going to start Cloud Servers in Frankfurt and Chicago locations in order to ensure the best connectivity for Europe and North America customers, we also have plans to move on with São Paulo and Hong Kong locations later on this year.

Conclusions

All in all, for better web browsing experience, we should pay attention to latency rather than to bandwidth. Solve the issues of high latency and surf the web smoothly and pleasantly.

By Dovainis Kalėda
Categories: Articles, Cloud Servers, News
No Comments Leave a Comment
Leave a Comment

Archive