o.k., the title of this post is a bit of a mouthful but I found the analogy to the Observable Universe and what lies beyond interesting. This post is about an interesting learning experience I went through a number of times in recent weeks: Staying at places for several days with marginal Internet connectivity and trying to get my everyday projects done. The question: How much should I do locally, and how much do I push in the cloud and run from there?
What Is Marginal?
OK, let’s define marginal Internet: In my case, that’s hanging off a radio site that is around 2 km away, giving me a single and rather loaded band 20 carrier with a bandwidth of 10 MHz at a signal level of -112 dBm. During most times of the day, that translates into a maximum downlink speed of around 6-8 Mbit/s, and an uplink speed of around 2 Mbit/s. Extrapolate that a bit into the future, and band 20 without any additional spectrum becomes the new EDGE.
Keep Large Data In The Cloud
So how much of my daily work can I do with that kind of Internet connectivity? Uploading and downloading large amounts of data in the range of several giga bytes is pretty much out of the question. If I don’t need it locally, I rather do that on my workstation at home that I can access over ssh remotely.
My Development System In The Cloud
For coding and deploying to production, I have basically two options: At home, my development environment runs in a virtual machine on a powerful workstation in the LAN. There, I typically use X over SSH to the screen of my notebook. This does not work over the Internet, however, no matter how fast the connection is. Instead, I use VNC. Over fast Internet links that works pretty well. Over such a slow link, screen updates are visibly slower, but good enough, so I prefer this way of working to running a copy of this virtual machine on the notebook I have with me. There are two reasons for this: First, because I can. Second, because for some things I do require up- and downloads of hundreds of megabytes of data, in which case the slow fringe connectivity becomes the showstopper.
Uplink Congestion Remedies
And one more problem that needs to be tackled in a multi-user, multi-device environment: Sharing a slow wireless connection usually results in someone or some device sending large amounts of data in the uplink every now and then. This significantly impacts round trip delay times due to buffer bloat. If I’m alone using my slow backhaul that’s usually not a problem as I can control the times during which the uplink is saturated. However, if others use the same link, this quickly spirals out of control. In such a case, Wondershaper on a Raspberry Pi that also acts as a Wifi access point for everybody in front of whatever router or device providing the backhaul will help.
So with this setup, living at the edge of the observable Internet for a couple of days is quite possible.