How To Configure OpenVPN So I Can Return to LTE

On my commute to work I make good use of the excellent network coverage along the railway track. LTE coverage is almost perfect, but just almost as there are a few locations where my data session is handed over to the 3G network. Once on 3G the only way back to LTE, at least for now, is for the network to set the device into Cell-PCH or Idle state so it can search for LTE and return autonomously. That unfortunately doesn't happen in my case as my OpenVPN Server sends a UDP keep-alive packets every 10 seconds, thus preventing my smartphone I use for tethering to return to LTE. It's not that big of a deal as 3G is quite fast as well so I hardly notice the difference. But I'm a perfectionist… So I had a closer look at the OpenVPN sever configuration (in /etc/openvpn/server.conf) and noticed an option for keepalive timers:

keepalive 10 120

The "10" suspiciously looked like the 10 seconds interval that keeps my 3G connection in Cell-DCH state. After changing the line to

keepalive 30 120

the UDP keepalive packets are now spaced 30 seconds apart. That's more than enough time for the network now to set my device to Cell-PCH or Idle state, which in my case, happens after around 12 seconds of inactivity. Shortly after, my tethering smartphone then changes back to LTE.

Perfect! And on top of all this I might even save some battery power as fewer packets are sent and received now.

 

What has Changed In Mobile Computing Since 2009?

2008-2015In a previous post I wrote about what has changed in desktop computing in the last 6 years. In summary not very much, I still use my notebook from back then for some purposes with an up to date operating system for multimedia consumption.  So what about mobile computing and mobile devices, how have things evolved in this domain in the same time frame?

Back in 2008 I wrote a review of how well an entry level phone at the time, a Nokia 5000, could be used for email and web browsing. Back then, the point was to show that even with an entry level device, it had become possible to surf the web and use the device for email communication. It was a sensation. So let's have a look at how the 7 year old Nokia 5000 compares to a similar device that can be bought today.

Price

For my comparison I chose the Android based LG D160, released back in 2014 and which is currently available for around 56 euros, contract free, VAT included. That is only around 60% of the price I paid for the Nokia 5000 at the time, which cost 90 euros. I could have made a comparison to a device that also costs 90 euros today but I wanted to compare two entry level devices and the cost of such a device has come down significantly over the years.

Connectivity

At the time, being able to browse the web with an entry level device was spectacular, today it's a given, nobody would think otherwise anymore. Back then I used Opera Mini with a compression server in the cloud to reduce the size and complexity of the web page. This was necessary on the one hand because the Nokia 5000 only came with a 2G EDGE network interface that could at best transport around 250 kbit of data per second. 3G networks did exist at the time and already covered bigger cities, but entry level devices were still limited to 2G networks. Compression was also necessary due to the processing power and memory having been quite low on the Nokia 5000 compared to today's devices. The LG D160 of 2014 on the other hand comes equipped with a 3G HSPA network interface with data transfer speeds of up to 14.4 Mbit/s. LTE networks are available nationwide today but it's the same story as with 3G for the Nokia 5000 then, LTE hasn't moved down into entry level category yet. What is included today that was considered a high end feature at the time is Wi-Fi, so the device can be used at home without a cellular network. Also, the device supports tethering, so it can be used as a gateway to the Internet for a notebook or tablet on the move.

Screen and Web Browsing

The image on the left shows the Nokia 5000 and the LG D160 side by side and next to a Samsung Galaxy S4, a flagship device back in 2013. While the Nokia 5000 back in 2008 came with a 320×240 pixel screen capable of 65k colors, the LG D160 now has a 320×480 pixel screen with 16 million colors. By today's standards that is a very low resolution but compared to 2008 it is still twice the number of pixels. Opera is still my browser of choice but I have moved-on from Opera Mini to Opera, a full web browser that no longer requires a compression server on the backend as the device has enough RAM and processing power to show mobile optimized and even full web pages without any magic applied in between. At the time it took around 12 seconds to launch the browser and there was no multitasking. Still acceptable then but today, the browser launches in 4 seconds and even stays in memory if no other big apps are running despite the 512 MB RAM, which is a massive amount compared to 2009, but rather little today. GSMArena doesn't even specify how much RAM was built into the Nokia 5000 but the 12 MB of Flash memory for file storage compared to the 4 GB in the D160 today are a pretty good indication of what it must have been. Another aspect I focused on at the time was how fast and smooth scrolling and I noted that compared to the flagship Nokia N95 at the time it was slower and not as smooth. Still usable was the verdict. Today, scrolling of normal web pages via a touchscreen is quite smooth on the D160 and light-years away from what was possible on entry level devices in 2008/9.

eMail

At the time, the email client in the Nokia 5000 was quite rudimentary, with important options such as partial downloads missing. Also, there were few if any email apps for non-smartphone devices at the time to improve the situation. Today, even the 40% cheaper D160 easily runs sophisticated email clients such as K9 mail that, apart from a proper PGP implementation, leaves little to be desired.

Camera, Navigation and Apps

When it comes to built-in cameras, the Nokia 5000 from back in 2009 has a 1 MP camera at the back while today's D160 has a 3 MP camera built in. Both take pictures but they would both be rated pretty much worthless by the standards of each period. But still, the camera is significantly better at a much reduced price compared to 2009. One big advantage of today's entry level smartphones compared to 2009 is the built in GPS chip for a variety of uses from finding the closest Italian restaurant to car navigation. I didn't install Osmand on the D160 but Google maps pinpointed my location in seconds and presented me with a route to a destination almost instantly. An incredible improvement over the state of the art in 2009 in this price category. I mentioned the price tag on purpose as Nokia Maps with car navigation existed in 2008/9 (see here and here) but could only be used on much more expensive Symbian OS based devices. And a final point to make in this review is the availability of apps now and then. Few apps and games existed for entry level devices back then. Today, even the very low cost D160 can handle most Android apps and many if not most games (I'm no expert when it comes to gaming). Also, SMS messaging is quickly dissapearing with most people not caring about privacy and using Internet based multimedia replacement solutions such as WhatsApp instead.

Summary

So while I still use the notebook I bought back in 2009 with the latest operating system version on the market today, the entry level phone from back then is so outdated by today's entry level state of the art that I find quite shocking. Incredible how things have advanced in mobile in this short amount of time.

What Has Changed In Desktop Computing Since 2009?

When I recently checked out a "very low end" smartphone of 2015 I couldn't help noticing how vastly different and improved things are compared to smartphones sold a couple of years ago. I'll write a follow up article about this but I think the scene should be set first with a comparison: What happened in desktop/laptop computing since 2009?

I chose 2009 for this post as this was the year I bought a 17" laptop mainly for stationary use to replace an aging tower PC. Since my usage became more mobile since then I had to replace this laptop for everyday use with a smaller device in the meantime. Nevertheless I still use that laptop today, 6 years later (!), for streaming Netflix, Youtube and other things. So while I still use this 6 year old computer any phone from that era has long gone to digital oblivion.

So is that 6 year old laptop old and outdated? I guess that depends on how you look at it. At the time I bought the laptop for 555 euros with an Intel Core 2 Duo processor, 4 GB of RAM, a 256 GB hard disk, USB 2, a 17" display and Windows Vista. Even if I hadn't upgraded the machine, Windows Vista pretty much looks like Windows 7 which is still widely used today. I could even upgrade the machine to Windows 8 or Windows 10, to be shipped in a few weeks from now and it would still run well on a 4 GB machine. As a matter of fact, many low end laptops sold today still come equipped with 4 GB of RAM. Hard disk sizes have increased a bit since then, USB 3 ports are now standard, CPUs are perhaps twice as powerful now (see here and here) and the graphics capabilities required for gaming are more advanced. But for my (non-gaming) purposes I don't feel a lot of difference.

As I switched to Linux in the meantime my software evolution path was different. Windows was banished from the machine at some point and replaced by Ubuntu Linux. Ubuntu's graphical user interface looked different in 2009, a lot of eye candy has been added since then. Today I run Ubuntu 15.05 on the machine and I upgraded to a 256 MB SSD which makes it in effect look no different from my almost up to date notebook. It also still behaves pretty much the same when it comes to program startup and reaction times. The major difference is that the fan is louder compared to my current laptop due to the still higher power requirements of laptops of the 2009 time frame compared to today's machines.

So what has changed since 2009 in the laptop world? Prices have certainly come down since then a bit and many people these days buy laptops in the €300 to €400 range (taxes included). Technical specs have improved a bit but the look and feel is pretty much the same. Companies have started experimenting with touch screens and removable displays to create a more "tablet-like" experience, trying to import some of the fascinating advances that have happened elsewhere since. But that's still a niche at best. In other words, hardware and software evolution on the desktop have very much slowed down compared to the 1990's which was the second half of the home computer area and the decade of the rise of the PC and Windows. Things already slowed in the 2000's but that decade still saw the rise of easier to use Windows and prices for laptops coming down significantly.

Now try to remember what kind of mobile phone or smartphone you had in 2009 and compare that to what you have today and you'll see a remarkable difference to the story above. More about that in a follow up post.

LTE-only when 3G gets crowded…

While 3G networks are still doing pretty well in most parts of Europe due to HSPA+, 64QAM, dual-carrier, etc. etc., I was recently at an airport where the 3G cell covering my location seemed to have severe uplink congestion problems. Ping times were normal while only little data was transmitted in the uplink direction but immediately skyrocketed to several seconds whenever the uplink was somewhat more loaded with screen sharing and a VoIP calling. A bit of a let down.

But then I remembered that my phone I used for tethering was on LTE just a couple of minutes before and must have been redirected to 3G due to a low signal level. So I decided to lock the phone to LTE-only with an app I discovered recently. Who needs circuit switched mobile telephony anyway…!? Despite the signal level being really really low (a single signal bar was just barely shown every now and then), both uplink and downlink were much faster than what I could get over the 3G cell that was very strong in my location. Signal strength isn't everything.

Generally, I think the network operator bases thresholds for moving between network technologies are a good thing to rely on. In some cases such as this one, however, I'm glad I can make the choice myself.

VoLTE Roaming – From RAVEL to REVOLVER

Many network operators these days are trying to get their VoLTE system off the ground in their home countries so perhaps by 2016 we'll finally see a significant number of networks using the system beyond the few that are silently up and running today. While VoLTE at the beginning will only work in the subscriber's home country, many network operators are now thinking about implementing the next step which is to also offer VoLTE when the user roams abroad instead of falling back to CS-fallback to 2G or 3G networks for voice calls. Problem is, it adds quite some complexity to an already very complex system.

The solution favored by many so far is to have VoLTE work abroad in pretty much the same way as circuit switched calls. The concept is referred to as RAVEL (Roaming Architecture for Voice over IMS with Local Breakout) and LBO (Local Breakout) and its core idea is to use part of the IMS infrastructure in the visited network (i.e. the P-CSCF) that then communicate with the S-CSCF in the home network. Further, calls can be routed directly to another subscriber instead of going back to the home network first. Docomo wrote a good article with further details that can be found here. One of the advantages of the approach is that the P-CSCF has interfaces to the visited core and radio network and can thus establish a dedicated bearer for the speech path and hand-over the call into a circuit switched channel when the subscriber looses LTE coverage. The downside of the approach is that interaction of the P-CSCF with the IMS in the home network is not a trivial matter.

As a result, network operators have started thinking about a simpler solution in the GSMA REVOLVER group which has resulted in a 3GPP study item referred to S8HR (S8 Home Routing). S8 is the packet switched interface for LTE between a home network and a visited network. The 'Home Routing' part of the abbreviation already indicates that this solution is based on routing all IMS related things back into the home network without any involvement whatsoever of IMS network components in the visited network, thereby drastically reducing VoLTE roaming complexity. In fact, apart from the MME having to set a parameter in the Attach accept message, the visited network is not aware of the UE's VoLTE capabilities and actions at all, everything is sent transparently to the P-CSCF in the home network via the home network's PGW. In other words, IMS signaling and voice traffic takes the same path as other LTE data from roaming subscribers today. Another interesting thought: VoLTE roaming via S8HR would be like an OTT (over the top ) service…

Needless to say that reduced complexity results in a number of disadvantages compared to local breakout. Another Docomo paper, an article by Telecom Italia and a recent post over at the 3G4G blog give a good introduction. One major issue is how to handle emergency calls by roaming subscribes. The challenge of emergency calls for the network is to direct the call to a local emergency responder (e.g. the local police station). As S8HR does all things related to IMS in the home network there is no way to do that. As emergency calling is a regulatory requirement and unarguably an important feature it needs to be dealt with. The simplest solution is to instruct the mobile to do a CS-fallback call in case of an emergency. A more complex solution is to use the IMS in the visited network for emergency calling. But I wonder if the additional complexity is worth the more elegant solution? After all, 2G or 3G network overlays will be present in most parts of the world for a very long time to come, so why bother? Or if one bothers, perhaps bother later?

The second, equally problematic drawback is that calls in the visited network can't be handed over to a circuit switched channel (SR-VCC) when the subscriber runs out of LTE coverage. Again, the IMS in the home network has no way to communicate with network components in the visited network. 3GPP is investigating solutions but it's likely that in case they come up with something it's not going to be a simple solution. Perhaps S8HR is not less complicated with SR-VCC support than RAVEL? It remains to be seen.

The big question is whether not supporting SR-VCC is a showstopper for S8HR? After all the OTT competition (Skype, etc.) can't do it either. But I suspect it's going to be a showstopper for many network operators as this is a clear disadvantage over traditional circuit switched voice roaming. On the other hand, mobile devices could have an option for the user to disable VoLTE roaming if they are really bothered by it. I suspect most people won't as SR-VCC mainly plays a role in high mobility scenarios, e.g. in moving cars and trains. One could even think about putting logic in mobile devices to detect roaming and high mobility scenarios and then preferring CS calls over VoLTE if S8HR is used. That would push the issue from the network to the mobile side, but still, perhaps it is worth a relatively minor effort on the mobile side instead of going to great lengths to implement it on the network side. And again, after all, the competition can't do SR-VCC in the first place…

Tomi Ahonen: What if Microsoft sold Nokia back to Nokia?

Until September 2010 I must have been one of the most outspoken and enthusiastic Nokia fan around. The future was great, the future was bright, Nokia was embracing open source and promised to migrate from the somewhat aging Symbian OS to the open source Meego operating system for its upcoming devices. The day Nokia announced that an 'Ex'-microsoft manager is to become the new CEO of Nokia was more than just a shock for me. How could an 'Ex'-Microsoft manger possibility continue with open source!?

The day of the 'burning platform' memo marked the not quite unexpected but still abrupt end of my Nokia fandom. Meego and open source to be abandoned and to be replaced by a closed source Microsoft Windows Mobile, you can't imagine a more drastic 180 degrees in a company strategy and what I would have liked Nokia to do for me. 5 years later Nokia is no more, bought and destroyed by Microsoft and now completely written off from their books due to a complete failure to make Windows Mobile a success.

While I don't mind that Windows is not getting a foothold in mobile it's a shame Nokia and its great ideas withered away. While most of the tech press has already written of Nokia the smartphone company, my favorite mobile analyst Tomi Ahonen has three great scenarios for Nokia the smartphone coming back but not as a Microsoft subsidiary but as a part of Nokia the network infrastructure company. Microsoft wants to get rid of what's left of Nokia mobile while the original and still existing Nokia wants to make a comeback in mobile. Based on lots of insight and historical knowledge it's a brilliant piece of analysis of what could now happen. I just wished those in charge would listen and show dome good sense, at least this time…

“Commodore: The Amiga Years” book on Kickstarter – Only 3 Days Left

A couple of weeks ago I finished reading "Commodore – A Company On the Edge", a book by Brian Bagnall about the early years of Commodore in the computer business up to 1984 and the success of the C64. I could hardly resist as the C64 was the first computer I owned. At the end of the book I wished the sequel about the later years of Commodore and the Amiga, which was my second computer I owned, would also have been already published. Now it seems that my wish could be fulfilled rather sooner than later as Brian has a campaign over at Kickstarter for his upcoming project.

Originally set out for a sum of 15.000 Canadian dollar, the project has attracted quite some attention and the pledged sum, as of this writing and including my own backing is at 73.800 Canadian dollars! A respectable sum and I'm really happy to see that even books for such an arguably niche audience can still get great funding this way and make it worthwhile for the author to work on such a monumental project.

There's less than 3 days left to participate in the Kickstarter campaign as I write this post so if you are a computing history buff you better hurry!

via The Digital Antiquarian

Skype is 20x Cheaper During Intercontinental Roaming For Me Compared to Traditional Voice Calls

Every now and then I travel the world and stay in places where I'm charged 2.50 Euros a minute for the privilege of making traditional mobile voice calls with my mobile phone to someone back home. Needless to say that long calls for €150 per hour have to be avoided if at all possible. These days, however, Skype or other VoIP clients that run on smartphones come to my rescue in combination with a 15 Euros for 150 MB roaming data package.

So what's the price difference to those 2.50 Euros a minute circuit switched mobile calls? When not using video, Skype uses around 20 kbyte of data per second which is around 72 MB an hour. In other words, that 150 MB data roaming package for 15 Euros buys me around 2h worth of Skype calls, i.e. a Skype call to another Skype user costs me €7.50 when roaming. When used to call a fixed line phone, add a euro or two. In other words, despite using an expensive data roaming package, that Skype call costs 20x less and I'm sure my home network operator still has a good margin on roaming data. Hour long mobile calls 10.000 miles away from home have just become sweet again.

And a nice bonus: When calling other Skype users, voice quality is way beyond what's possible with circuit switched mobile calls between networks and continents. No WB-AMR anywhere… The only downside: Should your connection drop down to 2G during the call that's rather an abrupt end of the conversation. So in mobility scenarios, circuit switched mobile calls still have an advantage, at least until that 98% coverage requirement is reached.

Amiga 30th Birthday Celebrations!

This year, both the Atari ST and the Commodore Amiga celebrate their 30th birthday. To me they represent the height of the home computer evolution that paved the way to computing for many adolescents at the time. Many, like me, had a C64 or similar computer before an Atari or an Amiga and most migrated to PCs at the beginning of the 1990s once Microsoft got their act together in 1991 and shipped Windows 3.1. That, in no way, however, diminishes their importance at this point in time.

While reading a book about the Amiga recently and doing some background research, I stumbled over the web page for the Amiga 30th anniversary event in Neuss, Germany, which will take place on the 10th of October. That's not too far away from where I live so I will most likely attend and relive history for an afternoon. Should be great fun!

For those of you living in the US, there's a 30th anniversary event on the 25th and 26th of July at the Computer History Museum in Mountain View, California. The UK will celebrate on the 2nd of August in Peterborough and the event in Amsterdam has already taken place at the end of June.

Cruise Ship and Remote Island Internet Access

Some people would probably still say today that they don't need or want Internet access when going on vacation on a cruise ship or to a remote island. But I suppose their number is on a steep decline and cruise ship operators are investing in Wi-Fi Internet access on their ships not only in special areas but in cabins as well. According to this article (in German) the pleasure costs between 25 euros per week for "social media" access to 99 euros for 3 GB of data for a week on a ship of one of the major cruise lines.

The article doesn't mention what kind of backhaul is used but it's likely to be satellite. There's different kinds of technologies and (one of?) the latest and greatest seems to be from O3b, a company about which I wrote a post in 2008. It looks like in the meantime their medium earth orbit satellites (at an altitude of 8.062 km) are up and running and their public list of customers includes a cruise ship operator (though not the one mentioned in the first post linked to above) and remote islands. The specs advertised on their web page is a top speed of a single transponder of 1.6 Gbit/s and round trip times of around 150 ms. Each satellite has many independent transponders that can direct their beam to a specific area which hints at the capacity and user experience that can be achieved even if several hundred people on a ship need access simultaneously. Here's a video that demonstrates how the system works with two antennas that track the satellites.

And a final thought: I wonder if the uplink/downlink ratio on a cruise ship with lots of people posting their pictures and videos to social media websites is significantly different from the "land" average!? So apart from pleasing customers, a cruise line you probably can't get any better advertising than people posting their pictures in real time to Facebook…