I Can’t Wait For GSM To Be Switched Off… Optus The Latest To Announce Such A Move

The last time I can remember when I was glad I had EDGE coverage was back in 2010 in Thailand. But these days are definitely gone now, with most 2G networks now being hopelessly overloaded in non-3G or 4G areas with the data traffic of all smartphones in the area and web content having become so big that even non-congested 2G networks would hardly be able to cope. After AT&T and Telenor, Optus as another big network operator has announced a GSM switch-off, 2017 in their case. Macau Telecom is perhaps the first network operator that has switched-off GSM already, but that's a fringe case.

I have to say, I more and more can't wait for GSM to be switched off, it is of little use anymore these days. And a nice benefit: Those who switch-off, at least in theory, make sure 3G and 4G coverage is available in areas where previously GSM was the only available network technology. Really, I can't wait, switch it off! And in the meantime, I'm happy I can switch it off on my side.

P.S.: O.k., yes, I know, some network operators have a lot of 2G M2M modules still out there…

Old ‘Byte’ Magazine At Archive.Org To Experience The Past

Every now and then I read books about different aspects of computer history. Good books written on the topic obviously heavily rely on original sources and interviews and make a good plot and summary out of it. To dig deeper on specific aspects requires to get hold of the original sources. Fortunately, quite some of them are available online now, such as for example scanned originals of 'Byte' magazine, that covered 'microcomputer' topics in great depth and technical detail from 1975 to 1998.

One issue I recently took a closer look at was from August 1985, i.e. from exactly 30 years ago, as it contains a preview of the upcoming Amiga 1000. What I find very interesting reading original sources is how new developments at the time were perceived at the time and how they compare with existing technology. I had to smile, for example when comparing the graphics and multitasking capabilities the Amiga introduced to Jerry Pournelle's ravings in the same issue of the magazine about a program that 'can have WordStar, DOS, and Lotus 1-2-3 running all at the same time that would otherwise have only single-tasked on his text based Zenth ZP-150 IBM-PC clone.

Obviously that's just one of millions of stories to be discovered. For your own enjoyment, head over to  Archive.org and begin your own journey through computer (and other) history 🙂

Evolutionary Steps of Mobile Phones And The Future

In the past 30 years, mobile phones that are no longer mere 'phones' these days have gone through an incredible evolution. Perhaps it's a good time now to reflect on the changes that happened over time and to ask the question if commoditization will be the inevitable next step.

The Era of Voice and Text Messaging

From the mid 1980's when mobile phones were expensive and hence a luxury of the few, right to the middle of the 2000's when mobile phones became affordable and a mass market phenomena, their functionality was mostly limited to voice calling and text messaging. At the beginning of this period it was actually only voice telephony, text messaging was added only much latter as a consumer service in 1992.

While voice and text messaging where the prevalent use of mobile devices right up to the middle of the 2000's for most people, the major trend was to make devices smaller and smaller. This continued up to the point where the size was no longer limited by the electronics that had to fit into the device but by how small the keyboard and screen could be made.

Early smartphones, proprietary operating system

While the race for miniaturization was in full swing, companies started to innovate around how to put more than just voice telephony and text messaging into a mobile device. A first device to include many more functions was the Nokia 9000 Communicator, released in 1996. Advertised as a mobile office with calender and address book, dial-up modem functionality for Internet access and web browsing, it was too expensive and bulky for most people. It took another 10 years before such devices were small and affordable enough and mobile networks capable to transport data faster than just a few kilobits per second. The Nokia 6680 from back in 2005 was one of the first mass market smartphones that one could actually see in the hands of people. Most did probably not realize the potential of the device in their hands beyond the nice and, for the time, large and colorful screen and continued to use it like their previous device mainly for voice telephony and texting.

But those who realized its potential and were willing to pay more for mobile Internet access used it for browsing the web, to send and receive emails and to create content for websites and social media on the fly. One of my blog entries from back in 2006 that shows the 6680 in action together with a foldable Bluetooth keyboard for blogging from, what was then called, the 3GSMWorldCongress.

Apart from having become versatile tools to access the Internet while on the go, third parties could develop apps that could then be used on the device. One of my favorite 3rd party application of the time was OperaMini, a web browser that far surpassed functionality and ease of use of the built in web browser.

While downloading and using apps on a mobile device has become common place today, the introduction of that capability was a major step in the development of mobile devices. Up to this point, mobile phone manufacturers saw their devices mostly as closed systems to which only they could add software or modify it. That is quite a difference to the evolution of personal computing where the computer was always seen as a means to an end, i.e. to run third party software rather than as a closed system.

Smartphones become a PC in the pocket – Or Almost

While 3rd party apps on mobile devices, by that time no longer called mobile phones by many, became more popular, the operating systems of these devices remained proprietary and closely tailored around the limitations of the mobile hardware in terms of processing power and memory capacity. Around 2007, however, enough processing power and memory was available even in small devices for adapted desktop kernels and operating systems to make the jump to mobile devices. The first generation iPhone from 2007 and the HTC Dream from 2008 were the first mass market devices to use adapted BSD and Linux kernels from the desktop world, a lot of software initially developed for desktop operating systems and merged it with newly developed user interfaces that were adapted to mobile use, smaller screens and touch input.

The re-use of desktop operating system software and the influx of the idea that mobile devices were no longer closed-world systems closely guarded by manufacturers, but first and foremost platforms to run third party applications, just like PCs, gave a tremendous push to the transformation of the mobile industry towards ever more versatile and open mobile computing platforms. Especially Android has helped in this regard as it is based on open source and most parts of the system are open source and available to use and modify by anyone.

Commoditization, Openness and the Future

Let's come back to the question if, when and how commoditization will happen in the mobile world and have a look at the desktop world first. Today, desktop PCs and noteboosk are pretty much commoditized, i.e. products of different manufactures look and feel pretty much the same and run the same software. Apart from Apple, the single but major exception, all other desktop and notebook manufacturers such as HP, Lenovo, Dell, Asus, Samsung, Acer and many many small brands and no-name manufacturers design and produce very similar hardware that is compatible to Microsoft's Windows operating system. The user interface for devices of all manufacturers looks identical, except perhaps for a company logo during the boot process and additional utility programs of often doubtful value and usability in addition to drivers for new hardware components to make it usable for the standard Windows operating system. As products are pretty similar, it is difficult for hardware manufacturers to differentiate themselves from the competition and hence, profits are small. Only Apple has managed, despite also using Intel based PC hardware, to differentiate themselves from others with their own operating system, programs and services and to keep profits high.

A tremendous benefit of hardware commoditization is that people can install alternative operating systems based on Linux and BSD kernels on those systems. While popularity of doing so is on the rise, it is far from a mass market phenomena. Despite advantages such as the preservation of privacy and being much less prone to security issues of mainstream desktop operating systems, most people do not want to spend the effort or feel incapable of installing a different operating system themselves. But the possibility for those who want to do it is there and things have become much easier for those who dare.

On mobile devices, however, we are still quite far away from this state. While Apple applies the same business model to their mobile devices like for their desktop and notebook (i.e. integrated hardware and software) most other mobile device manufacturers use Google's Android operating system as basis for their devices. Most major manufacturers such as Samsung, Sony, LG and HTC, however, still heavily customize the user interface of the operating system and innovate on the hardware side. Still, it's possible to install different Android based operating systems on many of those devices, CyanogenMod being one of the well known alternatives to an Android OS that was customized by a manufacturer.

Things are unfortunately not as far along the way as in the PC world, where Linux distributions such as Debian, Ubuntu, OpenSuse, Redhat and many others recognize all hardware components during their startup procedure and configure themselves accordingly. In the mobile world, customized versions of CyanogenMod or other alternative Android distributions need to be created for each device. Also, many hardware manufacturers lock their boot loaders to prevent alternative operating systems to be installed. This either makes it impractical to install alternative OSes on such devices or requires special software tools to remove such locks. And to add insult to injury, it's difficult to impossible to provide timely updates to security issues if the complete OS needs to be rebuilt for thousands of different device types as the Stagefright vulnerability has painfully shown to a larger audience this summer.

In other words, despite Android's openness and use of open source, we are still lightyears away from mobile devices being just commoditized hardware that runs the same operating system and the same user interface on top with a few additional drivers to access special hardware and naturally allows users to install a different operating system if they chose to do so. But despite still heavy customization, it seems margins are getting thinner and thinner. This gives hope to those that don't want their lives to be analyzed by their devices and consequently by the manufacturers of those devices and their personal data sucked out, stored and analyzed on servers at other end of the world. So for me, commoditization that hopefully leads to more standardized devices, to easier installation of alternative operating systems and a 'one patch fits all devices on the same OS version' can't come soon enough.

14 Mbit/s Over UMTS – So Much For Network Congestion

Umts-lifeI've been back in Austria this year for a bit and once again I made good use of my unlimited data over Drei Austria's UMTS network for 18 euros a month on prepaid. They do have an LTE radio access network these days as well but unfortunately, my prepaid subscription is limited to their UMTS network. As a consequence, I was a bit afraid that my experience would suffer due to potential congestion due to the the ever rising data traffic and investment shifting to LTE.

To my pleasant surprise quite the contrary was the case. In all places with a high signal level I tried, I could easily get sustained data rates of well over 12 MBit/s over an OpenVPN tunnel. And that was not somewhere outside big cities with few people using the network, no that was in Vienna, one of the most densely populated parts of the country.

Quite incredible for a network that offers its customers unlimited and non-throttled data for less money than most other network operators worldwide would give you a couple of hundred MB or a few GB at best. So much for the theory that unlimited data would bring a network to its knees…

20 MHz LTE Band 20 Network in Austria

In most countries in Europe I have visited recently, network operators are now making good use of LTE band 20 in the 800 MHz part of the spectrum to cover rural areas. Usually, three network operators have each been successful in getting a 10 MHz chunk of the 30 MHz channel. In Austria, it's different however, as I recently found out while having a closer look at the networks around me.

Instead of three network operators, only two network operators have acquired spectrum in this band, T-Mobile Austria and A1 (Telekom Austria). The later one has quite a big chunk, 20 MHz! That's the first country I've seen in which a 20 MHz carrier is deployed in the 800 MHz band. A huge advantage for rural high speed Internet connectivity! For details see here.

While I traveled by train, however, all my devices mostly stayed on 3G, probably due to the heavily insulating windows. Also, none of my devices were able to acquire a GPS fix even when held directly at a window. As 3G got in just fine there might have been repeaters on the train but not for 800 MHz. At least there was free Wi-Fi on board which worked quite well on the cross country trip. But that's little consolation as Wi-Fi tends to saturate over time. Also, everybody being able to snoop on my data traffic is  not quite my cup of tea either. I hope there are plans to upgrade the repeaters and let the professionals outside handle the traffic.

First Network Operator Disables MMS Delivery Due to Stagefright

Several news portals report today (see here, here and here) that the first network operator has disabled MMS delivery temporarily due to most Android mobiles in the wild today being vulnerable to the Stagefright that can be exploited by sending videos by MMS and other means. Instead, an SMS is sent to customers with a link to view the MMS contents in a browser. Quite a responsible step and I wonder if or how many network operators will follow the example.

This announcement is quite something! I have never heard that a network operator has disabled one of its own services because of a mobile device vulnerability. As the articles linked-to above point out, it's no secret in the mobile industry that Whatsapp and other services are now much preferred over expensive MMS messaging, so perhaps little revenue is lost by this step.

In the announcement the carrier said that this is only a temporary solution until a fix has been found. That makes me wonder how that fix could look like and how long this temporary solution remains in place!? It's unlikely that the threat will go away anytime soon as it will take quite some time for devices to get patched as an Android OS patch is required. That means the update can't be delivered via Google's app store but needs to be incorporated by device manufacturers in their builds for each device. Good luck with that. Also, I guess there will be many devices that will never get updated as device manufacturers have already lost interest in providing OS updates for devices that are somewhat older.

Another solution I can imagine would be to put a "virus scanner" in place on the MMS server in the network to filter out malicious videos. But that will cost time and money not only initially but also to keep the signatures up to date. That makes me wonder if the service still makes enough money to justify such a measure!? On this account I wouldn't be surprised if Facebook, Google and others are already scrambling to put scanners in place to make sure videos that are put on their services by users do not contains malicious content.

No matter how I look at it, I can't help but feel that we've just reached a tipping point when it comes to mobile security. Google and device manufacturers need to do something radical and drastic NOW to make sure that future Android devices can be patched in a timely manner (i.e. in a matter of hours just like what is possible for desktop operating systems) rather than having to wait for device manufacturers to come up with new builds or, even worse, not being able to patch Android devices at all anymore due to lack of manufacturer support.

Why Don’t We Still Have A SIMPLE Smartphone Remote Help Function?

One thing I've been hoping for years now is for someone to come up with an easy to install remote smartphone help function to see and control the screen remotely. On the desktop VNC has existed in various flavors almost since the beginning of desktop GUIs. On all mobile operating systems, even on the open source Android, there is no good and simple solution. Yes, there are some solutions that have been started and abandoned again like the droid VNC server project but nothing that just works out of the box and across different devices and Android versions. I have to admit I'm a bit frustrated because I could use a remote help function at least once a week.

A couple of years ago I was extremely thankful to Google for bringing Wi-Fi tethering to the masses when it was still perceived as something "terrible" by most network operators. But for a remote screen implementation I personally can't count on Google because I'm sure that should they come up with such a thing they would put one of their servers between the smartphone and the helping hand. No thanks, I need something more direct, private and secure. But Android is open and I would accept something that requires a rooted phone and I can't but wonder why nobody has come up with a good, simple and interoperable solution so far!?

P.S. Cyanogen, hello!?

How To Simulate an IP-Adress Change On A NAT WAN Interface

The vast majority of Internet traffic is still over IPv4 and when I look at all the different kinds of connectivity I use for connecting my PC and smartphone to the Internet, there isn't a single scenario in which I have a public IPv4 address for those devices. Instead, I am behind at least one Network Address Translation (NAT) gateway, that translates a private IPv4 address into another IPv4 address, either a public IPv4 or another private one in case several NATs are cascaded. While that usually works well, every now and then one of those NATs change their IP address on their WAN interface which creates trouble in some use cases.

Applications that use the TCP transport protocol quickly notice this as their TCP link gets broken by the process. Higher layers are notified that the TCP link is broken and apps can reestablish communication to their counterpart. Apps using UDP as a transport protocol, however, have a somewhat harder time. UDP keep-alive packets sent in one direction to keep NAT translations in place are not enough as they just end up in nirvana. Bi-directional UDP keep alive packets also end up in nirvana without the application on top ever being notified about it. The only chance they have is to implement a periodic keep-alive 'ping' and a timeout after which the connection is to be considered broken.

Recently I had to simulate such behavior and wondered how to best do that. Again a Raspberry Pi acting as a Wi-Fi access point, NAT and Ethernet backhaul served as a great tool. Three shell commands are enough to simulate an IP-address change on the WAN-interface:

Do a 'sudo nano /etc/dhcp/dhclient.conf' and insert the following line:

    send dhcp-requested-address 10.10.6.92;

The IP address needs to be replaced with an IP-address of the subnet of the WAN interface but different from the one currently used. Obviously that IP address must not be used by any other device on the WAN subnet which can be checked by pinging the IP address before changing the configuration file.

Next, disable the dhcp client on the WAN interface with

   dhclient -r -v

The interface itself remains up so Wireshark does not stop an ongoing trace but IP-connectivity is removed. If the scenario to be simulated requires a certain time before a new IP address is available, just pause before inputting the next command.

A new IP address can then be requested with

   dhclient -v eth0

The result returned by the command shows if the requested IP has been granted or if the DHCP server has assigned the previous IP address again or perhaps even an entirely different one. If the DHCP server has assigned the old IP address, changing the MAC address after disabling the dhcp client will probably help.

And that's all there is to it except for one 'little' thing: If the Raspberry Pi or other Linux device you perform these actions on are yet again behind another NAT, the server on the Internet will not see an IP address change but just a different TCP or UDP port for an incoming packet. So while in some scenarios the connection will still break there are potentially scenarios in which a connection can survive. One example is an ongoing conversation over UDP. If the NATs assign the same UDP port translations on all interfaces again, which can but does not necessarily have to happen and the first UDP packet is sent from the NATed network, the connection might just survive. I can imagine some TCP survival scenarios as well so don't assume but check carefully if the exercise produces the expected changes in IP address and port nubmers for your network setup.

Have fun!

Who Is Interested In ‘Mobile’ and ‘Desktop’ Convergence Like I Want It?

For the last couple of years a number of companies have been trying to find a way to converge the operating system and user interfaces of mobile and desktop devices. Perhaps time is getting a bit scarce now as smartphone processors come pretty close to the computational power, memory size and graphics capabilities as full grown desktop PCs. Sure their screen and battery is smaller but at some point it will be trivial to interconnect them with a bigger screen, keyboard and mouse and ask them to be the 'desktop'. Perhaps we reach this point with tablets first? But what kind of operating system will run on it?

With almost the screen size of a small notebook the only things thet are missing in a tablet product we use today is a hardware keyboard and a mouse. Apple is getting closer and closer to this point with the latest Macbook 2015. Running a full iOS, it nevertheless is pretty much a tablet with a keyboard attached due to its thinness and use of only a single USB 3.1 connector. Unlike a tablet however, it runs full iOS. But the keyboard is attached to the screen and the graphical user interface is still geared towards keyboard and touchpad.

Microsoft is also on it's way with the Surface line of notebook / tablet hybrids, even though commercial success is nowhere to be seen yet. Their Surface notebooks / tablets are now also running a full Windows operating system on a tablet sized device with removable keyboard with an x86 processor, so that is perhaps even closer to a converged device than the Macbook desribed above. I don't like the Windows 8 style graphical user interface and closed source is not my piece of cake either but they are definitely innovating in this space.

The third player in the desktop/mobile space is Google with Android and Chromebooks. While I like the fact that Chrome OS runs on Linux, the idea that everything is in the Google cloud makes their vision of a combined mobile/desktop future not very appealing to me. I can imagine my data to be stored on my own cloud server at home but I'm not yet willing to give up the huge advantages of on-device data and application storage when it comes to speed, security and being able to get work done in places where Internet connectivity is not present or too slow.

So perhaps it's time now to get hold of a 'Surface' and install Ubuntu on it to see how usable the Unity graphical user interface or perhaps KDE or something else is on a tablet once keyboard and mouse are removed!?

Windows 10 And Wi-Fi Sense – Consider Your Network Compromised?

<sarcasm on> I'm sure lots of people are looking forward these days to upgrade to Windows 10 <sarcasm off>. Personally, the only reason why I'm even writing about it is the new Wi-Fi Sense feature that has jumped over from Windows Phone to the desktop to freely share Wi-Fi passwords with your contacts in Outlook, Hotmail, Skype and Facebook. Great for convenience, bad for security, bad for me.

Bad for me because in the past I let friends access my Wi-Fi network at home. I'm sure I haven't changed my Wi-Fi password since because it's a pain to change it on all of my devices that use the network, especially printers and other devices that don't offer an easy way to change the password. But I guess I have to do it now because Windows PCs of friends with whom I've shared my password in the past that upgrade can now freely share it with their friends, i.e. people I don't even know, with a simple tick in a check box. Obviously, Microsoft get's to see and store it, too.

Fortunately it's not as bad as it sounds at first as users still have to tick a check box per Wi-Fi network they want to share credentials for. Ars Technica has a detailed description of how the feature works if you are interested in the details. Still, I think it's time for a password change at home.

Yes, I know, giving out my Wi-Fi password was never secure to begin with. This is why I have a guest Wi-Fi SSID and password for quite some time now that I can quickly switch-on and off as required. Another benefit of this solution is that I can change the SSID and/or the password every now and then so things don't get out of hand. This way even if friends decide to or accidentally share my guest Wi-Fi credentials with Microsoft and the world it's of little use to anyone as the guest Wi-Fi SSID is automatically shut down after a pre-configured time after the last user has left.

And that by the way also limits the damage done by those automatic 'backups' of private data to Google Servers that Android based devices perform every now and then.