Updating Owncloud Has Become Really Scary

For the third time in a row I ran into major difficulties when upgrading Owncloud. I really love this project and I can't overstate it's importance to me, but in its current state it won't be attractive to a lot of non-technical people for whom a robust upgrade process is essential because they can't fix things when they break without them being at fault. So instead of just helping to fix the individual issues as I've done in the past, I've decided to describe my latest upgrade experience to an open audience in the hope that some people working on Owncloud realize the state the project is in and finally take some real counter measures.

I run a pretty much standard and small Owncloud instance on an standard Ubuntu 14.04 LTS server installation on an Intel platform. Nothing fancy, just a few users, only the minimal number of apps installed such as calendar and contacts. Also, I have some of the built in apps enabled such as the PDF viewer, gallery and the 'inside document' search. That's it. So I would expect that when my Owncloud instance informs me with a nice yellow banner at the top that an update is available and that I should just press the update button everything works smoothly. Boy was I in for a surprise.

When upgrading from Owncloud 8.0.4 to 8.1.1 as suggested by the updater the upgrade failed with an error message. A look at the issues list on Github revealed a mile long thread of other people who have also gotten the message. Eventually I figured that I just need to reload the update page and the update would run through again and this time it succeed…

Next, all external apps such as calendar and contacts are disabled and you have to re-enable them. A 'nice' feature that Owncloud picked up along the way after version 6.0 and that nobody understands from a user point of view. From what I've seen there have been quite a number of people voicing their concerns over this and it's on the top of the wish list of things to change for an upcoming version.

Unfortunately the calendar and contacts apps were not longer in the list of apps that can be activated. What!? After a lot of research I found out that this is related to an error message in the admin screen that Owncloud doesn't have access to the Internet. How so, I wondered, Internet access is working fine. Again, I had a look at the Owncloud core issue list on Github and found a mile long thread from other people who had the same issue. Somewhere in the middle of the thread I read that putting an SSL certificates file one can download from the web into the Owncloud config directory fixes the issue. Unbelievable but true, it fixed this issue, no more complaints about no Internet connectivity and the calendar and contacts apps were showing up again in the list of apps that can be activated.

The next challenge I met was that when activating the calendar in the app menu I got an update error message. Now what, again no further advice!? Clicking somewhere in the web interface popped up the system updater page again which informs me that the calendar is now updated. Unfortunately that fails again with a strange error message. Again, the issue list on Github tells me I'm not the only one and rinse, wash, repeat will fix the issue. I saw the updater page again when activating a number of other apps, this time fortunately without further issues after the initial error message.

I also had trouble getting the Lucence search app updated. When doing that via the "updater button" on the web interface, Owncloud completely fell over and wouldn't even show anything in the web browser anymore. Again, the issues list on Github told me I'm not the only victim and that deleting the Lucene app directory on the server and re-installing the app fixes the issue. And indeed it did.

After that, Owncloud was finally working again for me. But I can't believe that a simple update results in such chaos!? Really, dear Owncloud community, of which I feel I'm being a (frustrated) part of, you have to get your act together and fix the update process, IT MUST NOT FAIL UNDER ANY CIRCUMSTANCES if you want this project to thrive in the future. Forget new features, forget slick UI changes, FIX THE UPDATE PROCESS…

And fix it to a point where an update via 'apt-get update' from the command interface does everything, including re-activation and updates of any Owncloud apps. While this is not working I won't even bother anymore recommending to and installing Owncloud instances for my less technical friends.

Book Review: Diary Of An 80s Computer Geek

There are quite a number of books on the market these days about the emergence of home computing and the PC in the 1970’s and 80’s and the people that had a major influence on this development and shaped the industry. Little has been written so far from the point of view of the kids on the other side of the business model that dreamed about getting their first personal computer, what they’ve done with it apart from playing games and how it shaped their future lives. A wonderful exception is ‘Diary Of An 80s Computer Geek: A Decade of Micro Computers, Video Games and Cassette Tape’ by Steven Howlett (not with a ph!).

Written in 2014, Steven tells about his adventures and misadventures as a young teenager in the 1980’s, his first computer and subsequent computers, programming, impressing his friends, trying to sell the result, about clueless teachers and lots of other things over 90 pages in a very humorous and easy to read style. While I would have been a (C64) ‘Commie’, he was firmly entrenched on the Sinclair ZX ‘Speccy’ side of things, a school ground battle that was probably fought much harder in Britain than it was in my home country (were we only had proper school ground battles over computers in the Amiga vs. Atari age. But I digress…) I couldn’t put the (e)book down as the story had so many elements I could immediately identify with as it reminded me of my computing adventures in the 1980’s.

A great book, fully recommended and a wonderful story from the ‘other side of the fence’!

I Can’t Wait For GSM To Be Switched Off… Optus The Latest To Announce Such A Move

The last time I can remember when I was glad I had EDGE coverage was back in 2010 in Thailand. But these days are definitely gone now, with most 2G networks now being hopelessly overloaded in non-3G or 4G areas with the data traffic of all smartphones in the area and web content having become so big that even non-congested 2G networks would hardly be able to cope. After AT&T and Telenor, Optus as another big network operator has announced a GSM switch-off, 2017 in their case. Macau Telecom is perhaps the first network operator that has switched-off GSM already, but that's a fringe case.

I have to say, I more and more can't wait for GSM to be switched off, it is of little use anymore these days. And a nice benefit: Those who switch-off, at least in theory, make sure 3G and 4G coverage is available in areas where previously GSM was the only available network technology. Really, I can't wait, switch it off! And in the meantime, I'm happy I can switch it off on my side.

P.S.: O.k., yes, I know, some network operators have a lot of 2G M2M modules still out there…

Old ‘Byte’ Magazine At Archive.Org To Experience The Past

Every now and then I read books about different aspects of computer history. Good books written on the topic obviously heavily rely on original sources and interviews and make a good plot and summary out of it. To dig deeper on specific aspects requires to get hold of the original sources. Fortunately, quite some of them are available online now, such as for example scanned originals of 'Byte' magazine, that covered 'microcomputer' topics in great depth and technical detail from 1975 to 1998.

One issue I recently took a closer look at was from August 1985, i.e. from exactly 30 years ago, as it contains a preview of the upcoming Amiga 1000. What I find very interesting reading original sources is how new developments at the time were perceived at the time and how they compare with existing technology. I had to smile, for example when comparing the graphics and multitasking capabilities the Amiga introduced to Jerry Pournelle's ravings in the same issue of the magazine about a program that 'can have WordStar, DOS, and Lotus 1-2-3 running all at the same time that would otherwise have only single-tasked on his text based Zenth ZP-150 IBM-PC clone.

Obviously that's just one of millions of stories to be discovered. For your own enjoyment, head over to  Archive.org and begin your own journey through computer (and other) history 🙂

Evolutionary Steps of Mobile Phones And The Future

In the past 30 years, mobile phones that are no longer mere 'phones' these days have gone through an incredible evolution. Perhaps it's a good time now to reflect on the changes that happened over time and to ask the question if commoditization will be the inevitable next step.

The Era of Voice and Text Messaging

From the mid 1980's when mobile phones were expensive and hence a luxury of the few, right to the middle of the 2000's when mobile phones became affordable and a mass market phenomena, their functionality was mostly limited to voice calling and text messaging. At the beginning of this period it was actually only voice telephony, text messaging was added only much latter as a consumer service in 1992.

While voice and text messaging where the prevalent use of mobile devices right up to the middle of the 2000's for most people, the major trend was to make devices smaller and smaller. This continued up to the point where the size was no longer limited by the electronics that had to fit into the device but by how small the keyboard and screen could be made.

Early smartphones, proprietary operating system

While the race for miniaturization was in full swing, companies started to innovate around how to put more than just voice telephony and text messaging into a mobile device. A first device to include many more functions was the Nokia 9000 Communicator, released in 1996. Advertised as a mobile office with calender and address book, dial-up modem functionality for Internet access and web browsing, it was too expensive and bulky for most people. It took another 10 years before such devices were small and affordable enough and mobile networks capable to transport data faster than just a few kilobits per second. The Nokia 6680 from back in 2005 was one of the first mass market smartphones that one could actually see in the hands of people. Most did probably not realize the potential of the device in their hands beyond the nice and, for the time, large and colorful screen and continued to use it like their previous device mainly for voice telephony and texting.

But those who realized its potential and were willing to pay more for mobile Internet access used it for browsing the web, to send and receive emails and to create content for websites and social media on the fly. One of my blog entries from back in 2006 that shows the 6680 in action together with a foldable Bluetooth keyboard for blogging from, what was then called, the 3GSMWorldCongress.

Apart from having become versatile tools to access the Internet while on the go, third parties could develop apps that could then be used on the device. One of my favorite 3rd party application of the time was OperaMini, a web browser that far surpassed functionality and ease of use of the built in web browser.

While downloading and using apps on a mobile device has become common place today, the introduction of that capability was a major step in the development of mobile devices. Up to this point, mobile phone manufacturers saw their devices mostly as closed systems to which only they could add software or modify it. That is quite a difference to the evolution of personal computing where the computer was always seen as a means to an end, i.e. to run third party software rather than as a closed system.

Smartphones become a PC in the pocket – Or Almost

While 3rd party apps on mobile devices, by that time no longer called mobile phones by many, became more popular, the operating systems of these devices remained proprietary and closely tailored around the limitations of the mobile hardware in terms of processing power and memory capacity. Around 2007, however, enough processing power and memory was available even in small devices for adapted desktop kernels and operating systems to make the jump to mobile devices. The first generation iPhone from 2007 and the HTC Dream from 2008 were the first mass market devices to use adapted BSD and Linux kernels from the desktop world, a lot of software initially developed for desktop operating systems and merged it with newly developed user interfaces that were adapted to mobile use, smaller screens and touch input.

The re-use of desktop operating system software and the influx of the idea that mobile devices were no longer closed-world systems closely guarded by manufacturers, but first and foremost platforms to run third party applications, just like PCs, gave a tremendous push to the transformation of the mobile industry towards ever more versatile and open mobile computing platforms. Especially Android has helped in this regard as it is based on open source and most parts of the system are open source and available to use and modify by anyone.

Commoditization, Openness and the Future

Let's come back to the question if, when and how commoditization will happen in the mobile world and have a look at the desktop world first. Today, desktop PCs and noteboosk are pretty much commoditized, i.e. products of different manufactures look and feel pretty much the same and run the same software. Apart from Apple, the single but major exception, all other desktop and notebook manufacturers such as HP, Lenovo, Dell, Asus, Samsung, Acer and many many small brands and no-name manufacturers design and produce very similar hardware that is compatible to Microsoft's Windows operating system. The user interface for devices of all manufacturers looks identical, except perhaps for a company logo during the boot process and additional utility programs of often doubtful value and usability in addition to drivers for new hardware components to make it usable for the standard Windows operating system. As products are pretty similar, it is difficult for hardware manufacturers to differentiate themselves from the competition and hence, profits are small. Only Apple has managed, despite also using Intel based PC hardware, to differentiate themselves from others with their own operating system, programs and services and to keep profits high.

A tremendous benefit of hardware commoditization is that people can install alternative operating systems based on Linux and BSD kernels on those systems. While popularity of doing so is on the rise, it is far from a mass market phenomena. Despite advantages such as the preservation of privacy and being much less prone to security issues of mainstream desktop operating systems, most people do not want to spend the effort or feel incapable of installing a different operating system themselves. But the possibility for those who want to do it is there and things have become much easier for those who dare.

On mobile devices, however, we are still quite far away from this state. While Apple applies the same business model to their mobile devices like for their desktop and notebook (i.e. integrated hardware and software) most other mobile device manufacturers use Google's Android operating system as basis for their devices. Most major manufacturers such as Samsung, Sony, LG and HTC, however, still heavily customize the user interface of the operating system and innovate on the hardware side. Still, it's possible to install different Android based operating systems on many of those devices, CyanogenMod being one of the well known alternatives to an Android OS that was customized by a manufacturer.

Things are unfortunately not as far along the way as in the PC world, where Linux distributions such as Debian, Ubuntu, OpenSuse, Redhat and many others recognize all hardware components during their startup procedure and configure themselves accordingly. In the mobile world, customized versions of CyanogenMod or other alternative Android distributions need to be created for each device. Also, many hardware manufacturers lock their boot loaders to prevent alternative operating systems to be installed. This either makes it impractical to install alternative OSes on such devices or requires special software tools to remove such locks. And to add insult to injury, it's difficult to impossible to provide timely updates to security issues if the complete OS needs to be rebuilt for thousands of different device types as the Stagefright vulnerability has painfully shown to a larger audience this summer.

In other words, despite Android's openness and use of open source, we are still lightyears away from mobile devices being just commoditized hardware that runs the same operating system and the same user interface on top with a few additional drivers to access special hardware and naturally allows users to install a different operating system if they chose to do so. But despite still heavy customization, it seems margins are getting thinner and thinner. This gives hope to those that don't want their lives to be analyzed by their devices and consequently by the manufacturers of those devices and their personal data sucked out, stored and analyzed on servers at other end of the world. So for me, commoditization that hopefully leads to more standardized devices, to easier installation of alternative operating systems and a 'one patch fits all devices on the same OS version' can't come soon enough.

14 Mbit/s Over UMTS – So Much For Network Congestion

Umts-lifeI've been back in Austria this year for a bit and once again I made good use of my unlimited data over Drei Austria's UMTS network for 18 euros a month on prepaid. They do have an LTE radio access network these days as well but unfortunately, my prepaid subscription is limited to their UMTS network. As a consequence, I was a bit afraid that my experience would suffer due to potential congestion due to the the ever rising data traffic and investment shifting to LTE.

To my pleasant surprise quite the contrary was the case. In all places with a high signal level I tried, I could easily get sustained data rates of well over 12 MBit/s over an OpenVPN tunnel. And that was not somewhere outside big cities with few people using the network, no that was in Vienna, one of the most densely populated parts of the country.

Quite incredible for a network that offers its customers unlimited and non-throttled data for less money than most other network operators worldwide would give you a couple of hundred MB or a few GB at best. So much for the theory that unlimited data would bring a network to its knees…

20 MHz LTE Band 20 Network in Austria

In most countries in Europe I have visited recently, network operators are now making good use of LTE band 20 in the 800 MHz part of the spectrum to cover rural areas. Usually, three network operators have each been successful in getting a 10 MHz chunk of the 30 MHz channel. In Austria, it's different however, as I recently found out while having a closer look at the networks around me.

Instead of three network operators, only two network operators have acquired spectrum in this band, T-Mobile Austria and A1 (Telekom Austria). The later one has quite a big chunk, 20 MHz! That's the first country I've seen in which a 20 MHz carrier is deployed in the 800 MHz band. A huge advantage for rural high speed Internet connectivity! For details see here.

While I traveled by train, however, all my devices mostly stayed on 3G, probably due to the heavily insulating windows. Also, none of my devices were able to acquire a GPS fix even when held directly at a window. As 3G got in just fine there might have been repeaters on the train but not for 800 MHz. At least there was free Wi-Fi on board which worked quite well on the cross country trip. But that's little consolation as Wi-Fi tends to saturate over time. Also, everybody being able to snoop on my data traffic is  not quite my cup of tea either. I hope there are plans to upgrade the repeaters and let the professionals outside handle the traffic.

First Network Operator Disables MMS Delivery Due to Stagefright

Several news portals report today (see here, here and here) that the first network operator has disabled MMS delivery temporarily due to most Android mobiles in the wild today being vulnerable to the Stagefright that can be exploited by sending videos by MMS and other means. Instead, an SMS is sent to customers with a link to view the MMS contents in a browser. Quite a responsible step and I wonder if or how many network operators will follow the example.

This announcement is quite something! I have never heard that a network operator has disabled one of its own services because of a mobile device vulnerability. As the articles linked-to above point out, it's no secret in the mobile industry that Whatsapp and other services are now much preferred over expensive MMS messaging, so perhaps little revenue is lost by this step.

In the announcement the carrier said that this is only a temporary solution until a fix has been found. That makes me wonder how that fix could look like and how long this temporary solution remains in place!? It's unlikely that the threat will go away anytime soon as it will take quite some time for devices to get patched as an Android OS patch is required. That means the update can't be delivered via Google's app store but needs to be incorporated by device manufacturers in their builds for each device. Good luck with that. Also, I guess there will be many devices that will never get updated as device manufacturers have already lost interest in providing OS updates for devices that are somewhat older.

Another solution I can imagine would be to put a "virus scanner" in place on the MMS server in the network to filter out malicious videos. But that will cost time and money not only initially but also to keep the signatures up to date. That makes me wonder if the service still makes enough money to justify such a measure!? On this account I wouldn't be surprised if Facebook, Google and others are already scrambling to put scanners in place to make sure videos that are put on their services by users do not contains malicious content.

No matter how I look at it, I can't help but feel that we've just reached a tipping point when it comes to mobile security. Google and device manufacturers need to do something radical and drastic NOW to make sure that future Android devices can be patched in a timely manner (i.e. in a matter of hours just like what is possible for desktop operating systems) rather than having to wait for device manufacturers to come up with new builds or, even worse, not being able to patch Android devices at all anymore due to lack of manufacturer support.

Why Don’t We Still Have A SIMPLE Smartphone Remote Help Function?

One thing I've been hoping for years now is for someone to come up with an easy to install remote smartphone help function to see and control the screen remotely. On the desktop VNC has existed in various flavors almost since the beginning of desktop GUIs. On all mobile operating systems, even on the open source Android, there is no good and simple solution. Yes, there are some solutions that have been started and abandoned again like the droid VNC server project but nothing that just works out of the box and across different devices and Android versions. I have to admit I'm a bit frustrated because I could use a remote help function at least once a week.

A couple of years ago I was extremely thankful to Google for bringing Wi-Fi tethering to the masses when it was still perceived as something "terrible" by most network operators. But for a remote screen implementation I personally can't count on Google because I'm sure that should they come up with such a thing they would put one of their servers between the smartphone and the helping hand. No thanks, I need something more direct, private and secure. But Android is open and I would accept something that requires a rooted phone and I can't but wonder why nobody has come up with a good, simple and interoperable solution so far!?

P.S. Cyanogen, hello!?

How To Simulate an IP-Adress Change On A NAT WAN Interface

The vast majority of Internet traffic is still over IPv4 and when I look at all the different kinds of connectivity I use for connecting my PC and smartphone to the Internet, there isn't a single scenario in which I have a public IPv4 address for those devices. Instead, I am behind at least one Network Address Translation (NAT) gateway, that translates a private IPv4 address into another IPv4 address, either a public IPv4 or another private one in case several NATs are cascaded. While that usually works well, every now and then one of those NATs change their IP address on their WAN interface which creates trouble in some use cases.

Applications that use the TCP transport protocol quickly notice this as their TCP link gets broken by the process. Higher layers are notified that the TCP link is broken and apps can reestablish communication to their counterpart. Apps using UDP as a transport protocol, however, have a somewhat harder time. UDP keep-alive packets sent in one direction to keep NAT translations in place are not enough as they just end up in nirvana. Bi-directional UDP keep alive packets also end up in nirvana without the application on top ever being notified about it. The only chance they have is to implement a periodic keep-alive 'ping' and a timeout after which the connection is to be considered broken.

Recently I had to simulate such behavior and wondered how to best do that. Again a Raspberry Pi acting as a Wi-Fi access point, NAT and Ethernet backhaul served as a great tool. Three shell commands are enough to simulate an IP-address change on the WAN-interface:

Do a 'sudo nano /etc/dhcp/dhclient.conf' and insert the following line:

    send dhcp-requested-address 10.10.6.92;

The IP address needs to be replaced with an IP-address of the subnet of the WAN interface but different from the one currently used. Obviously that IP address must not be used by any other device on the WAN subnet which can be checked by pinging the IP address before changing the configuration file.

Next, disable the dhcp client on the WAN interface with

   dhclient -r -v

The interface itself remains up so Wireshark does not stop an ongoing trace but IP-connectivity is removed. If the scenario to be simulated requires a certain time before a new IP address is available, just pause before inputting the next command.

A new IP address can then be requested with

   dhclient -v eth0

The result returned by the command shows if the requested IP has been granted or if the DHCP server has assigned the previous IP address again or perhaps even an entirely different one. If the DHCP server has assigned the old IP address, changing the MAC address after disabling the dhcp client will probably help.

And that's all there is to it except for one 'little' thing: If the Raspberry Pi or other Linux device you perform these actions on are yet again behind another NAT, the server on the Internet will not see an IP address change but just a different TCP or UDP port for an incoming packet. So while in some scenarios the connection will still break there are potentially scenarios in which a connection can survive. One example is an ongoing conversation over UDP. If the NATs assign the same UDP port translations on all interfaces again, which can but does not necessarily have to happen and the first UDP packet is sent from the NATed network, the connection might just survive. I can imagine some TCP survival scenarios as well so don't assume but check carefully if the exercise produces the expected changes in IP address and port nubmers for your network setup.

Have fun!