No Google Play Store In China

When I was recently in China a number of my fellow travelers asked me if I could access the Google Play store. Over a Wi-Fi connection without a VPN I couldn't. I wasn't really all that much surprised, most Google services, Facebook, etc., etc, and most VPN services with servers outside of China are blocked as well, so why should the Play Store be accessible?

Well, for one thing I thought at first, because there are said to be 700 million Android based smartphones and tablets in China. As we are all taught how important it is to download software only from a carefully controlled App store these days, how are those 700 million devices getting software and updates? So I asked one of my local friends with a Chinese Android device if Chinese Android devices can access the Google Play store. As expected I got the answer that the Play store does not work in China and that people just search for Apps on Baidu (the local Google search equivalent) and install it right from a web page. Baidu offers and app store for Android as well but direct installation from web pages seems to be quite popular as well. So much for security screened apps and automatic updates.

Perhaps 700 million Android devices without access to the official store is one of the reasons why Android still makes it easy to download apps from, what is called, "unknown sources" in the user interface and allows to use alternative App stores (of which there seem to be quite many in China). If I were a cynic I would probably be thankful for the censorship so I have more freedom.

Which makes me wonder what kind of concessions Apple had to make as their app store can be accessed in China…

3G Mobile Video Calls Are Dead – Long Live Mobile Video Calls

Incredible, I made my first video call only or already a decade ago, depending on how you look at it. At the time I was convinced it would become a mass market phenomenon once more people had 3G phones. It didn't really work out like that, however, and I have to admit that the service never really became popular, perhaps because most network operators massively overpriced the service and failed to continuously innovate and evolve the service.

Today, 3G video calling is still in the same state as it was 10 years ago. For today's devices the resolution and frame rate of the video is far too low and picture quality on the large screens of today's devices is far from what people expect. In the meantime some network operators have even given up on the service entirely and have begun blocking the service for new subscriptions.

But I'm glad that others haven't given up and have continued to innovate. Facetime on mobile has reached some popularity, e.g. see my post from New York from back in 2011. Personally, I use Skype for smartphone and tablet video telephony. Over LTE and even 3G, the video resolution and frame rates are fantastic. These days, I'm seeing more and more people engaged in video calling, especially at airports. Still a niche when compared to the billions of voice minutes generated every day, I agree, but nevertheless quite mature and useful today.

Old DVDs And New Drives Don’t Make A Good Pair – Hello Old PC

Optical DVD drives are getting out of fashion in notebooks these days. In theory, that's not a bad thing as it saves space and weight and one can always buy an external USB DVD drive for a few euros should one really need one. The problem is that those I tried in recent weeks are of such bad quality that they fail to read many of the DVDs and CDs I wanted to read.

Read issues often do not appear at first when inserting a CD or DVD but only later when I'm already halfway or two thirds through the content. Sometimes, a DVD that can't be fully read in one drive but works o.k. in another and vice versa. Sometimes a DVD fails in both but at different locations. Quite a mess.

But then I remembered that I have a 15 year old PC still standing around in the corner with 2 DVD drivers from back then, solidly built and quite expensive at the time. Despite their age, though, they've so far been able to read each and every DVD and CD that was partially unreadable on those crappy USB connected DVD drives for a couple of euros.

Perhaps it's time to convert my CDs and DVDs while that computer still works…

HTTPS Public Key Pinning (HPKP) Is Great – But Mobile Support Is Only Half Baked So Far

A couple of months ago, Chrome, Firefox and perhaps other browser have begun to 'pin' the HTTPS certificates used by Google, Twitter and others for their web pages. This significantly improves security for these web pages as their certificates can no longer be signed by any of the hundreds of Certificate Authorities (CAs) that are trusted by web browsers but only by one or a few select ones. So far, this functionality was part of the web browser's code. Recently, however, most desktop and mobile browsers have added support for the generic HTTPS Public Key Pinning (HPKP) method standardized in RFC 7469 that enables any HTTPS protected web site to do the same. Time for me to add it to my Owncloud and Selfoss servers too to protect myself from man-in-the-middle attacks.

HPKP-headerHPKP works by adding a public key pin header string to the HTTP response header section that is returned to the web browser each time a web page is loaded. On first request, the web browser stores these and whenever the page from the same domain is loaded again afterward compares the hashes of the HTTPS certificate it receives with those previously stored. If they don't match the page load process is aborted and an error message is shown to the user that can't be overridden. For the details of how to generate the hashes and how to configure your webserver have a look here and here.

The first screenshot on the left (taken from Firefox'es Webdeveloper Network console) shows how the public key pin looks like in the HTTPS response header of my web server. In my case I set the validity of the pinning to 86400 seconds, i.e. to one day. This is long enough for me as I access my Owncloud and Selfoss servers several times a day. As I don't change my certificate very often I decided not to pin one of the CA certificates in the chain of trust but be even more restrictive and pin my own certificate at the end of the chain.

On the PC I successfully verified that Firefox stores the pin hashes and blocks access to my servers by first supplying a valid certificate and a corresponding public pin hash and then removing the pin header and supplying a different valid certificate. Even after closing and reopening the browser, access was still blocked and I could only access my Owncloud instance again after I reinstated the original certificate again. Beautiful.

Opera-hkpk-errorOn Android, I tried the same with Firefox Mobile and Opera Mobile. At first I was elated as both browsers block access when I used a valid certificate that was different from the one I pinned before. The second screenshot on the left shows how Opera Mobile blocks access. Unfortunately, however, both browsers only seem to store the pin hashes in memory. After restarting them, both allowed access to the server again. That's a real pity as Android frequently terminates my browser when I switch to other large apps. That's more than an unfortunate oversight, that's a real security issue!

I've opened bug reports for both Firefox and Opera mobile so let's see how long it takes them to implement the functionality properly.

Stagefright 2 – And Nobody Cares?

News is inflationary… Back in August there was a big wave in the press when it was discovered that Android, through all versions, had a couple of pretty serious remote code execution and privilege escalation vulnerabilities in the libstragefright libraries which are called every time a video is shown or previewed. The wave was as big as it was as the vulnerabilities are easily exploitable from the outside by embedding videos in web pages or messages. Device companies promised to patch their devices in a timely fashion and promised to change they way security patching would be done in the future. For some devices this has even happened, but for many older devices (read 2+ years old) nothing was done. But since the news broke, things have calmed down again. Then, in early October, another batch of serious Stagefright issues was discovered that are as exploitable as the first ones. This time, however, the echo was quite faint.

It really makes me wonder why!? Perhaps this is a result of the vulnerabilities not having been exploited on a large scale so far? Which makes me wonder why not, black hats are usually quite quick to exploit things like that. Does nobody know what to do with smartphones under their control? Or perhaps the bad guys are not yet familiar with coding in assembly language on ARM and how to use the Google Android API? If so then the latest episode was perhaps one of the final warning shots before things get real. Let's hope the good guys use the time well to fortify the castle.

On the positive side, Google has patched the vulnerable code in the meantime and so did CyanogenMod, so my devices are patched.

The Politics Behind LTE-Unlicensed

For some time now, interested companies in 3GPP are pushing for an extension to the LTE specifications to make the technology also usable as an air interface technology for the 5 MHz unlicensed band, currently the domain of Wi-Fi and other radio technologies for which no license is required to operate (i.e. it's free for everyone to use). I wrote about the technology aspects of this earlier this year so have a look there for the details. Apart from the technical side, however, another interesting topic is the politics behind LTE-Unlicensed as not everybody seems to be thrilled by LTE marching into unlicensed territory.

Some parties in 3GPP are totally against LTE becoming usable in an unlicensed band, fearing competition from companies that haven't paid hundreds of millions for beachfront spectrum property. Some cautiously support it in it's current incarnation, which is referred to as LTE-LAA (License Assisted Access), as it requires an LTE carrier in a licensed band to control transmission of an LTE carrier in an unlicensed band. In effect that keeps the would be upstart competition at bay. And then there are those who want to completely release the breaks and extend LTE to make it usable in a standalone way in unlicensed bands. Perfectly irreconcilable. I'm writing all of this because I recently came across an article that sheds some light on what's going on which I found quite interesting.

My Uploads Are Three Times Faster On LTE Than With VDSL At Home

It's a bit ironic but my uplink speed in LTE is three times higher when I'm sitting in a train comunting to work and being connected over LTE than what it would be when sitting at my desk at home and being connected over a 25 MBit/s down + 5 Mbit/s uplink VDSL line.

I just had that thought when I uploaded a 50 MB ZIP file to the cloud in a couple of seconds at around 15 Mbit/s which is, mind you, not the maximum LTE can provide on a 20 MHz bearer, but my uplink speed is limited. It's really time for a fiber link at my home in Germany like I already have in Paris. But unfortunately, German politics creates no incentives for network providers to catch up to more developed parts of the world… Quite the contrary, DSL vectoring is the future as far as the government and the local incumbent is concerned 🙁

15+ Devices At Home With A WiFi Interface Today – But It All Started With An Orinoco

At the end of the 1990's coax based 10 Mbit/s Ethernet was the network technology most companies used at the time, including the one I worked for then. It was also there that I held the first 802.11 wireless network card in my hand. The brand was Orinoco, the company that produced it was Lucent and it could transfer data at a whooping speed of 1 and 2 Mbit/s, just a tiny fraction of what is possible now. Today, 'Wi-Fi card' would be the term used by most but Wi-Fi cards to be plugged into PCs are mostly a thing of the past as most devices now have a Wi-Fi chip and antenna built-in. Gone are also the days when Wi-Fi connectivity was expensive. For less than 10 euros one can buy an 802.11n Wi-Fi USB dongle these days for the few devices that are not Wi-Fi equipped yet.

So much for the history part of this blog entry. I'm writing all of this because I recently realized that I have over 15 Wi-Fi enabled devices at home these days that are in frequent use. There's my notebook of course that I work with every day, a test notebook to try out new things, a notebook mostly used for video streaming, at least 3 smartphones, my spouse's notebook and her 2 smartphones, a Raspberry Pi for audio streaming to the 20 year old Hifi set in the corner, the access point itself, a second access point that also acts as an Ethernet switch and 2 Wi-Fi enabled printers. In addition to these devices that are in use all the time there are at least half a dozen Wi-Fi USB dongles that are occasionally put into good use with about as many Raspbery Pis for various purposes.

Quite an extraordinary development when I think back to this first and hyper-expensive Orinoco wireless LAN card I once held in my hands for the first time and marveled at how it is possible to transfer data so quickly over the air with just such a 'little' card.

We Can’t Afford To Let Any Part Of The Internet Rot In Place

Over the last decade Wi-Fi devices have become tremendously popular. Unfortunately it seems the Federal Communications Commission (FCC) and its counterpart in the EU are becoming concerned that 3rd party software that controls the radio hardware may negatively impact interoperability with other applications using the same frequency bands, e.g. by increasing transmission power beyond the regulatory limits. As a result the FCC and the EU are proposing or have already implemented laws that require the hardware manufacturer of a device to ensure that only their radio software can be used in the device. The problem with that is that instead of 'only' locking down the radio software, manufacturers of Wi-Fi access points and other Wi-Fi devices such as smartphones might be tempted to use this as an excuse to lock-down the whole device thus making it potentially impossible in the future to use Wi-Fi routers with alternative software such as Open-WRT or smartphones with alternative Android derivates such as CyanogenMod.

While the EU has already published a directive to that end that shall come into effect in June 2016, but first needs to be implemented in national laws of the individual member states, the FCC is still in the comments phase of the process. One response, signed by pretty much everyone of the who-is-who in the Open Source community including Linus Torvalds and Internet luminaries such as Vint Cerf, is truly outstanding:

In their response, the authors explain the dire state of the Wi-Fi router market today that is only driven by price but not by quality and responsibility. This leads to hundreds of millions of devices in the field today that are insecure and pose a significant risk to their owners and the Internet as a whole.

To fix both the radio issue addressed by the FCC and the wider issue of software with grave security issues being abandoned by device manufacturers, the authors propose an alternative approach to the FCC's lock-down proposal:

1. Any vendor of software-defined radio (SDR), wireless, or Wi-Fi radio must make public the full and maintained source code for the device driver and radio firmware in order to maintain FCC compliance. The source code should be in a buildable, change-controlled source code repository on the Internet, available for review and improvement by all.

2. The vendor must assure that secure update of firmware be working at time of shipment, and that update streams be under ultimate control of the owner of the equipment. Problems with compliance can then be fixed going forward by the person legally responsible for the router being in compliance.

3. The vendor must supply a continuous stream of source and binary updates that must respond to regulatory transgressions and Common Vulnerability and Exposure reports (CVEs) within 45 days of disclosure, for the warranted lifetime of the product, or until five years after the last customer shipment, whichever is longer.

4. Failure to comply with these regulations should result in FCC decertification of the existing product and, in severe cases, bar new products from that vendor from being considered for certification.

5. Additionally, we ask the FCC to review and rescind any rules for anything that conflicts with open source best practices, produce unmaintainable hardware, or cause vendors to believe they must only ship undocumented “binary blobs” of compiled code or use lock down mechanisms that forbid user patching. This is an ongoing problem for the Internet community committed to best practice change control and error correction on safety-critical systems.

These are powerful proposals and I am delighted that the letter was signed by a huge number of well known and respected people in the industry. But not everyone will like the proposals and I can already see the marching orders for lobbyists of hardware manufacturers to fight this. While many manufacturers have an open source driver for their Wi-Fi hardware today, the software that runs on the Wi-Fi chip itself is usually closed source and only available as a binary blob. Having the source available of this part as well would be truly revolutionary. Requiring that the owner of the device must have ultimate control over the software update process (if he wishes so) is another strong requirement. This wouldn't prevent automatic updates for those who don't care but the ability to stay in control of what you own if you wish to do so.

The paper from which I have quoted the 5 proposals above is well worth a read. It is well written and explains in detail why the FCC should adopt the proposals above instead of what they have initially suggested. So let's see how visionary the FCC can be.

P.S.: The headline of this post is an abbreviation of a quote of Vint Cerf in a recent article on the topic in Businesswire:

"We can't afford to let any part of the Internet's infrastructure rot in place. We made this proposal because the wireless spectrum must not only be allocated responsibly, but also used responsibly. By requiring a bare minimum of openness in the technology at the edge of the Internet, we'll ensure that any mistakes or cheating are caught early and fixed fast"

P.P.S.: And for further background info about EU directive 2014/53/EU that has something similar like the FCC in mind have a look at Julia Reda's recent blog entry on the topic.

The Interesting Prepaid Shift To Data Only Buckets For Smartphones

Over the last few years, apps on smartphones such as Facebook, WhatsApp, etc. have become hugely successful, especially among teens and twens. While I was one of the few just 5 years ago who used the mobile Internet on my daily train tips there are now only few without a smartphone in their hand or a notebook on their lap. In other words, use of mobile devices has significantly changed during that time, away from voice and SMS towards Internet based applications. Prepaid tariffs, however, have not, at least not until recently.

No matter to which network operator I turn in Germany, most prepaid tariffs are a bundle of included voice minutes, SMS messages and a few hundred MB or a GB or two for Internet access. Many people today, however, do not need endless amounts of voice minutes and SMS, they are communicating mostly via the Internet. In effect they need to pay for something they don't use. But now things are changing. In the last couple of weeks I have seen two offers that let prepaid subscribers chose the number of voice minutes, SMS and the amount of data traffic per month independently of each other. The number of voice minutes and SMS can even be set to 0 and then paid by the minute or by SMS instead.

An interesting offer for many customers and I would not be surprised to see the number of mobile voice minutes not only declining in fixed line networks but also in mobile now as we are way beyond peak telephony by now.