When A Book Still Trumps Online Search

Search engines have revolutionized the way many people including myself learn new things. Before the days of the Internet, working on a specific programming problem sometimes meant reading manuals and trying out a lot of different things before getting things working. These days a quick search on the Internet usually reveals several solutions to chose from. While this works well especially for 'smalls scale' problems I still feel much more comfortable reading a book when things get more complex.

Take my recent adventures with Apache, PHP and MySQL for example. I am sure I could have learnt a lot by just using online resources and 'googeling' my way through it but there are quite a number of things that have to fall into place at the beginning, e.g. setting up the Eclipse development environment, installing XAMPP, getting the PHP debugger up and running, to learn how PHP applies the concepts of functional and object oriented PHP programming, to learn how to work with SQL databases in this environment, etc. etc. As the combination of these things go far beyond what a single online search could return I decided to pick up a book that shows me how to do these things step by step instead of trying to piece things together on my own.

For this particular adventure I decided to go for the 'PHP any MySQL 24 Hour Trainer' book by Andrea Tarr. While already written back in 2011, it's about Apache, PHP and MySQL basics and these haven't changed very much since then. The book is available in both print and ebook edition and while I went for the printed edition I think the ebook version would have worked equally well for me. In other words, book vs. online search is not about offline vs. online, it's more about having all information required in a single place.

It's interesting to observe that at some point a cutover occurs in the learning process: Once the basic things are in place and the problem space becomes narrow enough it becomes easier to use an online search engine to find answers to topics rather then to explore the topic in a book. A perfect symbiosis I would say.

My Raspberry Pi Servers and Heartbleed

Unless you've been living behind the moon in the past 24 hours you've probably heard about 'Heartbleed', the latest and greatest secure http vulnerability that Bruce Schneier gives it an 11 on a scale from 1 to 10. Indeed, it's as bad as it can get.

As I have a number of (Debian based) Raspberry Pi servers on which I host my Owncloud, Selfoss and a couple of other things I was of also affected and scrambled to get my shields back up. Fortunately the guys at Raspberry reacted quickly and offered the SSL fix in the Raspian repository quickly. Once that was done I got a new SSL certificate for my domain name, distributed it to my servers and then updated all my passwords used on those systems. Two hours later… and I'm done.

And here's two quotes from Bruce's blog that make quite clear of how bad the situation really is:

"At this point, the odds are close to one that every target has had its private keys extracted by multiple intelligence agencies."

and

"The real question is whether or not someone deliberately inserted this bug into OpenSSL"

I'm looking forward to the investigation who's responsible for the bug. As 'libssl' is open source it should be possible to find out who modified that piece of code in 2011.

The Joy Of Open Source: You Can Fix It Yourself

Over the past months I've learnt a lot about Apache, PHP and MySQL in my spare time as I wanted to implement a database application with a web front end for my own purposes. While the effort would have probably been too high just for this, I was additionally motivated by the fact that learning about these tools also gives me a deeper understanding of how web based services work under the hood.

Databases have always been my weak point as I had little use for them so far. After my project I have gained a much better understanding about MySQL and SQLite and feel comfortable working with them. What a nice side effect.

And in addition, the knowledge gained helps me to better understand, debug and fix issues of open source web based applications I am using on a regular basis. A practical example is Selfoss, my RSS aggregator of choice I've been using ever since Google decided to shut down their RSS reader product last year. While I am more than happy with it, the feed update procedure stops working every couple of months for a while. When it happened again recently I dug a bit deeper with the knowledge I have gained and found out that the root cause were links to non-existing web pages that the update process tried to use. Failing to load these pages resulted in an abort of the update process. A few lines of code and the issue was fixed.

I guess it's time to learn about 'git' now so I can not only fix the issue locally and report it to the developer but also supply a fix for it and potentially further features I'm developing for it. Open source at its best!

Blog – Firefox Shows Https Encryption Algorithm

Firefox-encryption-infoBy chance I recently noticed that Firefox at some point has started to show detailed information on the encryption and key negotiation method that is used for a https connection. As I summarized in this post, there are many methods in practice offering different levels of security and I very much like that Wireshark tracing is no longer required to find out what level of security a connection offers.

T-Mobile USA Joins Others In the 700 MHz Band

A couple of days ago I read in this news report that T-Mobile USA has acquired some lower A-Block 700 MHz spectrum from Verizon it intends to use for LTE in the near future. To me as an outsider, US spectrum trading just leaves me baffled as network operators keep exchanging spectrum for spectrum and/or money. To me it looks a bit like the 'shell game' (minus the fraud part of course…).

Anyway, last year I did an analysis of how the 700 MHz band is split up in the US to get a better picture and you can find the details here. According to my hand drawn image and the more professionally done drawings in one of the articles that are linked in the post there's only 2×5 MHz left in the lower A-band, the famously unused lower part of band 12. Again the articles linked in my post last year give some more insight into why band 12 overlaps with band 17. Also, this Wikipedia article lists the company for band 12/17.

So these days it looks like the technical issues of not using the lower 5 MHz of band 12 have disappeared. How interesting and it will be even more interesting to see if AT&T will at some point support band 12 in their devices or whether they will stick to the band 17 subset. Supporting band 12 would of course mean that a device could be used in both T-Mobile US's and AT&T's LTE networks, which is how it is done in the rest of the world on other frequency bands for ages. But then, the US is different and I wonder if AT&T has the power to continue to push band 17.

One more thought: 2×5 MHz is quite a narrow channel (if that is what they bought, I'm not quite sure…), typically 2×10 MHz (in the US, Europe and Asia) or 2×20 MHz (mainly in Europe) channels are used for LTE today so I wouldn't expect speed miracles on such a channel.

Fake GSM Base Stations For SMS Spamming

If you think fake base stations are 'only' used as IMSI-catchers by law enforcment agencies and spies, this one will wake you up: This week, The Register reported that 1500 people have been arrested in China for sending spam SMS messages over fake base stations. As there is ample proof available how easy it is to build a GSM base station with the required software behind it to make it look like a real network I tend to believe that the story is not an early April fool's day joke. Fascinating and frightning at the same time. One more reason to keep my phone switched to 3G-only mode as in UMTS, mutual authentication of network and device prevent this from working.

USB 3.0 In Mobile Chipsets

Today I read for the first time in this post over at AnandTech that chipsets for mobile devices such as smartphones and tablets are now supporting USB 3.0. I haven't seen devices on sale yet that make use of it but once they do one can easily spot them as the USB 3.0 Micro-B plug is backwards compatible but looks different from the current USB 2.0 connector on mobile devices.

While smartphones and tablets are still limited to USB 2.0 in practice today, most current notebooks and desktops support USB 3.0 for ultra fast data exchange with a theoretical maximum data rate of 5 Gbit/s, which is roughly 10 times faster than the 480 Mbit/s offered by USB 2.0. In practice USB 3.0 is most useful when connecting a fast external hard drives or SSDs to another device, as these can be much faster than the sustainable data transfer rate of USB 2.0, which is around 25 MByte/s in practice. As ultra mobile devices such as tablets are replacing notebooks to some extent today, it's easy to see the need for USB 3.0 in such devices as well. And even smartphones might require USB 3.0 soon, as the hardware has become almost powerful enough for them to be used as full powered computing devices such as notebooks in the not too distant future. For details see my recent post 'Smartphones Are Real Computers Now – And The Next Revolution Is Almost Upon Us'.

Making use of the full potential of USB 3.0 in practice today is difficult even with notebooks that have more computing power than smartphones and tablets. This is especially the case when data has to be decrypted on the source device and re-encrypted again on the target devices as this requires the CPU to get involved. This is much slower than if the data is unencrypted and can thus just be copied and pasted from one device to another via fast direct memory access that does not require the CPU. In practice my notebook with an i5 processor can decrypt + encrypt data from the internal SSD to an external backup hard drive at around 40 MByte/s. That's faster than the maximum transfer speed supported by USB 2.0 but way below what is possible with USB 3.0.

German EMail Providers Are Pushing Encryption

Recently I reported that my German email providers have activated perfect forward secrecy to their encryption when fetching email via POP and SMTP. Very commendable! Now they are going one step further and are warning customers by email when they use one of these protocols without activating encryption (which is the default in most email programs). In addition they give helpful advice on how to activate encryption in email programs and announce that they will no longer support unencrypted email exchange soon. This of course doesn't prevent spying in general as emails are still stored unencrypted on their servers so those with legal and less legal means can still obtain information for targeted customers. But what this measure prevents is mass surveillance of everyone and that's worth something as well!

How To Securely Administer A Remote Linux Box with SSH and Vncserver

In a previous post I told the story how I use a Raspbery Pi in my network at home to be able to get to the web interfaces of my DSL and cellular routers in case of an emergency. In the light of recent web interface breaches when they are accessible on the WAN side of the router I guess it can be seen as a reasonable precaution. Only two things are required for it, a vncserver running on the Linux box and ssh. If I had known it is so easy to set up and use I would have done similar things already years ago.

And here's how to set it up: On the Raspberry Pi I installed the 'vncserver' package. Unlike packages such as 'tighvncserver', this version of VNC creates starts its own X Server graphical user interface (GUI) that is invisible to the local user rather than exporting the screen the user can see. This is just what I need in my case because there is no local user using the Rapsi and there's no X Server started anyway. When I'm in my home network I can access the GUI over TCP port 5901 with a VNC client and I use Remmina for the purpose. Once there I can open a web browser and then access the web interfaces of my routers. Obviously that does not make a lot of sense when I'm at home as I can access the web interfaces directly rather than using the Pi.

When I am not at home things are a bit more difficult as just opening port 5901 to the outside world for unencrypted VNC traffic is out of the question. This is where SSH (secure shell) comes in. I use SSH a lot to get a secure command line interface to my Linux boxes but ssh can do a lot more than that. Ssh can also tunnel any remote TCP port to any local port. In this scenario I use it to tunnel tcp port 5901 on the Raspbery Pi to port 5901 on the 'localhost' interface of my notebook with the following command:

ssh -L 5901:localhost:5901 -p 43728 pi@MyPublicDynDnsDomain.com

The command may look a bit complicated at first but it is actually straight forward. The '-L 5901:localhost:5901' part tells SSH to connect the remote TCP port 5901 to the same port number on my notebook. The '-p 43728' tells ssh not to use the standard port but another port to avoid automated scanners knocking on my door all the time. and finally the 'pi@MyPublic…' is the username of the pi and the dynamic dns name to get to the Raspi over the DSL or cellular router via port forwarding.

Once SSH connects and I have typed in the password, the VNC viewer can then simply be directed to 'localhost:1' and the connection via the SSH tunnel to the remote 5901 port is automatically established. It's easy to set up, ultra secure and a joy to use.

If You Think Any Company Can Offer End-to-End Encrypted Voice Calls, Dream On…

I recently came across a press announcement that a German mobile network operator and a security company providing hardware and software for encrypted mobile voice calling have teamed up to launch a mass market secure and encrypted voice call service. That sounds cool at first but it won't work in practice and they even admitted it right in the press announcement when they said in a somewhat cloudy sentence that the government would still be able to tap in on 'relevant information'. I guess that disclaimer was required as laws of most countries are quite clear that telecommunication providers are required to enable lawful interception of the call content and metadata in real time. In the US, for example, this is required by the CALEA wire tapping act. In other words, network operators are legally unable to offer secure end-to-end encrypted voice telephony services. Period.

Wire tapping to get the bad guys once sounded like a good idea and still is. But from a trust perspective, the ongoing spying scandal shows more than clearly that anything less than end-to-end encryption is prone to interception at some point in the transmission chain by legal and illegal entities. Also, when you think about it from an Internet perspective, CALEA and similar other laws elsewhere are nothing less than if countries required web companies to provide a backdoor to their SSL certificates so they can intercept HTTPS secured traffic. I guess law makers can be glad they came up with CALEA and similar laws elsewhere before the age of the Internet. I wonder if this would still fly today?

Anyway, this means that to be really secure against wire tapping, a call needs to be encrypted end-to-end. As telecom companies are not allowed to offer such a service, it needs to come from elsewhere. Also, this means that there can't be a centralized hub for such a service, it needs to be peer-to-peer without a centralized infrastructure and the code must be open source so a code audit can ensure there are no backdoors. Also, no company can offer the service as it would be pressured and probably required by law to put in a backdoor (see e.g. Lavabit and Silent Circle).

This is quite a challenge and requires a complete rethinking of how to communicate over the Internet in the future at least for those who want privacy for their calls. And without companies being able to provide such a service it's going to be an order of magnitude more difficult. For private individuals this probably means that they have to put a server at home for call establishment and to tunnel the voice stream between fixed and mobile devices behind firewalls and NATs. For companies it means they have to put a server on their premises and equip their employees with secure voice call apps that contact their own server rather than that of a service provider.

While I already do this for instant messaging between members of my household (see my article on Prosody on a Raspberty Pi) I still haven't found something that could rival Skype in terms of ease of use, stability and voice + video quality.