Just a few weeks after I could use LTE for the first time while roaming in France I recently found myself in Belgium's capital for the weekend and could again benefit from LTE speeds while roaming. But how fast is it actually and is there a bottleneck on the link to the home network? The later is quite important as all data is tunneled to the PDN-Gatway in the home network and from there to the Internet. As you can see in the image on the left, Mobistar in Belgium and my home network operator have provisioned the link with ample capacity and I could reach speeds of 20 Mbit/s in the downlink direction and 14 Mbit/s in the uplink direction on LTE Band 20 (800 MHz) cell with a 10 MHz carrier in average signal conditions. Not too bad I would say. And the ping delay of 62 ms for a roaming scenario is great as well.
Getting the RAT Indicator Back When Roaming…
So there we go, somewhere along the way Android lost the radio network type (RAT) indicator over the signal bars when roaming. I wonder if that has something to do with Americans rarely leaving their country? Anyway, the important thing is that Android is open and flexible enough for someone to come up with an app to fix the issue. After looking around bit I chose the "Network Type Indicator" app and as it didn't want any suspicious rights I didn't hesitate to install it. It works as it should and my smartphone again feels as it should when I'm out of the country. It is even better than the original as I can now see the network type much easier when using the device for navigation in the car. Yes, I like to know what kind of network is around me when driving through the countryside…
Ubuntu Brain Transplant – How Far Back Can I Go?
I like having a good backup strategy and thus have a couple of Clonezilla images of my notebook's SSD. In case my notebook gets lost or stolen I can restore the image on a backup drive, overwrite the data partition with the latest weekly backup and put the result in another notebook and I'm up and running again in no time. The question I had, however, was how close the hardware of the replacement notebook must resemble that of the original hardware for Ubuntu to still be usable.
To find out just that I recently restored a Clonezilla image of the SSD of my PC to a backup drive and installed that in a 4 year old notebook with completely different CPU and graphics hardware and a 6 year Atom based notebook, again with a very different processor, GPU, screen, touchpad and Wi-Fi hardware. That can't possibly work now can it!? Wrong! My Ubuntu 12.04 installation booted and ran perfectly on both systems. Graphics worked, the touchpad worked, the Wi-Fi worked, suspend/resume worked, everything just worked, I could hardly believe it.
Now try that with a Windows installation…
Switching to CyanogenMod – But There’s A Price To Pay For Freedom
While I've been using LTE since the very early days it has mostly been for Internet connectivity so far. When it came to the smartphone in my pocket I was downright conservative and only recently switched to a Samsung Galaxy S4 that comes with LTE. When making the switch I also decided that it was just the right opportunity to also do something about bloatware, crapware, vendor specific launchers and spyware by installing a vendor independent Android flavor.
There are many 'mod's' available these days and CyanogenMod is probably the most well known. So I decided to give it a go, and the pretty much automated installer CyanogenMod offers for a handful of devices made it a quick and hassle-free adventure. Download an app to the S4, download the installer to a Windows PC and let both run. With a few interactions and about half an hour later my S4 booted with a vanilla CyanogenMod Android 4.4.2 image.
The automatic CyanogenMod installer also downloaded and installed the Google Play store and while the device doesn't call 'home' as much to Google and others compared to vendor specific Android versions, there are still frequent interactions with mtalk and other Google services. But since CyanogenMod offers a built in root mode, that's easy to take care of by modifying the hosts file as I described here.
So here we go, my first smartphone with a custom firmware not from the manufacturer and not directly from Google either. A moment to savor, it's almost like in the PC world. But there's a price to pay as some features are missing or don't quite work as I would like them to. For example: When roaming, the status bar only displays an 'R' next to the reception quality bars and omits the network technology indicator. Also, I'm no longer able to disable GSM as I don't like to drop down to 2G for various reasons even if that means I am out of coverage every now and then. That's a small price to pay, however, as even many vendor supplied Android versions of devices with LTE don't allow locking to UMTS and LTE. Another thing that has also disappeared is the Wideband-AMR capability the original Android version activated in the baseband on startup. Together with not showing the radio technology while roaming I miss that the most.
Let's see, perhaps there's a way to get some of these things back. I'll keep you posted.
My Truecrypt Exodus Has Begun
A few weeks ago the Truecrypt project spectacularly imploded and Steve Gibson over at GRC has a good summary of the event. Many people, including me, always liked the idea of Truecrypt being open source but were quite skeptical of the authors being anonymous. Now that they've abandoned their project and also don't want it to be continued by anyone else it's time to think about alternatives.
Initially I used Truecrypt because there was nothing else available on Windows XP. Then I used it's advantages in cross-platform compatibility to share a file container so I could use my Thunderbird email client on Windows and Linux depending on which machine I used at a time. But I've moved on and I no longer have Windows machines at home. So cross-platform compatibility is no longer necessary.
As a consequence, I've been on the lookout for other options and on Linux, dm-crypt looks like a good alternative. In Ubuntu, and likely also in other Linux distributions, dm-crypt is straight forward to use. When formatting a new hard drive or usb memory stick, Ubuntu's "Disk Utility" offers a simple way to encrypt a new volume with dm-crypt as shown in the image on the left. When the USB stick is later-on plugged-in again, the file manager automatically asks for the password. Perfect! Encrypting hard drives for backups works in a similar way.
So when I recently ran out of space on my backup hard drives and had to buy new ones, I went for dm-crypt instead of Truecrypt. Call it a natural migration away from Truecrypt without much pain as my backup software doesn't care if and how a drive is encrypted. Also, it seems that not too far in the future, my 500 GB notbook SSD needs to get replaced with a 1 TB variant as those virtual machine images just keep growing. A good opportunity to ditch the Truecrypt container I use for especially sensitive data and replace it with a dm-crypt container file or partition.
Speaking of container files: It's also pretty much straight forward with a couple of commands to create and use dm-crypt encrypted container files. Here's a good overview in English and here's one in German.
Tcpdump, Netcat and Wireshark for Remote Logging
There are various way to use Wireshark to trace anything from local traffic to Wi-Fi packets and even Bluetooth. Recently, I’ve added yet another variant to my bag of tricks: Tracing on a Wi-Fi Access point with forwarding of all captured data to my PC for online debugging. When everything is put together I can start tracing with a single command. Here’s how it works:
The access point is a Raspberry Pi with VPN backhaul that I’ve recently put together. The VPN backhaul isn’t needed for the tracing part at all but the solution presented below lets me trace on the local Wi-Fi interface, on the backhaul interface (Wi-Fi or Ethernet) and the VPN tunnel interface itself. On the local PC netcat (nc) and Wireshark have to be installed. To start remote tracing, the data sink on the local PC and the forwarding of all data it receives to Wireshark has to be started as follows:
sudo nc -l 9999 | wireshark -k -S -i -
This command starts netcat (nc), asks it to wait for an incoming connection on port 9999 and to forward the standard ports to Wireshark. On the access point, tracing is started with the following command via an SSH shell:
sudo tcpdump -n -i wlan0 -s 65535 -w - not port 9999 | nc 192.168.55.9 9999
Wlan0 is the interface to trace (use ifconfig to check which other interfaces are available as well) and ‘not port 9999’ excludes all forwarded traffic to the local PC to be included in the trace. All output of tcpdump is forwarded to netcat which in turn forwards all traffic to the local PC with IP address 192.168.55.9 via port 9999. And that’s it!
And here’s how to combine these two commands into a single command that is issued on the local PC:
ssh pi@192.168.55.1 'sudo tcpdump -n -i tun0 -s 65535 -w - not port 22' | wireshark -k -S -i -
This chain of commands establishes an ssh connection from the local PC to the access point (192.168.55.1) and launches tcpdump on the access point. Instead of netcat, the ssh connection is used to tunnel the data back to the local PC. This is why port 22 must not be recorded by tcpdump instead of port 9999 in the example above. When the data arrives on the local PC it is piped to Wireshark in the same way as shown above. Two things may need to be in place for this to work: As I’m tired of typing in passwords I’m using client certificates for the ssh connection as described here. And the second thing to put in place is that on the Wi-Fi Access Point Pi, sudo doesn’t ask for a password either. On Raspbian that’s the default but on other devices that might not be the case. Should that be a problem, passwords can be disabled in the ‘sudoers’ file for specific commands.
First LTE Contact Abroad
It has taken 4 years since LTE was launched in Germany for the necessary bits and pieces to come into place to launch LTE roaming. But it has finally happened a couple of weeks ago. When I arrived in southern France I was greeted by an "LTE" logo in the display when I switched-on my phone after the plane landed in Marseille.
Sure, nobody was in a hurry to get LTE roaming in place as 3G networks still provide more than adequate speeds in Europe and roaming prices were and still are to a large degree prohibitive when it comes to 'really' using LTE, i.e. downloading tons of data at amazing speeds. But I won't complain, it's here now, roaming prices are falling and I'm looking forward to see the "LTE" logo much more often now when traveling.
A Gigabyte Of Data Through My VPN Per Day While I’m Traveling
I still remember the days 7 years ago when I was all excited over Vodafone's new WebSession program that offered mobile Internet access while roaming. 15 euros was worth 50 MB in those days. At the time it was enough to keep me connected when I was traveling on business with some volume to spare. Prices have fortunately come down over time and if the Internet access over the hotel Wi-Fi and in meeting rooms is working I am quite amazed at how much data flows over the wire when I don't have to think about cost: On a recent business trip, my data consumption at the meeting venue and in the hotel exceeded 1 gigabyte each day according to my VPN log.
Granted, I didn't hold myself back and just did the same as I would when I'm in the office and at home as well:
- Webradio in the morning
- Up- and download of large files throughout the day
- Emails with large file attachments
- Streaming videos from Youtube in the evening
- Skype calls with videos with excellent quality (2+ Mbit/s bi-directionally)
- Uploading pictures I have taken during the day to my Owncloud
- Remote screen sessions for IT support
- … and of course web browsing for research, fun and information
50 MB per day 7 years ago, 1000 MB today. Quite a steep change!
Running On Owncloud 7 Now – With Cloud to Cloud Sharing!
Last week the Owncloud community released version 7 of their fabulous cloud at home software and I could of course not wait to update. There are lots of changes that made it into 7 and I'd say the major ones are focused on making management of larger installations easier. But there are also a number of new things that are interesting for my much more limited usage scenario.
The first improvement, even if it only seems to be minor is an improved user interface when a new calendar entry is created via the web interface. While I usually create new entries either on a mobile device or in Thunderbird/Lightning, every now and then I also use Owncloud's own web interface which has been a bit clumsy so far. It's much better now.
Another thing I noticed is that files can now be sorted by date, name and size. Also it's now possible to get a list of all files that I have shared with others and all files that have been shared with me. Very helpful, too!
Also on the list of 'must-have's' from now is the Activity Stream view that shows which files and directories I have created, deleted, copied, shared, etc. and when. So far, I've never bothered to configure email notifications for various purposes such as automatically sending a notification to someone with whom I shared files with or to get automatic email notifications when a family member also using my Owncloud has shared something with me. Configuring the email notification settings is much improved in OC7 and can be done right from the web interface in the admin section. Reason enough to configure it and it only took me two minutes. Great, never without it from now on!
And one feature that I find particularly interesting also from a conceptual point of view is the new Inter-cloud sharing. I've noticed that when I share a file or directory via a link, there's a "Add to your Owncloud" button now. In other words if someone shares files on his Owncloud, I can directly put it into my Owncloud. I haven't yet tried out how this works under the hood but from what I've been able to read up it is based on the WebDAV protocol. Sharing between clouds, now that's a novel and interesting concept!
There's tons of other new features I haven't yet tried, have a look here for the details. Upgrading from Owncloud 6 to 7 worked quickly and flawlessly and all my calendar and address book clients still synchronize as they should.
Congratulations to the Owncloud community, it's an awesome new version!
Using Man In the Middle Proxy (mitmproxy) on a Raspberry Pi
It's good to see that more and more programs use secure http (https) to encrypt data they send and receive over the network. Especially over public hotspots and due to the prying eyes of security agencies around the globe, there's no alternative to it. The downside is, however, that data is also concealed from debugging and personal analysis purposes. But there's a solution: mitmproxy!
Mitm stands for 'Man In The Middle' as the software can split an SSL connection into two parts and decrypt data in the middle. To do this, the device or program under test has to be configured for http proxying. On Android and other mobile operating systems this is part of the Wi-Fi setup. On PCs it's part of the web browser configuration. Once done, all http and https requests are sent to mitmproxy which then terminates the secure link and opens another secure connection towards the destination. As mitmproxy doesn't have a valid SSL certificate for the destination it has to create a certificate of its own on the fly and send that to the client device. As the mitmproxy can only sign the certificate with its private credentials, an error message pops up in the web browser every time a https protected site is visited and the user has to manually confirm to proceed. That's how it should be because security is broken when an SSL connection is not terminated at the destination.
To stop these error messages, mitmproxy offers an easy way for devices to import its certificate authority credentials. On Android and other platforms this is as simple as surfing to a given URL and pressing OK on the dialog that pops up that asks if the certificate is to be put into the certificate store. Almost too easy from a security point of view.
Remains the question on where to run mitmproxy. As the software is written in Python it can be easily installed on Linux PCs and also on Mac OS and then use the IP address of that device in the proxy configuration in the device or software under test. But as I'm always a bit reluctant to install software on my PC I don't need on a daily basis, I decided to install it on a Raspberry Pi instead. As the software is installed and compiled from source it takes about half an hour to install it on the processing power restricted Pi. However, it's well worth the wait.