Raspi, Ubuntu and Co and How They Compare to Cooking vs. Instant Food

The title of this post is perhaps a bit long winded but it contains an interesting analogy I've recently come up with when I was explaining the concept of hosting my own cloud services at home to a friend rather than using similar services of large Internet based companies.

Having my own cloud services at home and maintaining them vs. using services of Internet based companies to me is like doing some real cooking versus just consuming instant food. Internet based companies offer services that are relatively easy to install and use, i.e. that's quite similar to instant food that heats up in microwave in a few minutes. There's a price to pay, however, less tasty in case of fast food, perhaps, and sharing your private data for analysis and use by an Internet company in the other.

Hosting your own services at home is obviously more work just as preparing a meal from fresh ingredients is more work. However, my data remains private and secure which is similar to a self prepared meal which is more tasty if the cook knows what he is doing.

I like the analogy because it also fits when people say "great that you do it but I lack the skills and time of doing it myself". I hear the same argument when people talk about instant food vs. cooking. Sure If you've never stood in front of a stove before, learning how to cook is likely to be a challenge and someone to help getting you started is surely a good thing. The same is true for hosting your own services at home or switching to an opern source operating system such as Ubuntu Linux.

To summarize: Yes, when it comes to using cloud based services and operating systems I definitely prefer cooking to instant food.

Leaking Intercepted Phone Calls Become ‘En Vogue’

I am amazed at the current steady flow of intercepted phone calls that are leaked in some form or shape. Take the phone calls of the Turkish prime minister, the leaked phone call of the US diplomat who found quite strong words for the foreign politics of the EU or the phone call between the EU foreign representative and the Estonian foreign secretary discussing the situation in the Ukraine as prime examples. What few reports are asking is who intercepted those calls, who might have leaked them and what was their motive in doing so? Leaving the political questions aside in this post I think it is save to assume the majority of those phone calls were not lucky intercepts by a teenager but the work of professionals in the employment of one state or another. Also, I think it's a safe assumption that these leaks are just the peak of the ice berg. A nice thing of leaked phone calls is that they litterally speak for themselves compared to documents who's authenticity is much harder to prove to the public.

In the meantime it must dawn on most politicians that they simply have to assume that the majority of their calls are intercepted and recorded if no end-to-end encryption is used. Not that they don't want to use devices that offer end-to-end encrypted calls but from what I can tell they are still cumbersome to use and interoperability between devices of different makers is virtually non-existent. But perhaps this constant flow of leaked phone calls will trigger people to rethink their position and create a bigger demand for interoperable and easy to use devices and applications for end-to-end encrypted calls that are not only affordable for them but for the general public as well. I for one would welcome it as I think it's not only politicians who in the meantime have no privacy anymore when making a phone call.

NAT Is The Main Inhibitor For Self Hosted Cloud Services

Lots of people I talk to like the idea of having a box at home that can be accessed remotely from notebooks, smartphones and tablets to synchronize private data such as calendar and address book information. They' like it because they'd rather like to have their private data at home than to give it up to companies that store it in a country far far away and make money by analyzing their data and selling advertising in some form or shape. Sooner or later, however, there's always a sentence like 'yes, you [Martin] can do it, but I have no idea how to go about it'. At that point I'd really like to say, 'gee, no problem, just buy box 'X', connect it to your DSL or cable router at home and you are done. Unfortunately that's just not were we are today.

To make self hosted services for the masses a reality, however, it's exactly such a plug and play setup that is required. Anything less and it won't work. I have no problem to imagine how most setup steps could be automated. A company could take open source software such as Owncloud, package it on inexpensive hardware such as a Raspberry Pi or even a NAS disk station at home and write an intelligent setup software that automatically does tasks such as registering a DynDNS domain, registering an SSL certificate for the domain and coming up with really simple to configure mobile OS connectors for calendar and contact synchronization. Money can be made with all of these steps and if reasonably priced I think there's a market for this.

But there are also some technical hurdles that are a bit more tricky. The major one is Network Address Translation (NAT) in DSL and cable routers at home today. For tech savvy users it's obviously easy to configure a port forwarding rule but for the average user it is an insurmountable obstacle. The Universal Plug and Play (UPNP) protocol implemented in most residential gateways offers the means to do this automatically and is used by programs such as Skype. Unfortunately this functionality is a significant security risk and many router vendors and DSL/cable network operators have decided to disable it by default. On top, many DSL and cable networks today no longer assign public IPv4 addresses to residential connections, hence preventing even tech savy people to have their servers at home.

Basically, I can see two solutions for this: One would be to have Owncloud and other services integrated into the DSL/cable routers. That's probably the easiest way but it would limit the opportunity to the few companies working on residential routers. The other solution could be for the home cloud box to establish a VPN tunnel to an external service from which it gets a public IPv4 address. Possible, but not ideal as it would introduce a single point of failure.

So perhaps IPv6 will come to the rescue at some point!? Unfortunately, that help will not come tomorrow. In addition I can't help but wonder if DSL/cable routers will not include IPv6 firewall functionality at some point to block incoming connections for security reasons. If so, we are back to step 1 for which we need a clever, secure and standardized way to automate initial connectivity configuration.

Plan-B Tales About My Home Cloud

One tiny downside of running cloud based services at home such Owncloud files, calender + address book synchronization, VPN services, Instant messaging server, etc. at home is that one becomes dependent on the power company and Internet provider to keep you connected to your services when you are not at home. And every now and then things go wrong. Back in December I had a two hour power outage that I managed to detect with my GSM enabled power socket that sent me an SMS once power was restored so that angle is covered. To survive DSL outages I have a fallback solution over wireless in place. And that's just what I needed recently when my DSL line failed for two days.

While it worked rather well it also demonstrated just how many self hosted services I used today and for which the fallback solution ensured service continuity and for which it didn't. So here's the story:

In addition to the DSL router for normal operation I have a cellular router in place for backup Internet connectivity over a different default gateway IP address. The router also registers a backup dynamic DNS address so I can still access the network remotely when the DSL line fails. One more thing I need to switch my services to the backup line is a way to remotely switch the default gateway addresses of my servers away from the DSL router during an outage and towards the cellular router For this purpose I use a secure shell (ssh) login on a box in the network that I can reach over the cellular connection. For this purpose I have a separate Raspberry Pi to which I have enabled port forwarding from the cellular router over a non-standard TCP port so I can securely reach it via SSL using the backup dynamic IP address. Once I'm logged into this machine I can ssh into my other routers to change the default gateway and DNS server and then restart the network stack on them.

The last thing that remains to be done during a DSL outage is to switch the dynamic DNS domain I used for my services away from the DSL router and towards the cellular router. Once that is done I have my main services back in operation. In addition, I can use the Raspberry Pi's vncserver to remotely get a GUI on a machine inside my home network and use a browser to access the web interface of the routers for maintenance. Again, the SSL connection helps to securely access the VNC server and I'll describe in a second post how that works.

So while this works very well there are a number of quirks:

The first is that most cellular network operators do not assign public IP addresses (anymore) which is, however, a requirement for this to work. Fortunately my cellular operator has a dedicated APN but it seems to be a rarity these days.

The second thing that makes the use of the backup solution somewhat of a pain in practice is that the cellular router doesn't recognize that when I'm at home and use my domain name to access my cloud services it should loop back the packets internally instead of sending them out to the network where they are lost. That means that while I'm in the home network I can't reach my services over the default domain name. My solution for this is use a VPN to connect to an external VPN service so the loopback is performed externally. Not ideal but the amount of data that goes back and forth is not very large.

Another thing is that my VPN service doesn't work while I'm using the backup solution because the cellular router doesn't have an option to create static routing entries to the IP address range and subnet used by my VPN server for the clients. While I could live without the VPN server for a while as I can also use an external VPN service it limits my ability for remote support when I am not at home as I use my home VPN service as part of that solution when I'm behind a NAT myself and thus not reachable for reverse VNC connections.

So while by and large the backup solution works there are some shortcomings that would take some more tinkering to overcome. But o.k. it's a backup solution so I can live with that for a while. And yes, agreed, this is not something non-techies would set up at their home so it's by no means a solution for the masses.

New Buildings, Insulation and Coverage Issues

A little rant today on new buildings and coverage issues as I keep hearing such reports with increasing frequencies:

When GSM was launched in the 1990's the windows of most buildings were made of glass and while there was some signal loss through them, by and large things worked pretty well. Over the last couple of years however, new buildings, especially offices, are equipped with heat insulating windows that don't only keep the heat or cold out but also radio waves. Their effect is pretty dramatic. Excellent coverage outside the building, no coverage whatsoever inside the building. Hotels, offices, shopping malls, you name it, it's getting more difficult to get coverage into those buildings from macro cells on the outside. While shopping malls are often equipped with indoor coverage via repeaters or small cells, hotels and office buildings or usually not. An exception I have noticed are 4+ star hotels in Asia while their European counterparts usually don't bother. Sure, there are solutions for this that work great such as repeaters, distributed antenna systems, small cells, femto cells, etc. but they all require active interaction between network operators and the owner of the buildings, i.e. extra work. Extra work many building owners are so far unwilling to do. I wonder how much critical mass it will take in terms of new buildings before network operators are taking a pro-active approach to this!?

How A Window Ends Up On The Screen

Back in my College days I had a course on computer graphics and how elements such as windows, buttons, input boxes, etc. etc. end up on the screen (both on the desktop and on mobile from today's perspective) and how they can overlap and disappear behind each other. But that's been some time ago so I was quite glad to have stumbled over a quick refresher on the difference of X and Wayland here. The first part of the post is quite easily understandable for those with a general background of how a desktop is rendered while the second part is quite a deep dive. But even if you don't want to go down that deep the post is still worth reading.

P.S.: And no I don't want to get into the debate of Wayland vs. Mir.

32 Bit On the PC, 64 Bit On Mobile

It's a bit of a paradox, I'm using the 32 bit version of Ubuntu on the PC while on one of my mobile phones a 64 bit based operating system is in operation.

There is one particular reason that made me still use the 32 bit version of Linux on my current notebook. When I set it up around two years ago I wasn't sure how backwards compatible the 64 bit version of the OS would be for 32 bit Windows programs I still have to use every now and at then for compatibility reasons via Wine. Perhaps it's no issue at all but I saw now risk or disadvantages of staying on 32 bits as Linux unlike Windows 7, which is restricted to 3GB on 32 bits, can still use the full 8 GB of RAM on my machine via PAE (Physical Address Extension). Each program is restricted to 4 GB of addressable memory but on a notebook not running enterprise scale applications that is not an issue at all. But it's definitely a kludge to ensure backwards compatibility.

As there are fewer reasons on mobile platforms to be backwards compatible it is quite a logical step to start the transition to 64 bits now. Some devices such as tablets already have 2GB of RAM inside today and going beyond the 4GB threshold is likely not far away anymore. So it's a good thing companies are thinking about 64 bits now instead of coming up with things like PAE on mobile just for the sake of dragging along old stuff.

So there we go, I'll have to live with my 32 bit PC versus 64 bit mobile paradox for just a little while longer.

7 Macs vs. 2 Windows Boxes in the Valley

Observation of the day: I find it quite interesting how Apple continues to gain mind share and users. When I was recently having breakfast in Silicon Valley, most people in the café hunched over a Macbook. My count were 7 Macs vs. only 2 Windows notebooks. Sure, Silicon Valley is a special case but I still think it is not much of an amplification of a general trend that can be observed.

3G Roaming to the US – Throw Away Local SIMs

It is interesting how getting connected in the US as a visitor via cellular has changed over the years from one extreme to the other. While Internet access enabled prepaid SIM cards where available in the past 5 to 6 years in many countries around the world I struggled for many years to get the same thing in the US. At some point AT&T offered mobile data on prepaid SIMs but one had to find an AT&T store and then fiddle with a pretty rough web front-end to activate a data option. Not ideal.

But things have significantly changed recently. When preparing for a business trip to the US I noticed that a number of MVNOs (virtual network operators) have sprung up that offer prepaid SIMs for mobile Internet access specifically for international visitors. Here's a PC Mag article that gives a good overview. I opted for a Ready SIM 500 MB data only SIM for around $20 (including shipping) that remains active for 14 days after first activation. Other options that include voice and and a bigger data bucket are also avaialble. It's a use and throw away SIM as there's not even an option to top-up option. That probably means that have a pretty lean back-end system 🙂

The only catch was that I had to find a mobile device that supported Wi-Fi tethering and the US 3G frequency bands of the network used by that MVNO, which were the 1900 MHz and AWS (1700/2100 MHz) bands. That's not so easy anymore as current mobile devices often sacrifice 3G bands for LTE and the AWS band was never very popular for 3G in devices outside North America in the first place. But I managed to get hold of one and I decided to order a 'Ready SIM' to my hotel as there wasn't enough time anymore for international shipping. Next time I'll order it a bit sooner so I can already use it at the airport.

In practice it only took a few minutes after inserting the SIM card for the first time before the data option was activated. In terms of speed I couldn't complain but the network managed to kill two different kinds of VPN tunnels regularly. It's a bit of a nuisance and I haven't experienced that in other countries before but I could live with it for a couple of days.

So while there is room for improvement I really enjoyed the freedom of having Internet access when out and about without roaming charges and it also spared me the $12.99 Wi-Fi charges the hotel wanted to charge me per day. I like competition.

Owncloud Mobile App For File Synchronization

Owncloud-file-sharingLike most multi device users I need to transfer files between my notebook and mobile devices every now and then. As I travel a lot, I have so far abstained from installing a file synchronization client on my PC or my mobile devices as I wasn't quite sure how to ensure that the synchronization process doesn't eat into my roaming data bucket. Also, I don't want my data traversing the data center of a commercial cloud storage provider for privacy reasons. My solution was therefore to transfer files via Bluetooth between devices when they are at close range to each other. For a file or two of a few megabytes such as a camera image that approach works quite well. But for larger files such as PDF documents with a size of tens and hundreds of megabytes or larger file connections, Bluetooth is just too small. And then there are mobile devices out there that can't send files via Bluetooth at all.

For these reasons I decided at some point to give Internet based file synchronization mobile apps a closer look. Since I have an Owncloud instance running on a Raspberry Pi at home that I already use for exchanging files between PCs, privacy and confidentiality was not an issue. Also, there are Owncloud file synchronization apps available for the major mobile platforms. So I went ahead and installed the Owncloud client on two mobile operating systems to play around with it. Usage is straight forward and the apps offer full control over which files are transferred (and when) and which files are kept in sync. File transfers work just as they should and while I'm at home, my DSL router recognizes that the public destination IP address is bound to the WAN interface and reflects the packets right back into the network without a hop on the outside. That makes file transfers as fast as my home Wi-Fi allows. When not at home, transfer speed is limited to the speeds I can get over the cellular network and the 25 Mbit/s downlink and 5 Mbit/s of my VDSL connection at home. That's still good enough for most file sizes.

Great stuff and one more reason to have my private Owncloud instance at home.