Bye Bye Google Reader – Selfoss On a Raspberry Pi

Google will terminate their RSS Reader service by the end of the month and I guess I am not the only one looking for alternative solutions to get that daily dose of news from blogs and news web sites on the way to and from work. While it obviously means work it's also an opportunity to 'in-source' another service to further reduce my dependency on cloud service providers.

Selfoss – The Central RSS Server

There are a couple of self-hostable RSS retrieval solutions with a server component that queries RSS feeds from a central place and then allows access to the content from smartphones, tablets and PCs while keeping track of what has already been read. The one I've been taking a closer look at is Selfoss as it's open source, light weight and requires little on the server side. Coupled with a Raspberry Pi it could be an inexpensive solution not only as far as the cost for hardware is concerned but also operating cost wise as the power requirements of a Raspberry Pi amounts to around 6 Euros a year.

Installation Process Overiew

I have to admit I had some difficulties getting Selfoss installed. The instructions suggest that it's a straight forward process. In practice, however, I struggled to get it working with the Apache and PHP installation I already have on a Raspberry Pi for my Owncloud server. In the end I gave up and instead opted to install it on a clean Raspian image. That worked pretty well and I'll have another post shortly describing the details. Combined with https only access and http layer username and password identification I can access the service over the Internet in a secure way.

Features

So far I am pretty happy with Selfoss. While I struggled to get my Google Reader OPML export file imported into Selfoss in version 2.4, it worked flawlessly with the current v2.7. Most of the time I read my RSS feeds on the mobile and the mobile browser optimized pages give me easy access to unread posts, to all posts ordered by date and to all posts of a particular site. In the overview, only the headlines of each post is shown and clicking on it reveals the full text. While posts can be individually marked as read I prefer the button that let's me mark all unread posts as read as this is more convenient for me as I usually don't want to read all new posts. Also, there's a link in each post that opens a new browser page to the post on the original website. Perfect, that's pretty much all I need.

Pictures

And here are some pictures for you to get an impression before the second part of the post contains the installation instructions specifically for a Raspberry Pi running on Raspian.

Screenshot_2013-06-13-17-32-26
Screenshot_2013-06-13-17-32-55Screenshot_2013-06-13-17-33-32

When Does 3G Get Patchy?

When the Wideband AMR speech codec was activated in my 3G wireless network of choice a year or two ago I decided to lock my smartphone to 3G-only mode to prevent it being handed over to 2G when on the move and thus being downgraded to a narrowband channel. In the Cologne / Bonn area where I live this works great as 3G coverage in the cities and in between is excellent. In other words, even though only using 3G I haven't experienced call drops due to running out of 3G coverage during voice calls. But I have to admit coverage is not like that everywhere.

Obviously the countryside is often still not covered by 3G. No big surpise here. But when I recently visited a relative living in an area where a few towns with a population of around 150k people are very close together I was surprised how often I did not have 3G coverage especially between the cities. I quickly realized that my 3G-only setting is definitely not an option there. An interesting realization especially if you continue that train of thought in the VoLTE and SR-VCC direction…

Securing OwnCloud With Truecrypt

I've been using OwnCloud on a Raspberry Pi at home for my file exchange needs as well as for calendar and address book synchronization across my devices for a couple of months now. It works perfectly and leaves little to be desired except for one thing: Encryption of all data on the server. Agreed, the only purpose of encrypting data on the server at home is to prevent data to fall into the wrong hands in case the server is physically stolen. While this is very unlikely, it's not impossible and security conscious as I am I wanted a solution for this. On my PC and backup drives I trust Truecrypt to keep my data safe so I was looking for a way to also use it on my OwnCloud server.

Truecrypt can be downloaded as a binary installation file for a number of Intel x86 based Linux distributions so installation and use is quite straight forward. The Raspberry Pi is based on an ARM processor, however, for which no binary package is available. But Truecrypt is open source so it can be compiled for ARM as well. This process is a bit tricky. Fortunately, there are a number of descriptions on how to do this and this is the most accurate one I found. Unfortunately it's written in German so it's of little use for many. So I decided to translate it into English and appended it to the end of this post.

Compiling the Truecrypt source takes a while but once done, using it from the command line works in exactly the same way as on any other Linux distribution. It took me a couple of hours to get everything working but now all data resides on a Truecrypt volume on my Raspi OwnCloud server. Without entering the password after the boot process, e.g. remotely over a secure ssh session, nothing is accessible.

One additional thing to consider is that when uploading a file, Owncloud and the underlying PHP libraries use the default temp directory and only copy the file to the Ownlcoud directory on the Truecrypt volume once the upload is complete. This is of course not very secure as the deleted files could still be recovered later on. But fortunately, PHP can be configured to use a different temporary directory, e.g. inside the secure Truecrypt directory. I'll also describe how to do this in the detailed description of compiling and using Truecrypt for Owncloud on the Raspi below.

And here are the details of how to compile and use Truecrypt on the Raspi:

Go to the home directory

cd ~

Get the current Truecrypt sources for Linux

wget http://www.truecrypt.org/downloads/transient/0d82764e6c/TrueCrypt%207.1a%20Source.tar.gz

In addition to the Truecrypt source the WXWidget library is required

wget http://prdownloads.sourceforge.net/wxwindows/wxWidgets-2.8.12.tar.gz

Next, libfuse has to be installed

sudo aptitude install libfuse-dev

And finally before starting the compile run, a number of header files are required

mkdir pkcs-header-dir

cd ~/pkcs-header-dir/ && wget ftp://ftp.rsasecurity.com/pub/pkcs/pkcs-11/v2-20/*.h

Unpack the Truecrypt Tar File

tar -xf TrueCrypt 7.1a Source.tar.gz && tar -xf wxWidgets-2.8.12.tar.gz

Include the directory with the header files into the path so they can be found

cd

export PKCS11_INC=/home/pi/pkcs-header-dir/

Now WXWidget has to be compiled, which takes about 15 minutes

cd truecrypt-7.1a-source/

make NOGUI=1 WX_ROOT=/home/pi/wxWidgets-2.8.12 wxbuild

And finally, compile truecrypt. This takes between 30 minutes and one hour, so be patient

make NOGUI=1 WXSTATIC=1

sudo cp -v Main/truecrypt /usr/local/bin/

Now see if it works:

truecrypt –help

Create a new volume (Interactive)

truecrypt –create

Rename the Owncloud folder and mount the Truecrypt volume it's place

truecrypt -t -k "" –protect-hidden=no /media/xxx/owncloud-crypt.tc /media/xxx/owncloud -v -m=nokernelcrypto

(Note: truecrypt -d unmounts the Truecrypt volume again)

Move the TMP Directory for PHP to the Truecrpyt folders to prevent data leackage

Create a tmp folder in the Truecrypt path and give it full access writes

sudo mkdir /media/pi-data/owncloud/tmp

sudo chmod 1777 /media/pi-data/owncloud/tmp

Web interface uploads are first put into /tmp and then into the Owncloud directory. To put tmp into the truecrypt volume uncomment and edit the following line ''/etc/php5/apache2/php.ini'' as follows:

upload_tmp_dir = /media/pi-data/owncloud/tmp

To make sure the changes are used restart Apache

sudo service apache2 restart

Note: WebDav uploads go directly to the Owncloud directory and hence the file does not have to be copied out of tmp. Thus, for larger files this is much faster!

Does Broadcast Have a Future when Netflix Unicast Traffic Today Accounts for 30% Of Busy Hour Internet Traffic In the US?

According to this post over at BusinessWeek, Netflix accounts for 30% of the busy hour Internet traffic in the US these days. Combine this number with the 36 million subscribers they have and it's not hard to imagine that the Internet has the capability to evolve to support unicast streaming to everyone thus making broadcasting (i.e. multicasting a single stream to many devices) in the traditional sense obsolete.

It was not too long ago when I wondered if web radio streaming on a massive scale is feasible. Now 36 million households in the US stream video content over the Internet. That makes audio streaming on a super large scale seem to be almost a trivial problem.

Also from a network transport price point of view, audio streaming should cost almost nothing if you compare it to Netflix. For $7.99 a month you get an all you can stream video service.  And the price does not only include the data volume transferred (on the Netflix side) but also the cost of running the servers and, not to be underestimated, the licensing fees for the video content itself.

The BusinessWeek article is quite long but contains a lot of interesting technical details. Like the 3 petabytes of storage in the Amazon cloud they require for their video content.

How Many Web Servers Do You Have At Home?

Quite amazing how many web servers have accumulated at my home over time. At the moment I count 9:

  • An ownCloud server on a Raspberry Pi
  • A webradio streaming server / client from my Hi-Fi set on a Raspberry Pi (Squeezeplug)
  • The web server in my Laser printer for status information
  • The web server in my Inkjet printer for status information 
  • A web server in my DSL gateway for administration
  • The wiki I use on my PC for personal notes uses a web server
  • Another Raspberry Pi I use for experimenting with new stuff usually runs a web server
  • My XBMC based media center for streaming videos is controlled over a web server
  • My VPN server based on OpenVPN and DD-WRT is administered over a web server

When looking at this list I am really glad for the security provided by the IPv4 network address translation (NAT) of my DSL router and my OpenVPN external access because some of those web servers should never be exposed to the outside world.

So how many web servers do you have at home?

Toggle Mobile: A Single SIM Card With Mobile Numbers From Different Countries

In most countries in Europe, fixed and mobile voice communication is a highly competitive market and prices have come down to acceptable levels for national calls. As soon as you want to make calls from a mobile phone abroad, especially to international mobile phone numbers, things get very expensive. But now I've found an interesting solution to this problem, both for me as a caller and also for those people who want to call me from abroad:

In the UK Lycamobile, a Mobile Virtual Network Operator (MVNO) operates a brand called "Toggle Mobile".  Their SIM cards are quite special as they can be assigned up to 5 numbers from different countries. This ensures cheap prices when roaming and also gives me national mobile phone numbers in up to 5 countries that people can call cheaply or for free if part of their monthly bundle. Here are three daily scenarios I have:

Scenario 1 – Making international phone calls with my mobile: Even only calling EU countries with my "normal" SIM card costs more than a euro a minute. In other words, I try to avoid making international calls from my mobile phone and rather use my fixed line phone at home where I can make the same call for a couple of cents a minute. With the Toggle Mobile SIM on the other hand, calls to fixed lines in many countries cost 3 cents a minute and mobiles can be called for 9 cents a minute. As I'm living in Germany and not in the UK, I've assigned a German mobile number in addition to the UK number to the SIM card to get these rates. The German mobile number is free if only needed for 30 days or 5 pounds sterling for a year if I want it to be assigned permanently.

Scenario 2 – Get local numbers in other countries so people can call a local number: Many people including myself (see scenario above) are quite reluctant to call international numbers from their mobile phones. The solution to this problem is to activate a French mobile number on the SIM in addition to my UK and German mobile numbers. This way I can give my friends in France a French mobile number they can call me as part of their monthly package. The twist is that I don't have to be in France to receive the call. I can be in any of the countries I have registered for a local number and receive the call for free.

Here's a practical scenario: The person that wants to call me is in France while I am in Germany. He dials the French number, pays for a local mobile call and Toggle routes the call to me in Germany. I receive the call for free because I currently use the German number I have registered.

Too good to be true I thought at first, but I've given this a try over the course of two weeks and found it to work flawlessly. Excellent! The only slight disadvantage: I have to carry a second phone again, a very small one, however, just for voice calls. A dual-sim phone might be an alternative in the future.

Scenario 3 – Receiving calls for free in non-EU countries

In countries such as the US, India, Hong Kong and Australia incoming calls are also free. Perfect as I tend to be in some of those countries for time to time.

On the technical side the Toggle SIM contains a SIM Application Toolkit (SAT) App that detects in which country the mobile is switched on. It then compares the country to the national numbers (IMSIs) stored on the SIM and copies the corresponding id to the IMSI field on the SIM. It then asks the mobile to perform a SIM reset to make the mobile register to one of the national networks with the local IMSI. That sounds easier than it probably is to pull it off in practice so kudos to Toggle for operating such as service.

WebRTC Does NOT Include A SIP Stack

A couple of days ago I published a post that gave an overview of what I have recently found out about WebRTC. In the post I mentioned that WebRTC includes a SIP stack. Thanks to a reader, I have found out in the meantime that this is actually not the case. As this is quite a significant difference I have updated the post accordingly. Thanks for the correction, I've learned a lot and sorry for not getting it right the first time.

Linux Jumps From One Processor Architecuture To Another With Grace While Windows Stumbles

Lately I noticed due to all the different things I do with the Raspberry Pi how easy the Linux ecosystem jumps between different CPU architectures. I would say it is almost seamless. Here are a couple of examples:

Starting with the Linux Operating System itself in a wider concept, system administration is the same on the x86 PC and on an embedded ARM machine such as a Raspbery Pi. The learning curve when jumping from an x86 Linux to an ARM Linux platform is zero.

On the application software side things look equally bright. LibreOffice works on both my x86 notebook and on the ARM based Raspberry Pi. It looks the same and it feels the same, no difference. Just the speed is not quite the same. VLC, my audio and video player of choice due to its universal capabilities also works on both platforms. Children games and learning apps collection such as Scratch and GCompris work just the same.

In the somewhat more advanced category, there are cloud services and remote support. The Apache web server and lots of additions such as PHP, perl, python, etc. work identically in the x86 and ARM world. Again a learn once – use across platforms experience. Lots of web based applications such as ownCloud, Wikis and even more exotic stuff such as the Logitech Media Server don't care about the CPU platform as the interpreter languages they rely on work the same on x86 and ARM. There's also no difference when it comes to a setup for supporting a device remotely. VNC exists cross-platform so there's no
difference configuring an x86 or ARM Linux for remote administration.

And finally, hardware support on ARM is excellent as well. USB keyboards, mice, sound cards, memory sticks, SD cards etc. just work fine on the ARM based Raspi without installing any drivers.

All of this would be much harder to accomplish if it weren't for a number of major innovations that have been made over the years and bear their full fruits now:

The first one that comes to mind is the centralized application repository and update tool that is at the core of every Linux distribution and has existed long before app stores became popular in the mobile world.

It's also important to realize that Linux repositories go one step further than mobile app stores: As the software contained in them is usually open source, the Linux distribution itself can can compile all software for different CPU platforms.

Finally, third party USB hardware support is fabulous because drivers are already part of the Linux kernel and due to the abstraction via USB don't care about the processor architecture either. Again, everything is compiled by a central instance and hence does not require third parties to compile their drivers for different hardware platforms and give them to the central repository. That dramatically simplifies software updates. 

Microsoft must look enviously to the Linux camp in that regard. Their jump over to the ARM world looks much more difficult. There's Windows RT running on ARM based tablets of course but except for Office that was specifically ported to ARM there are few programs known from the x86 Windows world that run on it as well. After all, it's the software companies themselves that would have to support it by compiling their applications for another architecture that lacks support for many of the legacy libraries they use today. And on top of that they have to enter the "tile interface" world. It seems few are willing to do that so it's unlikely Linux will loose its cross CPU platform advantage anytime soon.

A WebRTC Client as a Skype Alternative

Recently, I've been musing in this post about a self hosted alternative to Skype for communication between family members using Asterisk on a Raspbery Pi and the Ekiga SIP client. With the recent discovery that Microsoft is actually listening into Skype text conversations that need has grown even stronger. Then I've read that the upcoming Firefox 22 will have the full WebRTC API implemented and activated. So far my knowledge about WebRTC has been rather limited, I just knew it had something to do with web browser based peer to peer communication but not much else. Time to fill the gaps:

The Wikipedia entry on WebRTC is actually quite brief and pretty much reflected what I already know: There's almost nothing about how it works. The FAQ over at webrtc.org is quite enlightening however. Here are some key facts that will help you to better understand WebRTC if you know about how VoIP works with SIP:

According to the FAQ, WebRTC can be thought of as a web browser based JavaScript API to SIP (Session Initiation Protocol) functionality. In other words, WebRTC contains a full implementation of a SIP stack that JavaScript programs in the browser can use to establish a communication session. While the communication session is peer-to-peer a centralized SIP server is still needed to initially connect the two endpoints. So instead of a native SIP client that has to be installed once, the JavaScript program in the web browser that is loaded from a (web) server that also hosts a SIP server becomes the client. WebRTC can do more than just abstract the SIP API. However, if you're familiar with SIP then this is the way to start thinking about WebRTC.

According to the FAQ and this blog post, WebRTC can be thought of as a web browser based
JavaScript API for two things:

  • To access camera and microphone
  • To connect to another peer (i.e. to the destination user)

What is not defined is how the other peer is discrovered initially and how audio and video codecs are negotiated. Traditionally this is done with a number of different protocols, SIP being one of them. In other words, the SIP protocol and communication with a SIP server is not part of WebRTC and has to be implemented by the web app on its own (for details see the blog post linked above). What is defined however is the use of the Session Description Protocol (SDP) to describe the audio and video codecs available on each end.

What I am wondering at this point is how two JavaScript applications running on different devices can communicate with each other directly, as I always thought JavaScript enforces the rule that the program is only allowed to establish connections back to the site from which the script was loaded. Obviously that can't be the case here anymore.

The FAQ also mentions a number of other interesting facts: WebRTC implements STUN (Session Traversal Utilities for NAT) to establish a peer-to peer session through Network Address Translation gateways, a must in today's IPv4 environment. Also, echo cancellation techniques are mentioned as well as the codecs used that look pretty neat, bandwidth efficient and wideband and HD video enabled. As all functionality is part of the web browser there's hope that performance will not suffer as much as if all code was written in JavaScript.

So one the simplest use case would be to replace native SIP clients with a browser based WebRTC client that implements its own SIP stack. WebRTC clients can even communicate with native SIP clients over a proxy server if both support a common audio and video subset. This seems to be the case with WebRTC supporting the G.711 and G.722 audio codecs that are widely used in the SIP world today.

This obviously fits into Google's overall (Chrome) strategy to have everything running via a centralized server in the network and in the web browser on user devices. While this is not exactly what I have in mind due to my preference of hosting my own web services at home the architecture is open so nothing would prevent me from running my own SIP server at home with an open sourced WebRTC client and proxy. Having the client run in the browser also means that the client software can be deployed without any hassles. The use case implemented by Ericsson over here gives an insight of what's possible with "just in time" deployed communication clients. At this point, WebRTC breaks with today's technology to offer new and interesting possibilities to explore.

For further insight, have a look here on SIP servlets and here for a HTML5 SIP client (+proxy) implementation with WebRTC.

Is Powerline A Magic Bullet When The Wi-Fi Link Is Too Slow?

When I am at home I'd sometimes really like to use the 25 Mbit/s of my VDSL line to its full extent. That's easier said than done as my VDSL router is in the hallway while my PC is in another room. Laying a cable is not an option so I am relying on Wi-Fi. As my notebook does not reliably work with the VDSL router's built in 802.11n Wi-Fi since I upgraded the router's software I use the 802.1g Wi-Fi built into my somewhat old WRT54-GS Wi-Fi router I use a VPN gateway. That limits my transmission speeds to about 18-20 Mbit/s in practice. In other words I am falling around 5-7 Mbit/s short of what's possible with my VDSL line. Not ideal.

So I decided to do something about it and bought two 500 Mbit/s Powerline adapters. I had high hopes for the solution as the linear distance between the VDSL router and the PC is 10 meters at most. Also, most reviews on a number of web sites were very positive. Unfortunately, it seems that the the power socket in the hallway and the power socket close to my desk might be on different phases or there's something else in the way as my throughput was a meager 2 Mbit/s. Not quite the hundreds of Mbit/s I was hoping for…

I then connected the power line adapter in the bedroom and the kitchen and again I only got 2 Mbit/s each time. In the bathroom I got 7 Mbit/s. Only when I connected the the two Powerline adapters to sockets in the same room did I get the full 25 Mbit/s bandwidth of my VDSL line. The diagnosis program shipped on a CD with the adapters indicated that the line speed was 300 Mbit/s but I didn't give it a try as having the adapters in the same room is a nice exercise but worthless in practice.

I live in an apartment building in Cologne that was built in the 1980's so I don't think my power cabling is out of the ordinary. Quite a disappointing result. Looks like I have to think again about a better Wi-Fi option, Powerline's definitely not the answer for me despite seeming to be a good solution for many others.