The Myth Of Rising Telecoms Investment

Pretty much whenever quarterly reports are presented by telecommunication operators these days there is the usual note about the difficult situation they are facing due to the investment required to keep networks up to the rising data demand. But is this really the case? The 2012 report of the German telecom regulator (in English) has some interesting numbers on that.

In 2012, invest in fixed and wireless telecommunication networks in Germany was around 6 billion Euros. That's the combined sum of invest of all market players. Hast it risen in recent years? Not according to the report. In reality, invest has remained pretty much stable over several years and compared to the overall revenue in 2012 of around 58 billion Euros that seems a quite reasonable number, at least to me (see page 81 of the report).

Let's have a look at some more related numbers: While invest has remained stable, the number of employees in the telecom sector went down from 184.200 to 176.000 in Germany and the pressure can be felt. End customer prices might also have shrunken but I would argue that this was mostly compensated with higher use. This is reflected in a slight revenue decline from around 60 billion Euros in 2009 to 58 billion Euros in 2012. But when looking at the EBIDTA of Vodafone Germany which for 2012 was 3.359 billion Euros out of a revenue of 9.641 billion Euros then I don't really see big suffering.

30 Times More Data In Fixed vs. Wireless Neworks And Slowing Data Growth In Wireless

Once a year many telecom regulators in Europe publish their yearly analysis of the state of competition in the telecommunication market. A while ago, the German regulator has published it's report for 2012 (in English) which contains, among many many other interesting numbers, the amount of data transported through fixed in wireless networks in Germany.

As per the report, 4.3 billion gigabytes of data were transported through fixed line networks in Germany in 2012 (page 77) compared to 0.139 billion gigabytes (or 139.75 million GB to sound more impressive) in wireless networks (page 78). In other words, there's 30 times more data flowing to and from fixed line connections compared to wireless.

According to the report there are 28 million fixed line Internet connections in Germany today and thus the average monthly amount of data per line is around 12 GB. Also interesting is the rise of fixed line data from 3.7 billion to 4.3 billion gigabytes from 2011 to 2012, that's a rise of 16%. In wireless networks the amount of data transferred rose from 93 million GB to 139 million. That's a 30% rise which is quite substantial but far from the doubling or tripling the year before and the year before that respectively. In other words, the growth has been slowing down for a number of years now.

The report further says that there were 139 million mobile subscribers in Germany in 2012 out of which around 40 million are actively transferring data (page 79). This made me think a bit. I pay around 40 euros a month for my fixed line Internet and telephony connection today and around the same amount for wireless connectivity. And while the fixed line is shared, every family member has an individual mobile contract. So in effect I pay less for my fixed line connection when broken down per user compared to my wireless subscription and on top transfer over 30 times more data over it. Or put the other way round I pay more for my mobile subscription then for my fixed line and use it far far less.

All of this makes sense if wireless networks are more expensive to build and maintain than fixed line networks. But is it really cheaper to drag a fiber cable close to people's homes these days and then have a copper wire to each individual house or apartment compared to setting up a base station on a rooftop that servers one thousand users? I have my doubts.

The Fairphone – How Much Does What Cost?

Which device will be my next smartphone? I've made my choice and it will be the Fairphone. It's in the process of being built by a small company established in the Netherlands and the aim is to produce it with the people and the environment in mind. No children labor in African mines, fair wages for Chinese works and safe working conditions. In addition the company is open about the whole process of building the device and using an open operating system, i.e. Android and perhaps Firefox OS and Ubuntu in the future.

The device is in production now with shipment foreseen around Christmas time. One interesting piece of information I recently came across when I wanted to get an update on their status is the cost breakdown of the device's retail price of €325 based on a production run of 25.000 devices. Here are some noteworthy numbers:

  • €129 design, engineering, components, manufacturing
  • €4.75 prototyping
  • €4.25 reseller margin
  • €9 certifications (CE, GCF, RoHS, FCC, Reach) and testing
  • €63 taxes (VAT, etc.)
  • €11.75 personnel costs, office space, IT, travel
  • €11.00 legal, accounting
  • €6 events
  • €5.25 webshop hosting
  • €18.25 warranty costs
  • €11 interventions (sustainability, being fair to people and environment)

For the full details, see here. If you are interested in how a phone is built from scratch then the website is a treasure trove of information. Bring some time…

cURL for Throughput Testing

I was recently faced with the dauntingly tedious task of doing throughput testing which meant uploading and downloading files from HTTP and FTP servers and noting the average throughputs in each direction separately and simultaneously. This is fun for about 30 minutes if done by hand but gets very tedious and even confusing afterward as constantly triggering up- and downloads makes you loose your thread at some point when your mind wanders somewhere else during the downloads. So I decided to automate the process.

There must be about a zillion ways to do this and I chose to do it with cURL, a handy command line tool to upload and download files in just about any protocol used on the net, including http, ftp, pop, etc. etc. It's ultra configurable via the command line and has a great variety of output options that make later analysis such as averaging downloads speeds of different files very simple.

For doing repetitive downloads I came up with the bash script (works well under Ubuntu and MacOS):

#!/bin/bash
URL="http://ftp.xyz.com/name-of-file"
OUTFILE=test-down.csv
rm test-down.csv
curl $URL -o /dev/null -w '%{size_download}, %{speed_download}n' >>$OUTFILE
curl $URL -o /dev/null -w '%{size_download}, %{speed_download}n' >>$OUTFILE
curl $URL -o /dev/null -w '%{size_download}, %{speed_download}n' >>$OUTFILE

cat $OUTFILE

The URL variable holds the URL to the file to be downloaded. Obviously if you test high speed links, the server should have enough bandwidth available on its side for the purpose. The OUTFILE variable holds the name of the local file to which the file size and download speeds are written into. Then, the same curl instruction is run 3 times and each time, the result is appended to OUTFILE. While the script runs, each curl instruction outputs information about current speeds, percentage of the download completed, etc.

And here's my script for automated uploading:

#!/bin/bash
UPURL="http://xyz.com/test/upload.html"
LOCALFILE="10MB.zip"
OUTFILE="test-upload.csv"
rm $OUTFILE
curl  -d @$LOCALFILE $UPURL -o /dev/null -w '%{size_upload}, %{speed_upload}n' >> $OUTFILE
curl  -d @$LOCALFILE $UPURL -o /dev/null -w '%{size_upload}, %{speed_upload}n' >> $OUTFILE
cat $OUTFILE

The trick with this one is to find or build a web server as a sink for file uploads. The LOCALFILE variable holds the path and filename to be uploaded and OUTFILE contains the filename of the text file for the results.

Note the '.csv' file extensions of the OUTFILES which is convenient to import the results to a spreadsheet for further analysis.