My First Full Video Streaming Session Over 3G

Perhaps the following should be nothing special for me having used double digit megabits per second connections over 3G many times before. But still, today, I was using a 3G connection for the first time to stream a full length movie over to my Intel Atom based netbook running Ubuntu Linux, Firefox and Flash for the video playback. I was doing this as I was abroad and wanted to relax in the evening a bit so I decided that I wanted to watch one of my favorite TV shows while in the hotel room. This required the use of a VPN tunnel to a server back home as the streaming was limited to national IP addresses.

The reason why I haven't used 3G networks for streaming in the past is probably that while I am at home, I have no reason for using a 3G network as I have DSL and Wi-Fi i at home. And when traveling, the amount of data I can transfer over a 3G network for a reasonable price doesn't allow streaming full movies, even though being possible from a technical point of view. But if you get 7 Gigabytes for 20 pounds, there's little left to hold you back.

UMTS 900 Configuration in London

I had only little time when I was recently in London to check O2's UMTS 900 coverage in London but I've been able to catch a few things nevertheless. When taking the train from outside the city to Paddington station I noticed that the UMTS 900 coverage already started in the outskirts of London, well away of the city center. Interesting data point.

While roaming on foot in the city center I was on both the 900 and 2100 Mhz band outside in the streets. When moving further indoors I was sooner or later always falling onto the 900 Mhz layer, as expected, but not before the last little bit of 2100 MHz coverage was gone. In the other direction my mobile jumped back to the 2100 MHz layer as soon as it was found again. In other words, the 2100 MHz layer seems to be preferred at all costs while the 900 MHz carrier is only used in places where 2100 MHz coverage is lost. Not sure if the same applies while the UE is in DCH, however, as all tests were performed in idle. Something to be investigated next time.

SMS Past, Present, Future. Or: Counting SMS Messages – Will it Look Odd in 10 Years?

Recently I was wondering about the difference in user behavior around SMS depending on whether SMS is used as part of an "all you can eat" contract or paid individually on a message to message basis. When thinking about it I started reflecting on how SMS evolved and how it how it could be perceived in 10 years from now. Here's a bit of past, present and future as it looks to me:

The SMS service has undergone tremendous change since it's first use in the middle of the 1990's. Initially, it was unclear to most people in the industry if users would even have an interest in the service. So, even though SMS was created at the same time as GSM voice service, SMS service was only introduced a few years after the first GSM networks came into operation. For more details on that part of the history on SMS, I can recommend this book.

Other indications that SMS was an uncut gem in the early years are that the first GSM phones did not support SMS at all and some of the first ones that did, such as the Ericsson GH-172 and GH-174 from 1992/93, could only receive SMS messages but not send them. An anecdote that I was recently told by someone who was working on SMS at the time has it that in those years, network operators even had difficulties finding incumbent companies that were developing other GSM network equipment to build them an SMS service center as they saw no market opportunity to such equipment. This matches with the Wikipedia entry on SMS which says in the "early implementation" section that the first SMS message was sent from a PC to a phone via an SMS Service Center by Aldiscon, a company only founded in 1988.

Over the years, SMS has become a phenomenally successful with billions of messages sent single day now and perception has significantly changed. Today, many people still pay per use, i.e. they pay for each message separately. In the US one even pays for incoming SMS messages un less the user is on an unlimited plan. These have also become quite popular in other parts of the world and unlimited plans have completely changed usage behaviors. Instead of one shot messages, unlimited plans invite to use the service as a conversational tool, shooting back messages back and forth and using SMS in a similar way as people have used instant messengers for many years. Obviously, this fuels the grow of the service and also its popularity.

So how will people perceive SMS messaging in 10 years from now? First, who is we? Even today, SMS is used differently in different parts of the world, so I'll concentrate on Europe for now. Perhaps it will look odd at some point that there was something like "pay per SMS message" in the first place. Perhaps, with smartphones and Internet based services becoming the norm rather than the exception, SMS interactive communication will play a diminished role as people resort to instant messengers they already know from their PC and Wi-Fi tablet!?

Another interesting line of thought is that unlike voice calls, SMS is easy to port or be replaced in IP only wireless networks such as LTE, as instant messengers have existed in fixed line networks for many many years and the service is non-real time and non-streaming. In other words, handovers between different radio technologies are no issue at all compared that to the LTE voice call handover issue which is still unsolved in practice.

And a final thought for this post today: Perhaps SMS will become or remain an alternative for people who like privacy and services that don't store and analyze messages for targeted advertising, building social graphs, etc. After all, unlike for web based services for which users don't pay anything and are in fact NOT the customer but only a source of information that can be monetized and on top give up some of their privacy, for the SMS service they might still be that: The customer, who pays for the service of sending and receiving messages, free from other needs of monetization such as selling information gained to third parties.

Delicious – When the Cloud Looses Your Data – As Per Design…

In April this year Yahoo sold their Delicious online (some would say "cloud") bookmarking service to another company which required a move of the service and the user data from one company to the other. The companies did a sensible thing and only moved the data once the user had opted-in. That is a good thing and since I use Delicious regularly I was made aware of the move and I was able to act accordingly.

Not all users were that lucky though. Recently I got a call for help from a user who had links on Delicious that are quite important but which he suddenly could not access anymore. The problem was, as we soon found out, that he had not used Delicious for quite some time, perhaps half a year. That did not change the importance of the links store there, though. Unfortunately, the data migration period ended this September so the "new" Delicious did not have his account anymore, nor his data, nor did it help in any way to retrieve the data from the old service. Quite a nasty surprise.

So I was asked to the rescue. After some research on the web I found out that there is a way to still get the bookmarks from the old "Delicious" on Yahoo. It involves changing the hosts file on the local computer to redirect calls to delicious.com and a couple of other domains to the Yahoo servers which still seem to sit around. For the details have a look here. With this kludge I was able to export the bookmarks into a local HTML file. Changing the hosts file, something a normal user would never do.

Yet another example why you should

    a) think twice before trusting an online service with your important data

    b) keep a copy of your data locally as well

Fortunately there are some alternatives available to Delicious in the meantime. Firefox Sync for example might be a replacement that is even better than Delicious and is now part of a standard Firefox install. It syncs bookmarks between devices, which is what I used Delicious for and furthermore it encrypts data before sending it to a server in the network (read: "to the cloud") so your data is protected on top of it as well.

Detailed Ofcom report on UMTS 900 from October 2010

Today I found an interesting report on the Ofcom website from back in October 2010 on the potential impact the opening of the 900 MHz band would have for consumers and on competition in the UK. It's very detailed and I have just scratched the surface so far but already found some interesting figures and graphs in it. To reach a conclusion, Ofcom has looked into coverage differences between 2100 and 900 MHz, number of base stations used and much more. In other words, their technical analysis contains details that can answer more questions than what that report was aimed at if you take the time to read it carefully. Note that the text on the web page itself is just the exec summary, the details are contained in the PDF files linked at the bottom of the page.

The World of Mobile 5 Years Ago – October 2006

Only 5 years ago, the mobile domain was a radically different one. You don't believe it, it's just 5 year? Let me remind you: No iPhone, no Android (both only talked about in 2007). So what was going on then? Let's have a look at my blog entries from back then:

Mobile Virtual Network Operators: One can look at cheap, no frills mobile virtual network operators (MVNOs) from many different angles and come to good and bad results. Undisputed, however, that they had a big impact on the German market with prices tumbling significantly in only 18 months. What we take for granted today, going into a supermarket and buying a SIM card and perhaps a mobile phone or 3G dongle with it for a couple of Euros, it started back in 2005 and showed strong results in 2006. Other countries introduced MVNOs as well but under different circumstances with less competition. In France, for example, I am waiting up until today for the market getting a kick.

VoIP over Wi-Fi: Yes, no kidding, VoIP was already working back then, well, more or less…

EDGE: 5 years ago, I was musing on how long EDGE will be good enough for notebook connectivity. Now I can give the answer with tens of megabits of speed now offered by UMTS networks: It is not not good enough anymore 🙂 In that same post I was also wondering if we would see UMTS 900 in Europe in 5 years. And indeed, we do today, UMTS 900 is deployed in Finland, in the countryside in France in some places and in big cities in the UK such as London. Yes, UMTS 900, the idea was there already 5 years ago.

The Nokia N80: The predecessor to the iconic Nokia N95, one of the very first UMTS phones with Wi-Fi inside was unveiled at Nokia world in 2006. At the time, a risky move as network operators were probably not very happy about this. 5 years ago we were talking about 40 MB RAM, 200 MHz ARM processors and 3 megapixel cameras as the very top level in mobile. Today we are talking about mobile phones with 1 GB of RAM and dual-core 1.2 GHz ARM processors and dedicated video hardware encoding and decoding HD video and 12 megapixel images in real time. If you extrapolate this 5 years into the future, i.e. to 2016, we are talking of performance we have on desktops today. How that will be accomplished without draining the battery in half an hour or less remains to be seen. However, if you had told someone back in 2006 about phones with such specs, it would have been hard to imagine.

My killer-app from back then still not here now: Back in 2006 I was hoping that with the N95 I would get a feature that I could send my location to someone instantly instead of explaining where I am for 5 minutes in a call. From a technical point of view it has been working for many years now. And I had opportunities enough to use it. But either the person on the other uses a different maps application, has no smartphone, etc. etc. The lack of a standardized solution for this and use by a critical mass of users is still preventing things. Perhaps in another 5 years?…

WiMAX was a hot topic: Well, pretty much forgotten by now.

Video Calling: I've been using it occasionally over the past years but it hasn't taken up due to a number of reasons. Now Apple is giving it another try.

Wi-Fi 802.11n: What's currently in all high end Wi-Fi devices was finalized as a standard just around back in 2006. Despite the time that has passed I still see interoperability issues in the wild with devices with a chipset of one manufacturer not working well with devices that have a chipset of another manufacturer in sight. Not ideal.

I guess you agree, lots has happened in the past 5 years…

Ubuntu 3G Connection Sharing – Think Reverse

Ubuntu-ics Here's a thing that I stumbled over recently: I always assumed Ubuntu does not have an Internet Connection Sharing (ICS) functionality because there was no such option in the network manager settings in the "Mobile Broadband" section. But this was an error of thinking because I was assuming ICS would work the same way on Ubuntu as it does on Windows. As it turns out, however, configuring ICS works exactly the other way round.

In Windows (XP, Vista, 7) the ICS option has to be activated on the Interface that offers Internet connectivity (e.g. the 3G link) and here, the interface has to be selected on which the sharing computers are on (e.g. the Ethernet port).

In Ubuntu however, you have to set the "sharing" option on the interface where the sharing computers are (e.g. the Ethernet port) and NOT on the interface with the Internet connectivity. This is why there is no option in the "Mobile Broadband" section. The Ubuntu way of doing things has to significant advantages:

  • There is no need define which interface shares to which other interface.
  • Defining the Ethernet port as the port where the shared computers are connected allows Ubuntu (or rather iptables) to select any interface that has Internet connectivity to act as the Internet port. Even switching from 3G to Wi-Fi (both having Internet connectivity) is seamless to all computers connected to the Ethernet port (except for TCP and UDP connections being reset during the switchover).

Pretty neat!

Operator Patchy-ness Vs. Monoliths

Recently wondered what might be better for the mobile industry, the operator patchy-ness you find in Europe, Asia and Africa or the network operator monoliths found, for example, in the US? From a software development point of view things are probably easier for device manufacturers in the US. You sell your device to one network operator, you do your software once, you do your hardware once, you do everything once and you are done. Sounds good but this creates also a great dependency. If you suddenly fall out of grace of a network operator you've been doing devices for you loose market share on a whole continent instantly. Also, despite covering a whole continent, US operators use frequency bands incompatible with most networks on the rest of the planet. That leaves them vulnerable when it comes to volume. 

In Europe and the rest of the world things are very different. A manufacturer can sell a mobile device, identical from a hardware point of view, to many different network operators. If one doesn't like you, the world does not come to an end, you can still sell to the national competitor and each country is a whole new game. Obviously, that gives more power to the manufacturer. On the other hand, it's also more work due to all the different languages and apps each country and network operator requires on the devices he sells. Many network operators have networks in different countries now, which might make things a bit more simple for both sides and gives network operators a bit more power than those have that operate only in one country.

But despite there being many network operators they all have one thing in common: They all use the same technology and the same frequency bands. From a hardware point of view, that's a huge advantage for device manufacturers, they can concentrate on one variant of the device for all network operators. Very different in the US with it's mixture of GSM, CDMA and LTE coupled to both existing GSM and LTE networks.

Perhaps the huge number of countries and network operators throughout the European continent has had one good thing: Unlike in the US where operators were and mostly still are of the opinion that they can walk it alone, there is no such thing in Europe. Everyone knows that compromise is necessary to a common technology. Those not sticking to a consensus will have a difficult time to go it alone. To me it looks like this has helped tremendously to mold the industry together. What do you think?

The Empty Phone Booth

Empty-booth Even out in the Californian desert, the good old telephone booth is in it's way to extinction. I recently took the picture on the left at the same place from which I reported in a previous post on remote area 3G coverage. So 3G is killing the phone booth if you will. Not that I am nostalgic about it but this is one quite recent innovation that has come, had its prime and has gone again.

Interesting side note: Have a look at the logo at the top of the booth. According to Wikipedia it is the old Pacific Telephone logo, used until 1983: "In 1969, AT&T revamped its corporate identity, resulting in a simplified Bell logo, absent of "Bell System". This logo remained with Pacific Telephone until 1983". That gives an interesting indication as to when that booth was put there.

The UK Delays LTE Auctions Again – To End of 2012

Many countries have already auctioned their LTE spectrum years ago and counties such as Sweden, Norway, Finland and Germany in Europe have LTE networks on air already, the earlier ones since 2009. But for reasons which are hard to understand when looking at things from this point of view, the UK keeps delaying their spectrum auctions to the end of 2012 according to this news piece on MocoNews and the Ofcom announcement on their page here. So even if the auction now takes place at the end of 2012 it will be well into 2013 before first networks open up even only in a few places.

And while the UK network operators are testing LTE with a couple of sites in a few places, there are well over 2000 base stations already deployed and operating in Germany, serving private customers and, perhaps even more important, companies who have so far been left behind. Also large cities already have life LTE coverage as well, off-loading traffic from UMTS networks.

Waiting until the end of 2012 with the auction, that is four years after the first networks have launched in Europe! Back in the 1980's the UK was on the forefront of telecommunication development and had one of the most flourishing telecoms landscapes in the world. What has remained of this is now completely lost with this move and I am baffled. Whatever the issues are one must wonder why those were overcome in other countries in Europe which have now passed the UK by almost half a decade? Is Ofcom trying to make everyone happy? I would argue that that's an impossible mission. Whatever they decide, someone will be unhappy and go to court. Another year more is unlikely to make a difference to that.

Yes, I know, the current spectrum allocation in the UK is different from those in other countries. But are they the same in any two countries? In the UK, only two network operators have been assigned 900 MHz spectrum, a lease that was just recently confirmed, extended and opened up for UMTS. In Germany, the setup is different. Here, the 900 MHz band is assigned to all four network operators, although the shares are of different size, giving some more flexibility than others. But don't think this has made the auctioning process any easier with quite a number of companies going to court before the auction to stop it for various reasons. In the end there is and will be no single setup for the auction that makes everyone happy and is perceived by everyone as fair.