Vodafone Websessions Keep Timing-Out

Gateway Timout 2 Sometimes you have to look at the good sides of things going wrong. Every now and then, I use Vodafone's Websession offer in Germany as it is a convenient way to go online occasionally. If it works… Over the summer, unfortunately, I have observed continuous glitches. Everyone has glitches now and then and I wouldn't haven mentioned it at all if it was only a singular occurrence. But this one keeps dragging on.

So here's what's going on: In most failure cases, automatic initial redirect to the payment web portal works, the money is deducted from the account, but anything else either results in nothing or a technical error message on the portal. In most cases, disconnecting and re-connecting a couple of times from the network helps the service to come back to its senses. Not so, however, when I recently needed it. While I was charged and the service kept insisting that it would forward me to the requested page after connecting to the network, it never did no matter how often I tried to re-connect.

So what's the good side of this you might wonder!? Well, the good side is that Wireshark revealed a bit of how the service works and I learnt a couple of things:

First, the services uses a transparent web proxy that redirects all http requests to the landing page with a “moved temporarily” answer to a HTTP GET request for any page until the session is paid for or directly after re-establishing a network connection to display the remaining time of the session.

Second, the web proxy also responds to a request to a web page if there is a problem with the service. In the most recent one, it answered web page requests after a while with a “504 Gateway Timout”. The picture on the left shows how this looks in practice. The server information element of the message is set to “WebProxy/5.0”. From that I assume that each and every requested web page flows through the transparent proxy. This, by the way, is also supported by the network based picture compression that can be deactivated via the Vodafone dashboard software or by including a special http header parameter in every request.

O.k., I've learnt enough this way now, so please Vodafone, fix this service!

The Hollow Operator – From Operator to Owner

LightReading Insider has recently published a paper on the trend of mobile network operators outsourcing operational tasks to external service companies. This includes maintenance of existing core and radio networks, network monitoring and operation, performance monitoring, capacity management, new network rollouts, introducing and running services and many other things. Very interesting to me, and since LightReading was kind enough to send me a copy, I took a closer look.

LightReading chose the term “Hollow Operator” for their paper. Interesting wording and it makes me wonder about how many operational tasks can be outsourced before the term “operator” no longer applies!? What is a network operator that doesn't operate its network anymore? A network owner?

One of the main reasons for companies to outsource work that is not at the core of their business to reduce cost, which, if it works out, makes them more competitive. Global services companies taking over operational tasks such as Alcatel-Lucent, Ericsson, Nokia Siemens Networks, Huawei and others add to this that their global structure allows them to grow the business by improving net subscriber revenue, making the network more operationally efficient, and can develop the right plan to evolve a network for future consumer demands. And a global structure they truly have. Ericsson states for example […], that they have 37.000 people working for them in their service division, 28.000 close to the customer and an extra 9.000 globally). Many of those people are likely to come from network management deals were network operator staff joined the service company when the contract was signed. From a technical point of view, a big advantage is that information and knowledge on how a network can be operated efficiently can be shared, for example by continuous streamlining of company wide processes as a result of what is learned by operating many networks.

Skeptics argue that outsourcing itself does not improve anything on its own. In the end, funding has a lot to do with how things develop. If not funded appropriately, outsourcing can quickly turn into degrading standards. Also, outsourcing is difficult to reverse in case one day the network owner wants to take back operation. Once employees are gone and operational procedures integrated into a different company, it's difficult to get employees and procedures back in house. Unless of course, the outsourcing contract contains clauses for such a circumstance. Also, outsourcing increases complexity. Any extras that would previously have been handed down the internal hierarchy is now an external business matter between two companies. That means such requests are answered with a quotation as it wasn't included in the original calculation in the contract. That doesn't help to speed things up. On the technical side, there is no network that is like any other. Each network operator has different components in different configurations and uses different software versions. In short, while processes for network management inside a service company are probably similar, each network requires dedicated experts to deal with that configuration which in turn limits to port the lessons learned from one network to another.

Many more things can be added to support and counter network operation outsourcing and I can highly recommend Lightreading's paper for further reading. I'll leave it at that for the moment but please feel free to add your thoughts by leaving a comment below.

The Cloud and Denial of Services

I am not a huge fan of services in the cloud, as discussed here towards the end of the presentation. Instead, I much prefer to have my services and data securely running and stored behind a firewall in my own home network readily accessible over a secure connection over the Internet from my wireless devices.

In the past weeks, lots of denial of service attacks have been launched on Twitter and some mashups and front-ends were blocked for days because they could not have been distinguished from the DoS attack. In effect, no Twitter service for me when I wanted to use it a number of times.

These incidents made me think about my views on connected home services again. Dabr, the Twitter front-end I use on my mobile device can also be installed on a different server, e.g. at home, and would thus probably not have been an indirect victim of the DoS attack. Another thought was what would happen, if in the future a very popular web based service would be the target of a DoS and you wouldn't be able to use your critical applications for days. Not from home and not from the mobile either. Not a nice thought.

Sure, it's possible to launch a DoS attack on connected home services, too but it's unlikely that anyone would go through all the pain and expense just to make life difficult for one person.

I guess that, too speaks for distribution of private services rather than concentration in the cloud.

The Nokia N900 – Escape from the Cloud and Jail?

Next week Nokia World 2009 will be held in Stuttgart and while I am waiting for the press to give me the details of the event while I am roaming in the Scottish highlands I've been thinking a bit about what Nokia's recent announcements around the new Nokia N900 could mean for the future of mobile devices.

To me, the current smartphone market by and large looks as follows:

S60

For the moment I am still stuck with my S60 driven Nokia N95. The OS so far is closed source but anyone can develop programs for it and does not depend on Nokia to allow or certify anything if the developer thinks the user can handle a couple of warnings during program installation. To get rid of them, programs can be certified by Nokia / S60, which takes a bit of time, but unless the program does something really malicious, Nokia / S60 have no preconceptions on what should be allowed or not. While this sounds all great and a lot of applications are available, S60 has lost a lot of mindshare in the past 18 months. Many developers are now preferring the iPhone OS or Android when it comes to new and cool stuff. On top, Nokia has decided to strip out a couple of cool features in its latest phones such as VoIP, a killer argument for me against buying another S60 phone in the future.

The Jail

And then, of course, there is the iPhone. Great marketing, great user interface, very easy to use. Unfortunately, it has no multitasking and Apple is pretty opinionated on what should run on the iPhone and what not. The latest Apple / Google quarrel is a good example. No, thank you, not my piece of cake, either. When I buy a device, I should be the one to decide what I want to run on it and what not.

The Cloud

Next, there's Android. Based on Linux and undoubtedly very innovative, it is most useful if the user shares his private data with Google servers in the cloud. From a usage perspective, it's great, as your e-mail, address book, documents, etc. are available and synchronized between all devices of a user. I like that a lot but I don't like sharing my private data with Google or with anyone else for that matter. Private synchronization or connected home services are the way forward to me. For details, see here.

The Rest: And then there are OS'es like Windows Mobile and the the Palm Pre's WebOS which either fall into one of the categories above or in between.

The N900 – To The Rescue?

So what I want from a mobile device is quite simply described:

  • An OS for which new and innovative programs are developed for
  • Multitasking
  • My private data should be treated privately
  • I decide which programs I want to use on my device and no one else.

Or in short: The same experience as I have on my PC and my netbook: I decide!

With Nokia announcing the Linux / Meamo based N900 smartphone I am getting my hopes up again a bit. Maemo has already been around for a number of years now on Nokia's Internet tablets so I have a fair idea of what it is and what it is not. While I've so far not been very impressed by it due to lacking 2G / 3G support, wrong form factor and slow speed, Nokia seems to have an answer to all of that with the N900. 600 MHz processor speed should hopefully take care of speed issues, 2G/3G network support has been added and the physical dimensions of the device are in the same ballpark as my current N95. Also, Nokia says that it will be VoIP capable.

On top, Maemo, at least up until now, has been a very open platform from various angles: First, it's based on Linux so it's very well known in the developer community. Second, unlike with Android, where applications developers have to work with a Java framework for their applications and have no direct access to the OS, Maemo works just like a PC based Linux distribution: (Almost) everything is fully open to developers, existing programs can easily be ported to Maemo and there is no lengthy certification process. In other words, while Android is based on Linux but doesn't give access to it to applications, Maemo fully does, unless, Nokia decides to remove that openness in the new version of Maemo. Let's hope not.

So if Nokia plays it right, they will make developers happy, they will make users like me happy and they've sold their first smartphone in two years to me!

Diversity Rules

But don't get me wrong here, I don't argue for a full and open Linux phone to be the one and only answer. I think there's also a place for devices that do fewer things, that are not as configurable and expandable, that are more tightly controlled. The reasons for that are plenty: Ease of use, better support from manufacturers or network operators for users, etc. etc. While many users might want that on their PC world and thus might prefer it in the mobile world too, there are many, like me, who thrive on openness!

Vodafone and Petabytes

Another interesting number popped up on the Internet recently. Here, Teltarif quotes Georg Benzer, Chief Network Officer for Vodafone/Arcor in Germany, saying that Vodafone/Arcor (I assume he means both their DSL and mobile network Germany) transport 1.5 petabyte a day. No more details were given and there is only one more source on the net reporting the same number. But nevertheless, let's play around with this number a bit.

The 3UK CTO recently reported that at the end of 2008, they transferred around 1.000 terabytes (=1 petabyte) a month through their wireless network in the UK. Let's say most of that traffic was generated by 3G dongles. The exact 3G dongle subscriber number of 3UK is not known, I estimate it at around half a million at the end of 2008. That means every subscriber consumed about 1.000 terabytes / 500.000 subscribers = 2 GB of data a month.

Now lets say the 1.5 petabyte a day or 45 petabyte a month in the Arcor/Vodafone network were consumed by both fixed and mobile subscribers. Let's say Vodafone Germany has 2 million 3G dongle users (just an assumption, approximated from the 3UK number, no source for this) then out of the 45 petabyte, 4 petabyte would come from mobile subscribers. That means 41 petabytes are used by fixed DSL subscribers.

The number of fixed DSL subscribers of Vodafone/Arcor Germany is reported to be around 3 million. That makes 41 petabytes / 3 million subscribers = 13.6 gigabytes per subscriber per month on average. Note that both the 2GB above and the 13.6 GB are average values and there's no telling from those numbers how many users are at both ends of that figure (i.e. how many use much less and how many use much more).

Many of Arcor/Vodafone's DSL subscribers also use their DSL line for
VoIP (with a POTS to VoIP converter at home). That traffic should not
be counted either as it doesn't leave the network, at least not as IP packets. Let's say the average subscribers uses the fixed line phone
for 5 hours a month. VoIP produces 2*80 kbit/s (uplink + downlink) = 160 kbit/s of data
traffic = 72 MB per hour or 360 MB for 5 hours. Not very much compared to the 13.6 GB per month which are just reduced down to 13.3 GB per month.

A raw comparison of the two numbers would indicate that DSL subscribers are transferring 7 times more data through their connection than wireless subscribers. But I think that is a bit too simple a view. Most wireless subscribers are likely to also have a DSL line at home and fixed and mobile use might be different. Further, DSL lines at home are often shared with family members and several devices while a 3G dongle is mostly used by a single person with only one device at a time. Also, since most wireless offers have bandwidth caps, heavy users are much more likely to use a DSL line rather than a wireless modem, thus further distorting the direct comparison.

So despite the two numbers not being directly comparable they nevertheless give an interesting indication that mobile use is is not that far away from fixed line use.

Nokia Energy Profiler V1.2 now with 3G State Analysis Mode

A while back I reported on the Nokia Energy Profiler, a very useful utility from Nokia for S60 phones to measure power consumption. From the power consumption one can then deduct in which radio state the mobile is and how long it is kept there by the network during inactivity before a more power conserving state is selected. For details, see here. Now, Nokia has released V1.2 of the utility, which has a dedicated screen for showing the 3G radio states.

It's a bit hidden, though. First, it has to be activated in the preferences. Second, during recording only 0,1, 4 or 8 "CH" are shown. Not sure what "CH" means. Anyone? Anyway, 0 and 1 represents the Idle state, 4 the Cell-FACH state and 8 the Cell-DCH state (in HSDPA mode). The states are shown by stopping the recording and then using the 4-way navigation key to scroll back. In this mode, the application shows how long the mobile was in each state while you scroll back to the left.

I've tested the application in the T-Mobile network in Germany and the Orange network in France. T-Mobile keeps the connection in Cell-DCH state for around 20-25 seconds, including the (Opera Mini) page download time, which is around 2-3 seconds. The Cell-FACH state is keept quite long, somewhere between 1.5 and 2 minutes, before the connection is put into Idle state. In the Orange network, Cell-DCH state is kept for 10-15 seconds, and Cell-FACH state for around 30 seconds. A bit better for battery consumption one might argue, but T-Mobile's settings are better for the browsing experience if one remains on a page for more than 30 seconds.

Push and Pull, Keep-Alive and Wastefulness

O.k., here are some follow-up thoughts after my previous post on background applications that generate keep-alive IP packets which have a negative effect on radio interface efficiency. The “efficiency” issue here is that the ratio between the amount of data transferred and air interface radio signaling is very unfavorable for small bursts of data, especially in cellular broadband networks such as UMTS. So Dan asks in a comment to the post why mobile e-mail (push or pull) could be counted in the “wasteful” category. All right, here we go, this is my take on it:

There are two kinds of mobile e-mail delivery:

The first is push e-mail to mobiles, such as on the Blackberry. Here the server is likely to only communicate with the mobile device when there is an e-mail to deliver. I haven't tested it personally yet so I don't make a definite statement here. But I assume even if some keep-alive messaging is necessary, for example in cases when no e-mails are delivered for some time, it should not be that much. Also, IMAP push which is supported by more and more phones these days should also not generate keep-alive messaging.

Second is pull e-mail, which I use for example as I don't like the IMAP (push) implementation of my e-mail program. My polling timer is set to 10 minutes, so my N95 checks for e-mail 6 times per hour. Definitely more wasteful than push if there are less than 6 e-mail per hour. In case you receive more e-mails per hour however, it can even be more efficient than push.

So is mobile e-mail wasting air interface resources? I guess that depends on the definition. If the definition is that an application is wasteful that only transfers little data per radio bearer setup, then I guess the answer is yes. But then, small screen web browsing, like for example with Opera Mini, would have to be categorized as quite wasteful, too. Many pages I view are compressed to less than 20 kB. Ouch, that hurts, as it's one of my favorite applications…

So my own definition of “wasteful” would be:

Exchange of IP packets for frequent keep-alive messaging that do not contain data.

That excludes e-mail push, Opera Mini use and, depending on configuration and number of emails per hour, e-mail pull.

That still leaves us with a lot of other applications, especially when connecting the PC to the cellular network, that keep babbling away and provoke lots of bearer reconfigurations. But as I said in the previous post on this topic: For battery driven devices, always-on applications are quite likely to be optimized over time to talk less to increase the battery lifetime.

A Netbook, eeeBuntu and Mobility – Part 3

I've had my new netbook for about a month now (see here and here) so it's time now for an update on how things turned out. I was a bit skeptical at first whether I would keep Ubuntu Linux on the machine or revert back to the original Windows XP. A month later, I am pretty much convinced that Ubuntu is the right thing for me on the machine.

One of the most important things for me with is the startup time of the operating system and the applications. In both categories, Ubuntu does extremely well. Booting the system takes just around 60 seconds. Going to suspend mode and waking up again just takes 6 seconds. That's almost instantaneous and helps a lot if you just want do something quickly, like looking something up on Wikipedia for example.

The applications I use most are Firefox, Thunderbird, Open Office, GIMP, Pidgin (for IM) and Skype. Even when compared to my full notebook with Windows XP, most of them launch much quicker. Sometimes I even catch myself thinking that the netbook is faster than the notebook…

Some things especially noteworthy I haven't mentioned so far:

  • No drivers are needed for 3G phones or USB sticks. Both my Nokia N95 phone and the Huawei E220 3G USB stick worked right away. Not quite perfect as reported in part two, but it's a huge plus not having to install third party software, which often does more harm than good.
  • HP went out of their way to produce Linux drivers for their multifunction printers. The package is called HPLib and makes using my printer / scanner / fax over Wi-Fi very easy. I even dare say the software is much quicker than the PC version, especially the scanning part. No waiting for the next dialog box, no long program startup times, the scanner just jumps into action when the scan button is pressed. Conversion to JPEG and PDF works out of the box, too, very nice!

But where there is light, there is shadow, too. Here are the things that required under the hood tweaking to get it working:

  • The fixed line Ethernet chip was not detected automatically so I had to install the driver manually. It's not very complicated but for the average user compiling a driver is not a straight forward thing.
  • There seems to be a WPA2 problem with the Wi-Fi driver as I get lots of packet retransmissions. I've tried with several access points but the result is always the same. When going back to WPA encryption, everything is fine. I've searched the forums but haven't found anyone reporting this. Under Windows XP, WPA2 is working fine so it seems to be a driver issue.
  • The built-in video camera made some problems. I got it working for a while but it stopped once I've experimented with the screen resolution of the external VGA port and a second monitor. It seems the graphics driver can't handle advanced functions with a higher screen resolution. Also, desktop effects like windows zooming in and out when they are minimized only worked with the lower resolution. Getting the effects back requires manual intervention in the xorg.conf file as described here.
  • Automatic suspend when closing the lid did not work at first. Even worse, the processor utilization went to 100% and the netbook kept running. The root cause seems to have been a BIOS issue. After upgrading the Bios of my Acer Aspire One D250 to V1.07, suspend when closing the lid now works.

So even though it required some tweaking I've got a fully functional Ubuntu netbook now and I am very happy with the performance.

No more ‘3 Like Home’ International Roaming with 3UK

Back in 2007, network operator '3' in the UK announced that they are no longer asking for roaming charges for voice or data between their networks worldwide. A great offer even though the times I tried, their interconnection was hopelessly overburdened. Looks like the offer is no more as they 3UK has started introducing international roaming charges again since July. A great pity… Strangely though, '3 Like Home' is still offered by 3 Austria!?

The Battery is Part of the Mobile Experience

Extended battery This might seem obvious to most but I just realized these days how important the battery is for the mobile experience. I recently bought a netbook (see here and here) and while most experiences are positive, a battery lifetime of only 2 hours just doesn't do for me in many cases especially when I am traveling. Even if it is enough, connecting the netbook back to the mains all the time for recharging is also a hassle. So I bought an extension battery pack which gives me 6 hours of autonomy in addition to the 2 hours of the standard battery. An incomparable experience! Now even while traveling for a whole day, sitting in the train, waiting at the airport and on the plane, I don't have to worry about the netbook running out of power. Very nice!