Android Wi-Fi Tethering – Great But Watch The Battery

In my ongoing exploration of the Android OS I've arrived at Wi-Fi tethering. Quite simple to set-up on Android 2.2 and while I wished the settings would also allow WPA to be used as an encryption algorithm instead of WPA2, I don't think one can really ask for it. Too bad one of my notebooks will only do WPA and is thus not usable with it. But apart from that things work straight out of the box.

Performance is also great with the full speed of the HSPA chip forwarded to the Wi-Fi interface as proven by the 6 MBit/s throughput I got. Impressive! Also, the stability is impeccable. I tried the tethering over two days and the device didn't crash once. Also, on-board apps remain fully usable even while tethering is active and share the same network connection. There's not much more you can ask for.

Two little quirks though in addition to the missing WPA encryption:

  • Power consumption: The battery of my Galaxy S lasts for about 3 hours when a notebook is connected over Wi-Fi and modest web surfing is done in addition to having a continuous ping running (see next bullet point for the reason). One can of course connect the mobile via USB to supply power.
  • Fast dormancy (pre-Release 8) while tethering: The Galaxy S interrupts the RRC connection after 3 seconds of inactivity resulting in long waiting times especially during web browsing on the PC. While fast dormancy is great when only background applications are running on the device itself, it should really be switched off while tethering for performance sake. O.k. you can have a ping running on the tethered device to keep the mechanism from kicking in but that's a bit of a kludge…

Thanks Google, a very worthwhile functionality that will find it's way into my everyday use for special scenarios!

That Packet Has More IP Addresses Than I Can Count – Almost

How many IP addresses does an IP packet have when it is sent between two devices? In a simple world only two, i.e. one for the destination and one of the source. The fun thing is how often that changes between source and destination due to all the tunneling and NATing applied. Let me give you an example of a typical setup of mine, a netbook connected via a Wifi / 3G bridge with a VPN tunnel established:

Leg 1: Netbook – WiFi / 3G bridge

As the VPN is established an IP packet is tunneled over IP. Thus there are 2 IP addresses to identify me as the source and two IP addresses for the destination:

  • IP-1: IP address given by the Wifi bridge to the netbook
  • IP-2: IP address of the VPN remote endpoint

    Inside the VPN tunnel:

  • IP-3: IP address given by the VPN service to the tunnel end point (i.e. netbook)
  • IP-4: IP address of the destination (e.g. the web server)

Leg 2: Wi-Fi / 3G bridge to Mobile Network Gateway (GGSN)

  • IP-5: IP address given to the WiFi bridge from the GGSN (NAT)
  • IP-2: -unchanged-

    Inside the VPN tunnel:

  • IP-3: -unchanged-
  • IP-4: -unchanged-

Leg 3: GGSN to VPN remote endpoint

  • IP-6: IP address of the GGSN after NAT translation
  • IP-2: -unchanged-

    Inside the VPN tunnel:

  • IP-3: -unchanged-
  • IP-4: -unchanged-

Leg 4: VPN remote end point to web server

  • IP-7: IP address of the VPN remote endpoint after NAT translation
  • IP-2: -unchanged-

    Inside the VPN tunnel:

  • IP-3: -unchanged-
  • IP-4: -unchanged-

Leg 1 to 4 are not the full story, there's also tunneling performed in the wireless network:

Tunneling between SGSN and GGSN

User data packets are tunneld over IP in the wireless core network. Here the packet looks like this:

  • IP-7 SGSN network internal IP address
  • IP-8 GGSN network internal IP address

    User data packet:

  • IP-5: -unchanged-
  • IP-2: -unchanged-

    Inside the VPN tunnel:

  • IP-3: -unchanged-
  • IP-4: -unchanged-

Base Station to RNC

While in the past the 3G radio access network was based on ATM, it is more and more replaced with Ethernet and IP for routing. This gives the packet another two IP addresses in this part of the network:

  • IP-9 Base station network internal IP address
  • IP-10 RNC network internal IP address

    User data packet:

  • IP-5: -unchanged-
  • IP-2: -unchanged-

    Inside the VPN tunnel:

  • IP-3: -unchanged-
  • IP-4: -unchanged-

RNC to core network IP address

No, we are not finished yet, the interface between RNC and SGSN is also based on IP these days and again tunneling is applied:

  • IP-11 RNC network internal IP address on core network bound interface
  • IP-12 SGSN network internal IP address on RAN bound interface

    User data packet:

  • IP-5: -unchanged-
  • IP-2: -unchanged-

    Inside the VPN tunnel:

  • IP-3: -unchanged-
  • IP-4: -unchanged-

12 different IP addresses are used in the course of transfering a packet between source and destination! WOW!

Android and Open Source as A Door Opener for Deep Down Innovation

With the advent of Java on mobile phones many years ago, third party companies for the first time had the possibility to write software that runs on mobile phones. Many other programing environments have followed over the year and the most popular ones are currently the native programming environment for the iPhone, for Android and for Symbian. While it's incredible what can be done with them the quite deliberately set one limitation: All 3rd party programs are shielded from the operating systems. The can use the API offered to them but they are restricted from directly accessing any hardware or to interact directly with other parts of the operating system. With Android, things have become quite different though.

As Android does not only publish an API for programs running in the Dalvik virtual machine but the complete source code of the operating system, companies interested in offering functionality that requires interaction with the hardware or directly extend OS functionality can do so relatively easily. I have two interesting examples:

  • GAN for Android: Orange UK and T-Mobile US are now shipping Android based phones with Kineto's GAN stack that tunnel GSM voice calls over Wi-Fi (see here and here). Developed many years ago, GAN depended on Nokia, Samsung and others to integrate the GAN stack into mobile phones as the programing environments of their phones did not allow 3rd parties to dwell deep enough in the OS. With Android however, external companies can do it themselves.
  • NFC functionality: NXP and G&D have worked on integrating NFC and SIM Secure Element functionality into the Android OS (see here). Again, something that would have been only possible for device manufactures can now be done by 3rd parties to enable their services.

But there is one catch: Anything running outside Dalvik virtual machine can't be installed from the outside by a third party on non-rooted devices. In other words, such solutions have to be delivered as part of the firmware image. Which brings in Google and handset manufacturers again. 3rd parties can develop the code but it has to be integrated by the manufacturers. Still, a significant advantage over closed source operating systems, where such functionality can only be implemented by the OS owner.

Verizon, LTE and Averages – But What About the Maximum Speed?

Here's an interesting report from PC World about this years state of mobile networks in the US with a bunch of measurements and comparisons of previous years. By and large, improvements are impressive but there are a number of things I haven't quite figured out.

First, the round trip delay times they are quoting. In all networks, even including Verizon's new LTE network, latency values beyond 100 ms are given. When I compare that to HSPA+ networks here in Europe, same technology as that of AT&T and T-Mobile US and the typical latency values of 60 ms, I don't quite know where the extra delay comes from!?

And second, the data speeds. Good to see that all carriers have now left the sub-1 MBit/s average speed range and moved up significantly up the ladder. The 6+ MBit/s measured in downlink direction for Verizon's LTE network (in 10 MHz bandwidth) and around 3.5 MBit/s for T-Mobile US's network (in 5 MHz bandwidth) compare very well to the 4.5 MBit/s measured in German networks in comparable measurement campaigns. But what about maximum achievable speeds in the network?

And by that I don't mean theoretical peaks, those can be taken from the standards documents. What I mean is maximum speeds that can be reached in the live networks under very good signal conditions. In Germany, networks go as far as 12 MBit/s today on one carrier or 24 Mbit/s if you take the second 5 MHz carrier into account that is deployed in bigger cities for additional capacity (sorry, no reference here, as the values are from the print version of a magazine, my personal record is 11 MBit/s). The article seams to dance around those values but quite never mentioning them.

Agreed, average speeds also take overall load in the networks into account, but the article does not say at which times of the day (or night?) the measurements were taken. So one has to be a bit careful here but by and large it's a good indicator, especially for smartphone use. But maximum achievable speeds are just as important, as some people using the network for connecting their notebooks to the Internet benefit from maximum speeds if they are clever enough to look for a good place when using their device at home or in public, i.e. be close to a window, have a look at the number of signal bars, etc.

So again, a good article, but some interesting details are unfortunately missing.

 

Exploring Android – Part 6 – Profiles

One of the absolute must have's for me I use quite heavily on my Symbian phone today is profiles with which I can configure the phone's behavior with the touch of a button. Especially the functionality to silence the phone in general and only allow it to ring for a select few phone numbers is a real must for me. On Android there is no such functionality out of the box. Either the phone rings or it doesn't independently of the number of the caller. So I started to look for something in the Android market and found "Auto Ring" which fits my needs quite nicely. When Auto Ring is installed one can select the numbers out of the phone book which will make the phone ring when the phone is either in "silent" or "vibrate only" mode. Works great and another important piece of my Android puzzle gets solved this way!

Chipset Digging With Android

As a "radio" man, I'm quite interested in how applications and mobile device operating systems communicate with the part of the device responsible for the cellular data connectivity not only in terms of transmitting and receiving user data packets but also how to connectivity to the cellular network is controlled. Here's how it works in Android:

Application Layer

For application layer programs in the Dalvik Virtual Machine, information about the current state of the cellular network can be obtained via the android.telephony package. In the package there are a number of classes with methods to query the current type of network (e.g. GSM, UMTS), information on the current cell such as the cell ID for GSM or the primary scrambling code (PSC) for UMTS cells, signal strength and quality and also detailed information about neighboring cells found, including signal strength and quality.

Another package of interest in this regard is android.net. This package contains classes and methods to discover the type of connectivity (e.g. WiFi or cellular), own IP and DNS server information. In other words, an application can find out a lot of technical information from the physical layer up to the IP stack.

Operating System

From an operating system point of view the radio layer information has to be retrieved from the piece of hardware that is responsible for cellular connectivity. This is done via an abstraction library, the Radio Interface Layer (RIL) that offers a hardware independent interface to the operating system and higher layers to query such information and receive unsolicited information from the radio chip about events such as cell changes.

On the other end, the RIL communicates with the radio chip via a non-standardized interface with Google giving a reference implementation based on well known Hayes AT-commands (more on that below). Here's a link with some more infos on what vendors have to do if they want to customize the radio hardware facing side of the RIL. Over this interface, pretty much anything about the radio can be controlled, including retrieving and modifying information stored in files on the SIM card. This link contains an interesting AT-command log traced while a circuit switched call was established. Further details about the at commands can be found in 3GPP TS 27.007

Hardware

Remains the question of where the radio stack is running on a mobile device. In some mobiles a dedicated chip is used while in others, a single chip is used that contains both the processor(s) for the radio stack and the processor(s) for the application layer, i.e. the processors that Android runs on. The radio processor(s) and the application processor(s) run independently from each other.

In the case of the first Android device, the HTC Dream, also known as the G1, a Qualcomm MSM7201A system on a chip is used as per the Wikipedia entry for the device. That's of course an old and slow chipset platform compared to the dual-core application processors running at GHz speeds in 2011, but that doesn't really matter from a radio/application separation point of view. On the Wikipedia page for the chip it is stated that than an ARM11 core is used for the application processor, while an ARM9 core is used for the radio stack. In other words, the RIL described above runs on the ARM11 core and communicates via an interface to the ARM9 core that executes the radio stack independently, asynchronously and in real time from the application processor.

That separation comes in very handy from an Android development point of view because apart from the RIL, the 2G/3G radio code is completely independent and is even running on a different hardware unit or even on a different chip. Android and the radio OS can thus evolve completely separately from each other. If you look at it from the radio stack of things, it couldn't care less which application OS is running on the other processor, it just delivers its services to Android, Symbian, WP7 or any other OS sitting next door.

This separation is very similar to a scenario in which a 3G USB dongle is used as a modem and a notebook or netbook that runs the operating system (Windows, Linux, Mac OS). Between the 3G dongle (the radio) and the notebook (the application processor), USB is used as the interface. And if you dig a bit deeper, the afore mentioned AT commands are used to initiate and tear down IP connections (PDP contexts) and to control the radio in general (e.g. manual network selection, report of signal strengths, etc.). The RIL in Android in other words is doing something very similar as the PPP daemon in Linux or the dial-up network support in Windows.

And finally, here are two links that show how this is implemented on the circuit board. The first one shows the solution described above, i.e. radio processor and application processor in the same chip in the case of the G1. In the case of the Nokia N8, radio processor and application processor are implemented in two separate chips (Texas Instruments baseband processor and Samsung Application processor) as shown and described here. The Nokia N95 has a similar separation with TI providing both the radio and the application processor chips as shown here. The hardware diagram shows this very nicely.

UMTS 900 in London – A Tough Decision

Recent press reports (e.g. here and here) revealed that O2 UK has expanded their 3G service to the 900 MHz band in London and other big cities in the UK. Quite a surprising move for me since the common perception is that there is not enough bandwidth in the 900 MHz range to allow operators to remove a 5 MHz chunk from GSM services in busy cities and use it for UMTS in this band. So how is this possible in the UK from a technical point of view?

After some digging I have found the two references (here and in particular here) that suggest that in the UK, the 2×35 MHz bandwidth in the 900 MHz band is only shared between two network operators. That makes things very easy from a technical point of view as each operator has much more spectrum compared to countries in which up to four network operators share the resource.

In a first instance, this is very good for customers since 3G indoor coverage should be considerably enhanced by this. Personally, I can't wait to come to London again to get some first hand experience.

From a competitive point of view the move by Ofcom to allow the incumbent two operators to keep all the spectrum and to use it for more than just GSM (refarming) must have been a very difficult one as it puts the other operators at a disadvantage in the short and medium term. Until they can get similar spectrum, e.g. in the 800 MHz band, until manage to deploy a network and until they can get devices, a significant amount of time will pass.

And here's why: The bandwidth auctions for the 800 MHz band in the UK is set for 2Q2012 and there are still some question marks attached. And once that spectrum is allocated it is most likely going to be used for LTE. That's good for high speed Internet access with 3G dongles and embedded 3G/LTE modules but that still leaves the LTE voice problem to be solved. Here's a post of mine from back in 2008 describing the issue. It's 2011 now and I don't see that the industry has moved an inch closer to a solution (dual radio mobiles don't count).

It's not the additional 5 MHz that is the business advantage, it's the indoor coverage. And smartphone owner don't care only for 2G indoor coverage they want fast Internet access as well. According to this report (again, but see (*) below) the 900 MHz operators have to pay a yearly fee to compensate for the fact that 2100 MHz 3G operators would have to build three times more base stations in order to reach a similar coverage. But that's easier said than done, just imagine how an operator could triple the number of base stations in London. I wonder if that yearly fee (how much is it by the way?), which in theory should make tariffs more expensive with the 900 MHz operators, will be enough to ensure ongoing competition between four network operators. I am a bit skeptical.

Would it have been better to give some of the 900 MHz spectrum to other network operators? Difficult question. Perhaps it would have been a bit more fair in that band but it would also have meant that, according to current wisdom, there would have been no 900 MHz 3G in the UK, just like in other countries. That leaves the 800 MHz digital dividend band for fast Internet services with good indoor coverage in cities and economical rural coverage like in other countries. Unfortunately, the bandwidth of 30 MHz there is not enough for four operators, three is the best you can reasonable do here.

Tough choices!

(*) P.S.: The article states that EE uses 1800 MHz for 3G services. That is unlikely as there are no 1800 MHz 3G devices on the market today.

doesBenefitFromBatteryConsumptionOptimisation

A bit of a strange post title today but fitting the tech deep dive. When recently looking for something in the UTMS RRC Specification Document (3GPP TS 25.331) I noticed a parameter introduced in Release 6 of the specification which is called "deviceType". The parameter is included for example in the ue-radioAccessCapabilities Information element in the RRC Connection Setup Complete message. By default its value is "doesBenefitFromBatteryConsumptionOptimisation" but can be set to the device to "doesNotBenefitFromBatteryConsumptionOptimisation". No further explanation is given in the spec. I have to say I'm intrigued as to me it looks like something that could be used on top of the Fast Dormancy and Continuous Packet Connectivity extensions. So far the world has mostly looked at these two and I think the combination of the two of them makes a good package indeed as I described here.

A combination of FD and CPC makes a lot of sense for battery driven devices. But what about devices that do not require an optimization to conserve battery power? UMTS Modules in netbooks for example do not have the same battery capacity restrictions and here it might be better, both for the user and the network, to keep the device in Cell-DCH state (with CPC on top) for a much longer time if the user is actively using an application that requires frequent exchanges with a server on the Internet. For the network, fewer state changes would be required while on the UE side, no state changes mean that data can be delivered more quickly after the user has, for example, clicked on a link.

One could even imagine that the parameter could be used to configure CPC in a different way depending on the type of device. For battery constrained devices, CPC could be configured with larger transmission/reception gaps to conserve energy, while for other devices, shorter gaps could be configured for faster reaction times.

But o.k., let's first let the industry figure out how to do FD (Release 8) and CPC well before going to the next level of optimization…

 

What Happened To 2D Bar Codes?

For many years I've had a 2D bar code at the side of my blog to help people to get to my blog more easily on their mobile device when they discover it on their notebook or desktop PC. But these days I wonder if it is really still necessary!? While seeming to be very popular in Japan, bar codes to get further information haven't really caught on anywhere else. Every now and then I see a 2D bar code on a poster but if I'm really interested I'd rather type in the name of the product and let Google or Bing guide me to the website. And for me it's even more convenient, as making a search query with a single term even with a virtual keyboard is still easier than trying to remember where to find and how to use the 2D bar code reader application on my mobile device. What do you think, where will the 2D bar code story go?