Release 8 Fast Dormancy Now In 3 UMTS Network In Germany

Two and a half years ago I wrote a lengthy post about power consumption problems of smartphones and one remedy for it, referred to as 3GPP Release 8 Fast Dormancy. This feature enables the mobile device to inform the network that it thinks that no further data will be transferred for the moment and that it likes the radio link to be downgraded to a more energy efficient state. This way, the timeout period during which power consumption is high can be significantly be reduced. This is very beneficial in cases when only background traffic such as keep-alive pings and email push/pull services communicating with a server produce short bursts of traffic while the mobile is not actively used. Also, another benefit is that the connection is put into a semi-dormant state (Cell-PCH / URA-PCH, see the post linked above) from which it can be reactivated much more quickly than from a fully idle state. Shortly after that post one German network operator actually switched on the feature.

So when I recently made a check of the state of the networks in Germany I was very positively surprised that three out of four networks have the feature implemented and activated by now. Two of them switch the connection to the Cell-PCH state while one uses URA-PCH. Only one laggard remains, incidentally the least performing one in recent network comparisons.

So what's the difference between Cell-PCH and URA-PCH? In Cell-PCH state, the mobile needs to send a cell update message whenever it changes from one cell to the other so the network can send a paging message for incoming voice calls, SMS messages or IP packets via the right cell. When users are moving or are located just in between two cells this results in increased cell update signaling. URA-PCH on the other hand groups several cells into a common registration area (URA = UTRAN Registration Area) thus reducing the cell-update signaling. If this is better than Cell-PCH depends, of course, on how many cells are in an URA.

How Antennas Change Over Time

New-and-oldAntennas and base station sites, they can be seen everywhere these days but it's pretty difficult to see how they change over time. When I recently came home and looked out the window I could at first not  exactly say what it was but I had the impression that something had changed at the antenna site on the building at the opposite side of the street. But I couldn't quite figure out what was different. Then I remembered that I took a picture two years go so it was easy to compare.

And here's the result: The left part of the image (click to enlarge) shows how the antenna construction looks today and the right part shows how it looked like two years ago. Before the configuration was changed there were three antennas covering each sector. One antenna was installed on top and two antennas were mounted below closely side by side. Today, there's only a single antenna casing with at least two antennas inside as can be deducted by the number of cables at the bottom of the antenna casing. Furthermore, a second microwave antenna has been installed on the main mast below the one already used two years ago.

Quite a significant change and I can only speculate why it was done!? I am pretty sure the top antenna belonged to a different network operator than the lower antenna. So does this absence mean that this operator no longer uses the tower? It's likely as I am not aware of any antenna sharing deals between network operators. And how could the lower antennas have been changed at the same time as the upper antenna of presumably a different company was removed? Coincidence? Cooperation?

Questions over questions. But one thing is for sure: I don't remember my surroundings in as much detail as I always thought as otherwise I would have immediately noticed the missing top antenna instead of having to compare today's state with that of two years ago. That is interesting as well.

P.S.: Note that the sky is grey on both pictures. I'll let you draw your own conclusions…

The QUAM Story

Whenever I look at how the mobile space has developed in the US and how it could be so different to Europe, I easily forget that over here, very strange and incomprehensible things have happened as well. A very good case in point was the UMTS action a decade ago that yielded a total of around 50 billion Euros in spectrum license fees to the German government from the 6 winners of the auction.

While the 4 incumbents subsequently built their UMTS networks, the two new entries spectacularly failed and not only lost all those billions spend on licenses but also the little money that was left after the auction for actually building the networks. And it's not that the backers of those two newcomers should not have known better as those were Telefonica (Spain), Sonera (Finalnd) and France Telecom. The story of Quam, backed by Telefonica and Sonera can be found in this recent article. Google offers a handy translation to English here.

The article is very informative but I still have the same questions I had before: How could they have spent all those billions and then run out of money… Incomprehensible.

Wi-Fi Tethering Not Used On My Train A Lot – Yet

Wi-Fi tethering has been in Android and other mobile operating systems for quite some time now and as of late I've become quite fond of it, having exchanged my 3G dongle with Wi-Fi connectivity to a phone or tablet. It seems, however, that for the moment, I am pretty much alone with this approach on my daily commute as I don't see any other Wi-Fi hotspots with a strong signal in the train in which, by the way, the carriages are openly connected, i.e. my PC could see any Wi-Fi signal from at least 4 or 5 carriages. Just a note on the blog for a moment so I can come back to it over time should this change in the future.

The Additional Antenna in Korea

Every now and then there is a note in the press about the success of mobile TV in Asian countries such as Japan and Korea in combination with bewilderment of how it could have taken off there while it wasn't a success at all elsewhere.

In retrospect I can confirm this. When I was in Korea recently, I noticed that all phones of Samsung I know from Europe as well look very much the same, BUT, they have a retractable antenna for television reception at the top. One doesn't see it at first as it's nicely integrated in the design but it is there. And I think the big difference to the approach tried in Europe, for example, is that TV reception on the mobile phone seems to be free. No extra monthly charge, who could say no?

I am not sure if they receive the normal TV program on their mobile or if it is special content, however. From a technical point of view I wonder if there's a separate chip for TV reception or if that is integrated somehow in one of the other chips? And I wonder what the motivation is for manufacturers to put this into the Korean version of their devices? If it's not revenue generating it must be quite popular as otherwise I am sure the companies wouldn't go the extra mile and extra expense to include it. As usual one question answered, several new ones being raised 🙂

Three Network Operators Are Not Enough

In most countries in Europe, there are four mobile networks for customers to choose from. To me from a consumer point of view it seems to be the "lucky" number as in countries with fewer network operators, real competition often doesn't get a chance and network quality and prices are not where they are in other countries.

A case in point is France, where until last year there were only three network operators served the market with pretty much the same prices. Not a lot of competition as could have been clearly seen in the prepaid market which was one of the most unattractive ones in Europe, with validity dates of only 1-3 months for top-ups. 

A lot has happened in the 12 months since Free has started, however, and prices have come down quite considerably to a more "European" average level. Lots of complaints could have been heard in the time since, including parliamentary hearings, etc. And it seems feelings are still running high, according to the following quote from Stephane Roussel, SFR's CEO:

'The number of mobile operators in France is clearly excessive,
unreasonable in my opinion.’

I find the words "excessive" and "unreasonable" quite interesting. So the situation in a lot of other countries, in which the four network operator system works quite well is clearly excessive and unreasonable? Further, in his opinion, to quote the article linked to above, three network operators would be ideal. Hm, for the network operators or for the customers?

Measuring VM Performance with SunSpider

In my previous post I’ve been using the SunSpider JavaScript test in an attempt to get a feeling for the performance of smartphones compared to PCs. On my PC I am using quite a number of virtual machines and guest operating systems for various purposes. Using the SunSpider test is also quite revealing of how virtualization impacts performance of the guest OS.

My hypervisor of choice is VirtualBox and it seems that the operating system it handles best as far as CPU performance is concerned as a guest OS is Windows XP. On the host system running Ubuntu 12.04 the SunSpider benchmark results in a score of around 420 ms with Firefox 16.0.2. Running the same test with the same browser in the virtualized Windows XP results in a score of 455 ms, i.e. only slightly slower.

On a Windows 7 guest OS, however, the score drops to 604 ms, quite a bit slower. And finally, an Ubuntu 12.04 guest also only reaches a score of 607 ms.  In other words, while the Windows XP guest almost incurs no slowdown, Windows 7 and Ubuntu are 50% slower. That is very strange and I haven’t figured out the reason for this yet.

Also, I can rule out that the SunSpider tests gives different results in different operating systems running natively. On another computer I have Windows Vista and Ubuntu 12.04 installed natively and the SunSpider results with Firefox 16.02 give almost exactly the same value.

Comparing CPU Power Between Smartphones and Notebooks with SunSpider

ARM based smartphone CPUs are getting more and more powerful but so far I found it difficult to compare their performance to notebooks in a meaningful way. But now I found SunSpider, a benchmark available on pretty much any platform that will help in this quest. In essence, the SunSpider test benchmarks the JavaScript execution speed and can be used for various things, such as for example comparing the JavaScript implementation of different web browsers running on the same machine or to compare different devices running the same web browser.

When comparing smartphones with notebooks both the device and the web browser implementation changes which is not ideal. However, when testing the JavaScript engines of Firefox and Chrome on a PC and Opera Mobile and the native Android browser, their results only differed in the order of 10%. That's an indication that highly optimized JavaScript engines of different browsers do not have a significant impact on the general result when comparing smartphones with PCs this way. The following list shows the result of running SunSpider 0.9.1 on different devices:

  • 175 ms, Notebook, Intel i5-2520M CPU, 2.5 GHz, Firefox 16.0.2, Windows XP, (2012)
  • 266 ms, Notebook, Intel Centrino 2, Firefox 16.0.2, Ubuntu 12.04, (2009)
  • 269 ms, Notebook, Intel Centrino 2, Firefox 16.0.2, Windows Vista (2009)
  • 301 ms, Notebook, Intel Core 2 Duo, Firefox 16.0.2, Windows 7, (2009)
  • 410 ms, Notebook, Intel i3 2367M, 1.4 GHz, Firefox 16.0.2, Ubuntu 12.04, (2012)
  • 690 ms, Netbook, ARM, Exynos 5 Dual 1.7GHz, Chrome OS (result from here), (2012)
  • 1034 ms, Netbook, Intel Atom N570, second generation, 1.66GHz, Chrome OS (result from here), (2011)
  • 1194 ms , Android 4.0 based high end Smartphone, ARM, native browser (2012)
  • 1266 ms, Netbook,  Intel Atom N270 (first generation), 1.6 GHz, Firefox 16.0.2, Ubuntu 12.04, (2009)
  • 1279 ms, Lava X900, Android smartphone, Intel Atom based (result from here), (2012)
  • 1400 ms, Samsung Galaxy S-III International, ARM, native browser (result from here), (2012)
  • 2000 ms, Android 3.0 based tablet, native browser, ARM, (2011)
  • 6100 ms, Legacy Android based high end smartphone, native browser, ARM, (2010)
  • 11062 ms, Nokia N8, Opera Mobile browser, (2010)

(approx. device release date in brackets)

When comparing the results there are a number of interesting conclusions that can be drawn:

First, my current Intel i3 based notebook is significantly faster than the Intel Atom based notebook I used for my daily work while traveling until only recently. No surprise here, one can feel the difference.

Second, the latest iPhone's CPU single core performance is better than that of my 3 year old netbook running the latest Ubuntu and latest Firefox browser. They are still sort of in the same ballpark when it comes to performance and I wouldn't want to exchange my i3 based notebook with my netbook again. However, the point is that the latest smartphone processing power is in the area of first generation Atom based notebooks.

Third, the latest Chrome OS netbook (the XE 303) uses an ARM processor and while the performance is not quite the same as that of my Intel i3 based notebook, it is still twice as fast as my 3 year old Atom based netbook. It comes close enough to my current notebook, however, that I'd really like to try it with an Ubuntu installation once there's an ARM version that supports that device.

Fourth, Heat: Notebooks used to get pretty hot. But even my Intel i3 based notebook remains surprisingly cool (with a fan running).

In summary, the list shows clearly how close ARM processors have come in the SunSpider performance test to Intel based devices and the performance difference between low power desktop devices and smartphones is getting smaller. In another two or three years, perhaps even sooner, slim tablets will have enough processing power and acceptably low heat creation for that amount of procesing power that they are fast enough to become full desktop replacements with operating systems that have desktop like functionality when needed.

Cell Logger 2 Launched – Signal Strength And Cell Tracking Now With GPS Info

One year after the first version of my experimental Android Program – Cell Logger – I have published version 2.0 with one major addition: It now uses the GPS receiver to also track the location of signal strength and cell changes. You might remember this and this blog post where I ran some tests with it in Germany and Korea.

If you are interested, you can download the APK for a direct install on the device via this link. The source code is available via this link. In Google Play, you'll find the app by searching for "Cell Logger 2".

Independently from installing directly or via Google play, it's a good idea to download the source zip file from this page as well as there's an extra folder inside that contains some useful information on how to use the generated KML/XML file for offline analysis with Google Maps or Openstreetmap.

And after the jump, some more detailed info. Agreed, as always with experimental projects, documentation etc. could be better 🙂 Nevertheless, enjoy.

License

Cell Logger is available as Open Source Android app using the GPL 2 or later version license. For details see the source code.

What does it do?

Cell Logger records date and time of signal strength changes, the cell's Location Area ID, Cell-ID and, if available, the GPS coordinates where a cell change from one cell to a new cell that has never been observed occurs. GPS coordinates will not be recorded for cell changes between cells that have already been observed to prevent ping-pong handovers in certain situations to be counted as real cell changes. On the display of the device all this information is shown real time including counters on the number of cell changes and the number of unique cell changes (i.e. without ping-pong).

Whenever a cell change occurs data is written to a file in the main directory on the mass storage partition of the device. The file name used for this is “cell-log-data.txt” and data is appended until the file is removed. The file output is buffered in memory until 2k of data have accumulated to prevent frequent write access to Flash memory. The file buffer is also written to the file when the app is moved to the background.

This document is accompanied by an Excel file that gives a description on the fields of the .txt file described above. For analysis it often makes sense to copy and past the data contained in the text file into the Excel spreadsheet.  Auto filters for each column have also been added for easy data evaluation.

When a unique cell change occurs and GPS information is available, the event will be logged into a separate file, “cell-change-log.kml”. The format is text/xml and can be used for import to Google maps (http://maps.google.com). A Google account is needed to use the “import” functionality on the website. As the app has no control when logging starts and stops, the XML header and footer need to be inserted after copying the file to a PC before data is imported to Google maps. The file “header-footer.kml” that accompanies this document contains the required XML structures for easy copy/paste to the KML file that is used for the maps import.

The Network Type that is recorded in each line is a number as given by Google's API. Here are some numbers and their meanings:

2 = GSM/GPRS/EDGE
3 = UMTS
8 and 15 = HSPA (some networks switch between 3 (idle) and 8/15 (DCH)
13 = LTE

The KML file uses three different styles to indicate the use of different network types. The numbers given by the Google IP are mapped as follows:

1,2=GSM is mapped to STYLE 2 and is then mapped to blue ballons in the KML header file.

3=LTE is mapped to STYLE3 and is then mapped to yellow balloons in the KML header file.

Everything else is mapped to STYLE 1 and is then mapped to red balloons in the KML header file

It is important to let the app run in the foreground at all times. It prevents the device to go to sleep mode, i.e. the display is always on to ensure it is supplied with location and cell information from the baseband processor. If put to the background (e.g. when going to the idle screen), GPS location tracking is disabled by the app and no cell information is recorded. Recording continues once the application is brought back to the foreground again.

Wi-Fi Tethering Replaces My 3G Stick In The Train – Pros and Cons

After many years of using a 3G stick for my notebook on my daily commute I've finally made the switch to Wi-Fi tethering, either to an Android smartphone or an Android tablet, whatever I carry on a particular day. For the moment, it seems more convenient as that 3G stick requires some extra space which is at a premium in trains.

That makes me wonder if 3G sticks will have a future or if their will be completely replaced by Wi-Fi tethering, even more so because a 3G stick needs a separate SIM card which is usually a hassle and which often costs extra money. The few megabytes I use during my daily commute can easily be absorbed by my smartphone plan that does not exclude tethering use.

Speed wise I have not seen a difference between a 3G stick and smartphone tethering. Over both connections I can easily reach speeds well over 6 MBit/s, depending of course on cellular signal quality. Sure, dual-carrier sticks can go far beyond this speed but LTE smartphones with tethering capabilities are now on the market, taking away this advantage as well.

I've also noticed interesting Wi-Fi power saving mechanisms. While the ping round trip delay times are quite normal when tethering to my smartphone, they are around 300ms when tethering to the tablet. However, when I transfer more data over the tablet than just the ping packets, round trip delay times are down to the usual 60 ms again during the transfer and return immediately to 300 ms afterwards. This rules out cellular power saving schemes like Cell-FACH and Cell-PCH which take a much longer time to get activated. No disadvantage but I was a bit surprised when I saw the 300ms for the first time and initially thought something was wrong.

One thing that is an advantage as much as a disadvantage is that the smartphone or tablet runs on a battery vs. the 3G stick that takes the power from the notebook. On the positive side my PC runs a lot longer on a charge. My Lenovo Thinkpad requires about 12 Watts without the stick and an additional 3-4 Watts with the stick (depending on the type of 3G stick). This cuts the operation time by at least one and a half hours.

On the negative side I have to remember to activate/deactivate the Wi-Fi hotspot option on my smartphone once I arrive at work or at home. I wonder when we'll see an option to switch off the Wi-Fi hotspot functionality after a configurable time when no clients are active anymore.I faintly remember of already having seen such an option in a device but it deactivated the hotspot immediately after the last client had left. Not good for my train scenario in which every now and then suspend the notebook for a minute by closing the lid to answer a call or to show my ticket.

But even when switching off the Wi-Fi hotspot functionality after use the smartphone probably doesn't last a full day when using it for other tings, too, as a Wi-Fi/3G bridge for an hour a day with frequent data transfers requires quite some power. The tablet's got an advantage here due to its much bigger battery.

Another downside of the tethering solution is that in other scenarios such as libraries or at Starbucks two power sockets are required, one for the PC and one for the smartphone/tablet. Usually one is lucky if there is even one socket available.

But tethering is often cheaper than having an extra SIM for a 3G dongle which is why I think most people will not mind the drawbacks in some situations.