Wi-Fi After 802.11n: It’s 802.11ac

It has been relatively silent for a while on how Wi-Fi is going to develop now that the 802.11n standard has become widely accepted. But behind the scenes, companies in the IEEE are silently working on the next generation of the standard with first and interesting results.

Perhaps they have run out of single letters as the next version of the Wi-Fi specification will be called 802.11ac. Wikipedia contains an interesting entry on the enhancements and links to a current draft specification of the IEEE. The link is pretty interesting as it is the first time I see the IEEE publish drafts in public. Previously, things were kept inside the IEEE community until things were finished. A new openness?

Anyway, so here’s the features currently under development:

Wider channel bandwidths

The initial 802.11b, a and g standards were defined for channel bandwidths of 20 MHz. 802.11n then introduced channel bundling to 40 MHz. While this in theory doubles the available data rate, the issue especially in the 2.4 GHz band is that foremost in cities there many access points are on air and an enlargement to 40 MHz makes it even less likely that one can find an unused spot in the band.

As a consequence several access points use the same channel and hence capacity of a network becomes dependent on how much data is transferred in other networks on the same channel. If the channel used by other networks is fully overlapping, interference is limited but data can still not be transferred, while a packet is sent or received by another access point. The collision avoidance scheme makes sure that packets are seen by other access points and clients so lower speeds are not a consequence of interference but of the transmitters waiting to catch an opportunity to send their own packets to their own access points on the channel. There is also the scenario in which channels are only partly overlapping. In this case only a part of the channel used by an access point is used and the packets are therefore not detected.

This results in a reduction of throughput due to perceived interference as due to the only partial channel overlap the packets of the other network can not be correctly received and thus, collision avoidance does not work.

With 802.11ac, bandwidth aggregations of up to 80 MHz and 160 MHz are defined, making the issues described above even worse. In 2.4 GHz it's not even possible to aggregate 160 MHz as the overall channel is smaller than that. There is more spectrum available in the 5 GHz band but even here, 80 MHz or 160 MHz channel aggregations will be a stretch. Also, networks using the 5 GHz band have a more limited coverage area than on 2.4 GHz, as signal absorption through walls and by obstacles is much higher than on 2.4 GHz.

More MIMO Streams and Multi-User MIMO

802.11n introduced Multiple Input Multiple Output (MIMO) transmission, i.e. transmitting several independent spatial streams over the same channel simultaneously. 2, 3 and 4 antennas for the same number of spatial streams are defines so far and 802.11ac increases the number to up to 8 spatial streams. Enjoy 8 antennas on your access point and mobile devices.

Mobile devices can have fewer antennas an will consequently only be able to use the number of spatial streams in line with their number of antennas. 802.11ac, however, specifies Multi User MIMO, i.e. if the access point can handle more data streams than mobile devices, several devices can send their data simultaneously. In the other direction, the access point can send several MIMO streams to different devices simultaneously.

Even Higher Modulation

The current state of the art Wi-Fi uses up to 64-QAM modulation if access point and mobile device are close to each other. 64-QAM encodes 6 data bits per transmission step. 802.11ac takes this one step further to 256-QAM, i.e. transmitting 8 bits per transmission step. The coding rate for such transmissions, which is a measure for the number of user data bits to error detection and correction bits is 3/4 and 5/6.

Backwards compatibility

And, of course, a most important requirement is that 802.11ac devices and access points must be able to co-exist with their older 802.11 devices.

An Interesting Future

A marvelous challenge to put all of this in a specification. And even more of a challenge, I am sure, to put this into real devices and make it work in a backward compatible way. Things remain interesting!

5 Years Ago in Mobile

About 5 and a half years ago I started this blog and it's been a tremendous project ever since. One of the benefits that now appears is that I can look back and see what was "moving" me 5 years ago, giving interesting insights into how the mobile landscape has changed since. So here's what wrote about 5 years ago, in August 2006:

Smartphone Wi-Fi Sharing: Google introduced Wi-Fi sharing on their Android platform not too long ago. My first thoughts on the topic are back from August 2006. The Nokia N80 was one of the first if not the first phone with a Wi-Fi chip inside, without Wi-Fi sharing of course. At the time the discussion was more about whether Wi-Fi will survive in phones at all with some network operators being less than happy about it in the first place.

3G Roaming Issues: According to my notes I bought my first 3G phone in December 2004. One and a half years later, many 3G networks were launched and I made my first roaming experiences. At the time, there were still quite a number of interoperability issues between my mobile device and different networks I tried it in as documented in this post on "roaming pleasures with pitfalls". Since then things have improved a lot but there are still some quirks today as documented here in 2011.

3G Connection Sharing: One of the most popular blog entries every according to my statistics is this post on how to share a 3G connection with others via Windows XP. Now that Wi-Fi sharing becomes more common place on smartphones, the necessity for this is likely to diminish. But for years this approach has served me on many occasions.

3G Video Calls: Yes, video calling started to pick up in 2006 as reported here from an Italian supermarket. The trend didn't accelerate though for many reasons such as patchy 3G coverage and steep pricing but it has found its uses. With Skype on the desktop today, iPhone facetime, better 3G networks and a "different" pricing structure for calls (you pay for connectivity not for call duration) things might yet again another turn.

EDGE: Faster GPRS was on its way to networks helping me in many situations where 3G coverage could not be found.

US Spectrum Auctions: The AWS band (1700/2100 MHz) was on the block and T-Mobile bought quite a bit to launch their 3G service in the US. At the time I asked what the US government would do with the money. Looks like they did what everyone else did, the used it for other means than fostering the telecommunication landscape in their country.

Phone Software Update: My first phone that allowed software updates from home was the Siemens S45 and I made good use of it to improve the stability of GPRS, especially while roaming. In 2006, Nokia also added this functionality to their smartphones and I reported on updating my, at the time, brand new N70. This has since become a common phenomena, semi- or fully automatic updates of installed apps like on the PC is now the norm rather than the exception. It's another indication how the PC and mobile world are moving closer to each other.

2 Billion Mobile Users: The middle of the last decade was the time when mobile accelerated in developing countries. In 2006 2 billion subscriptions on the globe had been reached, up from one billion two and a half years earlier. Today, 5 years later we are well beyond the 5 billion mark and the number of subscriptions is still rising by a similar number as back in 2006 as per the "subscriber counter" at the bottom of the GSM Assocication web site. Incredible!

Will Smartphones Drive 3G Voice Adoption?

One thing I am wondering about when observing people in trains and restaurants in significant numbers now using smartphones to access Internet based services is what kind of effect his has on the shift of voice calls from GSM over to UMTS? Before the smartphone boom, I knew many people who bought a new UMTS capable mobile device but locking the device to 2G only to conserve battery power. I don't know too many people who do that anymore. When using a smartphone and Internet based services, locking the device to 2G for whatever reason is the last thing people want to do now. Consequently, voice calls that would have previously been made over the 2G network are now made over 3G and therefore reducing the load on the 2G network. The effect is likely countered to some degree by rising voice minutes per user per month in many countries but from a 2G/3G voice call distribution point of view I can very well imagine smartphones making a difference today.

The Moving Offload Challenge

Cellular offload to Wi-Fi is a hot topic these days in the industry but from an implementation point of view we are just at the beginning, especially on the mobile device side of things. Pretty much all smartphones today have a Wi-Fi interface in addition to their cellular connectivity so from a hardware point of view they are ready for offloading. Unfortunately, when switching from cellular to Wi-Fi, a number of things happen today when the Wi-Fi hotspot is public:

  • The IP address changes and the 3G connection is usually cut. In other words, ongoing connections are interrupted. Bad if you are watching a Youtube video for example.
  • The public Wi-Fi hotspot usually requires some form of authentication.
  • The coverage area of the Wi-Fi hotspot is rather small.
  • Data rates at the coverage edge are very low.
  • Sometimes, the backhaul of the Wi-Fi hotspot has a lower capacity resulting in lower hotspot speeds independent of the coverage situation than via the cellular network

While the user does not move, most of these issues do not really matter in practice. However, in real live, most subscribers are moving and here's a personal example where concurrent Wi-Fi / 3G connectivity becomes a problem:

When going and coming from work I usually use the time in the train to get some things done online with my netbook and Internet connectivity. As my data subscription also includes Wi-Fi connectivity via my network operator's Wi-Fi network I've initially set my netbook to auto-connect to these hotspots. In places where I know Wi-Fi connectivity exists I didn't bother to use my 3G stick but instead used the Wi-Fi because it is more convenient. But I figured out quite soon that this was not ideal. This is because when I am on the train, my notebook immediately recognizes Wi-Fi hotspots of my network operator in train stations and connectivity over the Wi-Fi interface gets precedence by the OS over the 3G connectivity. As a manual authentication procedure is required after connecting to the Wi-Fi hotspot, this in effect disconnects my 3G connectivity as all packets try to flow over the Wi-Fi link but can't because I haven't yet authenticated. Yes, one could automate that. However, it wouldn't help because after a minute, the train leaves the train station again and connectivity is lost. In this case the 3G connectivity is still there but all connections, that have just switched over to the Wi-Fi link are again broken.

Of course this could be fixed by having a piece of software on the mobile device and the network so the same IP address is used over both interfaces. But it's not here. Also, just the same IP address would not fix the issue of slower connectivity at the Wi-Fi coverage edge or in case the backhaul of the hotspot is under dimensioned. I have also seen in practice that at the Wi-Fi coverage edge, connectivity is still present but packets are no longer received in either direction. Again, software could help to fix this but we are a long way away from that, too.

Speaking of additional software on the mobile device, where should it be located? Today network operators deliver "connection manager" software with their data sticks that run on the operating system. Not everybody likes them as they might do more than some want and not all people can use them, such as me for example as I use Ubuntu. Another option is to have that software reside on the data stick which in addition to cellular connectivity could also contain a Wi-Fi chip. To the netbook or other device using such a stick or mini PCI card, it could just appear as a single network interface and everything is done internally. The downside from a user point of view to this is that this would bind the data stick or mini PCI card to a specific network operator due to the on-board software managing the switching between cellular and Wi-Fi. Not sure if that is a good idea either.

So my consequence out of all this is that I removed the auto-connection to the Wi-Fi hotspot network in my devices and only manually connect when I know I want to stay in a place for a longer time. But I have to do this by hand which is about as convenient as getting the 3G stick out of the pocket in the first place and connect it…

When Products Fail With Long Passwords

I have two Wi-Fi enabled printers in my network and both have a web server for configuration. So far, I didn't set a password on neither of them but I thought it might be a good idea to do so lately, with interesting results:

As I like long passwords for security reasons I chose a 20 digit password, which at first seemed to work. No error messages when setting the password. But when accessing the printers again, neither would allow me to log on with my 20 digit password!? After some trial and error I established that I could access my HP Photosmart C7280 when only using 16 digits of the initial 20. The same with my brand new Samsung ML-2525W which only let me back into the menu when I only used 18 digits of the original password. Now there are four things that are very wrong with this:

  1. The password length is too short.
  2. It seems the passwords themselves are stored and not a hash value, thus creating the problem. Very unsave to store the password and not a hash value by the way…
  3. Why was there no error message that the password was too long?
  4. There is no delay between two login events, so a brute force attack is possible.

If I were daring, I'd try special characters in the passwords now… But I spare myself the trouble.

Rise and Resurrection of the 2D Barcode?

2d-barcode 2D barcodes for mobile use have been on the horizon for at least half a decade. My first blog entry on the topic I could find with a quick search seems to be from 2006 and I have pretty much given up on the idea seeing a breakthrough anytime soon just this year. And just when I've put the idea out of my mind, they seem to be resurfacing quite massively. A case in point is the picutre on the left which I have recently taken in Cologne. They can't get any bigger than this, can they!? When looking a bit around in that neighborhood I noticed a few more 2D barcodes on a billboards and also restaurants (with links to their Facebook account or website). Looks like the advertisement industry keeps pushing.

A Paper on Fast Dormancy From the GSMA

Fd-paper One of the things the original UMTS did not take into account were mechanisms to allow mobile devices to reduce their power consumption when they come to the conclusion that for the moment, physical connectivity to the network is no longer required. This has led to a significant reduction of battery autonomy with the rise of bursty applications such as push email. As a result, device manufacturers started to become creative with the 3GPP specs and used a mechanism referred to "Signaling Connection Release Indication" not quite as it was originally intended to cut the physical connection to the network. 3GPP then caught up and specified an enhancement of this procedure in 3GPP Release 8, which brings improvements for both mobile devices and networks over the initial non standardized solution.

So what are those enhancements and what are the benefits? I've discussed this topic at length over quite some time on this blog such as here, and I think that in combination with the Continuous Packet Connectivity features as described here, battery performance, network signaling and network capacity can be improved significantly today and in the future. As the topic is quite hotly discussed in the industry from various points of view, the GSM Association has set out to assemble a freely available technical white paper that contains a consolidated view of both network operators and manufacturers on the topic. In the 23 pages of the white paper that can be found on the GSMA website in the technical documents section, the technical background is explained in depth including the impact of other features and settings on mobile power consumption. An interesting read no matter whether you work on the mobile device side or the network side of things.

And a small disclaimer at the end: I was part of the team working on the paper and I am happy to recommend it as I in my opinion it contains a fair and balanced view in addition to all the technical details that can be found in it. Enjoy!

The USB Cable Ensures Connectivity – Again

The-cable-again
It's amazing at how many places I go where 3G coverage is barely available at the window of a room but not inside. The situation hasn't really changed all that much in many years. And every time I am glad I have that USB extension cable with me to put the 3G USB stick at the window while working somewhere else. Or, in case I have several devices with me requiring Internet access, the same trick helps with a 3G/Wi-Fi bridge such as this one when the power plug and the window are not in the same place…

Resurrection of the Camera – Big Time?

One of the things I noted on a recent vacation was not only how many people these days use their camera phones to take pictures but also how many people are now carrying dedicated cameras again. No, not the small point and shoot ones, the big SLR type cameras, heavy as they are. Interesting that now that camera optics and software have become so good on mobile phones to replace an extra camera people are willing to carry heavy and big cameras for the extra quality (or just for the zoom and night shot capabilities?). Agreed, SLR cameras are now cheaper than ever but it doesn't make them any lighter to carry.

Thoughts on RRC Settings in Italian 3G Networks

I've been in Rome recently for a week and I noticed that on Vodafone and TIM's 3G networks the experience on my mobile phone was quite bad. Quite often when clicking on a link the page would not load in any reasonable amount of time. When switching to their GPRS networks page load times with Opera Mini where good so my problem likely resulted from some air interface issues. On Tre's 3G network, my device performed flawlessly so they must do something different to Vodafone and TIM. To see where my problems came from I therefore decided to take a closer look at how the radio network state changes where configured. Here's the result:

TIM:

DCH timer: < 5 s
FACH timer: 75 s
Final state: idle

Vodafone:

DCH timer: < 10 s
FACH timer: 45 s
Final state: cell-pch

Wind:

DCH timer: < 3 s
FACH timer: 75 s
Final state: idle

Tre (3IT):

DCH timer: < 5 s
FACH timer: 60 s
Final state: idle

When compared to network settings in other countries such as Germany, for example, I was quite surprised about the very long FACH timers. In Germany, those timers are much shorter, and in the range of 15-20 seconds to conserve battery power in mobile devices. Beyond 30 seconds, they are a huge energy drain and really, Fast Dormancy is a mandatory self defense mechanism against such settings…

Concerning Fast Dormany, I am at a loss when it comes to Vodafone Italy's settings. Why is there a 45 seconds Cell-FACH phase when the network then transitions to Cell-PCH instead of Idle. Cell-PCH combines the advantages of low battery consumption with fast data transfer resumption with less signaling in the network to reestablish the connection so such a long Cell-FACH phase seems very unnecessary (for details see the Fast Dormancy link above).

On Wind's network I found the Cell-DCH timer of 3 seconds or perhaps even a bit less quite surprising. In practice this means that the connection frequently changes between DCH and FACH, resulting in an inferior web browsing experience, as each time the state is changed, the transmission is interrupted and packets have to be queued. I noticed this when surfing on my pad as pages loaded much slower than they usually do, especially if they contained content that took a bit of time to be downloaded. Wind as furthermore set the thresholds in a way that the DCH is not kept if only little data flows. So a default "ping" will not keep the connection in DCH state. Only a ping packet size of around 500 bytes had the desired effect. Again, I am wondering why they are doing this!? Are they having problems with the number of concurrent connections in DCH state? It surely can't be to conserve power on the UE side. Time to buy some more DCH licenses guys instead of crippling the performance of your network!

While all of this is very interesting it does not explain why web pages are often not correctly loading after pressing a link on Vodafone and TIM. I therefore suspect that it has something to do with the UE and networks having an interoperability issue when changing between the different states and/or perhaps carrier frequencies, since both have two 5 MHz carriers deployed. Difficult to tell without a deep drill down. So during my stay, Tre.it became my favorite roaming network in Italy and I am glad about manual network selection.