Intel and Android, Microsoft and ARM

Interesting times are ahead with major alliances forged a long time ago not really breaking up but becoming non-exclusive. Windows and Intel have been a team in the PC world for decades but have so far failed to establish themselves in mobile. But both desparately want to be in that domain. It seems they have figured out they can't do it together as a dream team, they each need to partner with an established player in mobile that helps with their established success.

Intel with Android

So we have Intel who seems to have finally been able to produce a chipset that is lean enough for a mobile phone (see here, here and here). Their acquisition of Infineon for a 2G, 3G and 4G mobile baseband also helps tremendously. By adapting Google's Android to their chipset they have a great smartphone operating system from day one and it seems that all apps that do not directly access the hardware (i.e. everything programmed in Java, i.e. pretty much all apps except games) will run on Intel based smartphones. Not bad.

Microsoft with ARM

And then there is Microsoft on the other side. They've waited for years for Intel chips to make their OS run on tablets and other gadgets but it never worked out so far. So I guess they have lost patience and have now ported Windows 8 for ARM to run on tablets. Interesting technical insights can be found here.

Intel with Windows on Mobile?

Perhaps Microsoft will consider Intel chips for their tablets again in the future should the afore mentioned Intel/Android project work out and Intel keeps churning out good mobile hardware platforms. And this Intel project, unlike the previous attempts over the past few years, looks quite promising. The advantage for Microsoft coming back to Intel is that running on an x86 architecture would remove the need to recompile Windows applications for ARM (unlike apps on Android which are always "just in time" compiled).

8-carrier HSDPA – Who Wants It, Who Could Even Use It?

3GPP Release 11 contains an interesting work item, the bundling of up to 8 x 5 MHz HSDPA channels in two different bands. Octa-carrier HSDPA with a top downlink data rate with 64QAM modulation and MIMO of 337.5 MBit/s (HSDPA category 36). Sure, the data rate is impressive but I have to wonder if it will be praticable in the real world: I can't think of any network operator who would have 8 channels available. And even if there were some, why would you want to bundle that much spectrum for HSPA when the general trend is to move to LTE anyway? Am I missing something here?

What the Main Stream Press Overlooks in the US LTE vs. Europe HSPA Discussion

This week a device was announced to the press that can supposedly use LTE networks in the US  and HSPA network in Europe and Asia. To the mainstream press things are clear, Internet access with the device will be better in the US than in Europe. Hm, they just overlooked a small detail: In the US, Verizon and AT&T currently use the 700 MHz band for LTE and each carrier only has 10 MHz of spectrum in that band (see for example Verizon's band 13). In Europe, Dual Carrier HSDPA combines 2×5 MHz to a 10 MHz channel. This nullifies pretty much all of the theoretical speed advantage. The only thing that is left that LTE has and HSPA hasn't is MIMO. Add to that the denser network structure, a more mature technology and very likely a lower power consumption due to optimized networks and things suddenly look quite different. But I guess one can argue as much as one wants, 4G must by definition be better than 3G 🙂

Half The Phones Sold In Germany In 2012 Will be Smartphones

says Fritz Joussen, CEO of Vodafone Germany and president of Bitcom, a German telecom trade association. Last year it was already one third of all phones, which shows an interesting trend. While a few years ago a smartphone sold did not automatically also mean that a data connection came with it, this might have changed in the meantime as well. After all, what's the point of having an iPhone or an Android based phone without Internet connectivity? With Symbian phones this was still a possibility and many people used such smartphones for offline purposes only. Fortunately, prices have changed as well. 10 Euros a month now connect you to the Internet with a flatrate (throttled after 300 MB) + 50 voice minutes, 9 cents for SMS and voice minutes afterwards.

LTE-Advanced CoMP needs Fiber

So far I've always assumed that LTE-Advanced Cooperative Multi Point (CoMP) transmission would be similar to what we have in UMTS for voice calls. Here, several base stations transmit a signal in the downlink direction for the mobile at the same time. The mobile device then tries to decode all signals simultaneously to improve reception conditions. With the introduction of HSPA for packet based data transmission this was no longer possible. Here the central scheduler in the central Radio Network Controler was replaced by individual packet schedulers in the base stations. As a consequence fast coordination between the schedulers of the different base stations was not possible due to the delay and limited transmission capacity of the backhaul link.

But I thought time has moved on, technology has improved and some way has been found for schedulers to communicate over the backhaul link to synchronize transmissions to a mobile device. Actually, that is not the case and the CoMP scenarios that have been studied in 3GPP TR 36.819 work quite differently. In fact, all except one scenario is based on fiber links that transmit the fully processed RF signal which is only converted from optical to an electromagnetic signal at a remote radio head. Here's a summary of the two modes and four approches discussed in the study for 3GPP Release 11:

Transmission Modes

In the Joint Processing (JP) mode, the downlink data for a mobile device transmitted from several locations simultaneously (Joint Transmission). A simpler alternative is Dynamic Point Selection (DPS) where data is also available at several locations but only sent from one location at any one time.

Another CoMP mode is Coordinated Scheduling / Beamforming (CS/CB). Here the downlink data for a mobile device is only available and transmitted from one point. The scheduling and optionally beamforming decisions are made among all cells in the CoMP set. Locations from which the transmission is performed can be changed semi-statically.

Deployment Scenarios

Scenario 1, homogeneous network intra-site CoMP: A single eNodeB base station site is usually comprised of 3 or more cells, each being responsible for a 120 degrees sector. In this scenario the eNodeB controls each of the three cell schedulers. This way it is possible to schedule a joint transmission by several cells of the eNodeB or to blank out the resource blocks in one cell that are used in another cell for a subscriber located in the area between two cells to reduce interference. This CoMP method is easy to implement as no external communication to other entities is required. At the same time this is also the major downside as there is no coordination with other eNodeBs. This means that data rates for mobile devices that are located between two cells of two different eNodeBs cannot be improved this way.

Scenario 2, high power TX remote radio heads: Due to the inevitable delay on the backhaul link between different eNodeBs its not possible to define a CoMP scheme to synchronize the scheduler. To improve on scenario 1, it was thus decided to study the use of many (9 or more) Remote Radio Heads (RRH) distributed over an area that is today covered by several independent eNodeBs. The RRHs are connected to a single eNodeB over fiber optic links that transport a fully generated RF signal that the RRH only converts from an optical into an electromagnetic signal that is then transmitted over the antenna. While this CoMP approach can coordinate transmission points in a much larger area than the first approach, its practical implementation is difficult as a fiber infrastructure must be put in place to connect the RRHs with the central eNodeB. A traditional copper based infrastructure is insufficient for this purpose due to the very high data rates required by the RF signal and the length of the cabling.

Scenario 3 and 4, heterogeneous networks: Another CoMP approach is to have several low power transmitters in the area of a macro cell to cover hotspots such as parts of buildings, different locations in shopping malls, etc.). The idea of this approach is to have a general coverage via a macro cell and offload localized traffic via local transmitters with a very limited range, reducing the interference elsewhere. This can be done in two ways. The localized transmissions could have their own cell IDs and thus act as independent cells from a mobile device point of view. From a network point of view, those cells, however would be little more than RRHs with a lower power output instead of a high power output as in scenario 2. Another option would be to use RRHs as defined above with a low power output without a separate cell ID which would make the local signal indistinguishable from the macrocell coverage for the mobile device. Again, fiber optical cabling would be required to connect the low powered transmitter to a central eNodeB.

Overall, the 3GPP CoMP study comes to the conclusion that data rates could be improved between 25 to 50% for mobiles at cell edges with neighboring interference which is a significant enhancement. Except for scenario 1, however, fiber cable installations are required which makes it unlikely that CoMP scenarios 2,3 and 4 are likely to be implemented on a broad scale in the next 5 years.

Plan B Is…

… for Plan A to work. At least that's what I read in an interview with a high ranking manager from a previously important mobile phone manufacturer recently. But I digress, I wanted to say something else. When I recently went on a long weekend in the countryside, my D100 (3G dongle to Wi-Fi adapter box) I had for many years has suddenly decided to no longer cooperate. Pretty bad when you are in places that use Swisscom Wi-Fi and you have more than a single device that wants to share your 3G Internet connection. But unlike the afore mentioned manager I did have a plan B, Wi-Fi tethering to the Android smartphone I otherwise mostly use for eBook reading and experimenting. The Wi-Fi range is probably not as good as that of the D100 but it was good enough for the hotel room and all devices I needed connectivity for worked just fine. I love it when a plan comes together, even if it's plan B. The attitude could do miracles for the above mentioned manager as well. But perhaps he's not allowed to have a plan B… But I digress again. I'm glad that something I speculated about 6 years ago in 2006 now works so well.

How Many Base Stations Do You Need To Launch A Network (In France)?

Here comes an interesting number from the French Telecoms Regulatory Office ARCEP: After the fourth mobile network operator Free has just recently started their service in France, there were some complaints from the competition that Free would switch off some of their base stations so that the traffic would be handled by their national roaming partner. I could go into why complains were heard but the much more interesting thing is that ARCEP has launched an investigation and has concluded in their result available here that Free continues to fullfil their regulatory requirement of covering 27% of the population. Interestingly enough they mention with how many cell towers they do that: 735.

That is an incredibly small number. In comparison, Vodafone Germany stated in 2009 (long ago in telecoms land) that they had 20.000 GSM base station deployed in Germany and 13.000 UMTS base stations (in a country that is smaller than France by the way but has more inhabitants, to be fair). One of the highest population densities in France is most likely Paris, with around 10 million people living there, or around 7% of the population. To cover 27% of the population means covering around 4 times the area of Paris. With 735 cell towers? Wow, cell sizes must be quite large and I can imagine that mobile devices keep hoping between the Free network and the national roaming partner frequently even while in the Free coverage area.

It also shows that Free still has the major part of their network deployment still in front of them to meet their next coverage target of 75% of the population by January 2015 (3 years from now) and 90% in January 2018.

And on a closing note I found it quite interesting and I had to smile a bit that ARCEP pointed out that it's the incumbents who where actually those who in the past did not meet the regulatory coverage requirements they agreed to.

Dual-Carrier HSPA+: 30 MBit/s and Counting

30mbitBack in June 2011 I ran some speed tests at home in Cologne to see where my 64QAM capable HSDPA category 14 stick would take me in terms of downlink performance. The result was 16 MBit/s which went far beyond the already breathtaking 11 MBit/s I measured the year before when the higher order modulation was not yet switched on. In the meantime, dual-carrier HSPA has been activated which can bundle two 5 MHz downlink carriers. And with a HSDPA category 24 device I've reached my next personal HSDPA downlink speed record of 3.7 MB/s, which translates to 30 MBit/s. And this time I didn't run the test over night but in the morning at 7 a.m., so one can assume there was already a little bit of network load from all those people with smartphones going to work. When running the same test during the day I still get data rates well over 20 MBit/s. Have a look on the picture on the left for the details. Previously I usually used files to download that were a couple of hundred megabytes in size. At speeds like this however, they are downloaded much too quickly so I finally switched to 4 GB DVD images. That says something all by itself, too.

Firefox Synch – Finally a Cloud Service I Trust With My Personal Data

You might have noticed that I am usually quite critical of cloud services that interact with my private data. What bothers me with most services is that I loose control over my data as I am no longer in charge where it is stored outside my home, how well it is protected and for what other purposes it is used. Cloud services such as my blog, for example, are excluded from this, obviously as there is no private data here.

Recently I have started using Firefox Synch to keep my bookmarks backed up and synchronized between several devices. I very much like the approach taken by Firefox Synch as all data that leaves my devices and goes through the cloud or is stored there is encrypted BEFORE it leaves my devices. This is how I like it. The cloud is used for a service but my private data is secure and only available to me.

Nothing is perfect, though, my movementts could still be tracked with the IP address given in the quite frequent synchronization requests. But then the synchronization account is not linked to my name or email address or any other personal ID so tracking me personally would require that an attacker would have to figure out the id of my synchronization account first. But I can probably life with that potential threat as that is likely to be very difficult to pull off.

Telephony – 10 Years From Now – A Little Bit Of Everything?

Let me make a bold assumption: Even in 10 years from now, people will still use their mobile device to call each other. Many other forms of communicating with each other have sprung up over the years, SMS, eMail, instant messaging, Skype video calling, Facebook, etc. etc. And I am a heavy user of most of them. Before I call someone I usually prefer other forms of communication, less interrupting, less direct, less intrusive. But still, voice calls remain important to me and I don't see that change anytime soon. But how will I do voice calls in 10 years from now?

Shouldn't be too difficult to figure out, should it? Back in 2008 I wrote a full chapter in one of my books about mobile voice options in the future but which way it would go was still unclear. Now, four years later, I've revisited that chapter and I have come to the conclusion that there are even more options today and even less clarity of were it will go.

For the IMS supporters the way forward is clear. The VoLTE profile for IMS will be the ultimate solution in the future. The road, however, is long and thorny. CS-Fallback will be used by many to bridge the time until VoLTE can be introduced. Good luck with the longer call setup delays. Once VoLTE has launched, Single Radio Voice Call Continuity (SR-VCC) to a circuit switched channel will have to be used by many network operators until LTE networks are really ubiquitous. IMS, CS-fallback and SR-VCC including IMS centralized services, each system is a daunting task to introduce. CS-fallback might be the easiest one. But even here here, complexity should not be underestimated.

Then there is or was (who can say today) the VOLGA approach that reuses the Generic Access Network (GAN) approach to reuse everything of the existing circuit switched voice solution except the radio layer and send everything over IP, no matter whether it's Wi-Fi (as in GAN) or LTE (as in VOLGA). It's appeal is its incredible simplicity to implement, no new voice infrastructure but only a gateway box, no new billing system, the current one continues to be used for all subscribers, straight forward implementation in mobile devices with reuse of already existing GAN software, e.g. on Android devices. But it's not loved in the operator world so the best this solution can hope for is its reincarnation.

But perhaps even VOLGA is too much to do in the day and age of ever falling prices for voice minutes. So how about dual radio phones? One baseband for the data and one baseband for telephony. The HTC Thunderbird LTE smartphone sold by Verizon for example has gone this way. It's the first of its kind, so its bulky and power hungry. But look at early GSM or UMTS phones, they didn't win exactly prices for slimness or power efficiency. So there is reasonable hope that over time, even dual radio designs can become small and power efficient. But it would mean that "legacy" infrastructure would have to be kept and maintained indefinitely. Perhaps it has to be kept anyway to support those people just wanting a 10 Euro phone for voice and SMS. Yes, there will be fewer people in the future buying such phones but I predict it will still be a sizable group. And then there's all the M2M equipment in embedded systems and perhaps cars in the future with eCall in Europe. No way that in the next 15 years, the infrastructure required to communicate with those devices goes away. So perhaps dual radio will reign?

Or should perhaps Apple with Facetime or Microsoft with Skype become the standard voice and video solutions on PCs and mobile devices? Great, I can't call my friends with an iPhone anymore and vice versa. But perhaps Apple and Microsoft strike a deal and make a gateway between their worlds. I wouldn't count on it. Also, while networks are built like they are built today, especially in the US, over the top voice services will continue to be unreliable at best over mobile infrastructures and drop as soon as you run out of LTE or HSPA+ coverage. So I don't think that's going to be the ultimate answer either.

There's no single solution I look at today from which I would say, 'yes, I'm sure this will be the main telephony service in 10 years from now'. So perhaps a little bit of everything? Or will one of the solutions be able to overcome its weaknesses? From my point of view, the most difficult thing to predict in mobile today is how telephony will work in 10 years from now. Compared to that, everything else is a piece of cake.