5G Above 5GHz – More Than Just A Few Meters?

Last month I had a post about 5G heavily relying on spectrum beyond 5 GHz and the catch that today's consumer devices using such spectrum can only cover a few meters. In other words, using spectrum in the 30 or even 70 GHz range (called millimiter-waves, or mmW for short) won't work for cellular networks where base stations are several hundreds of meters apart from each other even in dense population areas. Fortunately, not everybody agrees.

After I posted my article I was made me aware of this very interesting IEEE article in which the authors describe their study of how such ultra high frequency ranges could become usable in a cellular environment. Their conclusion, based on experiments in a real environment, is that by using a high number of tiny antennas for beamforming in mobile devices and base stations, it's possible to overcome the high attenuation of the air interface in the 30 and 70 GHz bands and thus significantly increase the transmission range. They predict that the combination of beamforming and using large 1 GHz carriers can increase overall air interface capacity by an order of a magnitude compared to the 20 MHz carriers used for LTE today.

As antennas are small, a space in a smartphone of 1.5 to 1.5 centimeters could hold 16+ tiny antennas which would be enough to achieve the desired beamforming effect. The authors note, however, that with the current approach of treating each signal path separately it is not feasible to process so many inputs and outputs and that new methods have to be found, especially on the smartphone side, to master this new level of complexity while keeping the amount of energy necessary for the processing in check.

Another challenge pointed out by the authors is indoor coverage, because even with beamforming, millimeter waves are still not penetrating walls and other solid obstacles well. In other words, mmW base stations must also be put inside buildings to also go beyond today's data rates there. In many cases it's unlikely that several operators can deploy their mmW equipment inside a single building so the authors note that a new business model might be required where a third party offers mmW access equipment for interconnection to traditional mobile backhaul networks.

While beamforming holds the solution to extending the range of millimeter-wave systems to usable distances one issue that is created by this is how synchronization and broadcast channels that have to be transmitted omnidirectionally can reach devices. In addition, the channel state from and to each device needs to be continuously tracked in order to keep the beams aligned to individual devices. This is likely going to impact a device's power saving abilities as the transmitter can't be off for long periods even if no data has to be transmitted.

The authors list many further points that have to be considered due to the completely different nature of using mmW in combination with beamforming compared to today's LTE systems. It's by far not a drive-by read so bring some time if you want to explore the paper, it's definitely worth it.

Thanks to Guy Daniels for pointing this the paper via his article on the topic here!

 

5G Super High Frequency Radio Technologies Are Great, BUT…

Last week, a 3GPP 5G RAN (Radio Access Network) workshop took place in Phoenix, Arizona for interested companies to voice their opinions about requirements and potential implementations of Super High Frequency Air Interfaces between 6 GHz and 100 GHz. A summary can be found here as well as a number of very interesting presentations, it's well worth a look. There is one important thing, however, that is rarely mentioned, perhaps because it's implicitly understood: All transmissions over 6+ GHz radios will only reach devices a few meters away at best which means we have to say good bye to our current idea of what a cellular radio network is. Or, in other words, how do you bring such a radio close enough to devices, be they carried by humans, or be they machine type communication devices installed in fixed places or moving inside and outside of buildings?

Today it's already very difficult to drag a fiber to macro base stations covering hundreds of meters in diameter. So how is that going to work in the future? Electricity lines are dragged into the last corners of civilization but the business model for that is entirely different: It's not the electricity company doing that, it's the end user with an interest to have electricity for lighting and other things in all places not necessarily only for himself but also for others. Do we require a similar approach for 5G data networks as well, is there an alternative to such an approach, how can an end user provide connectivity for others and not be responsible for the data flowing over this connection and can an organization consisting of network operators and equipment manufactures actually define something like that? For me the answers to those questions are even more interesting than the details of future 6 GHz+ radio interfaces.

Obviously, answers to this can't be given by 3GPP RAN as they focus on air interface aspects. Instead, this is the task of 3GPP's System Architecture (SA) Group and they will have a first meeting on 5G (only) in December 2015. A joint workshop between SA and RAN is scheduled for the second half of 2016 according to the 3GPP tentative timeline for 5G. In other words, there's still quite some time to ponder these questions.

5G -Separating Use Cases From Technical Debates

For first and second generation mobile networks the use case was pretty simple: Let's make voice telephony wireless and mobile. For third and fourth generation networks the use case was to mobilize the Internet. Perhaps it's only in retrospect but these use cases are pretty easy to grasp. On the other hand I can still remember the 'search for a killer app' for 3G networks that went on for many years. I'm not sure if it was ever found as that search was done in mindset that the killer app should come from network operators when in reality the 'killer app', as far as I'm concerned, was to mobilize the Internet as a whole. So what about 5G then?

Compared to the discussion that was taking place around 3G (UMTS) and 4G (LTE) at the time the discussion on what 5G will be and why we need it is too hazy for me as lots of more or less realistic use cases are discussed while the discussion on how 5G will actually work is more or less done in the background. Stephen Temple over on his web site suggests to split the 5G discussion into a use case debate and a technical debate. A good idea in the light that most of the network operator centric use cases discussed at the time for 3G and 4G were never realized they way they were discussed (e.g. IMS as a universal service platform). He has a number of very interesting thoughts on the technical side, including the potential non-regulation of spectrum above 5 GHz and close range wireless-fiber networks as technical corner stones of 5G.

The 1 Millisecond 5G Myth

5G must be on the steep rise part of the Gartner Hype Cycle curve as I have heard a lot of non-technical people making a lot of technical statements out of context apart from the usual Mbit/s peak data rate claims. A prime example is the 1 millisecond round trip time that 5G should/will have to enable the 'tactile' Internet, i.e. Internet connectivity that is used to remotely interact with the physical world.

Sounds all nice but physics stands a bit in the way of this and nobody seems to say so. The speed of light and electricity is limited and in one millisecond, light can only travel around 200 km through an optical cable. So even if network equipment does not add any latency whatsoever, the maximum round trip distance is 100 km. In other words, there's no way to remotely control a robot with a latency of 1 ms in one part of the world from a place halfway around the world. But then, why let physics stop you?

So perhaps what was really meant is to further reduce the latency of network components? A big step was done in LTE with an air interface that has 1 ms slices and to base all network interfaces on the IP protocol to remove protocol conversions and the resulting overhead and latency. A scheduling interval of 1 ms means the round trip time on the eNodeB is in the order of at least twice this without even forwarding the packet to another node in the network. Add to this potential HARQ (Hybrid ARQ) retransmissions so you already end up at several milliseconds. Sure one could further reduce the length of the timeslices at the expense of additional overhead. But would it really help considering the many other routers between one device and another? Have a look at this great post of Don Brown and Stephen Wilkus which goes into the details.

So What Exactly Is 5G?

Now that 3GPP has officially started working on 5G, the time has come to put lofty ideas and cheap talk into practical specifications. I'm looking forward to this because I still find most ideas that are currently floating around too abstract and unrealistic. But vendors and the NGMN have started publishing whitepapers that give a bit more insight into where we are going. After reading the whitepapers of Ericsson, Nokia and the NGMN on the topic, here's my summary and my own thoughts:

Radio Technology Mix: All whitepapers agree that 5G will not be about a single radio technology vying for dominance but rather a technology mix. Current radio technologies such as LTE(-Advanced) in the cellular domain and Wi-Fi in the home and office domain will be further evolved and are part of the technology mix. New technologies should be specified to grasp the potential offered by using large chunks of so far unused spectrum above 6 GHz for communication over very short distances. Some technologies are already there today, take Wi-Fi 802.11ad as an example.

Virtually Latency Free: The "Tactile Internet" is a new buzzword which means tactile remote control of machines and instantaneous (virtually latency free) feedback. Here's a good description of the concept. Marketing managers are promising round trip delay times of a millisecond in future networks but forget to mention the constraints. But perhaps they have discovered Star Trek like subspace communication? More on this in a future post.

Ultra-Dense Deployments With Ultra-Cheap Radios: As all whitepapers I've read correctly point out, the only way to increase data rates is to shrink cell sizes. This goes hand in hand with using higher frequency bands above 6 GHz. That means that the number of (what we still call) 'base stations' has to grow by orders of magnitude. That in turn means that they have to become ultra-cheap to install, they must configure themselves without human intervention and operate at almost zero cost. Sounds like a nice challenge and perhaps it could be done by turning light bulbs into yocto base stations (nano, femto and other small units are already used…) for this to become a reality? But who's going to pay the extra money to put a transmitter into light bulbs, who's going to 'operate' the light bulb and should such connectivity be controlled or open to everyone? That's not only a technical question but will also require a totally different business model compared to network operators running a cellular network and installing infrastructure without involvement of their customers. Again, the light bulb comes to mind. Light bulbs and power cables that are installed by their owners not only to illuminate a certain area for them but also for others. So in addition to fundamentally new technology and fundamentally new business models it's also a fundamentally new psychological approach to providing connectivity. Perhaps it should be called "light-bulb connectivity"?

Lots of Devices Exchanging Little Data: Today's networks are optimized to handle a limited number of devices that transfer a significant amount of data. In the future there might well be many devices talking to each other or to servers on the network and exchange only very little data and only very infrequently. That means that a new approach is required to reduce the overhead required for devices to signal to the network where they are and that they are still available, perhaps beyond what 3GPP has already specified as part of the 'Machine Type Communication' (MTC) work item.

Local Interaction: Great ideas are floating around on radio technologies that would allow local interaction between devices. An example are cars communicating with each other and exchange information about their location, speed, direction, etc. Sounds like an interesting way to enable cars driving autonomously or to prevent accidents but might break the business model of making money by backhauling data.

Spectrum Licensing Scheme Shake-Up: Some whitepapers also point out that for higher frequencies it might not make a lot of sense to sell spectrum for exclusive use to network operators. After all, range is very limited and not everybody can be in the same place. So license-free or cooperative use might be more appropriate especially if a chunk of spectrum is not used for backhauling but only for local connectivity.

3GPP's Role: All of this makes me wonder a bit how 3GPP fits into the equation? After all it's an industry body where manufacturers and network operators are defining standards. In 5G, however, network operators are probably no longer in control of the 'last centimeter' devices and thus have no business model for that part of 5G. So unlike in 2G, 3G and 4G, 3GPP might not have all the answers and specifications required for 5G?

Summary

So here's my take on the situation: For 5G, everything needs to change and whenever the concept or a part of it is discussed one central question should be asked: Who is going to backhaul the massive amounts of data and how is that done? In 2G, 3G and 4G that question was very simple to answer over the last decades: Network operators are setting up base stations on rooftops and install equipment to backhaul that data over copper cables, fiber or radio. For 5G that simple answer will no longer work due to the massive increase in the number of radios and backhaul links required. Operators will no longer be able to do that on their own as we move from nodes that cover the last mile to nodes that only cover the last centimeters. That means we have to move to a 'lightbulb' model with all that this implies.

3GPP’s Odyssey to 5G Has Begun

A couple of days ago, 3GPP has published a tentative timeline for their upcoming 5G technology standardization activities in the coming years. In other words, the first steps are now made from a "what could 5G be" to "what will 5G be".

The 3GPP timeline set for 5G pretty much starts today with the SA1 SMARTER Study item and extends well into 2020:

  • Today: SA1 SMARTER Study Item
  • September/December 2015: Radio Channel Modeling and kick-off of a RAN Study item on scope and requirements.
  • Feburary 2016: RAN Study Item to evaluate potential solutions
  • 2018: A RAN Work Item to specify the solutions agreed on that will extend into at least 2020.
  • LTE will continue to evolve over the timeframe as well as it is seen as an integral part of an overall 5G network architecture.

The first step to get from "what could it be" to "what will it be" might actually be the most difficult one as the ideas about what "5G could be" currently imply a fundamental conceptual change both in terms of technology and who does what in the value chain. I've taken a look at a couple of 5G whitepapers and will post a summary of my thoughts in an upcoming blog post.

No matter how this turns out in the next 5 years it's going to be an interesting Odyssey with lots of surprises along the way!