This week a device was announced to the press that can supposedly use LTE networks in the US and HSPA network in Europe and Asia. To the mainstream press things are clear, Internet access with the device will be better in the US than in Europe. Hm, they just overlooked a small detail: In the US, Verizon and AT&T currently use the 700 MHz band for LTE and each carrier only has 10 MHz of spectrum in that band (see for example Verizon's band 13). In Europe, Dual Carrier HSDPA combines 2×5 MHz to a 10 MHz channel. This nullifies pretty much all of the theoretical speed advantage. The only thing that is left that LTE has and HSPA hasn't is MIMO. Add to that the denser network structure, a more mature technology and very likely a lower power consumption due to optimized networks and things suddenly look quite different. But I guess one can argue as much as one wants, 4G must by definition be better than 3G 🙂
10 thoughts on “What the Main Stream Press Overlooks in the US LTE vs. Europe HSPA Discussion”
Comments are closed.
MIMO makes all the difference. It helps speeds at the cell edge.
Hi Herrro,
Unfortunately you are mistaken. MIMO speeds-up transmissions close to the center of the cell under good signal conditions. Only then can several data streams be transmitted simultaneously. At the cell edge with high noise and low signal, MIMO, 64QAM and all other things that take advantage of good signal conditions can’t be used.
Kind regards,
Martin
Martin is correct. MIMO only works when C/I is relatively high, i.e., close to the site. In order to realize a gain from multiple antennas at the cell edge, you have to transmit the *same* data from the antennas (transmit diversity).
A couple of things not directly related to LTE might make the LTE user experience in the US better than the DC-HSPA experience in Europe. First, if site densities are equivalent, a 700 MHz system should show higher C/I (and performance) than a 2100 MHz system. Second, the European HSPA systems have to split their capacity between voice and data whereas the US LTE systems can devote all their capacity to data traffic, increasing the average capacity available to each user.
MIMO as such can be used either with 1 or 2 layers: when radio conditions are poor, it uses 1 layer in order to mitigate these poor channel conditions (same info transmitted in parallel);when radio conditions are good, it uses 2 layers (it allows to double peak throughput). In real life, with the same amount of bandwidth, I would also expect a better performance of LTE not because of MIMO but because it is SOFDMA and thus inherit for some aspects the TDMA technology interference behaviour. It is often the case for instance that I get a better performance in GSM than in 3G for voice in a fully colocated network due to 3G interference that is not happening in GSM as it uses different frequency channels.
Always a good blog, Martin. One note: Verizon operates in 22 MHz for 700 MHz LTE across the US and can have 10 MHz channels. AT&T has various licenses, but I believe these are 12 MHz blocks. Some of their markets will have 10 MHz channels and some 5 MHz.
Like
Hi David,
thanks for the comment! On the WCDMA side I wouldnt discount RX diversity and type 3i receivers with active interference cancellation. The results are quite astounding. So I wouldnt bet all my horses on cell edge performance superiority of LTE, especially not when early non-optimized LTE networks compete with very fine tuned WCDMA networks 🙂
Kind regards,
Martin
Hi Bailey,
unfortunately you are mistaken with the US channel bandwidth. Verizon has 22 MHz in the 700 MHz band, but that is for uplink and downlink. In other words, only 10 MHz in each direction.
Martin
Much of the LTE story is not about what’s here right now, but the groundwork being laid for the future. For example, the carrier aggregation coming with LTE-Advanced is similar to DC-HSDPA, but it defines aggregate bands of up to 100 MHz rather than 10.
But this really isn’t a US vs. Europe story. Upcoming TD-LTE trials in APAC will include advanced MIMO and beamforming (using 8×2 and 8×4 configurations) to squeeze even more capacity out of the RF pipe. While it will be quite a few years before the majority of subscribers are accessing these technologies, it’s worth noting that the relevant carriers in China and India alone represent over 2 billion users… or almost a third of the world’s subscribers.
Good discussion points. The thing that always strikes me is a lot of people proclaiming LTE is better than WCDMA. At the moment though WCDMA offers you more bitrate than LTE does!
The thing I’ve always been trying to find out is what is it in LTE that truly makes it superior (it’s not carrier aggregation or MIMO as these are done in WCDMA as well).
So what can LTE do that WCDMA cannot if we want to make WCDMA on par? 100MHz of BW for LTE? Which country can allocate it? Why couldn’t it be allocated for WCDMA? Simpler architecture (eNB->S-GW-PDN-GW)? I-HSPA is even simpler (iNB->GGSN)
I’m truly curious about finding a real answer to this one as I am starting to believe that it’s all just a political game instead of technology.
Technology harmonisation is one reason, but it’s already happening in WCDMA and LTE seems to cause a bad case of frequency fragmentation so one benefit is lost.