Two pointers today to performance simulations performed in 3GPP for LTE and comparison to baseline HSPA:
- 3GPP R1-072580: Liaison statement with an overview of the results of LTE performance simulation in uplink and downlink.
- 3GPP R1-071956: Simulations performed by Ericsson on the downlink (referenced in the document above)
The result is that 2×2 MIMO and 4×4 MIMO bring a tremendous benefit for the average cell throughput, with average cell spectral efficiency for 2×2 MIMO at 1.58 bits/s/Hz and for 4×4 MIMO at 3.04 bits/s/Hz under the same radio conditions.
Even baseline HSPA with a theoretical peak data rate of 14 MBit/s in a 5 MHz channel has a peak spectral efficiency of 2.8 bits/s/Hz which comes close to what the report say can be done only with 4×4 MIMO (who's peak spectral efficiency is even higher). So if the channel has a SNR high enough for 3 bits/s/Hz (about 8db) why doesn't basline HSPA reach this speed as well?
Hm, what am I missing? One thing might be that those users very close to the base station or an external antenna enjoying an SNR higher than 8db can push the average data rate by having a much higher transmission rate than the average. But is that alone enough for such a difference?
3 thoughts on “LTE Performance Simulations”
The baseline UTRA does not use 64QAM but E-UTRA does (Table A.2.1.8). So the difference in spectral efficiency is not only based on more antennas but also higher modulation order and then the numbers make more sense.
yes, indeed, baseline HSPA doesn’t have 64QAM. So what I am wondering is:
Today, even super optimized and operator advertised tests ‘only’ speak of 5.76 MBit/s in downlink with a node-b upgraded to support hsdpa cat 7/8 (http://tinyurl.com/7vo8jn). I am sure their SNR and backhaul capacity was as good as it can just be and they are not downplaying the results 🙂 So given that, why was it not more and how would 64QAM and MIMO change that? In theory, there is still room to make better use of the radio channel even without those features.
The comparison is with was is called baseline UTRA which is a basic UTRA R6 system without any kind of advanced receiver. Today most HSPA devices have either interference rejection features like GRAKE and/or RX diversity implemented. These features reduce the own cell interference and increase spectral efficiency typically by a factor 2.5 which then would increase the average spectral efficency to about 1 bps/Hz with the same simulation conditions. By adding 64QAM modulation and MIMO, (which are part of R8) the spectral efficiency will reach about the same figures as for LTE with MIMO. That the spectral efficiency figures are similar for the 2 systems is not surprising as both systems are getting close to the Shannon limit. The advanced receivers that are now implemented in HSPA and which purpose is to reduce own cell interference are neutralising the only major benefit that OFDM have over CDMA, i.e. better orthogonality.
Comments are closed.