Two pointers today to performance simulations performed in 3GPP for LTE and comparison to baseline HSPA:
- 3GPP R1-072580: Liaison statement with an overview of the results of LTE performance simulation in uplink and downlink.
- 3GPP R1-071956: Simulations performed by Ericsson on the downlink (referenced in the document above)
The result is that 2×2 MIMO and 4×4 MIMO bring a tremendous benefit for the average cell throughput, with average cell spectral efficiency for 2×2 MIMO at 1.58 bits/s/Hz and for 4×4 MIMO at 3.04 bits/s/Hz under the same radio conditions.
Even baseline HSPA with a theoretical peak data rate of 14 MBit/s in a 5 MHz channel has a peak spectral efficiency of 2.8 bits/s/Hz which comes close to what the report say can be done only with 4×4 MIMO (who's peak spectral efficiency is even higher). So if the channel has a SNR high enough for 3 bits/s/Hz (about 8db) why doesn't basline HSPA reach this speed as well?
Hm, what am I missing? One thing might be that those users very close to the base station or an external antenna enjoying an SNR higher than 8db can push the average data rate by having a much higher transmission rate than the average. But is that alone enough for such a difference?