After my post on the stellar data rates that can theoretically be achieved by combining 60 MHz of bandwidth of 4G LTE with 100 MHz 5G NR in the 3.5 GHz range, I think it is also necessary to look at real life and have a look at how much capacity is actually offered by a live network cell when it is fully loaded.
So let’s say there’s an LTE cell out there that uses 20 MHz of spectrum. With 256QAM and 4×4 MIMO the theoretical peak data rate is around 375 Mbit/s, minus 15% of overhead for the control and reference channels, so around 320 Mbit/s. But that’s under super ideal radio conditions and only a single user in the cell. A typical cell site has 3 sectors so you can triple that value.
But that’s not real life. In real life, there are dozens of users in a cell that transfer data simultaneously. And quite many of those have marginal radio conditions and would only get a 3-4 Mbit/s even if they had the whole 20 MHz channel for themselves. So the fully loaded capacity of a cell is somewhere between 320 Mbit/s and 3-4 Mbit/s. But exactly where? Here’s an interesting slidedeck by Huawei that gives an answer to this question. According to this, a fully loaded cell in an urban environment runs at a datarate of around 35 Mbit/s in the downlink, which is 11% of the theoretical maximum.
That’s not very much if you compare it to the theoretical maximum. At least one can aggregate several carriers to, let’s say 60 MHz, which would triple that value to around 100 Mbit/s. And then, a cell site typically has 3 sectors so the total throughput of the cell site when all sectors are fully loaded is 300 Mbit/s. Sounds much, but that’s what around 1000 users have to share.