Recently I was on the road again for a couple of days and as always had my 3G/Wi-Fi router with me to connect several devices to the net with a single 3G connection. After a while I was a bit surprised that despite my mobile phone and the 3G/Wi-Fi box showing very good coverage (5 signal bars) transmission speeds were consistently below 500 kbit/s. Network overload? So I got out my little analyzer and saw that in the spot I placed the 3G router, the signal level was strong (-72 dbm) but there was high interference as two cells were received with an equally strong signal level. The result was an EcNo (signal to noise ratio) of -12db for both cells (for details see here). So despite the 5 signal bars, not an ideal place to put the 3G router.
So I walked around a bit to find a better spot and in most other places I encountered a similar situation. But in a few places I could consistently receive one cell much stronger than the other one, with the serving cell having a much better EcNo of -6 db while the first neighboring cell continued to have -12db. The signal level was a bit weaker (-78 dbm) but the result was quite stunning. Instead of a throughput below 500 kbit/s I persistently got well over 2 MBit/s.
In other words, the signal bars shown on mobile devices today are clearly optimized for showing the user the quality of the network for voice calls but not for finding good spots for placing a 3G router. On the one hand, I don't think it will help to put RSSI and EcNo values in front of the average user as that might be too complicated. On the other hand, though, I think that for data centric products device manufacturers should think about a more meaningful indicator. It's not an easy task though as the signal to noise ratio of the serving cell can vary widely and quickly depending on the activity of other users. But I am sure some middle ground can be found.