I’m a heavy user of the 5 GHz band at home for Wi-Fi, as I’m not inclined to drill half a dozen of holes through several rooms to get ‘the Internet’ and high speed connectivity from my local servers (i.e. ‘the private cloud’) to my workspace. My biggest enemy: The required radar detection and subsequent downgrade of the Wi-Fi channel to the lowest 80 MHz of the band.
So what’s the problem with using the lower part of the 5 GHz band? In practice, whenever radar is supposedly detected, my link rate drops from almost 800 Mbit/s to 500 Mbit/s for a while, as transmit power is limited in that part. Also I’m limited to an 80 MHz channel.
Fortunately, the radar detection algorithms have improved over time and it doesn’t happen as often anymore as it used to. The screenshot on the left shows how often my gear detects radar in the center of Cologne. I have no idea if that’s a real weather or airport radar, but quite frankly, there are no radar stations nearby in the center of Cologne, so I think those are false positives. But at least the downgrade happens only once every week or two at the moment.