Is it preferable to use the 5GHz broadband in the house for things like the TV, Apple TV, HomePod and other stationary devices in the house that are in straight proximity to the router?
Is this common use case or are people just using the 2,4 GHz for these devices?
In many (crowded) environments there’s less traffic from other routers on 5 GHz compared to 2.4 GHz.
So, if you can use 5 GHz, I would (unfortunately many IoT devices don’t support 5 GHz…).
My router does offer 2.4 GHz and 5 GHz simultaneously under the same SSID. Devices that are able to use it are on the 5 GHz network and devices that are not able to use the 5 GHz network connect to WiFi via the 2.4 GHz network. It works like a charm without anything to configure on my end.
For day to day internet usage like streaming, 2.4 GHz networks are fine. But if it comes down to heavy network traffic, a 5 GHz network is a must-have for me. My Macbook Pro is able to copy data from and to the NAS with 50 to 80 MB per second using the 5 GHz network under ideal circumstances. And that is really nice - almost like a wired Ethernet connection.
So, my opinion would be: for a TV, an Apple TV or an HomePod: a 2.4 GHz network is fine.
For devices like a Mac or even an iPad, maybe even an iPhone: 5 GHz networks are really nice. And I am looking forward to WiFi 6.
In a broad sense I don’t think it really matters… but the two primary variables are distance and bandwidth. If you have a device that is not very close to the access point you probably want to go with the 2.4GHz since it has longer range. This is especially true for things that dont require a lot of bandwidth. For example, I put my video doorbell on 2.4MHz since it is a bit far away from the AP and it doesn’t require huge amounts of bandwidth. On the other hand I try to put my streaming devices on 5GHz since I want to ensure that I can get good HD TV. (2.4GHz is probably good enough for that, but, why not? And it keeps the 2.4 channel clearer for all of the other stuff). My iMac is hardwired … but my MPB is on 5GHz. My Wemo light switches only support 2.4 (I think) and that is the right place for them anyway.
My wireless network is served up by our Eero mesh. Eero had band steering to encourage devices to use 5Ghz if possible. The Apple TV is connected directly to the FiOS router by ethernet. I don’t think it matters all that much what band is being used, since most of the devices are low-demand IOT bulbs, Echos, etc.
WiFi networks are built up of a number of channels through which communication takes place. The bandwidth of these channels is 20Mhz both for 2.4 as well as 5Ghz. The bandwidth determines max comms speed, so in essense this is the same between 2.4 and 5Ghz. However, the 2.4Ghz frequencies are unregulated so you’ll find just about anything using these frequencies including your microwave (which is noise of course). The 5Ghz is regulated and therefore signals have better SNR i.e possible effective throughput.
Higher frequencies are easier attenuated by building structures etc. Additionally, there are differences between the 2.4G and 5G radio power affecting total distance.
So if you have a good quality signal (throughput) on 2.4G i would stick with that, it is just less finicky to make work consistently. Then again, if you’ve got an eero system you don’t care, it just works
PS: Some routers can bond wifi channels which add bandwidth and thereby throughput. But if there is broadband noise sources, the wider bandwidth might effectively work against you. Many can adjust radio output power giving better range and SNR. All this affects how well 2.4 vs 5G works in your specific set up…