2.4GHz wifi is not suitable for two big reasons, interference and low bandwidth. 2.4GHz wifi in any kind of suburban or city environment and sometimes even in rural will be congested with other networks, microwaves, other appliances, etc causing massive speed degradation or fluctuations. The range of 2.4GHz is just too large for all the equipment that uses it in today’s world. In my previous apartment complex for example my phone could see 35 distinct 2.4GHz wifi networks while only 3 at max can operate without interfering with each other. In that same building i could only see 13 5GHz networks. Which brings me to the second issue of bandwidth
2.4GHz at least here in the US only has channels 1, 6, and 11 that will not interfere with each other. if anyone puts their network between these three channels it will knock out both the one below and the one above. Channel 3 would interfere with both channels 1 and 6 for example. By going up to 5GHz you have many more free channels, fewer networks competing for those channels, and higher bandwidth channels allowing for much higher throughput. 2.4GHz allows 40MHz wide channels which in isolation would offer ~400mbps, but you will never see that in the real world.
Personally, i think OEMs should just stop including it or have it disabled by default and only enable it in an “advanced settings” area.
Edit: I am actually really surprised at how unpopular this opinion appears to be.
Two words: Backward Compatibility
There comes a time when maintaining backward compatibility either introduces serious security flaws or becomes too great a load to maintain. Take the cellular networks shutting down 2G and 3G in the United States, for example. Yes, it maintains backwards compatibility, but 2G is highly flawed and easy to exploit. They sure as hell aren’t doing it to free up the 700 kilohertz of bandwidth that it’s been stuck on for 20 years.
While 3G has been phased out, 2G has been kept for calls in rural areas. The reason is… you’ve guessed it: coverage.
And in the US, 3G hasn’t been phased out - It’s no longer available for consumer use, but commercial use for remote device management it still exists.
There are millions of remote devices performing remote monitoring via 3G today for things like gas pipelines, oil wells, etc (and probably much more that I’m not aware of).
Interesting… Where I’m from, the industrial stuff use 2G (and 4G for the data-intensive cases).
AT&T has shut down its 2G Verizon has shut down its 2G or is in the process of doing so. And T-Mobile just delayed it but had planned to shut it down next month. The frequencies can travel the same distance in 2G and LTE because they’re the same wavelengths. What’s different is that, again, 2G is only 700 kilohertz wide where LTE is 20 megahertz channels. So the power is going to wider frequencies. Dbm per MHz
The world isn’t just US, you know…
That’s true, but the fact that power gets reduced over wider bandwidth is a physics thing.
Neither of those is the case for 2.4Ghz though.
For now, sure. But that’s likely because there’s not very many bands. After all, with the inclusion of six gigahertz now, there’s three bands that must be supported, which is nothing like what the cellular networks have to do.
2.4 has many applications that just don’t make sense for 5 or 6. Sure, the latter can transmit at higher rates, but you don’t always care about the speed. Sometimes you’ve got the choice between blasting 2.4 through an annoying section of wall, or drilling a hole. And you don’t always have the right to drill a hole through a wall.
Consider my student dorm housing in college. The walls were literal concrete with wiring and piping running through them. 5Ghz had issues with penetration, and certain areas you just couldn’t get internet. 2.4 had similar issues sure, but to a much lesser extent. Those dorms are still in use today, and while you might be able to finagle the perfect placement for coverage, hanging the router on the back of a wooden door in the middle of the unit just isn’t a great idea, for many reasons.
Eventually, I suspect that Wi-Fi will be high enough that you will need a router in every single room, just like you need a light in every single room. I’m thinking of those Wi-Fi access points that look kind of like smoke detectors except larger and that are hung from the ceiling.
Eventually, I suspect that Wi-Fi will be high enough that you will need a router in every single room, just like you need a light in every single room. I’m thinking of those Wi-Fi access points that look kind of like smoke detectors except larger and that are hung from the ceiling.
This would be a step back from where we are now.
The concept of wireless is to remove the need to set things up over and over. No wires, to dealing with holes in walls, no need to get to the specific spot that has a plug, etc.
What you’re talking about is making every room require more cables than before, a run to every room that you might want to use internet in, possibly even two.
I’d rather run a cell tower type setup in my backyard than deal with running dozens of cables through my wall just to get wifi.