Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's exactly why it should be used for wifi!

2.4GHz is completely unusable in urban environments, because you're getting interference from two dozen neighbours. And everyone has a poor connection, so their "handy" nephew will turn up the transmission power to the maximum - which of course makes it even worse.

6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.

On the other hand, cellular networks are well-regulated: if an airport's entire network is managed by a single party they can just install extra antennas and turn down the power.

And it's not like cellular operators will be able to use it often: outdoor use falls apart the moment there are a bunch of trees or buildings in the way, so it only makes sense in buildings like airports and stadiums. Why would the rest of society have to be banned from using 6GHz Wifi for that?

Besides, didn't 5G include support for 30GHz frequencies for exactly this application? What happened to that?



> 6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.

I agree with this and the fact that 6GHz should still be available for wifi, but this whole bandwidth frenzy over wifi has always seemed like a meme for anyone except power users. A 4K netflix stream caps out around 15mbps, so >95% of typical home users will be just fine using 2.4/5GHz inside their own homes.


You've got to take into account that those bandwidth figures exist on paper only - nobody is getting 5Gbps out of their wifi.

In practice it is all about degraded performance. If you're sitting in another room than the AP, close to your neighbour, do you want to be left with 50Mbps remaining out of the original 5000Mbps, or 2Mbps remaining out of the original 200Mpbs?


> A 4K netflix stream caps out around 15mbps

Yeah, but that's just because Netflix streams are ridiculiously over compressed -- they use extremely low quality encodes. It's technically a "4K" stream, sure, but at a bitrate only realistically capable of 1080p.

An actual 4K stream (one capable of expected resolution at 4K) is around 30 to 40mbps.


Back in the day, it was better to upscale netflix's 720p to 1080p than to stream 1080p.


The problem is not bandwidth. The problem is inconsistent performance and latency spikes caused by interferences.


This ~10 to 20 Mbps is enough nonsense is like claiming that 24 fps is enough to play games.

I mean sure, its usable, but its not good. You can notice the differences in buffering / scrubbing speed well into the 100+ mbps range.

Plus being able to download and upload files quickly. Particularly from something like a home NAS, is important. 15 mbps is like using a shitty USB 2 stick for everything!


But your home NAS should be on ethernet? Who would buy a NAS and then not wire it in??

The point here is that only devices like a TV, mobile, tablet or laptop should be on WiFi and it's pretty hard to notice the difference between say 50Mbps and 500Mbps on any of those except maybe if you are moving files around on your laptop.


> But your home NAS should be on ethernet? Who would buy a NAS and then not wire it in??

Your smartphone is not talking to your NAS over Ethernet.


I think you'll find downloading files to your phone from a NAS is like 0.01% type behaviour.


iCloud backups are something normal people do each time they plug in their phone.


50Mb is more than fine for iphone backups.


Family of 4 comes home after a long day out, all plug in their phones at the same time to charge and drop down in the sofa to vegetate in front of Netflix. Why is it buffering so bad?!?

Traffic is bursty. Higher bandwidth connections make the whole internet more efficient - if you can race to idle then servers have fewer concurrent connections to keep track of, routers can more quickly clean up their NAT tables etc etc


The average normie now is downloading 100GB Call of Duty updates over wifi. 5Ghz is ok but most of it is unusable due to DFS.


> 6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.

6GHz barely makes it thought a piece of paper. I live in dense downtown area of Los Angeles and I see zero 6Ghz networks except mine, sometimes three 5Ghz networks (usually just two). No issues using 160Mhz wide channel on 5Ghz, at least for me.

My balcony separated from AP with a 2 panel window, other than that it's in line of sight: 6Ghz not visible at all, 5Ghz poor signal, but better than 2.4Ghz. 2.4 Ghz is unusable in my area at all.


> 6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.

I'm no expert and only speak from personal experience. When the signal is weak, you don't have the whole bandwith, you only get low throughput. Ideally you would want a strong, high penetration signal (low frequency) and all users on separate channels. It's of course impossible in densely populated areas.

Whenever I have to deal with setting up WLAN in the office or at home, I hate the experience and I try to use wired connections wherever possible.


That’s not how RF works (generally). It’s about signal/noise ratio.

It gets really bad when signal is difficult to distinguish from noise because (for example!) everyone is talking at roughly the same power level. Think crowded bar with everyone yelling at each other.

When one is significantly louder than others, even if the others are not that quiet, it’s not a big deal unless at your ear/antenna they have the same loudness. Think concert with big speakers for the main act.

6ghz is better for many isolated networks right next to each other precisely because the others ‘voices’ lose power so quickly. You don’t have the competition for attention. Think ‘every couple in the bar gets their own booth’.

Wired connections are even better, because the amount of noise required to be unable to tell apart signal from noise is orders of magnitude higher - like ‘noisy welder right on top/EMP’ levels. Because the wires can actually be shielded. It’s like having your own hotel room.


You're misunderstanding what is being proposed.

It's not saying 6GHz shouldn't be used for WiFi. It's saying that 6-6.4GHz (approx) is reserved for WiFi and 6.4-7GHz should be used for cellular networks.

My point isn't that we shouldn't have no WiFi on 6GHz, but 1GHz extra for WiFi is limited utility compared to cellular networks.

You can still fit an entire 320MHz channel width in the lower 6GHz and if it doesn't overlap like you say why bother with 3x that?


The question then is: do we really need the whole 1Ghz of spectrum for wifi, if it doesn't really propagate to your neighbour? It should be much easier to avoid interference than on 2.4Ghz, so you need less channels.


Doesn’t 6ghz have essentially the same penetration as 5ghz and thus in a few years will have all the same problems as people shift to 6ghz?


I count approximately twice as many channels available on 6ghz than 5ghz, so even if we ignore the penetration differences between 5 and 6ghz, the 6ghz band is still better. Plus this isn't a "pick one" type of scenario (especially with MLO), 5ghz + 6ghz is 3x as many channels as 5ghz alone.

https://en.wikipedia.org/wiki/List_of_WLAN_channels


> I count approximately twice as many channels available on 6ghz than 5ghz

Isn't this mostly arbitrary? Eg what frequency range one defines channels over and thus how many channels? Eg in the wikipedia link that "6GHz" goes up to ~7.1GHz. Because otherwise channels seem to be more or less spread centered 20MHz apart in each case.


Yeah, I wasn't making that argument as some sort of intrinsic benefit to frequencies around 6ghz, but rather we have administratively decided that the slice of spectrum available for "6ghz" wifi has approximately twice the room compared to the slice of spectrum we have administratively allocated for "5ghz" wifi. In reality, "5ghz" wifi is more like 5.2-5.9ghz (with a hole around 5.4ghz) and "6ghz" wifi is more like 5.9ghz to 7.1ghz.

The intrinsic benefit for the frequencies around 6ghz is the reduced penetration through walls which will also reduce the congestion.


No definitely not in practice. 5Ghz reaches across multiple rooms with some loss whereas 6Ghz clearly looses more and drops off to 0 much faster.

The really big problem here is that 6Ghz also comes with the ability to have 320Mhz towards one channel so its got double the bandwidth of 5Ghz as well as being lower penetration. Its really good for things like VR headsets due to the lower interference and higher bandwidth.


6GHz has worse penetration than 5Ghz, but the difference is indeed not as pronounced as it is compared to 2.4GHz.

The main benefit is going to be the additional frequency space. 5GHz effectively has 3ish channels, and 6GHz adds another 3-7 to that. Combine it with band steering and dynamic channel allocation, and you and all of your close neighbours can probably all get your own dedicated frequency.


> 5GHz effectively has 3ish channels, and 6GHz adds another 3-7 to that.

It would be useful if vendors shipped with 40Mhz channels by default.

A 1x1 40Mhz using 802.11ax will give you a max PHY of 287Mbps:

* https://www.intel.com/content/www/us/en/support/articles/000...

* https://superuser.com/questions/1619079/what-is-the-maximum-...

Even if you half that, that's (IMHO) probably sufficient for the vast majority of online activities. And if you have a 2x2 client you double it anyway.


That’s true. I hadn’t realized that 6ghz has 500mhz of extra spectrum over 5ghz and also doesn’t have to contend with DFS.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: