And in a sane world, few would have a physical internet connection unless you’re really pushing serious data (which probably doesn’t need to be from home anyway).
I’ve toyed with shutting my router and modem at night on a timer because it’s a waste of 15-20w of electricity for nothing.
Your cellphone's 4G connection might be fast enough to handle a 4K stream, but there isn't enough RF spectrum to handle 500 devices in the same block trying to stream 4K. That's why you still need physical internet connections.
If I option-click the wifi icon on my laptop, it says "NSS 2" -- that is, the Number of Spatial Streams is 2. My laptop and router have identified two different spatial paths between them and are using their multiple antennae to send different data down the different paths.. NSS=2 is the limit for my router because it only has 2 antenna, but those routers with a ring of antennae that look like they were designed by the Dark Lord Sauron to top a tower in Mordor can fire correspondingly more simultaneous spatial streams. Cell towers with antenna arrays can do even better.
There is plenty of spectrum, just not for old single-antenna tech.
>If I option-click the wifi icon on my laptop, it says "NSS 2" -- that is, the Number of Spatial Streams is 2. My laptop and router have identified two different spatial paths between them and are using their multiple antennae to send different data down the different paths..
Doesn't that use 2x the spectrum? If so that doesn't really solve the problem. Sure, you're getting 2x the speeds, but also using the 2x the spectrum which is still a shared resource.
Emphatically not. Every chunk of space, propagation direction, and polarization has its own spectrum. Draw a box, count the photons. If you point a satellite dish at different satellites, you see different signals, even if the frequencies are the same.
The same trick works for "virtual antennas" built out of phased arrays of real antennas, except you can steer them in microseconds and without any motors. MIMO leverages this.
Plus cell connections and tethering are usually data capped. I don't know where the parent is coming from but within the last few months I got gigabit fiber installed and I'll do anything I can to never lose these speeds.
There could become a point where technology has advanced so far that cell tech is all there is. Imagine everything was just wireless and the internet penertrated everywhere
Wireless is a seriously limited resource. There will never be sufficient spectrum available to support all the fixed services that are needed. And the heavier the loading, the slower the network runs.
On the other hand, whenever you run a length of fiber, you create a whole new spectrum which is available 24 hrs a day.
That is the promise of 5G though. You personally won't notice any difference but it should support more people in the same area in the same limited spectrum.
20w idle is insane. Probably save more in the long run if you just get a more efficient router so it's not using so much juice during the times you have it on too.
I turn off my wifi off at night. This saves some energy and still allows me me to connect physically if I need to for work or whatever.
It may sound insane, but for folks that are operating gigabit or multi-gigabit fiber connections (certainly not the standard, but definitely becoming more available), the amount of CPU needed to route at gigabit+ speed, the switch chips that need juice, the WiFi radios that have 4+ antennae all needing amplification having a baseline power requirement, there's a surprising amount of juice required. If you're running one of those combo cable "modem" + router + WiFi access point jobbies, that's even more radios that need juice.
I didn't start getting a reasonable understanding until I started watching ServeTheHome videos, where they usually toss each device on some sort of Kill-a-Watt style meter, and then start putting the device under load to see how it changes.
The beast my ISP sends out says 5w power save (when does that activate when something is always connected?), 18w typical and 36w max (probably with usb devices plugged in).
Interesting. I own my modem. I guess that's not an option everywhere. Plus, options for modems are more limited than options for routers (or so it seemed a few years ago).
Cell providers seem to advertise ridiculously high bandwidth numbers (far beyond what a home user might need), but don’t say much about latency. If I want to edit code over ssh or play an online video game, I care more about the latter, and those things don’t seem like non-sane things to do, to me.
Indeed, one thing I disliked it that Canadian incumbent providers used to (and maybe continue to) refuse to peer freely at IXs. So my closest POP (by their shit routing) for my VoIP provider was in US while on mobile vs my independent home ISP could use the local POP.
It is likely that your broadband provider will look at the fact that your router is often "losing connection" and infer that your cabling is dodgy, then reduce your connection speed to try to "help". Also, 15-20W is a little high - my modem uses 3.9W.
I’ve toyed with shutting my router and modem at night on a timer because it’s a waste of 15-20w of electricity for nothing.