5G networks meet consumer needs as mobile data growth slows
375 comments
·February 12, 2025gerdesj
reaperman
There is no way that "15MBs-1" is more clear than "15MB/s", especially in an environment which lacks the ability to give text superscript styling.
Dylan16807
Even worse when it's actually 15Mb/s.
alanfranz
Which could be written as 15Mbps with no doubt whatsoever.
notpushkin
15 Mb·s⁻¹
gerdesj
Sorry mate, an old habit.
Perhaps we should insist on more from formatting from above. That's probably a post processing thing for ... gAI 8)
throwaway314155
Old habit from where? Is this a personal preference or a lesser known convention?
madwolf
A decent quality stream is not 256 kilobytes per second (1/4 megabyte), as you write. You probably meant 256kilobits, which is only 32 kilobytes. For speech that’s VERY high quality. Teams actually uses either G.722 or SILK codec, which is just 40kbit/s. That’s 5 kilobytes per second.
gerdesj
256kbps (yes I was pissed and miss-typed B for b and a few other transgressions) or whatever is sod all these days for throughput, as you well know, so worrying your codecs down to 40kbps means nothing if your jitter buffer is going mad!
Modern home/office internet connections are mostly optimised for throughput but rarely for latency - that's the province of HFT.
You see tales from the kiddies who fixate over "ping" times when trying to optimize their gaming experience. Well that's nice but when on earth do you shoot someone with ICMP?
I can remember calling relos in Australia in the 1970s/80s over old school satellite links from the UK or Germany and it nearly needed radio style conventions.
I've been doing VoIP for quite a while and it is so crap watching people put up with shit sound quality and latency on a Teams/Zoom/etc call as the "new" normal. I wheel out Wireshark and pretend to watch it and then fix up the wifi link or the routing (Teams outside VPN - split tunnelling) or whatever.
wcoenen
Tangentially related to the topic of the bandwidth efficiency of Teams: screen sharing in Teams has a very low framerate of about 3 to 4 fps. It is driving me insane, especially when the presenter starts relentlessly scrolling up and down and circling things with the mouse cursor.
I think Microsoft took bandwidth efficiency a bit too far here.
timewizard
That's a 1/4 megabit per second. The codec should be around 32,00 bytes and thus 256,000 bits. The problem is the Internet is not multicast so if I'm on with 8 participants that's 8 * 32,000 bytes and our distributed clocks are nowhere near sample accurate.
If you really want to have some fun come out to the country side with me where 4G is the only option and 120ms is the best end to end latency you're going to get. Plus your geolocation puts you half the nation away from where you actually are which only expounds the problem.
On the other hand I now have an acquired expertise in making applications that are extremely tolerant of high latency environments.
nomel
The first time I learned about UDP multicasting, I thought it was incredible, wondering how many streams were out there that I could tap into, if only I knew the multicast group, and all with such little overhead! Then I tried to send some multicast packets to my friend's house. I kept increasing the hop limit, thinking I was surely sending my little UDP packet around the world at that point. It never made it. :(
Hikikomori
Mbone.
bigfatkitten
> If you really want to have some fun come out to the country side with me where 4G is the only option and 120ms is the best end to end latency you're going to get.
That's basically me. 80ms on LTE or 25ms on Starlink.
Used to be 1200-1500ms on BGAN, 160ms on 3G.
silisili
Oh you'd love Project Genesis.
It has all the latency associated with cell networks combined with all the latency of routing all traffic through AWS.
As an added bonus, about 10% of sites block you outright because they assume a request coming from an AWS origin IP is a bot.
Karrot_Kream
Latency is an issue but it's not the biggest one. Most real-time protocols can adjust for latency (often using some sort of OOB performance measurement port/protocol.) The issue is jitter. When your latency varies a lot, it's hard for an algorithm to compensate.
fulafel
You can't fix problems caused by network latency by adjustments in real time apps like video/audio calls or gaming.
Maybe you are thinking of working around bufferbloat?
ratorx
I agree for video + audio calls, but I’m not sure for gaming.
Usually multiplayer games have some correction and if you have really stable latency it can work pretty well (of course never as well as 0 latency). But usually, the jitter is the killer because it makes actions delayed AND unpredictable.
gspr
And then comes all the latency _jitter_ inherent in a shared resource like 5G, and on top of that shenanigans like (transparently) MITM-ing TCP streams on a base station by base station base.
rsynnott
5G is (at least on paper) _very_ low latency; couple of ms.
ksec
It is not on paper. Couple of ms ( 4ms ) is real, except it is not normal 5G but specific superset / version of 5G.
Your network needs to be End to End running on 5G-A ( 3GPP Rel 18 ) with full SA ( Stand Alone ) using NR ( New Radio ) only. Right now AFAIK only Mobile networks in China has managed to run that. ( Along with turning on VoNR ) Most other network are still behind in deployment and switching.
lozf
On that note, the BBC deployed a temporary 5G SA non-public-network to broadcast King Charles' Coronation, giving over 1Gbps uplink over 1km of the procession route. More details are in the award-winning white paper linked from this[0] blog post.
[0]: https://www.bbc.co.uk/rd/blog/2023-05-5g-non-public-network-...
Zigurd
Although the deployments are very limited, wouldn't the 5G mmWave coverage, in NFL stadiums for example, need to have both New Radio (NR) and Stand Alone (SA)? Or is there some bodge for down-rating your mmWave experience to a non-Stand Alone network? I understand you won't get SA end-to-end calling an endpoint on an LTE network, but, with 5G FWA also needing SA to achieve full performance, SA should be pretty widespread in the US, or is it?
lambdaone
It is until you congest it. Then, like all other network technologies, performance collapses. You can still force traffic through with EDC, buffering, and re-tries, but at massive cost to overall network performance.
richardwhiuk
Cellular technologies can be configured to degrade differently to WiFi. The device gets fewer assigned slices, so the bandwidth reduces, but latency should remain static.
JasserInicide
Barring some revolutionary new material where we can get EM waves to travel orders of magnitude faster through it over current stuff, I don't think we're ever getting around the latency problem.
ahartmetz
It's usually not a speed of light problem. It's a problem of everyone optimizing for bandwidth, not latency, because that is the number that products are advertised with. Speed of light is 200'000-300'000 km/s in the media used, that should not be very noticeable when talking to someone in the same country.
lambdaone
See my comments about L4S.
viraptor
> Could maximum data speeds—on mobile devices, at home, at work—be approaching “fast enough” for most people for most purposes?
That seems to be the theme across all consumer electronics as well. For an average person mid phones are good enough, bargain bin laptops are good enough, almost any TV you can buy today is good enough. People may of course desire higher quality and specific segments will have higher needs, but things being enough may be a problem for tech and infra companies in the next decade.
aceazzameen
The answer is yes. But it's not just about speed. The higher speeds drain the battery faster.
I say this because we currently use an old 2014 phone as a house phone for the family. It's set to 2G to take calls, and switches to 4g for the actual voice call. We only have to charge it once every 2-3 weeks, if not longer. (Old Samsung phones also had Ultra Power Saving mode which helps with that)
2G is being shutdown though. Once that happens and it's forced into 4G all the time, we'll have to charge it more often. And that sucks. There isn't a single new phone on the market that lasts as long as this old phone with an old battery.
The same principle is why I have my modern personal phone set to 4G instead of 5G. The energy savings are very noticeable.
bityard
I actually miss the concept of house phones. Instead of exclusively person-to-person communication, families would call other families to catch up and sometimes even pass the phone around to talk to the grandparents, aunts and uncles, etc.
ssl-3
It may have been decades ago, but there was a time when it did not seem awkward or even unusual to just...call someone's house, or to simply stop by.
There was usually a purpose in this that was more profound than just being bored or lonely or something, but none was really required.
And maybe the person they were trying to find wasn't home right now, but that was OK. It was not weird to talk to whoever for a minute, or to hang out for awhile.
Nowadays, the ubiquity of personal pocket supercomputers means that the closest most people ever get to that kind of generally-welcome (if unexpected) interaction is a text message.
"Hey, I'm in front of your house. Are you home?"
And maybe that works more efficiently between two individuals than knocking on the door did, but otherwise this present way of doing things mostly just serves to further increase our social isolation.
Sometimes it seems that the closer it is that our technology allows us to be, the further it is that we actually become.
anal_reactor
My siblings are much older than I am and when I was 10 my sister was starting her relationship with now-husband. One time he called our house phone, I picked up, and said "yeah sister is home but she's busy at the moment so you can spend five minutes talking to me instead"
little_bear
I set up an old rotary phone via twilio as a house phone for our kids. They just dial 1-9, each number is programmed to a different family member. Their cousins also have one, so they can call each other whenever they want. It’s great to also have a phone that just rings in the house, for when you don’t care about getting a specific person.
secstate
That's a facinating obsevation! I hadn't consider the side effects of calling a common phone and interacting with other people rather than exclusively the one person you wanted to talk/text with. Probably distances you from (or never allows you to know) those adjacent to the person you already know.
pjmlp
I can vouch that pass the phone around is still a thing in southern Europe, or being sundenly dropped into a group video call where everyone gathers around the phone to be in view.
xattt
I’m looking to set up an Asterisk server that takes calls over Bluetooth on any paired cell phones inside the house, or falls back to a VoIP line if no phones are at home.
Similarly, I find that it’s hard to catch family on their cell and much easier when I call their home.
8note
does that not still happen for you?
passing cell phones around still happens for my family
vel0city
There are some more traditional home phones which work on 4/5G networks with a DECT handset which talks to a cellular base station. You might look into switching to that model to replace your "cell phone as a home phone" concept. It makes it a bit easier to add another handset to the DECT network and often means convenient cradles to charge the handsets while the base station stays in a good signal spot with plenty of power.
Just a thought when it comes time to change out that device.
aceazzameen
That's not a bad idea, thank you! I always hate having to retire perfectly good working hardware just because a spec requirement change.
derefr
Given that it's a house phone, have you tried enabling Wi-Fi Calling (VoWiFi) in the carrier settings (if you have that option), and then putting the phone in Airplane Mode with wi-fi enabled? AFAIK that should be fairly less impactful on battery.
(Alternately, if you don't have the option to use VoWiFi, you could take literally any phone/tablet/etc; install a softphone app on it; get a cheap number from a VoIP number and connect to it; and leave the app running, the wi-fi on, and the cellular radio off. At that point, the device doesn't even need a[n e]SIM card!)
chefandy
Or just port the number to Magic Jack for $20 and get a cordless phone $20 at Target that you have to charge once per week if you don’t just keep it on the dock, and pay like $5/mo or less for service. And you can make/receive calls from that number on a mobile phone using their app.
Gigachad
I think it's more that mobile transfer speeds is no longer the bottleneck. It's the platforms and services people are using. If you record a minute video on your phone it ends up as like 1GB. But when you send it on a messaging app it gets compressed down to 30MB. People would notice and appreciate higher quality video. But it's too expensive to host and transfer for the service.
hexator
A problem for tech companies but not the world.
gambiting
Get ready to be sold the exact same thing you already own, just "with AI" now.
callc
And obsolescence via “TPM2”!
yieldcrv
I’m all for advancements in the size of various models able to be loaded onto devices, and the amount of fast ram available for them
teamonkey
A problem when the world’s pension funds contain significant holdings of US tech stocks
trevithick
It won't be a problem for them. They'll find a way to make it not enough - disable functionality that people want or need and then charge a subscription fee to enable it. And more ads. Easy peasy.
grafporno
5.6 kbit/s is enough for voice https://en.wikipedia.org/wiki/Half_Rate
nomel
Back in the day, if you knew the secret menu, you could change the default vocoder to use on the network. The cheap cell company I used defaulted to half-rate. I would set my phone to the highest bitrate, with a huge improvement in quality, but at the expense of the towers rejecting my calls around 25% of the time. When I would call landline phones, people would mention how good it sounded.
hypercube33
For a short while Verizon enabled this on our super grandfathered plan. Well not exactly this, it was some more modern codec dubbed HD and it sounded so good it was freaky and unnerving
ekianjo
Good enough but most of these devices are not built to last.
sureIy
I don't understand this "good enough" argument. We never really needed anything we use daily today. Life was "good enough" 100 years ago (if you could afford it), should we have stopped?
4K video reaches you only because it's compressed to crap. It's "good enough" until it's not. 360p TV was good enough at some point too.
theshackleford
> 4K video reaches you only because it's compressed to crap.
Yes, but I assume when they say the "consumer" they mean everyone not us. Most people i've had in my home couldnt tell you the difference between a 4K bluray at 2 odd meters on a 65" panel vs 1080p.
I can be looking at a screen full of compression artificts that seem like the most obvious thing i've ever seen and ill be surrounded by people going "what are you talking about?"
Even if I can get them to notice, the response is almost always the same.
"Oh...yeah ok I guess I can see. I just assumed it supposed to look like it shrug its just not noticable to me"
sureIy
You can't ask someone today to see what they're not used to see.
I expect a future of screens that are natively subtly 3D and where you could see someone's nose hair without focusing. Only then they will notice "why do they look blurry and flat" when comparing it to an old TV.
Today if you get closer to a TV you will see blur. Tomorrow you will see the bird's individual strands of feathers.
"Good enough" is temporary.
rglullis
> 4K video reaches you only because it's compressed to crap.
Streaming video gets compressed to crap. People are being forced to believe that it is better to have 100% of crap provided in real time instead of waiting just a few extra moments to have the best possible experience.
Here is a trick for movie nights with the family: choose the movie you want to watch, start your torrent and tell everyone "let's go make popcorn!" The 5-10 minutes will get enough of the movie downloaded so you will be able to enjoy a high quality video.
Dalewyn
>Streaming video gets compressed to crap.
That's because your source video is crap.
I'm not sure if you realize it, but all forms of digital media playback are streaming. Yes, even that MP4 stored locally on your SSD. There is literally no difference between "playing" that MP4 and "streaming" a Youtube or Netflix video.
Yes, even playing an 8K Blu-Ray video is still streaming.
anal_reactor
Due to a bunch of reasons combined, I'm stuck on a plan where I have 5GB of mobile data per month, and honestly, I never really use it up. I stopped browsing so much social media because it's slop. I don't watch YouTube on mobile because I prefer to do it at home on my TV. I don't stream music because I keep my collection on the SD card. Sometimes I listen to the online radio but it doesn't use much data anyway. Otherwise I browse the news, use navigation, and that's pretty much it.
Once every few months I'm in a situation where I want to watch YouTube on mobile or connect my laptop to mobile hotspot, but then I think "I don't need to be terminally online", or in worst-case scenario, I just pay a little bit extra for the data I use, but again, it happens extremely rarely. BTW quite often my phone randomly loses connection, but then I think "eh, god is telling me to interact with the physical world for five minutes".
At home though, it's a different situation, I need to have good internet connection. Currently I have 4Gbps both ways, and I'm thinking of changing to a cheaper plan, because I can't see myself benefitting from anything more than 1Gbps.
In any case though, my internet connection situation is definitely "good enough", and I do not need any upgrades.
pr337h4m
>Regulators may also have to consider whether fewer operators may be better for a country, with perhaps only a single underlying fixed and mobile network in many places—just as utilities for electricity, water, gas, and the like are often structured around single (or a limited set of) operators.
There are no words to describe how stupid this is.
hedora
It actually works well in most places. Look up the term “common carrier”.
The trick is that the entity that owns the wires has to provide/upgrade the network at cost, and anyone has the right to run a telco on top of the network.
This creates competition for things like pricing plans, and financial incentives for the companies operating in the space to compete on their ability to build out / upgrade the network (or to not do that, but provide cheaper service).
protocolture
Your second and third paragraph are contradictory.
Common carriers become the barrier to network upgrades. Always. Without fail. Monopolies are a bad idea, whether state or privately owned.
Let me give you 2 examples.
In australia we had Telstra (Formerly Telecom, Formerly Auspost). Testra would resell carriers ADSL services, and they stank. The carriers couldn't justify price increases to upgrade their networks and the whole thing stagnated.
We had a market review, and Telstra was legislatively forced to sell ULL instead. So the non monopolist is now placing their own hardware in Telstra exchanges, which they can upgrade. Which they did. Once they could sell an upgrade (ADSL2+) they could also price in the cost of upgrading peering and transit. We had a huge increase in network speeds. We later forgot this lesson and created the NBN. NBNCo does not sell ULL, and the pennies that ISPs can charge on top of it are causing stagnation again.
ULL works way better than common carrier. In singapore the government just runs glass. They have competition between carriers to provide faster GPON. 2gig 10gig 100gig whatever. Its just a hardware upgrade away.
10 years from now Australia will realise it screwed up with NBNCo. Again. But they wont as easily be able to go to ULL as they did in the past. NBN's fibre isn't built for it. We will have to tear out splitters and install glass.
The actual result is worse than you suggest. A carrier had to take the government/NBNCo to court to get permission to build residential fibre in apartment buildings over the monopoly. We have NBNCo strategically overbuilding other fibre providers and shutting them down (Its an offence to compete with the NBN on the order of importing a couple million bucks of cocaine). Its an absolute handbrake on competition and network upgrades. Innovation is only happening in the gaps left behind by the common carrier.
rstuart4133
Yeah, I've noticed the same things with roads. They are common carriers owned mostly by the government, so they never get upgraded. That freeway near my house is always clogged.
Oh wait ... the reason that freeway is always clogged is they are ripping it up, doubling it's width. And now I think about it, hasn't the NBN recently upgraded their max speeds from 100 Mb/s, to 250Mb/s, and now to 1Gb/s. And isn't the NBN currently ripping out the FttN, replacing it woth FttP, at no cost to the cusytomer? Sounds like a major upgrade to me. And wasn't the reason we got the NBN that Telstra point blank refused to replace the monopoly copper infrastructure with fibre?
If I didn't know better, I'd be think the major policy mistake Australia made in Telecom was the liberals to selling off Telstra. In a competitive market when a new technology came along a telecom is forced to upgrade because their a competitors would use the new technology to steal their customers. That works fabulously for 5G, where there is competition. But when the Libs sold Telstra it was a monopoly. Telstra just refused to upgrade the copper. The Libs thought they could fix that though legislation, but what happened instead is Telstra fought the legalisation tooth and nail and we ended up in the absurd situation of having buildings full of federal court judges and lawyers fighting to get reasonable ULL access. In the end Tesltra did give permission to change the equipment at the ends of the wires. But replacing the wires themselves - never. That was their golden goose. No one was permitted to replace them with a new technology.
Desperate to make the obvious move to fibre, the Libs then offered Telstra, the Optus, then anybody money to build a new fibre network - but they all refused to do so unless the government effectively guaranteed monopoly ownership over the new network.
Sorry, what was your point again? Oh, that's right, public ownership shared natural monopolies like wires, roads, water mains is bad. The thing I missed is why a private rent extracting monopoly beholden to no one except the profit seeking share holders owning those things is better.
ClumsyPilot
> This creates competition for things like pricing plans
If the common carrier is doing all the work, what’s the point of the companies on top? What do they add to the system besides cost?
Might as well get rid of them and have a national carrier.
kemitche
The companies on top provide end user customer support, varied pricing models ("unlimited" data vs pay by the GB, etc), and so on. It allows the common carrier to focus solely on the network hardware.
L-four
They add value by producing complicated and convoluted contracts which cannot be compared easily full of gotchas.
celsoazevedo
Common carriers have some upsides, but one downside is that it sometimes removes the incentive for ISPs to deploy their own networks.
I was stuck with a common carrier for years. I could pick different ISPs, which offered different prices and types of support, but they all used the same connection... which was only stable at lower speeds.
timewizard
It actually has nefarious benefits. Look up the term "HTLINGUAL" or "ECHELON." It's certainly nice for the government to have fewer places to shop when destroying our privacy.
The trick is that this is essentially wireless spectrum. Which can be leased for limited periods of time and can easily allow for a more competitive environment than what natural monopolies allow for.
It's also possible to separate the infrastructure operators from the backhaul operators and thus entirely avoid the issues of capital investment costs by upstart incumbents. When done there's even less reason to tolerate monopolistic practices on either side.
Gigachad
Feels a lot like whitelabeling. Where you have 200 companies selling exactly the same product at slightly different price points but where there isn't really any difference in the product.
odo1242
"Common carrier" tends to raise prices for minimum service, though. And once the network is built the carrier is just going to keep their monopoly. You bet they're never upgrading to any new piece of technology until they're legally required to.
therein
It also makes it more vulnerable to legal, bureaucratic and technical threats.
Doesn't make much sense to me to abstract away most of the parts where an entity could build up its competitive advantage and then to pretend like healthy competition could be build on top.
Imagine if one entity did all the t-shirt manufacturing globally but then you congratulated yourself for creating a market based on altered colors and what is printed on top of these t-shirts.
jethro_tell
This was a common way to do things before the telcos in the USA were deregulated in the 2000s and 2010s. At the time it was both internet and telephone but due to the timing of de regulation, it never really took off with real high speed internet, only dsl and dialup.
I used to work at a place that did both on top of the various telcos. We offered ‘premium service’ with 24 hour customer support and a low customer to modem and bandwidth ratio.
Most of our competitors beat us in price but would only offer customers support 9-5 and you may get a busy signal/ lower bandwidth in the back haul during peak hours.
There was a single company that owned the wires and poles, because it’s expensive and complex to build physical infrastructure and hard to compete, but they were bared from selling actual services or undercutting providers because of their position. (Which depended on jurisdiction).
It solved the problem we have now of everyone complaining about their ISP but only having one option in their area.
We have that problem now specifically because we deregulated common carriers for internet right as it took over the role of telephone service.
SSLy
the in world practice seems to have this worked out. I am working for such provider right now and it is neither cash starved not suffocating under undue bureaucracy
computerthings
And private companies don't even have to be vulnerable, they can just do nasty things nilly willy, because it might be profitable and they might get away with it. Yeah, there could be ones that don't suck, and then customers could pick those, but when there aren't, when they all collude to be equally shitty and raise prices whenever they can -- which they do -- people have no recourse. They do have recourse when it comes to the government.
And for some things it's just too much duplicated effort and wasted resources, T-shirts are one thing, because we don't really need those, but train lines and utilities etc. are another. I can't tell you where the "boundary" is, but if every electric company had to lay their own cables, there would only be one or two.
And in the opinion of many including mine, for example the Deutsche Bundesbahn got worse when it got privatized. They kinda exploited the fact that after reunification, there were two state railroad systems obviously, and instead of merging them into one state railroad system, it was privatized, but because it made more money for some, but not because it benefits the public, the customers. Of course the reasoning was the usual neoliberal spiel, "saving money" and "smaller government" but then that money just ends up not really making things better to the degree privatization made them worse.
Obviously not everything should be state run, far from it. But privatizing everything is a cure actually even worse than the disease, since state-run sinks and swims with how much say the people have, whereas a 100% privatized world just sinks into the abyss.
grahar64
In New Zealand we have a single company that owns all the telecommunications wires. It was broken up in the 90's from a service provider because they were a monopoly and abusing their position in the market. Now we have a ton of options of ISPs, but only one company to deal with if there are line faults. BTW the line company is the best to deal with, the ISPs are shit.
Same for mobile infrastructure would be great as well.
kiwijamo
In NZ we also have the Rural Connectivity Group (RCG) which operates over 400 cellular/mobile sites in rural areas for the three mobile carriers, capital funded jointly by the NZ Government and the three mobile carriers (with operational costs shared between the three carriers I believe). For context the individual carriers operate around 2,000 of their own sites in urban areas and most towns in direct competition with each other. It has worked really well for the more rural parts of the country, filling in gaps in state highway coverage as well as providing coverage to smaller towns that would be uneconomical for the individual carriers to cover otherwise. I'm talking towns of a handful of households getting high speed 4G coverage. Really proud of NZ as this sort of thing is unheard of in most other countries.
wingworks
Ironically, often you get way faster speeds out on a RCG tower too. (probably due to few users), vs when in the city, where I often get pretty average speeds be it 4g or 5g.
Marsymars
I dunno, it makes conceptual sense. Networks infrastructure is largely commodity utilities where duplication is effectively a waste of resources. e.g. you wouldn't expect your home to have multiple natural gas connections from competing companies.
Regulators have other ways to incentivize quality/pricing and can mandate competition at levels of the stack other than the underlying infrastructure.
I wouldn't expect that "only a single network" is the right model for all locations, but it will be for some locations, so you need a regulatory framework that ensures quality/cost in the case of a single network anyway.
newsreaderguy
IMO this can be neatly solved with a peer-to-peer market based system similar to Helium https://www.helium.com/mobile.
(I know that helium's original IoT network mostly failed due to lack of pmf, but idk about their 5G stuff)
Network providers get paid for the bandwidth that flows over their nodes, but the protocol also allows for economically incentivizing network expansion and punishing congestion with subsidization / taxing.
You can unify everyone under the same "network", but the infrastructure providers running it are diverse and in competition.
suddenlybananas
I think that it should be run as a public service like utilities and should be as cheap as humanly possible. Why not?
cogman10
I personally like the notion of a common public infrastructure that subleases access. We already sort of do that with mobile carriers where the big 3 provide all access and all the other "carriers" (like google fi) are simply leasing access.
Make it easy for a new wireless company to spawn while maintaining the infrastructure everyone needs.
daedrdev
My public utility is bad at its job because it has literally zero incentive to be cheap, and thus my utilities are expensive
cogman10
> it has literally zero incentive to be cheap
Do private utilities have any incentive to be cheap?
The reason we have utility regulations in the first place is because utilities are natural monopolies with literally zero incentive to be cheap. On the contrary, they are highly incentivized to push up prices as much as possible because they have their customers over a barrel.
natebc
I believe the idea is that you shouldn't have a corporation provide the utility if there's only going to be one.
"public utility" implies it's owned by the public not a profit seeking group of shareholders.
sweeter
A private electric grid is a nightmare. Look at Texas. People pay more, and they get less coverage. It's worse by every metric. The conversation should revolve around, how can we fix the government so that it isn't 5 corporations in a trench coat who systematically defund public utilities and social safety nets in hopes of breaking it so they can privatize it and make billions sucking up tax payer money while doing no work. See the billions in tax funding to ATT, Google, etc... to put in fiber internet that they just pocketed the cash and did nothing.
fny
Because competition drives innovation. 5G exists as widely as it does because carriers were driven to meet the standard and provide faster service to their customers.
This article is essentially arguing innovation is dead in this space and there is no need for bandwidth-related improvements. At the same time, there is no 5G provider without a high-speed cap or throttling for hot spots. What would happen if enough people switched to 5G boxes over cable? Maybe T-Mobile can compete with Comcast?
javier2
Well, 5G is unlikely to be built in my area for the next decade, meanwhile 3 operators are building networks in the slightly more populated areas.
dgacmu
Competition drives innovation, but also, we've generally seen that things like municipal broadband are _more_ innovative than an incumbent monopoly carrier. Large chunks of the US don't have much competition at all in wired services, and if we approach that in wireless, we are likely to see the same effects starting where the local monopoly tries to extract maximum dollars out of an aging infrastructure. Lookin' at you, Comcast, lookin' at you.
vel0city
The T-Mobile 5G Rely fixed-wireless home internet plan offers no caps and no throttling plans.
immibis
Three things are necessary then:
1. It must be well-run.
2. It must be guaranteed to continue to be well-run.
3. If someone can do it better, they must be allowed to do so - and then their improvements have to be folded into the network somehow if there is to be only one network.
javier2
Maybe it is. Building multiple networks for smaller populations comes at enormous cost though. In my country there have been a tradition for this kind of network sharing, where operators are required to allow alternative operators on their physical network for a fee set by government.
HnUser12
They should study Canada. We’re already running that experiment.
micromacrofoot
It's not that stupid IMO, they could handle it like some places handle electricity — there's a single distributor managing infra but you can select from a number of providers offering different generation rates
Having 5 competing infrastructures trying to blanket the country means that you end up with a ton of waste and the most populated places get priority as they constantly fight each other for the most valuable markets while neglecting the less profitable fringe
recursive
How confusing. Now I can't tell whether it's very stupid, not stupid, or medium stupid. Too bad there were no words.
Animats
> Of course, sophisticated representations of entire 3D scenes for large groups of users interacting with one another in-world could conceivably push bandwidth requirements up. But at this point, we’re getting into Matrix-like imagined technologies without any solid evidence to suggest a good 4G or 5G connection wouldn’t meet the tech’s bandwidth demands.
Open-world games such as Cyberpunk 2077 already have hours-long downloads for some users. That's when you load the whole world as one download. Doing it incrementally is worse. Microsoft Flight Simulator 2024 can pull 100 to 200 Mb/sec from the asset servers.
They're just flying over the world, without much ground level detail. Metaverse clients go further. My Second Life client, Sharpview, will download 400Mb/s of content, sustained, if you get on a motorcycle and go zooming around Second Life. The content is coming from AWS via Akamai caches, which can deliver content at such rates. If less bandwidth is available, things are blurry, but it still works. The level of asset detail is such that you can stop driving, go into a convenience store, and read the labels on the items.
GTA 6 multiplayer is coming. That's going to need bandwidth.
The Unreal Engine 5 demo, "The Matrix Awakens", is a download of more than a terabyte. That's before decompression.
The CEO of Intel, during the metaverse boom, said that about 1000x more compute and bandwidth was needed to do a Ready Player One / Matrix quality metaverse. It's not that quite that bad.
SteveNuts
How many people consuming these services are doing so over a mobile network?
For my area all the mobile network home internet options offer plenty of speed, but the bandwidth limitations are a dealbreaker.
Everyone I know still uses their cable/FTTH as their main internet, and mobile network as a hotspot if their main ISP goes down.
Maakuth
Here in rural Finland, 4G/5G is the only option available to me. I'm getting 50-150Mbps download speed, but often just a dozen Mbps upload. On the night hours it's better and that's when I my game downloads and backup uploads. I think there's going to be another municipal FTTH program, let's see if I get a fixed line at that point.
john_minsk
Right now - not many. But at some point in the future, if metaverse is everywhere, you can pull out a phone and a combined data for the room you are in might be 100GB. Would we want to have 6G then?
drawfloat
Few people play games built for mobiles, let alone looking to play GTA6 on an iPhone
Animats
For the better games, you'll need goggles or a phone that unfolds to tablet size for mobile use. Both are available, although the folding-screen products still have problems at the fold point.
markedathome
> The Unreal Engine 5 demo, "The Matrix Awakens", is a download of more than a terabyte. That's before decompression.
The PS5 and Xbox Series S/X both had disks that were incapable of holding a terabyte at the launch of The Matrix Awakens. Not sure where you are getting that info from, but both the X S/X and PS5 were about 30GB in size on disk, and the later packaged PC release is less than 20GB.
The full PC development system might total a TB with all Unreal Engine, Metahumans, City Sample packs, Matrix Awakens code and assets (audio, mocap, etc) but even then the consumer download will be around the 20-30GB size as noted above.
kmeisthax
On my current mobile plan (Google Fi[0]) the kind of streaming 3D world they think I would want to download on my phone would get me throttled in less than a minute. 200 MB is about a day's usage, if I'm out and about burning through my data plan.
The reason why there isn't as much demand for mobile data as they want is because the carriers have horrendously overpriced it, because they want a business model where they get paid more when you use your phone more. Most consumers work around this business model by just... not using mobile data. Either by downloading everything in advance or deliberately avoiding data-hungry things like video streaming. e.g. I have no interest in paying 10 cents to watch a YouTube video when I'm out of the house, so I'm not going to watch YouTube.
There's a very old article that I can't find anymore which predicted the death of satellite phones, airplane phones, and weirdly enough, 3G; because they were built on the idea of taking places that traditionally don't have network connectivity, and then selling connectivity at exorbitant prices, on the hopes that people desperate for connectivity will pay those prices[1]. This doesn't scale. Obviously 3G did not fail, but it didn't fail predominantly because networks got cheaper to access - not because there was a hidden, untapped market of people who were going to spend tens of dollars per megabyte just to not have to hunt for a phone jack to send an e-mail from their laptop[2].
I get the same vibes from 5G. Oh, yes, sure, we can treat 5G like a landline now and just stream massive amounts of data to it with low latency, but that's a scam. The kinds of scenarios they were pitching, like factories running a bunch of sensors off of 5G, were already possible with properly-spec'd Wi-Fi access points[3]. Everyone in 5G thought they could sell us the same network again but for more money.
[0] While I'm ranting about mobile data usage, I would like to point out that either Android's data usage accounting has gotten significantly worse, or Google Fi's carrier accounting is lying, because they're now consistently about 100-200MB out of sync by the end of the month. Didn't have this problem when I was using an LG G7 ThinQ, but my Pixel 8 Pro does this constantly.
[1] Which it called "permanet", in contrast to the "nearernet" strategy of just waiting until you have a cheap connection and sending everything then.
[2] I'm told similar economics are why you can't buy laptops with cellular modems in them. The licensing agreements that cover cellular SEP only require FRAND pricing on phones and tablets, so only phones and tablets can get affordable cell modems, and Qualcomm treats everything else as a permanet play.
[3] Hell, there's even a 5G spec for "license-assisted access", i.e. spilling 5G radio transmissions into the ISM bands that Wi-Fi normally occupies, so it's literally just weirdly shaped Wi-Fi at this point.
stkdump
> I'm told similar economics are why you can't buy laptops with cellular modems in them
I don't know what you mean. My current laptop (Lenovo L13) has a cellular modem that I don't need. And I am certainly a cost conscious buyer. It's also not the first time that this happened as well.
john_minsk
So true. I remember the first time I got access to 5G was on a short visit to Dubai. I got a sim card with ~20GB of traffic and was super excited to try speedtest. My brother told me not to do that because 5G is so fast that if speedtest doesn't limit the traffic size for the test it will consume all of it within 30 seconds the test runs. Guess what? I didn't run the test because I didn't want to pay another $50 for the data package. If I have 1Gbit connection even with 100G. What's the point? I'm still kind of on 4G if I want 100G to last me a month...
Peanuts99
Seems damn expensive. In the UK you can get a SIM card for £20 with unlimited data & calls which is run on one of the larger 5G networks. I usually have the opposite problem though, I barely use 5GBs on a given month.
ai-christianson
We live in a rural location, so we have redundant 5G/Starlink.
It's getting pretty reasonable these days, with download speeds reaching 0.5 gbit/sec per link, and latency is acceptable at ~20ms.
The main challenge is the upload speed; pretty much all the ISPs allocate much more spectrum for download rather than upload. If we could improve one thing with future wireless tech, I think upload would be a great candidate.
sidewndr46
I can pretty much guarantee you that your 5G connection has more bandwidth for upload than my residential ISP does
ai-christianson
Yeah?
We're getting 30-50 mbit/sec per connection on a good day.
repeekad
In downtown Columbus Ohio the only internet provider (Spectrum) maxes out at maybe 5 mbps up (down 50-100x that), it's not just a rural issue, non-competitive ISPs even in urban cities want you to pay for business accounts to get any kind of upload whatsoever
nightpool
Yes, many residential broadband ISPs top out at 1/10th that.
squeaky-clean
I'm in NYC, Spectrum is my ISP. 500mb/s down, 10mb/s up. I used to live in a building with symmetric 1G fiber from Verizon, but they don't serve my building.
toast0
> The main challenge is the upload speed; pretty much all the ISPs allocate much more spectrum for download rather than upload.
For 5G, a lot of the spectrum is statically split into downstream and upstream in equal bandwidth. But equal radio bandwidth doesn't mean equal data rates. Downstream speeds are typically higher because multiplexing happens at one fixed point, instead of over multiple, potentially moving transmitters.
orev
You identified the problem in your statement: “the ISPs allocate…”. The provider gets to choose this, and if more bandwidth is available from a newer technology, they’re incentive is to allocate it to downloads so they can advertise faster speeds. It’s not a technology issue.
throwaway2037
What do you need faster upload speeds for?
chris_va
With 5G, I have to downgrade to LTE constantly to avoid packet loss in urban canyons. Given the even higher frequencies proposed for 6G, I suspect it will be mostly useless.
Now, it's possible that that raw GB/s with unobstructed LoS is the underlying optimization metric driving these standards, but I would assume it's something different (e.g. tower capex per connected user).
numpad0
There seem to be some integration issues in 5G Non-Standalone equipment and existing network. Standalone or not, 5G outside of millimeter wavelength bands("mmWave") should behave like an all-around pure upgrade compared to 4G with no downsides, in theory.
BenjiWiebe
5G can also use the same frequency bands as 4G, and when it does, apparently gets slightly increased range over 4G.
msh
In my part of the world 5g actually is rolled out on lower frequencies than 4g so I actually get better coverage.
thrownblown
I just leave mine in LTE and upgrade to 5G only when I know i'm gonna DL something big.
cogman10
This article misses the forest through the trees.
I can grant that a typical usage of wireless bandwidth doesn't require more than 10Mbps. So, what does "even faster buy you"?
The answer is actually pretty simple, at any given frequency you have a limited amount of data that can be transmitted. The more people you have chatting to a tower, the less available bandwidth there is. By having a transmission standard with theoretical capacities in the GB or 10GB, or more you make it so you can service 10, 100, 1000 more customers their 10Mbps content. It makes it cheaper for the carrier to roll out and gives a better experience for the end users.
the_mitsuhiko
> Transmitting high-end 4K video today requires 15 Mb/s, according to Netflix. Home broadband upgrades from, say, hundreds of Mb/s to 1,000 Mb/s (or 1 Gb/s) typically make little to no noticeable difference for the average end user.
What I find fascinating is that in a lot of situations mobile phones are now way faster than wired internet for lots of people. My parents never upgraded their home internet despite there being fire available. They have 80MBit via DSL. Their phones however due to regular upgrades now have unlimited 5G and are almost 10 times as fast as their home internet.
maxsilver
> Transmitting high-end 4K video today requires 15 Mb/s, according to Netflix.
It doesn't really change their argument, but to be fair, Netflix has some of the lowest picture quality of any major streaming service on the market, their version of "high-end 4K" is so heavily compressed, it routinely looks worse than a 15 year old 1080p Blu-Ray.
"High-end" 4K video (assuming HEVC) should really be targeting 30 Mb/s average, with peaks up to 50 Mb/s. Not "15 Mb/s".
nsteel
It's frustrating the author took this falsehood and ran with it all throughout this article.
Dylan16807
Why? The conclusion that "somewhere between 100 Mb/s and 1 Gb/s marks the approximate saturation point" wouldn't be any different.
pcchristie
I watched a 4K documentary on Netflix last night and my wife got sick of me complaining that it looked worse than a 720p YouTube video
pak9rabid
Not to mention I doubt they're even including the bandwidth necessary for 5.1 DD+ audio.
kevin_thibedeau
Audio doesn't require high data rates. 6 streams of uncompressed 16-bit 48 kHz PCM is 4.6 Mb/s. Compression knocks that down into insignificance.
ziml77
On one hand it's nice that the option for that fast wireless connection is available. But on the other hand it sucks that having it means the motivation for ISPs to run fiber to homes in sparse towns goes from low down to none, since they can just point people to the wireless options. Wireless doesn't beat the reliability, latency, and consistent speeds of a fiber connection.
dageshi
It doesn't beat it but honestly it's good enough based on my experience using a 4g mobile connection as my primarily home internet connection.
immibis
It's not that mobile is fast, it's that home internet is slow. It's the same reason home internet in places like Africa, South Korea and Eastern Europe is faster than in the USA and Western Europe: home internet was built out on old technology (cable/DSL) and never upgraded because (cynically) incumbent monopolies won't allow it or (less cynically) governments don't want to pay to rip up all the roads again.
hocuspocus
Several Western European countries have deployed XGS-PON at scale, offering up to 10 Gbps, peaking at ~8 Gbps in practice. Hell I even have access to 25 Gbps P2P fiber here in Switzerland.
Also you can deliver well over 1 Gbps over coax or DSL with modern DOCSIS and G.fast respectively. But most countries have started dismantling copper wirelines.
rsynnott
Very few people have home equipment that can do anything close to 10Gbps, of course; this is all largely future proofing.
Years back, when FTTH started rolling out in Ireland, some of the CPE for the earliest rollouts only had 100Mbit/sec ethernet (on a 1Gbit/sec service)...
harrall
5G can be extremely fast. I get 600 MBit over cellular at home.
…and we only pay for 500 MBit for my home fiber. (Granted, also 500 Mbit upload.)
(T-Mobile, Southern California)
throitallaway
Sure but I'll take the latency, jitter, and reliability of that fiber over cellular any day.
vel0city
The reliability is definitely a bigger question, jitter a bit more questionable, but as far as latency goes 5G fixed wireless can be just fine. YMMV, but on a lot of spots around my town it's pretty comparable latency/jitter-wise as my home fiber connection to similar hosts. And connecting home is often <5ms throughout the city.
reaperducer
5G can be extremely fast. I get 600 MBit over cellular at home.
Is your T-Mobile underprovisioned? Where I am, T-Mobile 5G is 400Mbps at 2am, but slows to 5-10Mbps on weekdays at lunchtime and during rush hours, and on weekends when the bars are full.
Not to mention that the T-Mobile Home Internet router either locks up, or reboots itself at least twice a day.
I put up with the inconvenience because it's either $55 to T-Mobile, $100 to Verizon for even less 5G bandwidth, or $140 the local cable company.
harrall
Probably. My area used to be a T-Mobile dead zone 5 years ago.
I also have Verizon.
Choice of service varies based on location heavily from my experience. I’m a long time big time camper and I’ve driven through most corners of most Western states:
- 1/3 will have NO cellular service
- 1/3 will have ONLY Verizon. If T-Mobile comes up, it’s unusable
- 1/3 remaining will have both T-Mobile and Verizon
My Verizon is speed capped so I can’t compare that. T-Mobile works better in more urban areas for me, but it’s unpredictable. In a medium sized costal town in Oregon, Verizon might be better but I will then get half gigabit T-Mobile in a different coastal town in California.
One thing I have learned is that those coverage maps are quite accurate.
bsimpson
Verizon Fios sells gigabit in NYC for $80/mo.
They're constantly running promotions "get free smartglases/video game systems/etc if you sign up for gigabit." Turns out that gigabit is still way more than most people need, even if it's 2025 and you spend hours per day online.
randcraw
That's what I pay for FIOS internet 20 miles north of Philly. I suspect that's their standard rate for 1 Gb/s service everywhere in the US.
dale_glass
Higher bandwidths are good to have. They're great for rare, exceptional circumstances.
10G internet doesn't make your streaming better, but downloads the latest game much faster. It makes for much less painful transfer of a VM image from a remote datacenter to a local machine.
Which is good and bad. The good part is that it makes it easier for the ISPs to provide -- most people won't be filling that 10G pipe, so you can offer 10G without it raising bandwidth usage much at all. You're just making remote workers really happy when they have to download a terabyte of data on a single, very rare occasion instead of it taking all day.
The bad part is that this comfort is harder to justify. Providing 10G to make life more comfortable the 1% of the time it comes into play still costs money.
kelnos
I have 1Gbps down, and the only application I've found to saturate it is downloads from USENET (and with that I need quite a few connections downloading different chunks simultaneously to achieve it).
I have never come remotely close to downloading anything else -- including games -- at 1Gbps.
The source side certainly has the available pipe, but most (all?) providers see little upside to allowing one client/connection to use that much bandwidth.
bsimpson
Part of it is hardware too.
Only the newest routers do gigabit over wifi. If most of your devices are wireless, you'll need to make sure they all have wifi 6 or newer chips to use their full potential.
Even if upgrading your router is a one-time cost, it's still enough effort that most people won't bother.
mikepurvis
This tracks. I recently upgraded from 100mbps to 500mbps (cable), and barely anything is different— even torrents bumped from 5MB/s to barely 10MB/s. And there's no wifi involved there, just a regular desktop on gigabit ethernet.
stkdump
Same here. My ISP recently did a promo to try out 1G/1G for free for a few months. I decided not to buy it after the free trial and went back to my old 500/200 line instead of paying 40% more. Yeah, it takes a minute longer downloading the latest LLM from huggingface, so what.
kookamamie
Steam downloads easily saturate my 1 Gbs. Same for S3 transfers.
__alexs
Steam downloads can easily max 1Gbps for me.
msh
Steam and the ps5 store can fill out my 1 gigabits connection.
sheepdestroyer
Steam can fill up much more
I'm getting my Steam games at 2Gbps, and I am suspecting that my aging ISP's "box" is to blame for the cap (didn't want to pay my ISP for the new box that officially supports 8Gbps symmetrical, and just got a SFP+ adapter for the old one). I pay 39€/M for what is supposed to go "up to" 8Gbps/500Mbps on that old box.
Games from Google Drive mirrors are coming at full speed too. Nice when dling that new Skyrim VR 90GB mod pack refresh
vel0city
Steam used to max out my internet, but now its smarter about it and starts to decrypt/expand the download as its going instead of doing it in phases. This quickly maxes out my IOPS on even NVMe drives at only several hundred megabits for most games I've tried recently.
James_K
I got a 5G capable phone a few months back, and I can't say I've noticed a difference from my old one. (Aside from the new phone being more expensive, worse UI, slower, heavier, unwieldy, filled with ads, and constantly prompting me to create a "Samsung account".)
fkyoureadthedoc
What's any of that have to do with 5G? On 2 bars of 5G right now and I get 650Mbit download speed, it's significantly faster than 4G.
James_K
The last bit is just stuff I wanted to whine about. I obviously know it is faster, you don't need to explain that concept. I have just never had need of any significant internet speed on my phone. I don't download things, and only sometimes stream video. Most of the time I am just checking emails, or calendars, or something trivial like that. Unless I do some kind of benchmark, I can't notice the difference between 4G and 5G.
throw0101c
> I have just never had need of any significant internet speed on my phone. I don't download things, and only sometimes stream video.
But other people do.
And the main resource that is limited with cell service is air time: there are only so many frequencies, and only so many people can send/receive at the same time.
So if someone wants to watch a video video, and a particular segment is (say) 100M, then if a device can do 100M/s, it will take 1s to do that operation: that's a time period when other people may not be able to do anything. But if the device can do 500M/s, then that segment can come down in 0.2s, which means there's now 0.8s worth of time for other people to do other things.
You're not going to see any difference if you're watching the video (or streaming music, or check mail), but collectively everyone can get their 'share' of the resource much more quickly.
Faster speeds allow better resource utilization because devices can get on and off the air in a shorter amount of time.
dylan604
If 5G lived up to everything it was touted to do, you could use a 5G hotspot for your home internet would could be a huge positive in areas that only have one ISP available. However, 5G does not live up to the promises, and your traffic is much more heavily shaped than non-wireless ISPs.
jfengel
It would matter more if you were in a crowded place, with more users taking up the spectrum. But yeah, as with computer speed, ordinary applications maxed out a while back.
agumonkey
In a similar manner, I got 5G recently and used it as my main link, and i'm still at 150GB downloaded (multiple persons, multiple laptops, regular OS updates, docker pull etc). I'm not even smart about this .. Without constant 4K streaming I realized that my needs will rarely exceed 200GB.
secondcoming
5G allowed me to avoid having to get fibre.
adrr
What provider? I have yet to get over 300Mb/sec on either TMobile or Verizon. Tmobile is suppose to have the fastest speeds according to reports.
throwaway2037
Let's say that 5G is 10x faster than 4G. Why do you need faster than 65MBit download speed on a mobile phone?
Lammy
It spies on you a lot more effectively :)
https://www.fastcompany.com/90314058/5g-means-youll-have-to-...
https://venturebeat.com/mobile/sk-telecom-will-use-5g-to-bui...
https://www.ericsson.com/en/blog/2020/12/5g-positioning--wha...
johnwayne666
The UE could decide not to participate: https://www.androidauthority.com/android-15-location-privacy...
Lammy
That's completely a false sense of security with 5G systems, because the way it achieves that high bandwidth is by literally “steering the beam” to follow you, i.e. precise location surveillance is an implicit part of using it: https://arxiv.org/pdf/2301.13390
“The initial access period includes the transmission of beamformed SSBs that provide the UEs with a basic synchronization signal and demodulation reference signals. This allows for UEs to save power by going inactive and rejoining the network at a later initial access period. At the physical layer, the UE will receive the SSBs from a single antenna or using one or more spatial filters, such as a multi-panel handset used to overcome hand blockage. The UE will use the received SSB for synchronization and determining the control information. The beam reporting stage includes one or more possible SSB CSI reports which are transmitted in the random access channel. The report includes information for the strongest serving cell and may include a set of the next strongest cells within the same band to assist with load balancing. The number of reported additional cells depends on the carrier frequency, the previous state of the UE in the net- work, and the bands being monitored. In a newly-active state, the UE reports the top 6–16 additional cells across each active frequency range. This reporting helps to manage handover and mitigate cell-edge interference. In the final steps, the UE has connected to a serving cell and is ready to start receiving data. Further beam refinement and channel estimation can occur by transmitting reference signals with more precise beams. Although not specified in the standard, a typical CSI-RS would cover smaller portions of the reported SSBs’ directions or combine coherently across a multipath channel. Using more directional or precise beams can increase the SNR–thereby improving the channel estimates and beam alignment. Beam refinement can also be used to adjust the beamforming slightly to track highly mobile UEs.”
Your linked article even agrees: “Carriers can still see which cell towers your device connects to, use the strength and angle of your device’s signal to the tower, and then look up your device’s unique cellular identifier to determine your general location. Your location may never be private when you’re connected to a cellular network”
Fun fact: modern Wi-Fi standards do this too and it's possible to use the backscattered emissions to see through your walls lol https://www.popularmechanics.com/technology/security/a425750...
atian
Yeah…
dclowd9901
I've found 1-bar 4G LTE to actually be enough to do work on at home, to my surprise (in the occasions that my in-the-ground cable connection up and dies on me). Only thing I don't get is Zoom with that, but it's nice to have a good excuse not to be in a meeting.
readthenotes1
Well, I believe you can try audio only to reduce the bandwidth requirements. That was my excuse for anything below five bars...
jowea
Does using the internet use less battery?
hn_throwaway_99
> Is that such a foregone conclusion, though? Many technologies have had phases where customers eagerly embrace every improvement in some parameter—until a saturation point is reached and improvements are ultimately met with a collective shrug.
> Consider a very brief history of airspeed in commercial air travel. Passenger aircraft today fly at around 900 kilometers per hour—and have continued to traverse the skies at the same airspeed range for the past five decades. Although supersonic passenger aircraft found a niche from the 1970s through the early 2000s with the Concorde, commercial supersonic transport is no longer available for the mainstream consumer marketplace today.
OK, "Bad Analogy Award of the Year" for that one. Traveling at supersonic speeds had some fundamental problems, primarily being that the energy required to travel at those speeds is so much more than for subsonic aircraft, and thus the price was much higher for supersonic travel, and the problem of sonic booms meant they were forbidden to travel over land. When the Concorde was in service, London to NYC flights were 10-20x more expensive on the Concorde compared to economy class on a conventional jet, meaning the ~4 hours saved flight time was only worth it for the richest (and folks just seeking the novelty of it). There are plenty of people that would still LOVE to fly the Concorde if the price were much cheaper.
That is, the fundamental variable cost of supersonic travel is much higher than for conventional jets (though that may be changing - I saw that pg posted recently that Boom has found a way to get rid of the sonic boom reaching the ground over land), while that's not true for next gen mobile tech, where it's primarily just the upfront investment cost that needs to be recouped.
pjdesno
Note that existing bandwidth usage has been driven by digitization of existing media formats, for which there was already a technology and industry - first print, then print+images, then audio, then video. People have been producing HD-quality video since the beginning of Technicolor in the 1930s, and while digital technology has greatly affected the production and consumption of video, people still consume video at a rate of one second (and about 30 frames) per second.
There are plenty things that *could* require more bandwidth than video, but it's not clear that a large number of people want to use any of them.
OK so someone has noticed that you "only" need 15MBs-1 for a 4K stream. Jolly good.
Now let's address latency - that entire article only mentioned the word once and it looks like it was by accident.
Latency isn't cool but it is why your Teams/Zoom/whatevs call sounds a bit odd. You are probably used to the weird over-talking episodes you get when a sound stream goes somewhat async. You put up with it but you really should not, with modern gear and connectivity.
A decent quality sound stream consumes roughly 256KBs-1 (yes: 1/4 megabyte per second - not much) but if latency strays away from around 30ms, you'll notice it and when it becomes about a second it will really get on your nerves. To be honest, half a second is quite annoying.
I can easily measure path latency to a random external system with ping to get a base measurement for my internet connection and here it is about 10ms to Quad9. I am on wifi and my connection goes through two switches and a router and a DSL FTTC modem. That leaves at least 20ms (which is luxury) for processing.