xfinity will advertise 100 Tbps lines with the abysmal 1.5 TB/mo data cap anyway
“you can drive this super sport car for $ per month - but only for 10 miles”
Aren’t fiber lines typically symmetrical? At least that’s how I’ve usually seen them advertised.
You underestimate the fuckery that ISPs will go through to offer the least amount of services for the most possible money.
Don’t be silly son, the free market will signal there is opportunity and prices will drop and quality will go up.
I hate Comcast as much as the next guy but I feel like 1.5TB a month would be reasonable. Even at those speeds you probably wouldn’t be downloading more, just downloading whatever you do now but faster.
E: I was gonna ask why this was so controversial but I just checked my routers stats and, oh yeah I’ve only downloaded around half a terabyte over 3 segregated VLANs in the past 2 months. I’ve uploaded almost double that which is baffling to me though. Even still I don’t see why anyone would be downloading anything more that a terabyte in a month unless your one of those data hoarders, which fair but… I’ll stop my rambling.
Why the fuck would I want that speed if I can only fully use it for less than a second before hitting the data cap? I’d rather have 100 times less speed with 100 times more cap, so I can actually fully use it however I want.
Also it’s just ridiculous anyway because I don’t even think hard drive write speeds are that fast.
There should be, that’s just how fiber works. If they lay a 10 Gb line in the street, they’ll probably sell a 1 Gb connection to a 100 households. (Margins depend per provider and location)
If they give you an uncapped connection to the entire wire, you’ll DoS the rest of the neighborhood
That’s why people are complaining “I bought 1Gb internet, but I’m only getting 100Mb!” - They oversold bandwidth in a busy area. 1Gb would probably be the max speed if everyone else was idle. If they gave everyone uncapped connections the problem would get even worse
Data caps are simply false advertising - if your infrastructure can only handle X Tb/s then sell lower client speeds or implement some clever QoS.
There are plenty of users for whom 1.5TB is quite or very restrictive - multi member households, video/photo editors working with raw data, scientists working with raw data, flatpak users with Nvidia GPU or people that selfhost their data or do frequent backups etc.
With the popularity of WFH and our dependence on online services the internet is virtually as vital as water or electricity, and you wouldn’t want to be restricted to having no electricity until the end of the month just because you used the angle grinder for a few afternoons.
I’m on pace for 0.60 TB this month and I’m no heavy user. I only have 1 4k TV and a laptop for work that I use all day. My wife is mostly on her phone but is a heavy TV user in the evening. I can imagine people who download and/or torrent most of the content they consume can easily hit 1.5TB
Broadband is not a speed.
according to the FTC or FCC whichever one it was recently raised the defined speed of a broadband connection.
It’s not symmetrical yet though. Which is weird.
It’s not symmetrical yet though. Which is weird.
Eh, I would say it’s to be expected. A lot of infrastructure still relies on coax/DOCSIS which has its limitations in comparison to an all-fiber backbone. (This post has some good explanations.) However it wouldn’t surprise me if some ISPs argue that “nobody needs that much uplink” and “it helps restrict piracy” when really it’s just them holding out against performing upgrades.
it really shouldnt be though, this is going to be in effect for like, the next decade or two. FTTH is literally fresh off the presses for most suburbanites, and city dwellers, i see no reason that this standard should be so antiquated anymore.
Literally only incentivizes ISPs to keep rolling out shitty infra that’s slow as balls everywhere that isn’t suburbia.
Distances though? I’ve seen similar breakthroughs in the past but it was only good for networking within the same room.
It’s optical fiber so it’s good for miles. Unlikely to be at home for decades but telcos will use it for connecting networks.
Optical fiber is already 100 gigabit so the article comparing it to your home connection is stupid.
So the scientist improved current fiber speed by 10x, not 1.2 million X.
It’s optical fiber so it’s good for miles.
OM1 through OM4 have full rate distances of less than 800 meters.
Yes there is faster stuff that goes for literal miles but saying that optical fiber can always go miles is incorrect.
It’s much more than just 100Gb/s.
A single fiber can carry over 90 channels of 400G each. The public is mislead by articles like this. It’s like saying that scientists have figured out how to deliver the power of the sun, but that technology would be reserved for the power company’s generation facilities, not your house.
over 90 channels of 400G each
You mean with 50 GHz channels in the C-band? That would put you at something like 42 Gbaud/s with DP-QAM64 modulation, it probably works but your reach is going to be pretty shitty because your OSNR requirements will be high, so you can’t amplify often. I would think that 58 channels at 75 GHz or even 44 channels at 100 GHz are the more likely deployment scenarios.
On the other hand we aren’t struggling for spectrum yet, so I haven’t really had to make that call yet.
Its not stupid at all. “Broadband” speed is a term that laypeople across the country can at least conceptualize. Articles like this aren’t necessarily written exclusively for industry folks. If the population can’t relate to the information well, how can they hope to pressure telcos for better services?
So it’s fine if an article says Space X develops a new rocket that travels 100x faster than a car?
Because that implies a breakthrough when it’s actually not significantly faster than other rockets: it’s the speed needed to reach the ISS.
10X faster than existing fiber would be accurate reporting. Especially given that there are labs that have transmitted at peta bit speeds over optical already. So terabit isn’t significant, only his method.
I wonder what non-telco applications will use this
I wonder if something like a sport stadium has video requirements that would get close with HFR 8K video?
To be fair, it all trickles down to home users eventually. We’re starting to see 10+gbps fiber in enthusiast home networks and internet connections. Small offices are widely adopting 100gbps fiber. It wasn’t that long ago that we were adopting 1 gigabit ethernet in home networks, and it won’t be long before we see widespread 800+ gigabit fiber.
Streaming video is definitely a big application where more bandwidth will come in handy, I think also transferring large AI models in the 100s of gigabytes may also become a large amount of traffic in the near future.
Disaggregated compute might be able to leverage this in the data center. I could use this to get my server, gaming PC and home theater to share memory bandwidth on top of storage, heck maybe some direct memory access between distributed accelerators.
Gotta eat those PCI lanes somehow
And 1.2 million times less likely to be available to the public
Also 1.2 million times less likely to leave the research stadium because even if this is true (very big if already) it’s still “new and exciting and revolutionary improvement #3626462” this week alone. Revolutionary new battery tech comes out twice a week if you believe the pop sci tech sites, it’s 99.9% crap
Battery advancements aren’t crap. We’ve gotten 5-8% improvement in capacity per year, which compounds to a doubling every 10 to 15 years. Every advancement covered by over sensationalized pop sci articles you’ve ever heard has contributed to that. It’s important not to let sensationalism make you jaded to actual advancements.
Now, as for broadband, we haven’t pushed out the technologies to the last mile that we already have. However, this sort of thing is useful for the backbone and universities. Universities sometimes have to transfer massive amounts of data, and some of the most efficient ways to do that are a van full of hard drives.
That’s not what I said though, I meant that 99.9% of the “revolutionary new battery technology” articles on blogs, magazines and what not are clickbait crap. I’ve seen these articles for at least the last 25 years and hlbeyond lit-ion batteries, not much revolutionary has happened on the battery front. My point was more against the clickbait science and tech news that regurgitates the same dumb crap all the time
Stuff like this is a bit more believable. Still will be more than a decade before we will see any benefit. First all of the sea cables would get the upgrade, then private companies (banks mainly), then governments (military and such), ISPs will prolly not touch it for as long as possible till governments force em.
No normal consumer user would have any reasonable use case for this kind of bandwidth.
This is data center and backbone network stuff.
Cool I’ll be able to download CoD in just a few hours.