this post was submitted on 06 Feb 2025
366 points (92.4% liked)

Technology

61850 readers
2127 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://lemm.ee/post/54702508

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 day ago* (last edited 1 day ago) (4 children)

That's entirely speculative. There are diminishing returns. Unless you're going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn't even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.

Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn't that important.

[–] [email protected] 1 points 4 hours ago* (last edited 4 hours ago) (1 children)

Take a look at devContainers as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….

devContainers are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDE

But you could generalize this idea. Maybe it’s the next ChromeOS-like thing. All you need is something that can run containers, and everything you do starts with downloading a container with everything you need …… if something like this happens, there’s a great example of needing to be responsive with a lot more data

[–] [email protected] 1 points 4 hours ago (1 children)

Maybe don't rely on cloud garbage for basic development?

[–] [email protected] 1 points 4 hours ago* (last edited 4 hours ago)

Technically I don’t. I’m also the guy running CI/CD building devContainers for my engineers. They no longer have to worry about updating certificates and tools and versions or security patches, and IT doesn’t have to worry about a lot of crap on their laptops that IT doesn’t manage. Engineers can use a standard laptop install and just get the latest of everything they need, scanned, verified, as soon as it’s available. And since it’s all automated, I can support many variations, and yes they can pull any older version from the repo if they need to, every project can easily be on different versions of different tools and languages

At work, I’m on the same network, but working from home, I still need the responsiveness to do my job

[–] [email protected] 3 points 23 hours ago (1 children)

there could be some new thing that no one has not even bothered to think about because of the limitations. Imagine streaming back when downloading few kilobytes for an hours was considered reasonable, people would have laughed at the very thought of it.

[–] [email protected] 1 points 22 hours ago

We're not using the bandwidth we have. Many US cities have service with 1Gbps download speed available. I have it for my own reasons. Servers are the bottleneck; they rarely even reach half that speed.

If we're not using 1Gbps, why should we believe something would pop up if we had 50Gbps?

Now, direct addressing where everyone can be a server and bandwidth utilization is spread more towards the edges of the network? Then you have something that could saturate 1Gbps. But you can't do that on IPv4.

[–] [email protected] 4 points 1 day ago (1 children)

Unless you're going to host your own YouTube....

This is exactly what peer tube is struggling with. This bandwidth would solve the video federation problem.

See, you get it!

[–] [email protected] 0 points 1 day ago* (last edited 1 day ago) (1 children)

Except we need IPv6 before that's at all viable.

We are not even filling out the bandwidth of pipes we have to the home right now. "If you build it, they will come" does not apply when there's already something there that isn't being fully utilized.

[–] [email protected] 2 points 1 day ago

Oh, maybe. I'm not familiar with bandwidth utilization in China.

[–] [email protected] 2 points 1 day ago (1 children)

How exactly does NAT prevent that? On good hardware it adds insignificant latency.

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago)

It has nothing to do with latency, and everything to do with not being able to directly address things behind NAT.

Edit: and please, nobody argue that NAT increases security. That dumbass argument should have died the moment it was first uttered.