this post was submitted on 01 Apr 2024
54 points (75.5% liked)

Technology

59187 readers
1991 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/13805928

It's a long vid. I suggest prepping your fav drink before viewing.

It's re Nvidia's new gpu architecture for ai, NVlink switch, RAS diagnostics and other Nvidia announcements.

Nvidia knows it's the star of the backbone of the current ai boom and seems to be going full steam. I'm hoping for more innovations on tools for ai and gaming in the future.

top 20 comments
sorted by: hot top controversial new old
[–] [email protected] 69 points 7 months ago (3 children)

God, I wish AMD and Intel could get their shit together.

[–] [email protected] 52 points 7 months ago (2 children)

Happily playing modern games and developing shaders on my AMD GPU. 5120x1440 120 Hz issue free

I wish people would get their shit together and realize they’ve fallen victim to marketing

[–] [email protected] 26 points 7 months ago (2 children)

It's not marketing, AMD sucks for ML stuff. I don't just play games. Everything is harder, with fewer features and less performant with AMD.

[–] [email protected] 23 points 7 months ago (1 children)

The situation is mostly reversed on Linux. Nvidia has fewer features, more bugs and stuff that plain won't work at all. Even onboard intel graphics is going to be less buggy than a pretty expensive Nvidia card.

I mention that because language model work is pretty niche and so is Linux ( maybe similar sized niches? ).

[–] [email protected] 11 points 7 months ago* (last edited 7 months ago)

Yeah. Linux boys on AMD are eatin' good.

[–] [email protected] 8 points 7 months ago (1 children)

Really? I've only dabbled with locally run AI for a bit, but performance in something like ollama or stable diffusion has been really great on my 6900xt.

[–] [email protected] 7 points 7 months ago

The problem isn’t, that it isn’t great, but that nvidia cards are just better at a given price point, partially thanks to cuda.

And for gaming and general use, my experience in the last few years has been, that nvidia still has the leg up, when it comes to drivers on windows. Never had a nvidia card make any problems. AMD, not so much.

Would still happily trade my GTX 1650 with a RX 6400 because I recently switched to Linux and it’s a whole different world there…

[–] [email protected] 9 points 7 months ago* (last edited 7 months ago) (1 children)

AMD successful at the mid tier?! I'm shocked!

NVIDIA prints money in the enterprise where business will literally lose money over the extra compute, and lesser so in high end gaming with details turned up. AMD simply can't complete, it's not marketing, it's a better product.

[–] [email protected] 13 points 7 months ago (1 children)

AMD has never gotten more than 50% of the market, even in the years where their entire product lineup offered better performance/features for less money. I'm talking about the "good old days" here, where software features weren't a big factor for consumers and ML was nonexistent. You have to be delusional to think that Nvidia doesn't hold a very clear mindshare and marketing advantage.

[–] [email protected] 40 points 7 months ago (1 children)

Seriously. AI aside, if you're doing anything 3D-related you're basically shooting yourself in the foot by not going Team Green. The difference in render time/quality is exponentially better. I'd kill to see AMD or Intel pull a Ryzen in the GPU market.

[–] [email protected] 4 points 7 months ago

Yeah, the quality of Nvidia pro drivers is a crapshoot though. We've had so many issues, especially with OpenGL, it just isn't funny anymore. Granted, AMD isn't really that much better, but at least the cards cost a fraction and I have more confidence in AMD fixing the problem, than I have in Nvidia.

[–] [email protected] 3 points 7 months ago

Success of nvidia now is pretty much the same as the success of Intel back when cisc was losing to RISC(so they became partially risc, lol) and Intel got developers attention because of IBM PC and its clones. So is Nvidia that was chosen by Open AI. Intel competition was quite on par with blue giant(not only amd), but one big player has decided the winner and same with Nvidia since both AMD and Intel and Google have their ai accelerators that are not really worse and even better from time to time. And now after Blackwell Intel presents Gaudi 2, AMD prepares new CDNA generation, Chinese companies create more cost effective solutions, tenstorrent too starts catching up.

Nvidia's and Intel's wins in their respective fields were not really products of their innovations, but rather good cooperation

[–] [email protected] 5 points 7 months ago* (last edited 7 months ago) (1 children)

Nvidia has the short term covered, but I'm skeptical they are going to end up leading the AI chip market in about 5 years or so without acquisitions.

Recent research has shown not only efficiency gains but also actual performance gains with binary or ternary weights instead of floats.

This means you don't need FP calculations or matrix multiplication.

It requires being trained from scratch with that architecture in mind, so it will probably be 12 to 18 months before we see leading models with light weights, but once we do the market may go more towards faster and more energy efficient options that don't need to rely on Nvidia's legacy of IP for FP ops.

So while an unmatched king in how things are currently done, the magic phrase that brings any monarch to tears is "this too shall pass."

[–] [email protected] 2 points 7 months ago

While I agree with everything you said, I thought the same that after Crypto, Nvidia is done. They did have a dip, before AI jackpot hit.

Once models hit, new chips are bought and installed, it will be another year or two. So they are good for 2-3 years. They are also aware of this issue, I think they are working on CPUs and other businesses, so it won't be total loss. But I agree, sustaining stock at this price level seems unrealistic as of now.