this post was submitted on 03 Dec 2024
242 points (97.6% liked)

Technology

60090 readers
1796 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

If even half of Intel's claims are true, this could be a big shake up in the midrange market that has been entirely abandoned by both Nvidia and AMD.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 50 points 3 weeks ago* (last edited 3 weeks ago) (16 children)

If they double up the VRAM with a 24GB card, this would be great for a "self hosted LLM" home server.

3060, 3090 prices have been rising like crazy because Nvidia is vram gouging and AMD inexplicably refuses to compete. Even ancient P40s (double vram 1080 TIs with no display) are getting expensive. 16GB on the A770 is kinda meager, but 24GB is the point where you can fit the Qwen 2.5 32B models that are starting to perform like the big corporate API ones.

And if they could fit 48GB with new ICs... Well, it would sell like mad.

[–] [email protected] 29 points 3 weeks ago (9 children)

I always wondered who they were making those mid- and low-end cards with a ridiculous amount of VRAM for... It was you.

All this time I thought they were scam cards to fool people who believe that bigger number always = better.

[–] [email protected] 12 points 3 weeks ago (1 children)

Yeah, AMD and Intel should be running high VRAM SKUs for hobbyists. I doubt it'll cost them that much to double the RAM, and they could mark them up a bit.

I'd buy the B580 if it had 24GB RAM, at 12GB, I'll probably give it a pass because my 6650 XT is still fine.

[–] [email protected] 2 points 3 weeks ago (2 children)

Don’t you need nvidia cards to run ai stuff?

[–] [email protected] 12 points 3 weeks ago* (last edited 3 weeks ago)

Nah, ollama works w/ AMD just fine, just need a model w/ enough VRAM.

I'm guessing someone would get Intel to work as well if they had enough VRAM.

[–] [email protected] 3 points 3 weeks ago
load more comments (7 replies)
load more comments (13 replies)