this post was submitted on 21 Jan 2024
324 points (97.1% liked)

Technology

59446 readers
3474 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 77 points 10 months ago* (last edited 10 months ago) (11 children)

My RTX 4060 has 16GB of RAM. What on earth makes them think people would go for 12GB?

[–] [email protected] 8 points 10 months ago (6 children)

I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn't the bottle neck, so 12 GB might be fine with this use case BUT I'm also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It's not, that I would never get a 12 GB card as an upgrade, but you'd be sure, that I'd do some research for all alternatives and then it wouldn't be my first choice but a compromise, as it wouldn't future proof me in this regard.

[–] [email protected] 1 points 10 months ago (2 children)

Thanks, that was going to be exactly my question. I don’t see anyone choosing low memory for video but had no idea what ai needs

[–] [email protected] 2 points 10 months ago (1 children)

You can run Stable Diffusion XL on 8GB of VRAM (to generate images). For beginners, there's e.g. the open source software Fooocus, which handles quite a lot of work for you - it sends your prompt to a GPT-2 model (running on your PC) to do some prompt engineering for you and then uses that to generate your images and generally features several presets, etc. to get going easily.

Jan (basically an open source software that resembles ChatGPT and allows you to use several AI models) can run in 8GB, but only for 3B models or quantized 7B models. They recommend at least 16GB for regular 7B models (which they consider "minimum usable models"). Then there are larger, more sophisticated models, that require even more.

Jan can run on CPU in your regular RAM. Since it's chatting with you, it's not too bad, when it spits out words slowly, but GPU is / would be nice here...

[–] [email protected] 1 points 10 months ago
load more comments (3 replies)
load more comments (7 replies)