this post was submitted on 09 Oct 2024
9 points (76.5% liked)

Stable Diffusion

4308 readers
9 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS
 

I want to buy a new GPU mainly for SD. The machine-learning space is moving quickly so I want to avoid buying a brand new card and then a fresh model or tool comes out and puts my card back behind the times. On the other hand, I also want to avoid needlessly spending extra thousands of dollars pretending I can get a 'future-proof' card.

I'm currently interested in SD and training LoRas (etc.). From what I've heard, the general advice is just to go for maximum VRAM.

  • Is there any extra advice I should know about?
  • Is NVIDIA vs. AMD a critical decision for SD performance?

I'm a hobbyist, so a couple of seconds difference in generation or a few extra hours for training isn't going to ruin my day.

Some example prices in my region, to give a sense of scale:

  • 16GB AMD: $350
  • 16GB NV: $450
  • 24GB AMD: $900
  • 24GB NV: $2000

edit: prices are for new, haven't explored pros and cons of used GPUs

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -3 points 1 month ago (3 children)

quit you are killing the earth with this inane bullshit.

[–] [email protected] 3 points 1 month ago (2 children)

Do you know how much wattage those GPUs use, even if I disconnected my solar panels and ran the card 100% 24/7? Protip: it rounds down to zero.

If you're serious about the global environmental crisis, comrade, organize with others to fight industrial-scale culprits instead of wasting your valuable time blaming trivial people.

[–] [email protected] 1 points 1 month ago

It's the training that's the issue more than inferring.

[–] [email protected] 0 points 1 month ago

Enjoy your treats!