this post was submitted on 18 Mar 2024
262 points (86.4% liked)

linuxmemes

21143 readers
1408 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.

  • Please report posts and comments that break these rules!

    founded 1 year ago
    MODERATORS
    262
    Merry Christmas! (feddit.de)
    submitted 7 months ago* (last edited 7 months ago) by [email protected] to c/[email protected]
     
    you are viewing a single comment's thread
    view the rest of the comments
    [–] [email protected] 2 points 7 months ago* (last edited 7 months ago)

    Nvidia pretty much always dominated the AI GPU market with their closed source driver and Cuda. Nothing has changed about that except for more competition in AI specific hardware which you can buy from several vendors now. But no one has ever used AMD cards with OpenCL for AI or ML. If you were serious about it, you always used Nvidia with Cuda or nowadays some dedicated AI accelerator card (DPUs).