this post was submitted on 18 Jul 2024
385 points (99.0% liked)

Linux

48067 readers
686 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 113 points 3 months ago (3 children)

For newer GPUs from the Turing, Ampere, Ada Lovelace, or Hopper architectures, NVIDIA recommends switching to the open-source GPU kernel modules.

So 20-series onwards.

[–] [email protected] 18 points 3 months ago (2 children)

My ol' 1070 doesn't make the cut hey... ;-;

[–] [email protected] 32 points 3 months ago* (last edited 3 months ago) (4 children)

Maybe it's just because I'm older and more jaded, but that really feels like the last truly good era for GPUs.

Those 10 series cards had a ton of staying power, and the 480/580 were such damn good value cards.

[–] [email protected] 32 points 3 months ago (1 children)

It's more that back then was a better time for price to performance value. The 3000 and 4000 series cards were basically linear upgrades in terms of price to performance.

It's an indicator that there haven't been major innovations in the GPU space, besides perhaps the addition of the AI and Raytracing stuff, if you want to count those as upgrades.

[–] [email protected] 13 points 3 months ago (1 children)

It feels like the crypto mining goldrush really changed the way GPU manufacturers view the market.

[–] [email protected] 6 points 3 months ago

I feel like AI has changed the game. Why sell retail when people are paying you billions to run LLMs in the cloud.

[–] [email protected] 12 points 3 months ago (1 children)

RTX 3050 (which got a new 6 gb version less than a year ago) is similar to 1070 Ti in terms of performance and 1080s are of course even better. Definitely a ton of staying power, even in 2024.

[–] [email protected] 4 points 3 months ago

I bought a secondhand 1080 a couple years ago when the crypto bubble burst finally and it's still serving my needs just fine. It could play Baldur's Gate 3 just fine on release last year, which was the last "new" game I played on it. Seems like it'll still be good for a few years to come so yeah.

[–] [email protected] 8 points 3 months ago (1 children)

That was mostly because the 20 series was so bad. Expensive, didn't perform lightyears better to justify the price, raytracing wasn't used in any games (until recently).

The 30 series was supposed to be more of a return to form, then covid + mining ruined things.

[–] [email protected] 2 points 3 months ago

I got a 2060 super and i must say i'm very happy, i do 3d stuff so the ray tracing was plenty useful and despite it getting a bit it fairs pretty great in most games and the price was okay at the time (500 €still a bit high since it was during the bitcoin mining madness =-=")

[–] [email protected] 4 points 3 months ago (1 children)

Still have a beautifully running 1070. 👌

[–] [email protected] 3 points 3 months ago

Comrade. (☞ ͡° ͜ʖ ͡°)☞

[–] [email protected] 3 points 3 months ago

I think it works but the performance might not be ideal. Keep on the proprietary module.

[–] [email protected] 11 points 3 months ago (1 children)
[–] [email protected] 3 points 3 months ago

Yep! My pre-built 1660 super i got years ago is still chugging along amazingly as a streaming device for my steam deck.

[–] [email protected] 10 points 3 months ago (2 children)

Yes. Everything older is unsupported in terms of the new Linux stuff anymore. Planned obsolescence yk?

[–] [email protected] 20 points 3 months ago (1 children)

It's not really planned obsolescence, they changed the way their drivers work with the 16xx/20xx series. Up till the 10xx series, they did a lot of algorithms and processing in the software. Then they switched to doing most of that on the GPU in the form of firmware. The 10 series GPUs can't do that.

Like most hardware vendors, Nvidia doesn't want (and probably isn't allowed to) publish all of their special sauce source code. They can open source the driver and load a binary blob like, most hardware does, but only on the newer cards.

The older cards will have to keep the special sauce in software on the CPU, so those devices will need to stick to the proprietary driver.

[–] [email protected] 0 points 3 months ago (1 children)

(and probably isn't allowed to)

I doubt very much it's about whether they are allowed too or not. They're the ones at the top of the hardware supply chain, designing their own chips and having them fabricated. It's them telling other companies, like Gigabyte and EVGA, what they are allowed or not allowed to do.

[–] [email protected] 11 points 3 months ago (1 children)

Nvidia also buys and licenses code from other companies. These days they're on top of the chain, but they used to be a lot smaller. Maybe they rewrote their drivers to remove the external code, but I wouldn't be surprised if they still have old external code in their drivers.

AMD tried to open their source code for a display technique (VRR I think? Not sure what it was) but was prevented from doing so by the standards authority, presumably because they used licensed reference code. I don't think this applies to the older 10xx series of cards, but these factors are difficult to work around.

[–] [email protected] 8 points 3 months ago

Hmdi 2.1 and the hdmi consortium prevented them from releasing code. It wasn't even proprietary, just based on a licensed implementation from what I understood.

[–] [email protected] 13 points 3 months ago

Well their proprietary driver works fine for older hardware.