I've got the feeling that GPU development is plateauing, new flagships are consuming an immense amount of power and the sizes are humongous. I do give DLSS, Local-AI and similar technologies the benefit of doubt but is just not there yet. GPUs should be more efficient and improve in other ways.
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
I’ve said for a while that AMD will eventually eclipse all of the competition, simply because their design methodology is so different compared to the others. Intel has historically relied on simply cramming more into the same space. But they’re reaching theoretical limits on how small their designs can be; They’re being limited by things like atom size and the speed of light across the distance of the chip. But AMD has historically used the same dies for as long as possible, and relied on improving their efficiency to get gains instead. They were historically a generation (or even two) behind Intel in terms of pure hardware power, but still managed to compete because they used the chips more efficiently. As AMD also begins to approach those theoretical limits, I think they’ll do a much better job of actually eking out more computing power.
And the same goes for GPUs. With Nvidia recently resorting to the “just make it bigger and give it more power” design philosophy, it likely means they’re also reaching theoretical limitations.
AMD never used chips "more efficiently". They hit gold with the RYZEN design but everything before since Athlon was horrible and more useful as a room heater. And before athlon it was even worse. The k6/k6-2 where funny little buggers extending the life of ziff7 but it lacked a lot of features and dont get me started about their dx4/5 stuff which frequently died in spectacular manners.
Ryzen works because of chiplets and the stacking of the cache. Add some very clever stuff in the pipeline which I don't presume to understand and the magic is complete. AMD is beating intel at it's own game: it's Ticks and tocks are way better and most important : executable. And that is something Intel hasn't been able to really do for several years. It only now seems to be returning.
And lets not forget the usb problems with ryzen 2/3 and the memory compatibility woes of ryzen's past and some say: present. Ryzen is good but its not "clean".
In GPU design AMD clearly does the same but executes worse then nvidia. 9070 cant even match its own predecessor, 7900xtx is again a room heater and is anything but efficient. And lets not talk about what came before. 6xxx series where good enough but troublesome for some and radeon 7 was a complete a shitfest.
Now, with 90 70 AMD once again, for the umpteenth time, promises that the generation after will fix all its woes. That that can compete with Nvidia.
Trouble is, they've been saying that for over a decade.
Intel is the one looking at GPU design differently. The only question is: will they continue or axe the division now gelsinger is gone. which would be monunentally stupid but if we can count on 1 thing then its the horrible shortsightness of corporate America. Especially when wall street is involved. And with intel, wall street is heavily involved. Vultures are circling.
Just like I rode my 1080ti for a long time it looks like I'll be running my 3080 for awhile lol.
I hope in a few years when I'm actually ready to upgrade that the GPU market isn't so dire... All signs are pointing to no unfortunately.
I'm still riding my 1080ti...
980ti here, playing Cyberpunk 2077 at sufficiently high settings without issues (if you call ~30fps 1440p with no path tracing “without issues”, that is).
My 1080ti was probably peak Nvidia for me. It was so good, and I was super happy with it. Truely felt like upgrade.
If you're still playing at 1080 it's still a capable card tbh.
I gave mine to a friend who still had a 660 when I upgraded to the 3080 lol.
Oh yeah, it still gets the job done.
Keeping my eyes open for used ones to upgrade with now that the new series is out though. Gives me an excuse to get the 1080 in my server.
Me too
1060 6GB gang here... I will probably get a 3060 or 4060 next time I uppgrade, unless I ditch Nvidia (thinking of moving to Linux).
Same here.
Only upgraded when my 1080 died, so I snagged a 3080 for an OK price. Not buying a new card untill this one dies. Nvidia can get bent.
Maybe team red next time….
Nvidia is just strait up conning people.
Wow, it looks like it is really a 5060 and not even a 5070. Nvidia definitely shit the bed on this one.
Where's the antitrust regulation?
Trump's America has none. But the FCC is suing public broadcast services. So thats what we get.
For what exactly?
Your mom
Everyone trusts their mum.
I thrust your mums
Using the dominant market position to hurt the end customer... Price gouging