this post was submitted on 21 Jan 2024
324 points (97.1% liked)
Technology
59446 readers
3684 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt's 16gB offers any advantage. Or do you think having 15fps average is more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.
The 4070 super offers significantly superior ray tracing performance, much lower power consumption, superior scaling(and frame generation) technology, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.
Amd's offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to "suffer" using AMD gpus, yet they usually join the nvidia shitting circlejerk.
I have an amd 580 card and have bought and recommended AMD gpus to people since the 9500/9700pro series. But my next gpu will almost certainly be an nvidia one. The only reason people are complaining is because nvidia can make a better gpu(as shown by the 4090) but they choose not to. While AMD literally cant make better gpus but they choose to only "competitively" price their gpus, instead of offering something better. Both companies suck.
$100 less IS the advantage.
It's not enough though and the sales are showing it. 7800xt is a decent card but it isnt an amazing offer, it is just a good one. For some people, It is a slightly better value for money option. But those nvidia things have value too. So the value proposition isnt as clearcut, even though it should be considering that AMD is behind.
The steam stats should tell you what consumers think. And while consumers are not infallible, they are a pretty good indicator. The most popular amd card is the 580, which is arguably one of the best cards of all time. Except it came out 6 years ago. Did AMD have a better marketing back then? No. Did they have the performance crown? Nope. But that didnt stop the 580 from being an amazing card.
The 7800xt could have been the new 580, mid/high end card, with decent vram. Except you could get the 580 for 200€, while the 7800xt costs literally three times as much. When your "good" card is so expensive, customers have higher expectations. It isnt just about running games well(cheaper cards can do that too), it is about luxury features, like ray tracing and upscaling tech.
Imagine if the 7800xt was 400€. We wouldnt even have this conversation. But it isnt. In fact, in Europe it launched at basically the same price as a 4070. Even today, it is 50€-80€ cheaper. If nvidia is scamming us with inferior offers, why arent AMD offers infinitely better in value? Because AMD is also scamming us, just very slightly less so.
$100 sure feels much more solid than rtx that a ton of games don't even support. There are a bunch of people that just want to play in 4k and couldn't care less about features you call luxury.
That requires more VRAM and 7800xt and xtx deliver that perfectly.
A ton? Try "most".
I disagree.
D4 on Linux. Literally the only bottleneck is it eats 11GB of my 1080Ti's VRAM for breakfast and then still wants lunch and dinner. Plays 4k on high with prefect fps otherwise. Starts glitching like crazy once VRAM is exhausted after 10-15 minutes.
Zero issues on a 20GB card. I understand that shitty code on single game is not exactly universal example, but it is a valid reason to want more VRAM.
Found the Nvidia fanboy
This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn't enough anymore. Even though the cards were still rasterizing quickly enough they weren't useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.
And I'm not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don't remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn't deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.
At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren't current anymore, and it was only used in my Linux server.