this post was submitted on 20 Sep 2023
119 points (94.7% liked)
Gaming
19934 readers
280 users here now
Sub for any gaming related content!
Rules:
- 1: No spam or advertising. This basically means no linking to your own content on blogs, YouTube, Twitch, etc.
- 2: No bigotry or gatekeeping. This should be obvious, but neither of those things will be tolerated. This goes for linked content too; if the site has some heavy "anti-woke" energy, you probably shouldn't be posting it here.
- 3: No untagged game spoilers. If the game was recently released or not released at all yet, use the Spoiler tag (the little ⚠️ button) in the body text, and avoid typing spoilers in the title. It should also be avoided to openly talk about major story spoilers, even in old games.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I can agree but with two conditions. Benchmarks must always be done in native resolution. Hardware capability / system requirement must not take any upscaling into account.
For example, if a studio publishes the requirements for playing at 1080p, 60 FPS, High RT, it must be native 1080p and not 1080p with upscaling.
Benchmarks should not be disconnected from actual games. If games don't play in native resolution, then benchmarks should not be limited to native resolution. they should check both native and upscaled rendering, and rate the quality of the upscaling.
Why?
RT + DLSS is less cheating than most other graphics effects, especially any other approach to lighting. The entire graphics pipeline for anything 3D has always been fake shortcut stacked on top of fake shortcut.