this post was submitted on 21 May 2024
50 points (89.1% liked)

Games

16728 readers
534 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
 

cross-posted from: https://sopuli.xyz/post/12872542

Does anyone really need a 1,000 Hz gaming display?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 12 points 5 months ago (1 children)

1000 Hz seems to be close to the limit human of human vision, since we stop seeing motion blur above 1000hz. Seems like a good endpoint for display technology.

[–] [email protected] 14 points 5 months ago (1 children)

On the other hand I heard so many times the same argument for 144hz, 165hz and then 240hz....

Now 1000Hz is much higher frequency but in term of frame time it's not that far. I wouldn't be surprised if some people could successfully tell the difference in a "blind" test.

[–] [email protected] 5 points 5 months ago (1 children)

I remember people talking about 1000hz being the holy grail for vr headsets, though so it seems like there's more consensus on 1000 Hz being a good limit. Frame time is just the inverse of hz.

But yeah ive personally only used 144hz, I think I could see a difference with 204hz, but I'm not sure if I'd be able to discern 480 or 1000hz outside of maybe VR.

[–] [email protected] 9 points 5 months ago (2 children)

There kinda isn't really any definitive science that indicates a specific frame rate that the eye can perceive.

There are studies however that show ranges from 30 to 90hz, and studies that show that human perception can detect flicker at up to 500hz even.

The issue is that nothing that happens in the real world is synchronized with what you perceive. So filling in with more Hz means there are more chance for you to actually perceive the thing.

To complicate matters further, our brains do a lot of filling in for us, and our eyes and brains can still perceive things you aren't consciously perceiving yourself. So again more frames is always nice.

Here are some sources

Canadian Centre for Occupational Health and Safety. (2020). Lighting Ergonomics - Light Flicker.
https://www.ccohs.ca/oshanswers/ergonomics/lighting_flicker.html

Davis J, et al. (2015). Humans perceive flicker artifacts at 500 Hz.
https://doi.org/10.1038/srep07861

Mills M. (2020). How Many Frames per Second (FPS) the Human Eye Can See.
https://itigic.com/how-many-frames-per-second-fps-human-eye-can-see/

[–] [email protected] 1 points 5 months ago* (last edited 5 months ago) (1 children)

Sure, eyes dont have a "global frame refresh" like computers do. That's why we can tell the difference between 24hz and 60hz video. Every eye cell is excited independently and continuously.

Still, there's a physical limit for frame time where 99% of humans wouldn't notice a full screen flash 99.9% of the time. Being able to shake your head around with a 1000hz vr headset and not perceive and motion blur from sample and hold seems pretty close to that limit.

[–] [email protected] 1 points 5 months ago

That's still going to depend on the frame rate of the output source. If the source is only 200fps having 1000hz isn't going to matter a whole hell of a lot.

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago)

If you're using a game that renders each frame at an instant in time, and the aim is to get a better approximation of true motion blur to your eye, the theoretical maximum for getting smoother motion blur is gonna be when the thing is moving at one pixel a second, which is higher than the rate at which we can distinguish between individual images. Well, okay, maybe a bit more, since you could hypothetically have sub-pixel resolution.

But point is, more rendered frames does buy you something even past the point that they're not individually distinguishable, unless the game's rendering engine can render perfectly-accurate motion blur itself.