this post was submitted on 16 Jun 2024
629 points (95.3% liked)
Greentext
4342 readers
1372 users here now
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?
Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.
Of course there's buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.
There's some really bad misconceptions about how latency works on screens.
CRTs (apart from some exceptions) did not have a display buffer. The analog display signal is used to directly control the output of each electron gun in the CRT, without any digital processing happening in-between. The computer on the other end however does have display buffers, just like they do now; however eliminating extra buffers (like those used by modern monitors) does reduce latency.
Doesn't matter. Having a buffer means either the buffer must be full before drawing, or you get screen tearing. It wasn't like racing the beam.