this post was submitted on 16 Jun 2024
632 points (95.3% liked)

Greentext

4501 readers
1014 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 5 months ago (1 children)

Because it draws those "pixels" as the signal reaches the monitor. When half of a frame is transmitted to a CRT monitor, it's basically half way done making it visible.

An LCD monitor needs to wait for the entire frame to arrive, before it can be processed and then made visible.

Sometimes the monitor will wait for several frames to arrive before it processes them. This enables some temporal processing. When you put a monitor in gaming mode, it disables (some of) this.

[–] [email protected] 5 points 5 months ago (1 children)

If that's how TFTs worked we wouldn't have vsync settings in games.

[–] [email protected] 2 points 5 months ago* (last edited 5 months ago)

No? Afaik vsync prevents the gpu from sending half drawn frames to the monitor, not the monitor from displaying them. The tearing happens in the gpu buffer Edit: read the edit

Though I'm not sure how valid the part about latency is. In the worst case scenario (transfer of a frame taking the whole previous frame), the latency of an lcd can only be double that of a crt at the same refresh rate, which 120+ hz already compensates for. And for the inherent latency of the screen, most gaming lcd monitors have less than 5 ms of input lag while a crt on average takes half the frame time to display a pixel, so 8 ms.

Edit: thought this over again. On crt those 2 happen simultaneously so the total latency is 8ms + pixel response time (which I don't know the value of). On lcds, the transfer time should be (video stream bandwidth / cable bandwidth) * frame time. And that runs consecutively with the time to display it, which is frame time / 2 + pixel response time. Which could exceed the crt's latency

BUT I took the input lag number from my monitor's rtings page and looking into how they get it, it seems it includes both the transfer time and frame time / 2 and it's somehow still below 5 ms? That's weird to me since for that the transfer either needs to happen within <1 ms (impossible) or the entire premise was wrong and lcds do start drawing before the entire frame reaches them

Although pretty sure that's still not the cause of tearing, which happens due to a frame being progressively rendered and written to the buffer, not because it's progressively transferred or displayed