this post was submitted on 24 Jan 2025
1341 points (98.6% liked)
memes
11236 readers
3515 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to [email protected]
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
Sister communities
- [email protected] : Star Trek memes, chat and shitposts
- [email protected] : Lemmy Shitposts, anything and everything goes.
- [email protected] : Linux themed memes
- [email protected] : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
More interestingly, lamps in video games use the same amount of real electricity if they are on or off.
Not necessarily, on OLED displays (which are definitely a thing for desktop computers and TVs) a light that's turned off is using less power because the pixels the lamp is displayed on (and the ones around it too) are dimmer.
YELLS IN GPU VERTEX PIPELINE
that consumes electricity. ever think about the poor gpu? about how your words hurt its feelings?
jokes aside the power to process a few hundred vertices every frame is insignificant
And traditional LCDs with a backlight use more power for darkness. The LCD is transparent by default and turns opaque/black when a voltage is applied.
Actually, the pixels go completely black and do not consume any electricity at all in that state.
You might be thinking of early OLEDs, which had to stay on at all times to prevent blur/smearing. But panel manufacturers solved that problem a few years ago. Don't remember exactly when the change happened, but I remember first seeing true black OLEDs sometime around 2017/2018.
When a lamp turns off it doesn't become a black hole. Previous commenter was correct, though I appreciate your info about OLED
The light doesn't become true black, it's dark but not a complete nothingness. So yes, it'll still consume power.
Probably not for most people, due to cost. More realistic for portable devices where battery saving is a thing, as it doesn't seem like there's much mainstream push for OLED (or similar equivalent) monitors that aren't top-end (on newegg, I could only find 240Hz options).
That and often search results are for other panel technologies (IPS/TN/VA). Lower spec stuff seems to exist but you really gotta scrape the bottom of the barrel (portable monitors) to find some niche product.
Monitors no, TVs very much so.
Very much so... what? A quick glance, they're expensive AF (riddled with "smart" features and now AI, gigantic on top of 4K etc) too.
Sure I guess there's actually a chance a few impulsively bought one at a big-box store (or "on sale" for the full price of a non-OLED TV), but it's more likely they bought "LED" which is marketing speak for local dimming (not even close to OLED turning pixels off).
I'm not sure sub-£550 ($700) with reasonable sizes (42"), really counts at expensive AF anymore (not cheap but not expensive AF). But each to their own.
Alright sure, maybe. But LCD screens are ubiquitous, and most people probably aren't looking to buy more displays. In a similar vein, early 4K adopters probably don't have much reason... if they can just be happy with what they already have.
It is good enough to be the last thing to upgrade, especially looking at the chunk of cost it'd be when lumped in with PC/console cost. (also, selling is probably not for everyone even if less-modern HDTVs had any resale value, and at ~42" you might even not get any quick takers even if free)
A quick look at the Steam survey, ~56% of users are still using 1080p and ~20% are using 1440p. If OLED is almost exclusive to 4K and/or 240Hz many will likely continue to ignore it.
Also if you don't have the hardware+content, it also doesn't really make sense. That's additional cost, and you may even need to look specifically for content created that works well with OLED (if not created with it in mind). Higher-speeed broadband availability/cost and streaming enshittification(+encoding quality) may be factors here too.
And burn-in seems to still be a thing, at least with some types/models.
So I see this as a long way off for mass adoption, similar to VR. And more to my point that it's more of an exception than a norm.
EDIT: Also just saw QDEL, seems a year away still but may fix burn-in and cost (especially if it is pushed to lower end, print manufacturing may allow it). Though who knows, I'm also seeing tandem OLED (except it seems to make cost worse).
A few things:
You sound like you're already at higher-end, obviously not who I was talking about. Perhaps I should've said "for most people", but really cost is a multiplier here so maybe similar tech will become a norm some day due to advancements (as I mentioned in the edit).
Part of my thinking (aside from not high-end) with the survey was that people could be using Big Picture mode for living-room OLED gaming, but seemingly aren't (unless they have older OLED that is not 4k?). Some people even still like their retro stuff (even 4:3 content) on CRT tech, rather than filters and/or upscalers.
Also just saw a video (L1T) about 2 options for $180 4K HDR IPS displays, not sure if this is a new low but I'll keep waiting (though I may be an outlier, going for free content that isn't the highest quality even by 1080p standard) also because it's on amazon.
I think you know what I mean. A daylight scene is going to look great on the display I mentioned above (and there may be higher-end non-OLED options too). Side-by-side there might be a difference, but diminishing returns for the actual experience.
Where OLED-like tech excels is darker content (near if not perfect black, which is what IPS etc will not match). I could see somebody buying this tech for horror games/content (especially Dead Space with its diagetic UI). Maybe for space content, but even then the stars need to be sparse or very under-exposed (white stars, dimmer clusters/interstellar cloud if any) to get a contiguous field of perfect black between the stars.
So stylistic choices really make-or-break it here. For an example I actually do have an OLED display (a phone I got free* because screen is cracked) and in the movie Wall-e there are just a few bits with near-perfect darkness that work really well (some transitional-moments, Wall-e's trailer when unlit, robot PoVs where the letterboxing looks like it's part of the mask)... but here it usually isn't space as most of the shots of used are pretty bright (some in the intro are darker) like the rest of the movie.
My mention of burn-in was not that I think it's a huge issue, but that it's still a worry. Searching on it I was still seeing videos about burn-in, one of the videos from 1 year ago was about a then-new display that had it due to mismatched-aspect content causing the panel to over-drive too much (which is unfortunate as that should be a great use-case). Wear leveling still sounds a bit long-term scary to me, especially with higher cost.
Other model-dependent issues I was seeing was VRR flicker and font rendering (sub-pixel arrangement). Also saw someone complaining about the support of HDR in general (games and even creation tools, Windows etc) from that same 1yr ago (it could be better now, but I'm betting this also leaves a lot of older titles that now are unplayable unless some mod/tonemapper etc can be used).
*= the person who gave it to me seemingly didn't even know what OLED is, and forgot me pointing it out
Highly depends on the rendering engine and if you’re looking at it, as it could unrender if you look away, meaning less energy used.