He's not wrong.
-
[email protected]replied to BlueFootedPetey last edited by
if you have an oled display, then if a video game is brighter it costs more energy because the LEDs turn on more.
if have an lcd display, there's a backlight that always has the same brightness and crystals blocking the light, which makes the image. meaning a brighter scene doesn't take more power, since the backlight doesn't use more energy.
-
[email protected]replied to [email protected] last edited by
not that specific. most modern displays are oled, and most efficient games use prebaked lighting. the average gamer probably plays on an oled display, and has a game with prebaked lighting.
-
On an LCD display, the backlight is always on but the crystals need power to align and let the backlight through.
A full white screen would in theory use more electricity than a full black screen. How much more, I don't actually know but I would like to know more info in it.
-
[email protected]replied to [email protected] last edited by
Alright sure, maybe. But LCD HDTVs are ubiquitous, and most people probably aren't looking to buy more displays. In a similar vein, early 4K adopters probably don't have reason to buy a new display... if they can just be happy with what they already have.
It is good enough to be the last thing to upgrade, especially looking at the chunk of cost it'd be when lumped in with PC/console cost. (also, selling is probably not for everyone even if less-modern HDTVs had any resale value, and at ~42" you might even not get any quick takers even if free)
A quick look at the Steam survey, ~56% of users are still using 1080p and ~20% are using 1440p. If OLED is almost exclusive to 4K and/or 240Hz many will likely continue to ignore it.
Also if you don't have the hardware+content, it also doesn't really make sense. That's additional cost, and you may even need to look specifically for content created that works well with OLED (if not created with it in mind). Broadband availability/cost and streaming enshittification may be factors here too.
So I see this as a long way off for mass adoption, similar to VR.
-
[email protected]replied to [email protected] last edited by
I guess if you disable the computer's fan, yes.
-
[email protected]replied to [email protected] last edited by
Greetings fellow time-traveler. What model of entropy-reversing computer fan do you use?
-
[email protected]replied to [email protected] last edited by
A few things:
- I disagree that LCD is good enough, especially for living room gaming. It is the best and most significant upgrade I've ever done, by a long way.
- In terms of Steam Survey, again no arguments from me, oled monitors are rare, I was arguing that TVs are not.
- There isn't such thing as content that works well with OLED, everything looks significantly better, especially with HDR, which almost everything supports and has done for a significant period of time.
- As someone that has been using an OLED TV for 5+ years, burn-in really isn't an issue, there's not a trace of burn-in on either of my TVs, or any of my portable devices with OLEDs. The only time I've ever experienced burn-in on an OLED was a Nexus 5, which is so long ago, that it's almost irrelevant. In the case of the Nexus 5, the only reason it ended up with burn-in is because I enabled the developer option to keep the screen on at all times, resulting in the status bar burning into the screen. All modern OLED displays take burn-in into account and run screen cleaning occasionally, which isn't noticeable as the screen just appears a black. So unless someone is running a news channel with a static logo 24/7 on the screen, they're not going to have issues with burn-in. It's worth noting I have an OLED TV on my desk too (that one was indeed on sale, for ~400 IIRC), and that has static content such as an Apple logo (work laptop ), on it for hours each day, with no burn-in.
-
[email protected]replied to [email protected] last edited by
Why reversing enthropy? I just throw the computer in the trash when it burns off so I can buy a new one every month. Mass consumer society is so greaaat.
-
The problem with those virtual lamps is that when you look away, the light turns off but the heat doesn't.
-
[email protected]replied to [email protected] last edited by
Looks like we're from a different galaxies, as I never seen oled display for PC in my eyes(I know they exist, but they are extremely rare where I am)
-
[email protected]replied to [email protected] last edited by
You sound like you're already at higher-end, obviously not who I was talking about. Perhaps I should've said "for most people", but really cost is a multiplier here so maybe similar tech will become a norm some day due to advancements (as I mentioned in the edit).
Part of my thinking (aside from not high-end) with the survey was that people could be using Big Picture mode for living-room OLED gaming, but seemingly aren't (unless they have older OLED that is not 4k?). Some people even still like their retro stuff (even 4:3 content) on CRT tech, rather than filters and/or upscalers.
Also just saw a video (L1T) about 2 options for $180 4K HDR IPS displays, not sure if this is a new low but I'll keep waiting (though I may be an outlier, going for free content that isn't the highest quality even by 1080p standard) also because it's on amazon.
There isn’t such thing as content that works well with OLED
I think you know what I mean. A daylight scene is going to look great on the display I mentioned above (and there may be higher-end non-OLED options too). Side-by-side there might be a difference, but diminishing returns for the actual experience.
Where OLED-like tech excels is darker content (near if not perfect black, which is what IPS etc will not match). I could see somebody buying this tech for horror games/content (especially Dead Space with its diagetic UI). Maybe for space content, but even then the stars need to be sparse or very under-exposed (white stars, dimmer clusters/interstellar cloud if any) to get a contiguous field of perfect black between the stars.
So stylistic choices really make-or-break it here. For an example I actually do have an OLED display (a phone I got free* because screen is cracked) and in the movie Wall-e there are just a few bits with near-perfect darkness that work really well (some transitional-moments, Wall-e's trailer when unlit, robot PoVs where the letterboxing looks like it's part of the mask)... but here it usually isn't space as most of the shots of used are pretty bright (some in the intro are darker) like the rest of the movie.
My mention of burn-in was not that I think it's a huge issue, but that it's still a worry. Searching on it I was still seeing videos about burn-in, one of the videos from 1 year ago was about a then-new display that had it due to mismatched-aspect content causing the panel to over-drive too much (which is unfortunate as that should be a great use-case). Wear leveling still sounds a bit long-term scary to me, especially with higher cost.
Other model-dependent issues I was seeing was VRR flicker and font rendering (sub-pixel arrangement). Also saw someone complaining about the support of HDR in general (games and even creation tools, Windows etc) from that same 1yr ago (it could be better now, but I'm betting this also leaves a lot of older titles that now are unplayable unless some mod/tonemapper etc can be used).
*= the person who gave it to me seemingly didn't even know what OLED is, and forgot me pointing it out