"Final Step to Achieving "Dream OLED" LG Display Becomes World's First to Verify Commercialization of Blue Phosphorescent OLED Panels"
Posted by Dakhil@reddit | hardware | View on Reddit | 231 comments
Weird_Tower76@reddit
Ok so does it mean they're closer to QD OLED in terms of color gamut or just brighter? If WOLED or whatever this tech is called can compete with QD OLED on colors (and especially if it's brighter, which LG generally wins on), then LG will win the OLED market pretty easily. Right now, QD OLED just looks better even if it's generally not as bright on monitors.
ExtensionChance6353@reddit
QDOLED is just a myth that they’re still trying to push that can never happen. I’ll explain: QDLED is a technique whereby instead of the usual 3 (red, green, blue) colors, they use 4 (+white). However, in order for this to work, all previous tech (blu-ray players, cable, DVR boxes, satellite transmissions, video games) would have to be programmed to put that extra white diode to use. The rest of the time, as in anytime you’re watching/playing something meant for the general public, you’re still just using RGB. *Sony & Magnavox tried the same thing on CRT TVs in the early ‘90s, introducing a white light to the cyan, magenta, yellow. It was touted and believed to be the best thing ever…until it was discovered that the white light simply never came on and was a waste of time and money.
StickiStickman@reddit
It still easily hits 1000 nits. Anyone who needs more than that needs to get their eyes checked. Even 600 nits is usually too bright for me even in a well lit room
ryanvsrobots@reddit
All of these monitors only do max 270 nits full screen, which is not very good.
HulksInvinciblePants@reddit
Good in what sense? Peak 100% window is not a reflection of real world content. I certainly wouldn’t want to push my excel sheets that high.
Strazdas1@reddit
It is reflection of real world content when you use it for productivity.
HulksInvinciblePants@reddit
I just don’t really believe anyone here has a gauge of what nits actually mean. I run two calibrations on my monitor. 120nits for dark room and 180 for bright. 250+ is excessive outside of direct sunlight for a monitor near your face. Your excel sheet shouldn’t hurt your eyes.
Strazdas1@reddit
Nits is not a great measure, but people use it quite commonly and its easy to spell/say.
180 would be fucking unusable in a bright room for me. Im looking at 250 as bare minimum for productivity, much more for gaming/movies.
Direct sunlight is 1000+ nits teritory if you want to actually see whats on the screen. Just remmeber that a bright room itself is about 10 000 nits. Sunlight is 100 000+ nits. Its not getting anywhere near close to hurting eyes outside of dark room scenario, and most people dont have dark rooms for their computers.
HulksInvinciblePants@reddit
I use cd/m2, but as you said nits is just easier for conversation. There are tons of factors in play, sunlight being a significant one. But with my two window room, and my 180 calibration, I’m fine until the sun goes down at which point it’s too bright.
My living room room TV is calibrated at 160 for all SDR content, which has 3 large windows directly facing it. Oddly enough properly mastered HDR content is a bit dimmer (as midtones are usually closer to 100), but my peaks are around 1500.
ryanvsrobots@reddit
Good compared to the other monitor technologies.
HulksInvinciblePants@reddit
Except, once again, it’s not reflective of real world content.
ryanvsrobots@reddit
I have a monitor that can do 100% 600 nits. I have no idea what you're talking about.
HulksInvinciblePants@reddit
I mean you’re not even speaking in complete terms, so it’s no wonder you don’t know what I’m talking about. I highly doubt you’re pushing 600nit APL on a monitor near your face. I also doubt you’ve confirmed it with a spectro.
ryanvsrobots@reddit
https://www.rtings.com/monitor/reviews/innocn/27m2v 800 nits sustained 100%
HulksInvinciblePants@reddit
Not full screen dude. Again, you’re not even understanding what you’re reading.
ryanvsrobots@reddit
It's not highlights, it's 100% of the screen. What do you not understand about "Sustained 100% Window"
HulksInvinciblePants@reddit
You have to have a pretty catastrophic understand of HDR mastering to think a game is outputting 800nits across the entire image.
ryanvsrobots@reddit
That's a different test dude and that one is over 1000 nits
HulksInvinciblePants@reddit
That was what you were implying. I know what test patterns are and do bud. However, you claimed you were looking at a 600nit APL in BF, and that’s simply not true. You used the Destiny line as a counter. It was just wrong.
ryanvsrobots@reddit
I was talking about "Sustained 100% Window," that's why I said "Sustained 100% Window." Not talking about test patterns, I'm talking about "Sustained 100% Window" brightness. BF snow scene is mostly a white sustained window. It's not that complicated.
HulksInvinciblePants@reddit
It certainly is more complicated than that. Again, you’re not applying these concepts correctly. Test charts do not represent real world signal calls.
A snowy tundra in BF’s HDR container is not requesting the television to hit 600+ nits on white. It’s without a doubt much lower. The ability to process a signal anywhere in between 0-1000 is the dynamic part of High Dryanmc Range.
Look how washed out an image can look with an average far lower:
https://cdn.arstechnica.net/wp-content/uploads/2018/02/hdr-15a-scaled.jpg
ryanvsrobots@reddit
It's requesting significantly more than 270, the brightness limiting is very obvious. And who said I'm abiding by what the container is requesting?
ryanvsrobots@reddit
It can and will if the display is capable. You'd know if you have ever used a display capable of it. The full range of SMPTE ST 2084 is 10,000 nits.
HulksInvinciblePants@reddit
Trying to sound authoritative on the subject after numerous stumbles, up to this point, is peak Reddit. I’m a professional in the space. I have 8 displays of various types and capabilities, all calibrated. I have and regularly use measuring equipment that demonstrates what the values translate to in the real world. Its 100% not related to what do
You haven’t said a single accurate thing on the subject of HDR. You simply ignored the word for word description of RTINGS’s HDR methodology that I quoted. It’s not an APL reading.
ryanvsrobots@reddit
I haven't stumbled, you're just talking about something different than I am. Are you on ADHD meds? You're hyper fixating on something and assuming that's what I'm talking about. I'm not. Mastering is completely irrelevant. APL is completely irrelevant.
And congrats on having a job but you're not acting professional at all. Peak reddit is using appeal to authority because of a job you may or may not actually have, despite the data being plainly visible. I am a colorist for TV and it doesn't matter because that's not what I am talking about. If you want to talk about something else, find someone else to talk at.
Strazdas1@reddit
I constantly have issues with one monitors because it peaks at 360 nits and in many situations (such as a bright day outside) its not enough.
Nicholas-Steel@reddit
What area size? I expect such high brightnesses would be over a 5% or smaller area of the screen.
veryrandomo@reddit
For QD-OLED monitors it's only 1000 nits in a 2% window
veryrandomo@reddit
Yeah it "easily" hits 1000 nits... if 98% of the rest of your screen is entirely black/turned off.
Saralentine@reddit
“Can’t see more than 30 FPS” vibes.
Equivalent-Bet-8771@reddit
That's not how technology works. If the panel can hit 1000 nits then it will have a long life at 100 nits. There is always a need to push the brightness further to increase the performance of the panel.
You are in the wrong subreddit bud.
Turtvaiz@reddit
Or is it you that needs their eyes checked it is "too bright"?
cocktails4@reddit
My A95L is so bright I don't know if I really want it any brighter. Like damn. Do we need TV to sear our retinas?
Dood567@reddit
QD OLED is doing pretty damn good compared to WRGB anyways. Brightness in OLED has two parts.
Full screen brightness is difficult because of the power draw eg. go full field white
Peak brightness can be difficult in really small patches if the individual pixels aren't bright enough. This is what's more noticeable with bright flashes and stuff. The peak brightness numbers measured off an OLED come from 10-25% window measurement a lot of the time. That's a sweet spot between having enough pixels grouped together to put out a lot of light, and not having so much power draw across a 100% filled window that you need to dim the pixels a bit.
CoUsT@reddit
All current monitors and TVs are insanely darker than outdoor sunny light and yet that doesn't burn our retinas. We can probably have 10x brighter displays and it should be fine and probably better for our eyes health because apparently lack of light causes shortsightedness or make things look more natural.
In the end brightness is adjustable so that's good I guess.
Jensen2075@reddit
That's b/c we're not staring at the sun.
djent_in_my_tent@reddit
Yeah, I’m over here trying to figure out what the fuck must be wrong with my eyes because I use my QD-OLED monitor at 5% brightness
Not out of trying to preserve it — it’s my genuine preference
HulksInvinciblePants@reddit
This isn’t so much about brightness as it is removing the white sub-pixel and its drawbacks.
BFBooger@reddit
Sometimes I get the impression that people put their TV in direct sunlight or something.
With all the comments here about 1000 nit not being good enough and most of those referencing the sun. Yeah, I get it, your smartphone needs high peak brightness. But your living room TV? The room might be bright, but its not right in the direct sun.
Some outdoor sports-bar sort of TVs, sure, those need to be bright, but they don't need the greatest quality HDR or response times or black levels, so just some high brightness LCD tech is fine. A bar owner would be a bit crazy to pay for more than a cheap durable bright screen with decent viewing angles. Better off to have 3x $400 screens than one $1200 screen for that situation, so this sort of 'needs to be very bright' requirement comes into the home entertainment/gaming discussion.
Weird_Tower76@reddit
That's how I feel about my 2000 nit modded S90D but I don't get that in monitor form
rubiconlexicon@reddit
The 4 stack WOLED panels are already catching up to QDOLED colour gamut, although still a little behind.
unknown_nut@reddit
It's already pretty close with their recent LG G5. I hope it beats QD OLED because the raised black is noticeable even in a dark room. I have both WOLED and QDOLED monitors next to each other in a dark room.
LosingReligions523@reddit
new LG G5 will use this new panel.
Pros:
Yeah, it is pretty much huuuuuge upgrade over rest of OLEDs at the moment.
HulksInvinciblePants@reddit
G5 is still WRGB…
Weird_Tower76@reddit
Damn. If this was 48" and 240hz I'd replace my monitor and go TV mounted again.
pholan@reddit
LG’s G5 uses their their Primary RGB Tandem panel without a white subpixel so it should have similar color volume to QD OLED and early reviews suggest it can get monstrously bright. Early reports suggest it has issues with banding in colors very near black but I’m not sure if that can be fixed in firmware or if it will need a hardware revision.
CeeeeeJaaaaay@reddit
G5 is still RGBW
pholan@reddit
As far as I can tell that’s only true for its largest and smallest model. For all the other models it’s using a color filtered white OLED emitter without a dedicated white subpixel.
CeeeeeJaaaaay@reddit
https://youtu.be/Hl7yTFtKois?si=4Ui9TW4dgHNoG6zr
2:55
If they dropped the white subpixel it would have been much bigger news.
LG.Display is exploring production of an RGB panel for the end of this year, so we might see 2026 monitors and perhaps TVs with it.
HulksInvinciblePants@reddit
It would have been huge.
pholan@reddit
Well, I was wrong. I was under the impression that they’d taken advantage of the higher brightness of their new primary RGB tandem emitter to ditch the white subpixel. I guess that evolution is reserved for their monitor line early next year or very late this year.
JtheNinja@reddit
It allows lower energy use (and better burn in resistance?) for a given brightness. This could - COULD - allow them to stop using the white subpixel, which is a big reason their panels have better brightness but worse gamut volume than QD-OLED. I believe LG Display has RGB-only OLED panels on their roadmap, so this is likely part of the plan for that.
AnthMosk@reddit
Well. Guess my next TV will be in 2030 or so
the_nin_collector@reddit
Why? Enjoy what they have now, get a new TV in 2030 if you want.
Pointless to always wait for the next thing, the next thing is now, and the next thing will always be later as well.
Strazdas1@reddit
I cant enjoy what they have now. My use case is such that i would deal with burnin issues in a matter of months. Hopefully the "dream OLED" will solve that. One can "dream".
the_nin_collector@reddit
burn in happens around 15,000-30,000 hours.
6 months, 24/7 a day is only 4,000 hours. You can literally leave a modern OLED TV on for a year and half, nonstop, without worrying about burn-in.
If you left an ISP monitor on for 5,000 hours without turning it off, it has a higher chance of burn in than modern OLED, because modern OLED have built in burning protection that will shift pixel, cycle power, and half a dozen other things.
I would love to hear what your "use case" is that means you can get an old.
Positive-Bonus5303@reddit
that statement is worthless without defining the usage parameter and specifying what amount of burn-in you consider to be burn-in is.
If you use the rtings.com burn-in tests as a reference, expect to be able to find burn-in after a few month of heavy usage.
the_nin_collector@reddit
What's heavy usage. I have had my LG BX since 2020.
Some days I use 12-14 hours as a PC monitor and playing games. Hours a day on the internet, emails, reddit. I am a teacher, and get 4 months off a year, so during my semester break I am on this thing a good 10-15 hours a day, multiple times a week. How much heavier is going to get?
Positive-Bonus5303@reddit
Throw some test images on it, unless you run yours very dim there should be burn in. Every rtings oled burn-in test shows burn in after a few month.
https://www.rtings.com/tv/learn/longevity-results-after-10-months
A main driver for burn-in is brightness. I assume you don't drive yours near the max (which i wouldn't either). Halving the brightness will more than double the time it takes to experience the same amount of burn-in.
the_nin_collector@reddit
I run mine at 99 brightness.
Positive-Bonus5303@reddit
Then the question would be, what do you differently? Given your proclaimed usage time and brightness settings there should be burn-in. Have you tried some test images?
the_nin_collector@reddit
I mean... If I use my PC, on a slow day for 6 hours, upwards to 14 hours on a weekend or holiday, and don't see any burn in, why go out of my way to find it. I don't see it. I don't notice it.
It's like you all WANT burn in.
I don't know. There are anti burn in measures. Maybe they are working. Maybe because of my ADHD and going from the reddit, to a movie, to a game every 30 minutes is more than enough to mitigate a lot of the burn in risk.
If I am using word to type or edit a paper I will use SDR brightness at 50, or if I am in a long work zoom call from work I will use SDR brightness at 50, but 98% of the time I use HDR settings with brightness at 99.
Strazdas1@reddit
Burn in happens based on brithness and image displayed too. My use case is brigth static UI elements 16 hours a day.
Also that 15-30k number is assuming you use mitigation features. That would not be possible (and would actually be a dealbreaker) for me.
Other than shifting pixels, which wont do much in my case, all others requires the monitor to be off at the time.
SJGucky@reddit
It will also be my next PC monitor.
But I also have an LG OLED right now. It is just not as bright as newer models and has a bit of burn-in and 1x dead pixel. :D
Neither the burn-in nor the dead pixel are noticeable unless you specificly search for it.
taicy5623@reddit
My C2 has a max of like 800 nits.
I'm having to downgrade the dynamic range of the TV because 800 nits is still bright enough to trigger my astigmatism.
goa604@reddit
Exactly like me minus the dead pixel. I got it for 500€ NEW
BioshockEnthusiast@reddit
Also kinda pointless to get a new TV if the market doesn't have a current option that carries enough value to warrant an upgrade.
astro_plane@reddit
OLEDS are awesome but they’re too expensive. They won’t catch on until the price goes down. I got my almost new C2 for a very good deal so that’s the only reason I own one, they’re pretty much the modern day PVM’s imo.
nVideuh@reddit
You think they’re too expensive? They’re cheap now compared to what they were when they first came into the market. Sony OLEDs are even more expensive but have better image processing than LG.
BlackBlueBlueBlack@reddit
Yeah but they're still expensive
R1chterScale@reddit
The burn in is also a deal breaker for monitors if youre gonna do any office work on a PC
the_nin_collector@reddit
Maybe for an office, why would you want an OLED for an office anyway.
But I have used an OLED exclusively for my PC monitor for 20,000 hours now, LG BX, and not a single issue. Not a dead pixel, not the least amount of burn in.
R1chterScale@reddit
Not "an office", "office work" a lot of people will use their at home monitors for things like excel, programming, and the like
BioshockEnthusiast@reddit
Understandable. I've got a bunch of really decent non-OLED monitors that I'm happy with, and honestly I expect them to last years. I'll look at replacing them when I need to replace them. I can't be the only one especially now with the tariff bullshit. Everyone I know personally and professionally has battened down the hatches in terms of IT expenditure.
EducationalLiving725@reddit
My next TV will be in the next couple of months, and it will be miniled Bravia 5. Oled is SUPER overhyped currently, and miniled in the same price bracket will be better. Moving from C2 77"
SeraphicalChaos@reddit
I don't think it's not overhyped; both technologies have their pros and cons. OLED is hard to beat in a dark room or while gaming.
It's not for me though... I essentially use my TV as a dumb computer (HTPC) monitor and OLED doesn't really fit well in the long term with static elements, so it makes for an unlikely purchase with my use case. I want to keep my TV for longer then 6-7 years and the thought of having to toss it because of burn in just doesn't sit well with me. I also refuse to be that person who has to baby tech, using it on it's terms, in order to keep it working properly.
EducationalLiving725@reddit
I mainly game (PC -> HDMI) and watch anime with subs.
In both these scenarios miniled is far brighter, juicier and superior. Maybe if I'd watch some noir cinema - I'd start to love perfect blacks\grays, but well...
MonoShadow@reddit
Subs blooms like a mf on mini-led. depending on the setup, the whole bottom of the monitor can be 100% on even in dark scenes.
I use C2 as a PC monitor for 3 years now. I like it. But I can create a perfect env for it, aka dark room, it's also glossy, a lot of reflections.
At the same time, if you tried both tech and lean towards one, then more power to you.
Strazdas1@reddit
Sectioned blacks can help. Altrough it is ananoying when subs have this glowing halo around them in an otherwise dark scene. But its more like 10% of total screen area being lit from subs, not 100%.
EducationalLiving725@reddit
Anime & Games are full screen, without cinematic black bars - so, no problems with bloom at all.
SeraphicalChaos@reddit
Not sure you deserved all the downvotes. Anime is usually full of pretty bright colors and hardly full dark scenes. Properly set subs won't cause much if any blooming. Maybe we got a bunch of Goblin Slayer fans on this sub 😏.
One of the biggest sells for LED LCD is that they can get quite a bit brighter (almost blindingly so on the newer, high end models) then their OLED counterparts. If that's what you value, then you've got a valid claim.
An OLED will still have the edge in response time / motion handling then LCDs though.
EducationalLiving725@reddit
Herd mentality I guess. Especially, when I owned both oled and qled, and saw everything by myself.
Gretaphor@reddit
Omg, "herd mentality I guess"
Nah dude, people just think it's a stupid take. Not herd mentality. You have a shit outlook/opinion.
God damn you think you're a genius and it's hilariously pathetic.
Keulapaska@reddit
Grey/semi transparent instead of pure white subs help a lot also not watching off axis. Sure it's a bit annoying that they will "change"(appear to change? idk how it works) colour based on whats on the screen and appear grey in high brightness scene and white in darker scenes
-Goatzilla-@reddit
OLED is the standard for watching movies and cinematic TV shows at home in a dark room. Mini LED is better for everything else.
EducationalLiving725@reddit
yeah, I dont watch this slop
Excellent-Knee3507@reddit
https://youtu.be/Xb2k7hcDKPs
atomicthumbs@reddit
movies are slop?
EducationalLiving725@reddit
Yes? Almost everything, that is done in last 10 years or so.
conquer69@reddit
How would you know? You said you didn't watch it.
Ar0ndight@reddit
You're downvoted to hell but that's just the OLED cabal for some reason people are super tribalistic when it comes to this stuff (and I say that as a OLED C1 owner).
A good miniled display with enough dimming zones is better for most uses. Only in a dark room, watching very dark content does OLED edge it out. I have both that C1 and a MBP and there's no arguing to me the miniled display of the macbook is simply better. Content looks better on it not in small part because of how bright it gets, while blooming is pretty much non existent outside of very edge scenarios.
HulksInvinciblePants@reddit
I mean, many of us own or use multiple displays. I have 2 OLED’s, 1 plasma, 1 full array LED, 1 mini LED, a CRT PVM, and a projector.
I have a previous career in color management software and follow display technology closely. When I see people talking about brightness in a vaacum, it’s a pretty clear indicator to me they think Quality = Brightness. Unfortunately that’s not how it works.
Without any stats behind what you consider “better”, that designation holds no weight. There’s literally a dozen factors that have to be considered when comparing like for like. Being brighter is a preference, especially when it’s outside spec. It doesn’t make something better. If a film is mastered in HDR with 100nit midtones, boosting APL to 350 is simply a manipulation.
TheAgentOfTheNine@reddit
miniled is nice for high brightness content, but it still pales against the cheapest oled in contrast and blacks.
And content tends to be on the darker side.
I am 100% getting an OLED/QD-OLED tv for the next one this or next year.
Alive_Worth_2032@reddit
Some of the newer ones are crazy good vs the past. Sure it's not OLED, but several thousand backlight zones mitigates a lot of delta that existed in the past.
While they will never be truly as black as a OLED and there will always be some minor blooming and bleed. Higher brightness can in many cases make the perceived blackness level comparable to OLED.
Contrast is as much of perception as a real world measurement. Higher brightness improves perceived contrast as well, just as with blacks.
The human eye and brain are already making up a imaginary reality. There is more to perceived image quality than clinical measurements.
I feel like a lot of people who are salivating over OLED. Has never actually put it side by side with a top of the line LCD in a real world setting. They both have things they excel at. If you have a dark room the OLED will win, if you are in a daylight setting the LCD will often win.
And I am talking about winning here in the sense of what people will perceive is the better looking display.
chapstickbomber@reddit
My G9 miniLED literally tans my face.
EducationalLiving725@reddit
In my case almost all content is bright, and OLED just not bright enough.
mduell@reddit
What has you dropping a C2 in favor of miniled?
EducationalLiving725@reddit
Not enough brightness
AnthMosk@reddit
I got a Samsung S90D a few months ago. Wa shopping to go bigger than 65 but the price delta to go bigger is still so an insane
Capable-Silver-7436@reddit
same. my current oled will be 10 by then anyway
Intelligent_Top_328@reddit
After this dream there will be another dream.
This is so dumb. There is no end game.
Yearlaren@reddit
There has to be an "end game". Displays can't keep improving forever.
Asleep-Card3861@reddit
depends what one considers a display. To some degree design is never complete as there are so many factors pushing one way or another, sometimes at odds with each other. Sure at some point there is likely diminishing returns, but the juggling of factors will likely continue.
There is probably some wild tech yet to come. Like a self assembling ‘screen paint’. You paint a surface and its nano particles communicate between themselves to display a screen that harvests the wireless display signal to power them and utilises cameras within the space to track your eyes and provide depth cues
Yearlaren@reddit
Even considering all the possible opinions on what a display is, nothing can improve forever.
Asleep-Card3861@reddit
I didn’t say improve forever. Variations could go on for a long time though, forever is a unfathomably long time. Displays have been around since the 1920s, so roughly 100 years. I wouldn’t be surprised if in the next 100 years changes are so great that the need, concept, use of displays becomes irrelevant rendering the notion of ‘forever’ moot.
WuWaCamellya@reddit
We have really always had the same end goal it has just been slow getting there. Once we have true RGB stripe panels that's literally it. Any other improvements would just be idk, burn in improvements? More resolution and refresh rate options at more sizes? Maybe brightness but my eyes get seared if I go above like 80% on my QD OLED so idk if that much more is needed. Idk, I just feel like the only real image quality related thing left is just a proper RGB stripe sub pixel layout, aside from that we are there.
reallynotnick@reddit
We could push for more subpixels per pixel for an even wider color gamut, though I’m not sure there would be a huge desire for that as rec 2020 is quite good. I read something awhile back where they were proposing a color gamut that covered all visible light and to get close to covering that we’d need more pure colored sub-pixels I think they proposed like a cyan, yellow-green and magenta.
JtheNinja@reddit
https://www.tftcentral.co.uk/articles/pointers_gamut.htm
Rec2020 is about the practical limit of what can be done with 3 physical RGB lights. It’s possible to tweak the primaries slightly to get more XYZ coverage, but the result clips off some off DCI-P3 in exchange for some neon cyan colors that rarely occur IRL. So not really worth it. Anything wider than Rec2020 - and it’s questionable how useful that would really be - would require 4+ primaries.
Equivalent-Bet-8771@reddit
No we are not there. These panels are still not bright enough under sunlight and they still get very very hot near max brightness.
TK3600@reddit
That only matters for phones.
Thotaz@reddit
So you close the curtains and turn off the light and sit in complete darkness every time you use your TV in the living room? What does the rest of the family say to that?
Strazdas1@reddit
What i learned talking with people like that is that they build a seperate room specifically for the display. Because you know if you cant afford a home theater you shouldnt have a screen.
TK3600@reddit
My monitor literally has window behind it every day, no difference what so ever.
Strazdas1@reddit
Or people who dont live in black holes.
Equivalent-Bet-8771@reddit
Of course you never take the laptop out of the underground cave.
TK3600@reddit
Unnecessarily aggressive, but ok.
Equivalent-Bet-8771@reddit
I have to be. You're downplaying a cool technological innovation because you're short-sighted and simply don't care.
gayfucboi@reddit
Phones are pushing nearly 2000 nits these days. It matters. If you can drives these panels less agressively then the burn in problem becomes less.
TK3600@reddit
One day we need a radiator for monitor lol.
GhostsinGlass@reddit
Some nutters watercool theirs.
kirsed@reddit
Pretty sure a lot of OLED monitors do have a fan and I would assume that's connected to a radiator.
StrategyEven3974@reddit
It matters massively for Laptops.
I want to be able to work on my laptop in direct sunlight and have full perfect color reproduction at 4k 120p
Strazdas1@reddit
so literally the most important aspect?
rubiconlexicon@reddit
You say that as if we're gonna have 10k nit peak brightness or full BT.2020 coverage any time soon, even once RGB OLED panels are introduced.
arandomguy111@reddit
There's a difference between endgame in terms of only expecting iterative improvements current technology vs. disruptive technology.
For example LCDs (non FALD) are now what you can term the endgame. Yes they will keep getting better but you aren't likely to get much benefit by holding out another year or even a few years. Something disruptive to that would be FALD or OLEDs.
While with OLEDs next years model can still be significantly better in terms of capability and/or cost.
Ok-Wasabi2873@reddit
There was with Trinitron. Loved it except for the wire that you could see.
Asleep-Card3861@reddit
they were lovely displays, but those wires irked me something fierce.
some top tier plasma’s were decent, Panasonic in that case.
noiserr@reddit
I regret getting rid of my CRTs. There was just something magical about them that I now miss.
Jeep-Eep@reddit
It took until 2022-3 or so for gaming LCDs to match high grade CRTs in good condition, and even then the price can be a little wince worthy.
wpm@reddit
They can still be found for cheap on local marketplaces if the seller didn't do any homework. Even so, I have no regrets on the few hundo I blew on my tiny Sony 8" Trinitron PVM. The magic is still there. They're definitely almost useless for modern stuff, but some things just demand a CRT, or just look better on them.
cocktails4@reddit
My laundromat has this massive Sony Wega built into the wall that probably hasn't been touched in 20 years. I want to ask the owner if it still works. Probably weighs 300 lbs...I don't even know how I'd get it down.
eugcomax@reddit
microled is the end game
DesperateAdvantage76@reddit
The endgame is optical antennas, which directly create any frequency of optical light needed for each pixel.
armady1@reddit
No, the true endgame is direct display neural injection which displays the image within your brain as an overlay on top of your normal vision.
Jeep-Eep@reddit
Fuck that, I am not dealing with the neural jack analog of Adaptive Sync technology shitting itself ON TOP of MSRA for gaming.
ReplacementLivid8738@reddit
Hope we still have ublock by then
FlygonBreloom@reddit
Holy crap, I never even considered that. That would be a huge boon for sharpness, and colour fidelity.
Jeep-Eep@reddit
Eh, at some point we'll get monitors to DAC level maturity - you can splurge if you want to, but there will be a Sabre 32 equivalent panel -aka one that looks incredible and is not offensively pricey that will go until it dies.
Daffan@reddit
End games are real, imo they are coming fast for everything. My wireless gaming mouse is almost at endgame, I don't see how anything can be much more perceptible to humans in that area at least.
ThinVast@reddit
According to UDC's roadmap, after phosphorescent oled comes plasmonic oled. promising even higher efficiency levels.
jedrider@reddit
Even worst. Now, everywhere will look like Times Square or Shibuya in Japan.
ProtoplanetaryNebula@reddit
Of course. It's like when colour TV was invented, they didn't stop there and retire. Things just keep improving.
Vb_33@reddit
So only 15% less power consumption? This is is still a compromise and short of the 100% luminous efficiency of dream OLED no?
nephelokokkygia@reddit
I'm sorry but 15% is a LOT. That's almost a 1/6. If you applied that reduction to a work schedule, it'd be like going from eight hours per day to under seven.
Positive-Bonus5303@reddit
additionally the 'top x%' are responsible for most of the burn-in. Wouldn't surprise me if those 15% end up doubling the degradation time given the same brightness.
Silent-Selection8161@reddit
Yeah but "modest progress made towards long term goals" isn't gonna get you to click now is it?
nday76@reddit
Does Dream Oled means no burn in?
MrMichaelJames@reddit
I have a lg oled 65” that I bought in 2018 that still has zero burn in. It’s used everyday. So almost 7 years old and still going strong. It’s had numerous game consoles and tv watching and no issues. I’m actually amazed but it keeps on going.
1eejit@reddit
My 2015 OLED also has no burn in at all either. I guess it's not really an issue for normal use cases.
Strazdas1@reddit
Do you also run bright static UI elements for 16 hours a day?
1eejit@reddit
...No
Strazdas1@reddit
That is normal use case for me. So until OLEDs can survive that, OLEDs are not for me.
Apprehensive_Seat_61@reddit
Don't kid yourself.
upvotesthenrages@reddit
It's far worse on monitors, pretty much because you will have tons of static objects that are displayed a huge % of the time.
With a TV that's far more rare.
reallynotnick@reddit
I wouldn’t be surprised if it has lost some brightness though, which one can argue is just even burn-in across the whole screen.
MrMichaelJames@reddit
Maybe but we don’t notice it. I’m sure if you put day 1 next to now it would show but on a whole there is nothing noticeable.
RedIndianRobin@reddit
My guy, there are mitigations in place in modern OLEDs that you won't see any burn in for 5 years and almost all OLEDs now have atleast a 3 year burn in warranty. 1440p and 4K OLEDs are in a steep rise in popularity.
Strazdas1@reddit
The "mitigation features" are features that are a dealbreaker to begin with.
VastTension6022@reddit
Except that the "mitigations" are severely limited brightness that no LED based technology has to worry about.
RedIndianRobin@reddit
LEDs can have all the brightness in the world yet it still has mediocre HDR. OLEDs are the only display tech that can do true HDR.
trololololo2137@reddit
only laptop on the market with proper HDR is a mini LED, oled is too dim :^)
RedIndianRobin@reddit
Try harder. They're fine in a dark room. Besides mini LEDs can never match the contrast radio of an OLED, which is a far more important metric in HDR performance. I had the Neo G8 and it had mediocre HDR performance. The day I upgraded to an OLED, I understood what real HDR even is.
veryrandomo@reddit
The Neo G8 is also a mediocre mini-LED that frankly gets outclassed in HDR by budget $300 VA Mini-LEDS with a quarter of the zones.
Frexxia@reddit
Local dimming is fine for HDR, with the exception of extreme situations like star fields.
RedIndianRobin@reddit
I had a MiniLED with high zone count FALD, the Neo G8. While it was good, it still lacked the contrast OLEDs can give.
JtheNinja@reddit
Meanwhile, at Sony HQ they’re going back to LCD-based designs for their flagships TVs…
RedIndianRobin@reddit
They can have it. I'm not going back to any LCD tech in the future. Will ride out OLEDs until MicroLED reaches consumer market.
RobsterCrawSoup@reddit
There is such a gap in understanding between the people who are happy if a display lasts them 3 years and people like me who aren't really interested in a display if it won't last closer to a decade. I also know that because my computer is used for work 80% of time and browsing and games only 20% of the time, that my use case is a worst case for burn-in and the mitigation systems might help but they don't get these displays the kind of longevity that matters to some consumers. Since my TV is on infrequently and doesn't tend to display a static image, I'd be ok with a OLED TV, but for my computer, which is on, with mostly static UI, windows, and text for hours and hours each day, it would absolutely still be a problem.
Especially now that in terms of resolution, color accuracy, refresh rate, latency, and pixel response times, we are soo close to having real "end game" displays, so it makes it all the worse that OLED has a much shorter lifespan. If the tech is no longer going to grow obsolete, it is a shame that doesn't last when it could be perfectly adequate for decades if it did.
I'm typing this now on a 15 year old IPS display. I would like my next displays to last at least half as long. OLED is sooo tempting, but I just don't want a display with a picture quality that will degrade over just a few years. That is why I keep hoping to see QDEL or mircoLED.
RedIndianRobin@reddit
Yeah if your PC is mostly for work, then OLEDs are the worst possible tech to buy. I hope MicroLED reaches consumer space soon.
DoTheThing_Again@reddit
Every tv technology has “burn-in”
TechnicallyNerd@reddit
What? With very rare exceptions, LCD panels don't suffer from permanent image retention issues at all.
DoTheThing_Again@reddit
Lcd and oled have different types of “burn-in”. As does plasma and crt. The word burn-in isn’t even the precise language for oled or lcd but it is a carry over word from the crt days.
Oled, led, cfl and even lcd ink all degrade.
TechnicallyNerd@reddit
Sure. That's why I used the phrase "permanent image retention" rather than the more colloquial "burn-in". Given OLED image retention issues are due to the diodes in each individual pixel getting dimmer over time rather than literally "burning" the image into the display with ye old CRTs, the more accurate terminology would be "burn-out".
Yes, everything known to mankind other than the proton (maybe) decays with time. But the speed and nature of the degradation matters. Please stop being pedantic for a moment and acknowledge that the comment asking about "OLED burn-in" is referring specifically to the permanent image retention issues induced by the non-uniform degregation of individual pixel luminance on OLED panels. LCD panels do not have self-emissive pixels and instead utilize a shared LED backlight. While the LED backlight does get dimmer with time due to aging, since the full panel is sharing a single light source this only results in a reduction in brightness rather than the permanent image retention seen on OLEDs.
DoTheThing_Again@reddit
Yes i will stop being pedantic. But my point is that people often misvalue objects that have a well defined expiration.
Realistic_Village184@reddit
That's just how language works. "Hard drive" is an umbrella term that includes SSD's in colloquial language. That's not "misvaluing"; it's just how people communicate.
It's like when someone asks if you can roll up the window or rewind the video. Obviously those terms aren't "precise" anymore if you're holding to the origins of those terms, but no one does because that's fundamentally not how language and human brains work.
DoTheThing_Again@reddit
I think we are talking past each other.
I am referring to years ago when people undervalued ssd vs hdd because ssd had well defined write cycles and people wrongly miscalculated there everyday level of read/write load. People thought there ssd would die early, but that was very dar from true, and hdd lasted longer than it should have in consumer products
Realistic_Village184@reddit
Oh, I did misunderstand what you meant. My apologies. Early SSD's did have short lifespans, though. That was a legitimate concern in the early days of SSD adoption, especially from bargain bin suppliers.
DoTheThing_Again@reddit
In the EARLY days yes. But you people were saying that into the early 2010s when they were already mature
Strazdas1@reddit
SSDs matured somewhere around 2015. Before that there was high chance of buying a very short lifespan one.
Strazdas1@reddit
SSD is a hard drive. HDD is also a hard drive. If you were to say hard drive is furniture, SSD and HDD would be table and chair.
Frexxia@reddit
What
DoTheThing_Again@reddit
Lcd has ink in it, did you not know that?
Frexxia@reddit
No, there's no ink in an LCD panel. There's however a very thin film of liquid crystal.
DoTheThing_Again@reddit
Every single tv and large display i have ever owned has an ink color filer as part of the panel, i know some tech doesn’t… but i know lcd definitely does. Point is that it all degrades, what we should be asking is how long does it take. And frankly for normal use… they all last very long.
Frexxia@reddit
The process of creating color filters may involve ink, but I find calling that "LCD ink" incredibly strange.
DoTheThing_Again@reddit
You are right, i did say that weird
ryanvsrobots@reddit
It's not weird, it's wrong. There's no ink.
JtheNinja@reddit
You’re really glossing over how much faster OLED degradation happens in the real world compared to LCD and backlight wear.
DoTheThing_Again@reddit
I am really not. Many led tvs actually last less than oleds, rtings did a long study on this. They found that higher end led tv lasted longer but affordable led tvs and would just lose there backlight completely.
And futhermore point if you are buying a high end qled… you can afford an oled and get the better picture anyway. But that is not a hard and fast rule.
Oled burn-in concern reminds me of all the people who thought they were gonna write a terabyte a month on the ssd for years, and so stuck to hdd.
Realistic_Village184@reddit
You're cherry-picking. It's not really meaningful to say that a bottom-budget cheapo LCD TV has components that fail. That's very different from OLED being a technology that inherently develops burn-in over time.
DoTheThing_Again@reddit
My point is, that it should not be viewed as inherently different. Oled, having a better defined lifecycle, should not be seen as a negative compared to the wide variance lifecycle of led.
Realistic_Village184@reddit
You're missing the point. One technology has inherent risk of burn-in due to how the technology works. The other doesn't.
The fact that someone can make a super cheap LCD panel that falls apart in a few months doesn't change that.
Qweasdy@reddit
While I agree that LCDs don't typically "burn in" like oleds do they do often degrade over time. Backlight bleed as panels age is pretty common, especially with modern edge lit LCDs. My previous LCD panel i retired because of a big splotchy greyness across ~30% of the screen when displaying dark images.
RTings has been running a 2 year longevity test for 100 TVs (OLED and LCD) and they've shown I'm not alone in this. LCDs last longer than oleds before seeing image quality issues typically but they're not immortal as many seem to think they are.
Strazdas1@reddit
image degradation exists but the mechanics are very different. LCD will degrade no matter what content i use it for or how many hours a day. OLED will get absolutely destroyed in a short amount of time with my "bright UI elements 16 hours a day" use case.
GhostsinGlass@reddit
You didn't answer his question and that "burn-in" phenomena is leagues apart between the different technologies to the point where it's discussed with some at a model level (OLED) and a complete non-issue in other technologies.
Grow up.
bizude@reddit
LG's current lineup is pretty resistant to burn in, if you don't interrupt the automatic cleaning functions. I put in over 12K hours on my last monitor, it showed no signs of burn-in despite being used for mainly WFH.
DeliciousIncident@reddit
Go read the article, it explains what that means.
JtheNinja@reddit
No. They didn’t even remove the fluorescent OLED from the entire tandem stack, just from one layer. The press release says “while maintaining a similar level of stability to existing OLED panels.” PH-OLED typically has worse lifetime than F-OLED, hence why they likely did one of each type. They managed to get something with similar brightness and burn-in resistance as a pure F-OLED stack while having somewhat reduced energy use.
wizfactor@reddit
It’s going to be difficult not pulling the trigger on a 4K/5K OLED monitor knowing that the true endgame OLED tech is just a couple of years away.
dabias@reddit
RGB oled monitors should be coming next year, using the above technology. It's already coming to TV's right now. As far as the panel is concerned, RGB tandem could be pretty much endgame - the brightness increase is the biggest in years, some form of blue phosphoresence is used.
azzy_mazzy@reddit
LG G5 is still WOLED, all newly released “primary RGB tandem” OLEDs still have the white sub-pixel
dabias@reddit
Yeah, I seem to have mixed it up. The G5 and so are getting the 4-stack RGB tandem layers, with RGWB pixels, like you said. However, RGB pixels are coming to monitors next year. I would presume that is made possible by RGB tandem, as the 1440p RGB oled monitor that was already announced has 335 nits SDR brightness instead of 250.
YakPuzzleheaded1957@reddit
Honestly these yearly OLED improvements seem marginal at best. The next big leap will be Micro-LED, that'll be the true endgame for a long time
conquer69@reddit
I think QDEL will make microled obsolete for regular displays.
azzy_mazzy@reddit
Micro LED probably will take much longer than expected maybe never reach wide adaptation given both LG and Samsung are scaling back investments
gayfucboi@reddit
Compared to my LG G1, the 10% window is basically rumored to be about 90% brighter.
Over 5 years thats a massive improvment, and firmly puts it in competition with Micro LED displays.
I still won't replace my panel until it breaks, but for a bright room, it's a no brainer buy.
YakPuzzleheaded1957@reddit
Samsung's Micro LED can hit 4000 nits peak brightness, and up to 10,000 in the future. Even if you take today's brightest OLED panels and double their peak brightness, it still doesn't come close.
TheAgentOfTheNine@reddit
Nah man, they got way brighter and this tandem stuff puts there up there with QD-OLED in color volume. Last 2 years have been pretty good improvement-wise.
The 5 or so before, tho.. yeah, pretty stagnant.
Yebi@reddit
I'd expect marginal improvements on that, too. The first version is unlikely to be perfect
Frexxia@reddit
There will never be an actual "endgame". They'll chase something else after.
Throwawaway314159265@reddit
Endgame will be when I can wireless connect my optic nerves to my PC and experience latency and fidelity indistinguishable from reality!
goodnames679@reddit
Endgame will be when you log out from your VR and you think real life’s graphics suck
FlygonBreloom@reddit
That's arguably already the case for a lot of VR users.
sh1boleth@reddit
Buy and enjoy, got a 4k240hz 32" oled monitor last year and ive been very happy
cocktails4@reddit
And by then it will probably be competing with MicroLED.
TehBeast@reddit
Just buy it now and enjoy. Current OLED is still stunning.
Cute-Elderberry-7866@reddit
If I've learned anything, it's that it all takes longer than you think. Unless you have unlimited money, I wouldn't wait. Not until they show you the TV with a price tag.
VastTension6022@reddit
The endgame display tech isn't oled so you'll be waiting for that too :^)
EnesEffUU@reddit
Display tech has been improving pretty rapidly year over year for the last few years. I'd say just get the best you can now if you really need it, then in 2 years you can decide if the upgrade is worth it, instead of just wasting 2 years waiting for what might be coming. You could literally die within the next 2 years or face some serious change in your circumstances, just enjoy the now.
GenZia@reddit
Personally, I think QDEL is probably the endgame for display technologies.
No burn-ins, no flickering, no backlight, and practically infinite contrast ratio. Plus, it can be manufactured with inkjet printing (like standard LCD panels) and doesn't require vacuum deposition, a major cost component in OLED displays.
Strangely enough, no one seems to be talking about it, at least no one prominent, which is a bit odd considering how far the technology has come in just a few years:
QDEL Was Hiding in Plain Sight at CES 2025
For perspective, QDEL looked like a lab project just 2 years ago:
https://www.youtube.com/watch?v=eONWY3kbZc0
JtheNinja@reddit
Stop huffing the Nanosys marketing hype around no burn in on QDEL. That’s what they hope to achieve in the future. Current blue QD materials degrade even faster than OLED, which is why this is not on sale today and why it doesn’t get much interest. Baring a material breakthrough, QDEL’s only advantage over QD-OLED is that it’s cheaper to build. QD-OLED uses QDs as well so will have the same gamut, but has OLED’s superior degradation resistance so it will have better brightness and less burn-in.
The whole hype is based on a dubious hope that blue emissive QD lifetimes will improve faster than blue OLED lifetimes. If that doesn’t happen, all QDEL will be able to do is be a cheaper QD-OLED with worse brightness. Which might still be a viable product as a budget display, but it won’t be any sort of end game.
Dood567@reddit
TCL is starting their inkjet OLED production later this year too. Looking forward to hopefully cheaper panels soon
gayfucboi@reddit
One of the biggest problems they have is manufacturing a 100% pixel perfect panel they have with "inject printing" and actually placing those pixels in alignment for a full panel.
In theory it sounds possible, in practice the failure rate is too high to actually manufacture them so far.
specter491@reddit
Great and I just spent $800 on a top of the line oled monitor
msolace@reddit
too bad oled is TRASH.......
I mean the pictures cool and all, but burn in is 100% a thing still, and i dunno bout you but i cannot afford a 2000+ monitor for my gaming pc just to swap to another monitor to actually do work all day with text. It needs to be able to handle 6+hours of text a day without ever an issue.
If someone figures out how to get your spouse to stop ordering something from amazon every two minutes, maybe i could afford extra "for fun" monitors :P
HerpidyDerpi@reddit
Whatever happened to microled? Faster switching. No burn in. High refresh rates....
iDontSeedMyTorrents@reddit
For any display that isn't tiny or wall-sized, it's still in the labs. Still too many difficulties in cost and manufacturability.
HerpidyDerpi@reddit
You should seed that shit....
JtheNinja@reddit
Still can’t be manufactured at scale and reasonable price points. This article is a great run down of where microLED sits atm: https://arstechnica.com/gadgets/2025/02/an-update-on-highly-anticipated-and-elusive-micro-led-displays/
There have been some promising concepts like UV microLEDs with printed quantum dots for manufacturing wiggle room, or using low-res microLED as an LCD backlight (a 540p microLED screen behind an LCD is effectively 518,400 dimming zones).
But for now, they’re not a thing and it will still be a few years. Some of the analysts in that article also mentioned how OLED has mindshare with the general public at this point. Tech enthusiasts will come out if you just say “look we made microLED”, but most people won’t immediately understand why it’s better than the OLED that they know.
ThinVast@reddit
The article only mentions about efficiency/power consumption with blue pholed because that is its only benefit compared to blue flourescent oled used in current displays. The lifetime of blue pholed and possibly color gamut as well is worse than the current blue f-oled used in displays. So blue pholed will mainly benefit displays like phones where long lifetime isn't as important compared to a tv. Blue pholed in TVs can still help to increase brightness and relax ABL, but then again if the lifetime is really bad, display manufacturers may not want to use it in TVs yet. The challenge to bringing blue pholed to the market has been bringing its lifetime to acceptable levels. Right now, they're at a point where the lifetime is good enough for devices like phones, but with more research they may eventually get its lifetime up to part with f-oled.
reallynotnick@reddit
Final step, yet still has a layer of non-phosphorescent blue since the lifetime of the new layer is poor.