Nanoscale OLEDs: scientists reduce the size of OLED pixels to just 300 nm
Posted by Balance-@reddit | hardware | View on Reddit | 128 comments
Researchers at the University of Würzburg have developed individually addressable organic light-emitting diode (OLED) pixels measuring just 300 × 300 nanometers—smaller than the wavelength of visible light and representing the smallest individually controllable OLED pixels reported to date.
The Core Innovation
The key challenge in scaling OLEDs down to nanoscale dimensions is that sharp electrode edges create intense local electric fields, leading to unbalanced charge injection, poor efficiency, and device failure through metallic filament formation. The team solved this by introducing an insulating layer that selectively covers the electrode edges while leaving a precisely defined nanoaperture at the center. This ensures charge carriers inject uniformly through flat regions with homogeneous electric fields, rather than concentrating at problematic edges and corners.
Performance Achievements
The nano-OLEDs demonstrate remarkable stability and performance. They achieve external quantum efficiencies around 1%, maximum brightness of 3,000 cd/m², and response times exceeding standard video frame rates (>60 fps). The devices use gold patch antennas as bottom electrodes, which serve dual purposes: efficient hole injection and plasmonic light extraction. By coupling molecular emission to plasmonic modes of the nanoscale gold antenna, the researchers achieved efficient light outcoupling despite the extremely small pixel size, where conventional emission would be severely limited by the (pixel size/wavelength)² relationship.
Broader Impact
This work represents a significant advance for ultra-high-density displays (potentially exceeding 10,000 pixels per inch) for augmented and virtual reality applications, as well as for photonic integrated circuits. The nanoaperture fabrication process proved highly reproducible with over 90% device yield, demonstrating practical scalability. While further optimization of organic layer stacks and antenna designs could improve performance, the demonstrated approach provides a clear path to overcome fundamental electronic and optical bottlenecks in nanoscale optoelectronic devices.
Source: https://www.science.org/doi/10.1126/sciadv.adz8579
In popular media: - https://www.notebookcheck.net/Nanoscale-OLEDs-scientists-reduce-the-size-of-OLED-pixels-to-just-300-nm.1148743.0.html - https://www.tomshardware.com/monitors/researchers-create-worlds-smallest-pixel-measuring-just-300-nanometers-across-could-be-used-to-create-a-1080p-display-measuring-1mm
kensaundm31@reddit
Let me know when burnout is not an issue...
surf_greatriver_v4@reddit
Always one comment isn't there
gajodavenida@reddit
I mean, it's the only real problem with OLED
DeeJayDelicious@reddit
But only in theory. And only with old models and extreme Use-cases.
SirMaster@reddit
How old? My 2 year old OLED gaming monitor has a ton of burn-in and my use case is video gaming, media playback, and software development.
StickiStickman@reddit
And my 3 year old OLED has next to 0 burn in even though I mostly use it for programming and gaming.
Strazdas1@reddit
Why even bother buying a screen this bright then?
StickiStickman@reddit
For no motion blur and perfect blacks? Also higher sustainable brightness usually means better longlivity.
Strazdas1@reddit
But you can have no motion blur and perfect blacks by simply choosing a lower brightness version of same technology?
Burin is an unique issue for OLEDs (well, we had a form of it in CRTs, but noones using those anymore).
SirMaster@reddit
50% brightness most of the time which is about 120 nits according to my meter.
Mr_s3rius@reddit
This is what makes me scared of buying one.
My current monitor is 7 years old. Sure it's degraded by now, but it's uniformly degraded. I can deal with redshift and worse brightness. I don't know if I can deal with burn in.
Dood567@reddit
No it eventually happens to all of them. All they can do is delay or pack enough brightness in the panel so it can uniformly keep dimming or cleaning the pixels for a long time before looking bad.
Whirblewind@reddit
Nope, not in theory, and no, not only old models, and no, not just in extreme use-cases.
gajodavenida@reddit
Is a Super AMOLED screen be considered an old model? Burn in is super common
HuckDFaters@reddit
Price is the only real problem with OLED. When OLED TVs get cheap enough that the average consumer can afford to replace it every 2-3 years then no one is going to care about burn in. People cycle through OLED phones faster than they can develop any visible burn in.
Nuck_Chorris_Stache@reddit
There are certain types of technologies you tend expect to replace every few years simply because of the speed at which new technology becomes better.
TVs and monitors are not really one of those things. I mean, they do improve, but not nearly as quickly as say your graphics card or CPU. Having a 10-15 year old monitor or TV is fairly common.
Sound equipment I'd also put in that category. A lot of people have old speakers. Some people might even still use speakers from like 1960.
ConsistencyWelder@reddit
One of my old TV's lasted 13 years. So I was pretty disappointed when the new OLED only lasted 2 years. That is not acceptable.
gajodavenida@reddit
That's a bad thing... we need less e-waste, please and thank you
surf_greatriver_v4@reddit
Yeah we all know, there's always someone here to point it out and sometimes start to write out small essays about why they won't use oled because their specific use case
Framed-Photo@reddit
Problems don't get solved when people stop talking about them, that's why it's good that it gets brought up.
If anything, the fact that it gets brought up so often is a good sign that it'll continue to be worked on and publicized. I know there's definitely problems in tech that I WISH would get talked about nearly as much as OLED burn in, it might actually get things addressed.
throwaway0102x@reddit
Also, as a happy OLED monitor owner, let's not pretend that burn-in isn't a real problem that's exacerbated by how expensive OLED panels generally are.
Framed-Photo@reddit
Yeah like if the panels were super budget friendly I don't think people would have nearly as much of a concern about burn in. But needing to change a bunch of your computer habits and preferred workflows just to avoid damaging your monitor, and the monitor costs 2x-3x as much as a normal LCD? Yeah it's a bit much lol.
gajodavenida@reddit
Yeah, it's a useless comment to be honest. Still really want it to become an issue of the past, though!
DyingKino@reddit
VVR flicker and price are real problems too.
AreYouAWiiizard@reddit
Yeah, I notice flicker on my VA sometimes and find it annoying, OLED VRR flicker would stand out like crazy and so far that the thing stopping me from upgrading. Doesn't really help that there's been little progress in that area while mini-led options are starting to look very attractive.
ConsistencyWelder@reddit
Well the brightness is an issue with OLED as well. As the brighter they get, the sooner they'll burn out.
I had an LG OLED that I had to retire because of burn out, it maxed out at 700nits peak brightness yet it still burned out in 2 years. I replaced it with a Mini LED with 3000 nits peak brightness. Wowzers, I was worried I would be doing a down grade in image quality. Now I'm addicted to the brightness.
And no risk of burn in or burn out.
SirMaster@reddit
It's not the only problem. Also near-black is a problem too in terms of things like overshoot, uniformity (mura) etc. Often the granularity between off and the lowest level is also too great of a jump.
FlygonBreloom@reddit
And yet nobody remarked on CRT burn in when CRTs were a thing, outside of remembering to use screensavers.
Meanwhile I've had LCDs that have burnt in (to my actual surprise).
Nuck_Chorris_Stache@reddit
That's because CRT was pretty much the only viable technology at the time.
LCD had its own problems that took time to solve. You might think LCD response times now are 'bad', but they run circles around the LCDs that first stated to come out competing with CRT.
I remember people saying you wanted to look for an LCD with a 16ms response time, to keep up with a 60Hz refresh rate.
diemitchell@reddit
crt wasn't used because it was good but because there wasn't a good alternative .-.
Soggy_Association491@reddit
Because CRT burn in can be avoided with the usage of screensavers while OLED cannot?
Essteethree@reddit
This was my thought as well. I'm certainly no scientist, but wouldn't burn-out be an even bigger issue with such tiny pixels?
-WingsForLife-@reddit
At this size couldn't you start doing redundant pixels? If it's bright enough I think you wouldn't be able to notice the pixels that are off
PriscFalzirolli@reddit
It depends on their brightness and voltage levels. But you aren't going to use these for large screens anyway.
Balance-@reddit (OP)
Yes, you're reading that right, 1000 Hz with 0.1 ms response time.
Cheerful_Champion@reddit
Can't wait to see product based on this tech in never
The_Edeffin@reddit
I dont know why you say that. Displays are the one area that are continuing to advance and get cheaper at a break neck speed. Might not be soon, but probably hit those numbers within 5-10 years.
Cheerful_Champion@reddit
I'm not really convinced about that. Display tech is full of superior ideas that were never introduced or died, because they were too expensive or too problematic.
CRT had better response times, refresh rate and blacks than LCD.
Plasma had better response times and refresh rate than LCD or OLED. It had better colors than LCD.
That's just to name most known cases. Now this tech sounds great, but might be too expensive to produce. Especially since it uses gold as main component.
Even OLED, that is considered gold standard for image quality, after decades of development still has major issues with longevity and brightness.
The_Edeffin@reddit
OLEDs are pretty close to solving those issues. They may never fully solve them, but they certainly are becoming minor concerns.
And of course there will always be trade offs. New tech needs to start somewhere and work through its issues. But for the mass majority of consumers/enthusiasts, display tech as a whole has only really steadily improved over time and at a pretty break neck pace (while also dropping costs).
vandreulv@reddit
It's those tradeoffs that make OLED a non-starter for a lot of people. Namely PWM.
azzy_mazzy@reddit
PWM is not used in all OLED products like TVs and monitors.
vandreulv@reddit
Then burn-in becomes the next tradeoff that makes OLED a non-starter for a lot of people.
makinenxd@reddit
Its just panic created by people. Its not an issue and its being blown out of proportions. Its a thing on CRT's and LCD's also and you don't see people panic over that.
vandreulv@reddit
LCDs don't burn in at the rate OLEDs do.
We don't use CRTs anymore.
Playing Iraqi Information Minster to someone who knows far more about this than you do isn't gonna work the way you think it will.
OLED burns in. Period.
Cr4zyPi3t@reddit
Yes but CRT screens were huge and Plasma was not very energy efficient compared to LCD.
Cheerful_Champion@reddit
And yet years later OLED is not much more energy efficient.
Cr4zyPi3t@reddit
Did you even own plasma and OLED TVs? My old plasma used 400W+ while my OLED needs just a bit over 100W+ while being brighter and bigger (and including a small computer which replaced my SAT receiver that also used energy).
Cheerful_Champion@reddit
Did you own anything other than early models? LG C5 draws up to 340W. Late Plasma models drew up to 350-400W (Panasonic ST60 drew 350W) while offering similar full screen brightness or even a bit higher than C5 (top Pioneer model went as high as 299 cd/m2 vs 202 of C5). Yeah, OLED is sooo much more energy efficient.
melberi@reddit
You quoted numbers from two different plasmas. Panasonic ST60 full screen brightness was just 82 cd/m2 (https://www.rtings.com/tv/reviews/panasonic/st60). 65" screen has ~17 % more area than 60". By these numbers, the OLED has ~2.9x the light output for similar power draw. I would class this as being much more energy efficient.
Now can you source "top Pioneer model" brightness and power consumption so a proper comparison with that can be made?
Cheerful_Champion@reddit
KRP-500M, it maxed out at 400W. There's also Samsung PNF8500 that maxes out at 310W and gets 190 nits.
These are more than decade old plasmas and it turns out if you want similarly bright OLED you get similar power draw.
TRIPMINE_Guy@reddit
This really isn't true though. My monitor from 1998 has higher motion resolution and black levels than an lcd from today. Arguably better near black performance than an oled id properly calibrated. These 1000hz displays have no purpose to exist outside of people who play super undemanding games.
Jiopaba@reddit
And also not being as deep as they are wide, you know? Kind of a big thing to leave out of your analysis, as if we've been going completely backwards for no reason and only just reaching parity with our ancient predecessors who were so much more advanced.
I get why people get most logic for CRTs, and the transition to flat panel monitors was for sure a step down in quality to start, but it was unquestionably the right move.
Perhaps you don't recall the era of blowing $300 in 90s money on a 17" monitor that was a tapered cube 20" to a side and weighed like a slab of lead?
TRIPMINE_Guy@reddit
No I wasn't alive back then. When I said my monitor back from 1998 I meant that's the manufacturer date. For sure a crt would be really expensive today, way above oled for sure especially when you factor economy of scales and the reduced demand they'd face ontop the weight and hands on manufacturing.
Jiopaba@reddit
That's fair. I was young at the time myself, but my point was that a 70" OLED is less than a thousand dollars and can be mounted on your wall. A 70" CRT even in its heyday would have cost many thousands of dollars and weighed thousands of pounds. You need a vacuum of commensurate size and enormously thick glass.
CRTs might have looked nice, but they were heavy, expensive, and huge. Barring a completely unforseeable development in the way they work which just never manifested in our world, there was no future for them.
Cellphones, smart watches, laptops, screens in cars, these things would never have existed in anything like the scale they do now without the whole world consigning CRT to the dustbin off history in favor of the more promising tech.
CRT as a technology was at the peak of a one hundred year development life cycle when the absolute best displays were slightly edging out good modern OLEDs, and most display tech now has less than twenty years of serious development behind it.
I truly think in another ten years we'll have crossed a threshold to the point that not even the most stubborn or critical will still point to CRTs as the pinnacle of display tech.
Anyway, sorry for the rant. Km just passionate about this topic.
VampyrByte@reddit
The largest CRT ever produced was 43", weighed 200KG (440lb), could display a 480p image at 60hz and cost a whopping $40,000 in 1990!
I think OLED is beyond it these days to be honest. Especially with software like the blurbusters CRT beam simulator for high refresh rate OLED. It's incredible the leap forward in the last decade in displays with OLED going mainstream.
TRIPMINE_Guy@reddit
Is that bluebusters thing even usable? Last I heard it had huge compute overhead because the gpu needs to calculate colors.
VampyrByte@reddit
Oh for sure it is usable. I was using it just after it was released for sort of PS1 and SNES emulation (PS2 and Gamecube era stuff didnt work so well) in Retroarch and it was incredible. At the time I had an Intel 8086k and a Nvidia GTX 1080. This was at 240Hz. So most people with a gaming PC can probably make use of it if you have a high refresh rate display.
It needs to be implemented in some more software to be truly useful. I see that there is now a video player that has integrated it, and it is supposed to be coming to ShaderGlass.
TRIPMINE_Guy@reddit
hmm see I'm worried about if it's usable on modern games though. I don't mean games taht can't even hit good framepacing without it but just modern games in general. I've been thinking about buying an oled this black friday and using that emulator. Do you think 4k or 1440p would be better? I honestly think gpou demand may be comparable if you consider the gpu will have to do extra work for the 480hz compared to 240hz, so it's just a matter of do I want higher static and slow movement resolution or higher fast resolution.
VampyrByte@reddit
I don't believe there is a supported way to run it on modern games just yet, and even if you could you likely wouldn't need to so much, since you can run them at higher frame rates anyway. If those framerates aren't achievable, then it is unlikely that CRT beam simulation would be achievable either, but you might still be able to use something cheaper to reduce motion blur like black frame insertion. The truth is though these displays are so good you don't need any of this stuff for modern games running at high frame rates. It really excels at 60Hz content that was intended for CRT displays like retro consoles.
I have a 1440p OLED, but hitting 240Hz constantly in modern content just isn't possible without having to sacrifice visuals very hard. 4K would be worse. But even without using freesync, it just doesn't matter much as it does on an LCD. These displays are so damn good it just doesn't matter.
The CRT beam sim is still niche and needs adapting into more software to work well. I am aching to get it working somehow with Factorio, but I don't believe it is possible yet.
TRIPMINE_Guy@reddit
Oh also you could look into fed and sed displays if you want to see what the future of crt could have been. It was tied behind patent lawsuits so never saw the light of day.
TRIPMINE_Guy@reddit
Yeah I mean as soon as oleds have enough hz to emulate a raster scanout like crt and emulate plasma motion I'd agree and I can see that happening in ten years with how fast they are increasing the hz every few years. Oleds are just fundamentally not great with low fps content like film and demanding games, but with enough hz you can just emulate the methods that are better for low fps content.
Qweasdy@reddit
"Display technology isn't advancing because CRTs had some advantages" is certainly a novel take I'll give you that.
rubiconlexicon@reddit
BFI, CRT shaders (both need very high Hz), as well as the more underappreciated quality of sidestepping VRR flicker by making VRR obsolete through raw refresh rate, as it sure doesn't seem like improving power delivery to actually eliminate VRR flicker is a priority for anyone right now.
TRIPMINE_Guy@reddit
I'm aware of those crt shaders and follow its development. Yeah those motion emulators will be neat once they are developed but from what I read those crt motion emulators use gpu power to calculate the scanout? I don't know the performance difference but it seems quite counterproductive to have a large amount of gpu compute dedicated to the emulation and then still needing headroom ontop due to the need for perfect framepacing with strobbed displays.
The_Edeffin@reddit
CRTs vs LCD is a bad comparison. There were trade offs. But for most people the size reduction was well worth it. LCDs have came very far since. However, OLED is the current king still, and honestly if you are saying CRTs are better in 95% of use cases than OLED you are fooling yourself. CRTs have definitely been superseded. They are only useful for games that were only designed/mastered for their unique looks or for collectors/nostalgia.
Ignoring CRTs, which again had huge trade offs, both LCD and OLED have been marching pretty steadily to greater and greater quality at decreased prices. And that will, hopefully, keep going on.
TRIPMINE_Guy@reddit
I have probably around the fifth best consumer crt monitor in existence right now. I've also used a 240hz and 144hz 4k oled. The sharpness and ansi contrast certainly leaves a bit to be desired, but you need 240fps to even start to be competitive with the crt motion. 120hz 960p vs 240hz 4k and the crt still looks sharper while panning albeit it's minor. I can use supersampling and downscale beyond 4k resolutions as well. It's not bad and honestly my biggest complaint is size. If I could get a 27 inch 16:9 high spec crt I wouldn't even touch oled honestly.
zdy132@reddit
Eh surely we can get it mass produced in less than 80 years. If I were alive then, my old eyes would definitely love the treat.
III-V@reddit
One of the primary components is gold, so I wouldn't count on it.
gvargh@reddit
... as do shittons of mass-produced packaged semiconductors
the_nin_collector@reddit
yeah. GPUs and Mobo have about 0.1g of gold in them each.
Eclipsetube@reddit
Let’s say a whole screen would use 1g of gold that would be 1300€ JUST FOR THE GOLD
zenithtreader@reddit
Today's gold price is \~128 USD.
the_nin_collector@reddit
No. You are off a decimal point. 130.
When 60 inch oldea cost 3000-5000 that's not really insane..and of we talking top of the line highest end OLED screen..more. so 1g of gold is totally reasonable.
the_nin_collector@reddit
The averahe computer has on average about .2 grams of gold in it.
Using gold in electronics is nothing new at all.
HiroYeeeto@reddit
How much more is needed relative to a normal oled monitor?
Insidious_Ursine@reddit
I give it less than ten.
RxBrad@reddit
I'm gonna use my graphene batteries to get this up and running. Planning on a Feb 30, 2027 launch.
szczszqweqwe@reddit
We already have OLED monitors over 500Hz, nanoscale OLEDs will get to the market in a few years, well unless there will be another tech which has similar results but is cheaper.
TechnologyEither@reddit
i really hope AR contacts will become a thing. No idea how we will power them
Sh1rvallah@reddit
Have you ever worn contact lenses? I cannot fathom it being possible to make tech contacts comfortable.
TechnologyEither@reddit
yeah, i would forget they were there tbh had trouble remembering to take them out
Cheerful_Champion@reddit
Wow, I'm envious. After day of wearing contacts I feel like someone brushed my eyeballs with stiff brush. I put them in when I'm doing something that requires them (usually sports) and take them out instantly once I'm done.
Sh1rvallah@reddit
Best time I had was the few years I used the daily disposable ones. Eyes start to feel dry and you just Wash your hands and toss them.
ser_Skele@reddit
Don't worry, it will only take 20-30years. I remember my dad telling me about laser projectors that were just around the corner. This was in the 1990's. It took them 30 years still 🤭 Too bad my dad ain't alive anymore, would've loved to show those to him 🥲
leferi@reddit
would be nice for VR maybe, although could be overkill
New_Amomongo@reddit
Looking forward 218 ppi across 41" screen size @ 8K with color consistency and lux of a 32" 6K Pro Display XDR.
Oinkidoinkidoink@reddit
While being lovelyand all, OLED technology is dead if producers can't find a way to lower costs and fabricate screen sizes bigger than 83".
LEDs are getting better every year. As well as cheaper and bigger.
loozerr@reddit
Yeah it's always a pain to use screens smaller than 83". Especially as computer monitors, phones, car displays and smart watches.
Strazdas1@reddit
I cant believe i am forced to use a 65" TV. Woe is me.
Nuck_Chorris_Stache@reddit
It's not the size that counts, it's how you use it.
loozerr@reddit
I wouldn't wish that on anyone!
ConsistencyWelder@reddit
Won't that make them burn out faster too? If there's less organic material to light up?
Nuck_Chorris_Stache@reddit
That was my thought. It's the main thing holding me back from OLED.
Deciheximal144@reddit
How can it be smaller than the wavelength of light it emits?
jmlinden7@reddit
You only need an electron to move around to emit light, and electrons aren't that big.
The bigger problem is that since the light has such a big wavelength, it might interfere with the light from neighboring pixels and cause blurriness
BoringElection5652@reddit
If you could shoot it in exact directions, it could help create a high-resolution light-field display, though. Like a super high-res lenticular display that can send different images to 1000 instead of just two directions. For use cases like glass-free 3D displays, or perhaps improvements in holographs, 3D monitors and VR displays.
jmlinden7@reddit
You're describing a VCSEL
https://en.wikipedia.org/wiki/Vertical-cavity_surface-emitting_laser
It's a small diode that emits a laser in an exact direction. It's what Apple uses for FaceID on their phones.
BuchMaister@reddit
I doubt that human eye could resolve that blur in those sizes.
kashyap69@reddit
It's not about human eye but diffraction
KR4T0S@reddit
Pixels that are smaller than the wavelengths of light, that sounds impossible even. If they can improve the efficiency and colour volume this stuff could be a revolution for wearable screens.
PurepointDog@reddit
I think it might be more about "if you want the photons to interact with a thing, the wavelength must be smaller than the thing". For example, optical microscopes have a limit on the smallest things they can see.
jmlinden7@reddit
That's because of blurriness not because they can't interact
That's how your phone antenna can pick up a radio wave that's multiple feet long even though the antenna itself is only a few inches at most.
PurepointDog@reddit
Microscopes don't have pixels. That's not how stuff works.
Radio waves "jump through" walls because they're so much longer than the size of the walls.
Kryohi@reddit
Well it's more complex than that, otherwise how would visible light pass through thick glass?
You can only block (thus use them to "see" something) photons that have a range of energies that interact with the material, so they must have energy levels separated by a similar amount of energy, whether it's the energy of electronic orbitals or of molecular vibrations/rotation. Anything much higher or much lower than that will pass through without interactions.
jmlinden7@reddit
Exactly, phone antennas absorb energy from radio waves despite being much smaller than the waves, because of electron interactions.
However, if you tried to use a radio telescope to take pictures of a bunch of phone antennas, the image would come out blurry AF because the wavelengths overlap.
R3Dpenguin@reddit
The only physics I learned was in high school, but radios are also smaller than the wavelength of AM, so it doesn't sound all that strange? Maybe an actual physicist can explain if there's something about light that makes it more challenging.
blind-panic@reddit
Its pretty hard to design an efficient antenna that is much smaller than the wavelength, though there has been a lot of progress - I think these are called 'electrically small' antennas. For this reason historically antennas are sized based on wavelength which is why you'll see ham radio antennas that can be 10's of feet or more.
VastTension6022@reddit
Individual electrons emit photons of large wavelengths. I think the problem is the way we visualize and graph "waves" and frequency.
TheMightyBunt@reddit
Radio and visible light are the same thing. Photons at different wavelengths.
Wavelength can be any size relative to whatever is emitting a photon.
rmccue@reddit
But note that a lot of antennas are half-wavelength, since you get efficiency gains: https://en.wikipedia.org/wiki/Dipole_antenna
3G6A5W338E@reddit
Gonna love integer scaling arbitrary resolutions to this thing.
94358io4897453867345@reddit
A technology that has burn-in is defective
duncandun@reddit
To be pedantic this isn’t nanoscale, thatd be 1-100 nm
battler624@reddit
how many PPIs is that? 85K? am i making a mistake somewhere?
loser7500000@reddit
depends on fill ratio and subpixel layout, a 25% fill ratio with a square 2x2 pixel would have a PPI of 21000. 570m pix/inch² would be 24000 PPI so similar ballpark
TemptedTemplar@reddit
If it can do a 1080p image at 1mm diameter, then a 1 inch screen would be ~275mm^2
~570m pixels per square inch.
battler624@reddit
maybe we can finally get 200ppi oleds for monitors.
TRIPMINE_Guy@reddit
220ppi oleds should be coming in the next year.
battler624@reddit
i believe only in 27" and not bigger. i was hoping for a 32" one even if 180Ppi+
doctorcapslock@reddit
i just want a 21:9 1440p with dual mode 2880p
1440p for games, 2880p for desktop
jenny_905@reddit
Cool. Can they make them cheap next please?
sukihasmu@reddit
And now you can have a shitload of dead pixels but it doesn't matter because you cant see them.
TheRealSeeThruHead@reddit
I just want to use it to fake crt
DaddaMongo@reddit
Oled contact lenses for VR etc
imtheproof@reddit
Better hope there's no bug that causes the brightness to randomly skyrocket.
youreblockingmyshot@reddit
The ultimate flash bang! Can’t even avoid it by closing your eyes.
Flaimbot@reddit
that's not a bug, it's a feature
diemitchell@reddit
No because focal point