[Monitors Unboxed] 1440p 500Hz QD-OLED Monitor Round-Up: What Model Is Best?
Posted by wickedplayer494@reddit | hardware | View on Reddit | 67 comments
Posted by wickedplayer494@reddit | hardware | View on Reddit | 67 comments
youreblockingmyshot@reddit
I’m gonna have to see if they have a video for lower refresh rate 4K Oleds. I don’t play esports titles on a singular monitor so I’ve never seen close to 500fps in games that I play.
Logical-Database4510@reddit
500hz is what Framegen is for, really.
Bring on the downvotes, but 🤷♂️ with a base framerate of 144 or so it looks really nice on a 500hz display.
JJ3qnkpK@reddit
That makes sense, and isn't downvote worthy. Much like upscaling enabled 4k to become mainstream (across both consoles and PC), framegen is enabling bonkers frame rates to become mainstream as well. Both techs benefit other uses, but I bet 4x framegen from 144hz feels pretty neat.
jammsession@reddit
It does not make sense. If you want ultra low latency and you want 500Hz instead of 144Hz, you definitely don't want added latency by using frame gen.
You maybe want frame gen to amp your 80fps civ game to 144 fps, because you don't care about the added lag, but want smooth scrolling.
youreblockingmyshot@reddit
I just don’t see the point of pushing 3-4 guesses on where things ought to be with increased latency. I’m more than happy to get 120-240 with just DLSS upscaling.
It doesn’t hurt anyone if you want to run it, I just don’t think I’d invest in a 500hz monitor with generated frames being the only way to achieve it.
-Purrfection-@reddit
It's not going to be much latency when the base framerate is already above 120.
If you're eSports then sure, you can tell, but I bet like 80-90% of people wouldn't be able to tell the difference side by side between real 500fps and fake 4x 500fps
zeronic@reddit
To be fair, i don't think most PCs these days(even stupidly powerful ones) can hit 500fps in more recent titles without FG unless we're talking about much older titles, so FG would be necessary.
Games are leaning more and more on DLSS/FG in lieu of optimization these days.
zeronic@reddit
As someone with an QD-OLED panel, you really don't want anything below like 120-165hz. The pixel response time makes frame rates below 100 feel like hot garbage, at least to me personally.
Meanwhile i have LED panels that feel totally fine with 30fps, you absolutely need to be able to have high frame rates for OLED/QD-OLED to feel good in my opinion. At least if you're sensitive to that sort of thing like i am.
Loose_Skill6641@reddit
they're all basically the same because they all use the same panels
Specialist-Buffalo-8@reddit
This is the greatest misconception/misunderstanding.
Anyone can plug a pre produced panel into electricity. What then? How do you properly Impliment accurate EQ tracking, Delta Es>2 etc.
Cool. The panel now how power, that's half the story. The other is the software implementations that make old shine.
jammsession@reddit
Serious question, why should software make my OLED shine? I personally just want a plain old ultra dump screen. Hopefully with DisplayPort 2.0 so I don't have to use DSC!
IMHO it is the GPUs job and not the monitors.
raydialseeker@reddit
From what I understand, if possible, always buy an Alienware. They have the best after sales support.
BlackenedGem@reddit
Lmao
Flukemaster@reddit
Dell/Alienware (consumer) support is pretty average. The unfortunate reality is the others are much worse lol
raydialseeker@reddit
All I've heard is Samsung has the worst QC. Gigabyte, Asus and AOC have terrible support.
Alienware seems to be doing better than all the other brands at both of those.
EmptyVolition242@reddit
We really have come a long way.
reddit_equals_censor@reddit
you mean massively regress from working monitors to planned obsolescence e-waste, that may have noticable burn-in after just 500 hours of playing a game?
yeah such progress /s /s /s
mimimumu69@reddit
IPS and VA monitors are still being made for those who wish
And saying "planned obsolescence" seems to imply that manufacturers are making OLED burn-in on purpose, any proof of that?
reddit_equals_censor@reddit
actually ips and va mini-led monitor new releases and development massively reduced. so no you are wrong there already.
and in regards to planned obsolescence YES it is planned obsolescence, because the industry DELIBERATELY prevented burn-in free technology from getting released.
over 15 years ago they nuked SED tech, which was ready to get released (basically flat crt as in truly flat).
and many years ago instead of pushing out samsung qned as quickly as possible, samsung canceled the pilot line and instead decided to push qd-oled down people's throats for years.
samsung qned is nano rod tech free from any burn-in if you are not aware.
so samsung DELIBERATELY put samsung qned on ice and happy pushed 0 re-sell value and on a good day 1/3 the lifetime oled planned obsolescence garbage instead.
so YES it is 100% planned obsolescence.
i suggest, that you do a bit more research in the topic.
the pilot line for qned was cancelled in 2022 and only recently samsung continued development.
so samsung DELIBERATELY delayed qned to push planned obsolescence oled for AT LEAST 4 more years.
oioioi9537@reddit
Qned/micro led isnt being pushed because its too expensive to produce iirc. Acting like its some grand conspiracy is just tinfoil hat territory when Samsung probably would've made way more bank with microled/qned had it actually been a viable product. Its the same thing with most other technologies, scaling up and cost balancing is the biggest and last major hurdle for all technologies and many cool things actually simply fail to pass that stage or are in constant development hell because of it e.g. solid state batteries.
reddit_equals_censor@reddit
well you just showed, that you have 0 idea about samsung qned and its challenges to bring it to market.
the actual costs for samsung qned would be cheaper than oled.
it would be NO WHERE NEAR the absurdity, that is micro-led costs and massive issues, where they have to make small panels and put them together to create a tv to get anything remotely manufacturable, that can yield.
that is the reason why they are freaking developing qned, because it s cheap burn-in free oled performance with 0 burn in + brighter....
can you please understand the basics of a tech, before writing nonsense.
the issue with qned is NOT cost, just as it isn't cost with qdel.
the issue with qned is solving problems to get it mass production. for example nano rod alignment, etc...
and samsung did freeze development for a burn-in free technology to squeeze qd-oled for a long time with 0 resell value and guaranteed burn-in.
so again YES this is planned obsolescence and YES samsung qned would be cheap.
oioioi9537@reddit
You basically agreed with me lol. Manufacturing issues is part of why qned has been expensive to make work and it didnt work and so they bailed. Cost isnt just about bill of materials, production/manufacturing all count towards it. If something isn't necessary expensive to make but is low yield, that cab still make it expensive to produce.
nittanyofthings@reddit
Wait until inkjet OLED really brings down prices.
HamCheezeSliderz@reddit
Is that because small panels and large panels cost a lot more due to higher rates of defects or lack of tooling?
exomachina@reddit
OLED panels are made on a large sheets and then cut to size and only a certain number of different sized panels can be made on the same sheet. There's essentially these big ass machines that move the sheets around to different stations for the different sizes and densities and then they are cut.
Printed OLED manufacturing basically streamlines the panel assembly process and reduces the complexity of the production systems needed to move them around the factory. You essentially have a single machine for each size/density working on a single large sheet that can be easily cut with minimal material waste.
HamCheezeSliderz@reddit
Then before QD the sizes were cut that could extract the greatest margins and now with QD any size can be made "on demand" to a set degree and it'll become cheaper going forward?
exomachina@reddit
Yea, and once tooling is mature it'll facilitate smaller plants that just want to produce smaller panels instead of like a huge plant that has to divide it's main production into smaller panels.
HamCheezeSliderz@reddit
That sounds incredible. I had noticed OLED IPS monitors come down in price these last few years reflecting that of televisions, but I thought it was momentary.
FlatTyres@reddit
Once we get to 600 Hz we get the ultimate gaming AND media monitor - 600 is an integer of 24, 25, 30, 50 and 60 which means judder-free playback (not talking about motion interpolation) on 24p films as well as both 50 Hz (25p, 50i, 50p) and 60 Hz (30p, 60i, 60p) made for broadcast content.
I'm still waiting for browser-based video streaming players to support VRR as a QMS-like feature in full screen mode in the meantime.
loozerr@reddit
480Hz it's divisible with all of those except for 50 and 25.
FlatTyres@reddit
That's the problem. The majority of the world broadcasts at 50 Hz and 480 Hz doesn't fit for the majority. 500 Hz works for both 50 Hz and 60 Hz regions but doesn't work for 24p
loozerr@reddit
Barely a problem. 24/30/60 is the standard for digital media even in PAL regions.
FlatTyres@reddit
I don't understand how you got upvoted for this - the majority of professionally filmed content in former PAL and SECAM regions (most of the world) IS recorded and broadcasted at 25p, 50i and 50p. That's how it will continue to be, so it is still a problem.
rubiconlexicon@reddit
Yeah that's lame, nobody wants to have to do that. This is why 600Hz (and multiples of it) are the holy grail refresh rates.
FlatTyres@reddit
I still switch manually - I default to 120 Hz on desktop but switch to 100 Hz for 25p and 50p content. Waiting for browser-based VRR fullscreen playback.
FlatTyres@reddit
Not at all for broadcast media. You won't find a studio in a single 50 Hz region shooting at 30p, 60i or 60p digitally or back when tape was used. You'll only find 24p filmed for films intended to release internationally.
Phones default to 30p and 60p because most of the flagship conpanies are designed by Apple (US) and Samsung (South Korea).
OttawaDog@reddit
So? It's not like these monitors are fixed frequency.
If you need to do precise frame pacing work at 50/25, you can just set the monitor to a compatible frequency.
clearlybreghldalzee@reddit
VRR seems to apply to nearly any fullscreen app in linux's gnome. On windows some players can trick the driver, like MPV with vulkan option.
FlatTyres@reddit
I tried to get VRR working with Chrome, Edge and VLC for playback on Windows and VRR became very erratic in web-based playback. I do intend to experiment with Linux some time. So far the default Media Player app in Windows works perfectly with VRR.
MrLancaster@reddit
I literally don't even care about numbers like this anymore. 1440p/4k 120/144hz and I'm good to go.
DezimodnarII@reddit
I have a hard time believing even pro gamers could distinguish between 240hz and 500hz in a blind test.
Admixues@reddit
I have a 240hz strobed TN monitor and a 480hz OLED, I can play just fine on both but I'd rather not go back, it hurts my eyes to adjust it literally looks like a slide show at first.
I'm not a pro but I've peaked Champ 5 all roles in overwatch, so I'm somewhat decent.
Thotaz@reddit
I call BS on it hurting your eyes. Your phone and your TV doesn't run at 480 Hz. Whenever you watch a random video/stream online it's at 60 Hz or less. Whenever you see a random billboard or display outside it's at 60 Hz or less.
You want to tell me that you avoid all those things or just deal with it hurting your eyes?
raydialseeker@reddit
Because those are things you watch content on. End to end latency is redundant.
Wanna test how smooth 240hz oled is ? Just open a notepad, type out a sentence, and drag it in a circle. Start slow. You'll come to realize that you can see sample and hold after images at a VERY low speed. The faster you go, the worse it gets.
This is why blurbusters have 1000hz as the eventual end goal for sample and hold displays. Some OLEDs already push 720hz, and nvidia is onto something with Gsync Pulsar.
Thotaz@reddit
They said their eyes hurt. That's not a latency thing, and besides even if we want to be generous and give them that, that still doesn't explain the phone which is obviously interactive.
It's also very unlikely that Overwatch on a 480 Hz OLED display is their first and only gaming experience. Statistically they've likely played a console game at 30 or 60 FPS at some point in their life. Do you want me to believe it hurt them then? Or that having experienced 480 Hz OLED physically changed them so they couldn't get used to a lower framerate again?
Look, I'm not denying that high framerates are nice. I'm a bit of a framerate snob myself, meaning that if I can't get my desired framerate then I'd rather just not play at all. However, this idea that moving from 480 FPS -> 240 hurts and looks like a slideshow is obviously ridiculous.
wtallis@reddit
To me, the easiest way to understand how seemingly-outrageous refresh rates could still be providing perceptible benefits is to put motion in context of pixels per frame: if your mouse cursor or some other object is moving across the screen, and takes a full second to cross the screen, how many pixels does it jump every frame when running at 60Hz? At 2560x1440, it's ~42 pixels per frame for horizontal movement, which means typical screens have vastly more spatial resolution than temporal resolution. Fast motion (but not anywhere close to being too fast for our eyes to track) fundamentally cannot be displayed clearly and smoothly at slow frame rates; the only options are smearing or stuttering.
Strazdas1@reddit
There is a huge different between real time gaming and passive watching.
DezimodnarII@reddit
Interesting, I thought since the jump from 120 to 240 was a lot less noticeable than going from 60 to 120, the jump from 240 to 480 would be even less once again. But maybe it's still more significant than I thought.
theunspillablebeans@reddit
Honestly it just depends person to person. When I tried showing my parents high refresh rate for the first time, they just didn't understand what they were looking for. What's significant and noticeable to one person might look negligible or identical to another. There is no objective indication of how you'll find it without trying it yourself. If your past experience is anything to go by, it's probably not worth the effort.
raydialseeker@reddit
Any person can notice motion artifacts on a 240hz display.
Open a notepad. Type something out. Drag and hold that window and spin it in a circle. At a relatively low speed you'll start seeing a sample and hold trail and the text will become blurry in motion.
FinancialRip2008@reddit
i would think that's more from the picture quality differences than the refresh
rubiconlexicon@reddit
I always see comments like this on reddit they just leave me at a loss for words, as if 240Hz sample-and-hold motion clarity isn't absolutely abysmal next to even a 60Hz CRT. Are you also one of those people who says nobody can feel a 15ms latency difference from Frame Gen, despite AG Split Latency double blind test showing that a lot of people can reliably discern latency differences of as low as low single digits? You mention blind tests to back up your claim, but the results of these tests would probably be a rude awakening for you.
-Purrfection-@reddit
It's people that don't understand the concept of morion clarity. They think the only thing about higher FPS is smoothness. And there they definitely have a point, diminishing returns hit fast.
But mostly when people have this debate they're speaking past each other as there are multiple components of what FPS is.
eubox@reddit
the definately can lol
even non pros could see the difference between 240 and 480+
Loose_Skill6641@reddit
they don't need to be able to tell
As LTT showed in the video below, even 1ms of extra latency affects your KD Ratio, therefore high refresh rates absolutely improves performance because latency is decreasing
https://m.youtube.com/watch?v=5qjSGEOEaXo&pp=ygULTFRUIGxhdGVuY3k%3D
0g7t4m4zp3@reddit
Wow, 500 Hz? Do we actually need it?
KARMAAACS@reddit
If you play eSports games like Valorant, LoL or CS2, absolutely more Hz is better, you can easily reach 1,000+ FPS with current hardware if you have the right settings in eSports games. Effectively you kill any blur and enhance motion clarity, whilst also making it easier to track targets as your brain is getting more information per second to track something and you remove any delays or bottlenecks, allowing for better plays to be made.
That being said, yes there's diminishing returns past 600 Hz, 1.6ms per frame at 600Hz vs 0.8 ms at 1,200 Hz is not as stark a difference as 16.67 ms per frame at 60 Hz vs 240Hz 4.167 ms. But regardless it's still better to have more frames and higher refresh rates. In addition, OLED has way faster almost perfect pixel transitions compared to LCD monitors and as a result, you can use these high refresh rate monitors at lower refresh rates like in heavily demanding games, without overshoot or undershoot like you get on high refresh rate LCDs where having fixed overdrive can be a problem.
What is a problem is that with OLEDs you can have VRR flicker, since gamma of the OLEDs is tied to the refresh rate, but this is really only an issue when you have erratic fast changes in frame rate.
There's basically no downside to high refresh rate OLED monitors, other than price or cost, they're effectively the perfect monitor technology until we have MicroLED commercially viable for consumers.
Feath3rblade@reddit
Burn in is still a downside for OLED, at least if you use your monitor for things other than gaming and content consumption
gabeandjanet@reddit
You should check out tftcentral and other panel review sites for miniled lcd failure rates.
They are many times higher than oled failures including burn in.
Unless you plan to buy a ten year old lcd panel with a cfl bulb as backlight all modern lcd panels have crazy failure rates due to miniled backlights being trash
KARMAAACS@reddit
Burn in is basically a non-issue for the simple fact that if you run pixel refreshes and allow the monitor to do it's pixel cleaning to dim all the pixels evenly, that you basically won't ever see burn in, the display will only lose it's brightness over time slowly.
With modern OLEDs, they're in the 400-500 nit range, assuming you lose 5% of your brightness annually to this process, by year 6 you're at 350 nits from 450 nits. Thats exaggerated example as most OLEDs don't lose that much brightness over time from pixel refresh and cleaning process. But 350 nits is still more than useable. By the time you lose enough brightness, say year 7 or 8, you're likely able to replace it very cheaply or at the very least an equivalent OLED would be much better.
I mean IPS and VA panels still dim themselves too over time or a edgelit backlight LED can also break in a similar time limit as burn in.
Believe me, I was a big sceptic of OLED monitors, believing that burn in would be a huge concern. But after watching HWUnboxed and OptimumTech try and burn in their OLED on purpose by disabling all the burn in and image retention fixes (that they could through the OSD) and really only seeing changes (outside of test patterns which showed burn in for months) after 24 months of high usage, it's basically a non-issue.
gabeandjanet@reddit
Agreed with everything except them being perfect tech.
They are still sample and hold just like lcd. Which is why they need such high refresh rates to get good motion clarity.
A pulse refresh panel with otherwise the same characteristics would look clearer in motion at 100 hz than a 300 hz oled panel does.
That is the true endgame
gabeandjanet@reddit
As long as monitors use sample and hold refresh to display images we benefit
I have a 360 hz oled monitor and the motion clarity is ridiculously much better at 300+ fps vs at 120 fps.
Its also much less fatiguing for the eyes due to less blur.
Sample and hold blur is nasty.
Browsing is a joy at 360 hz too, because text stays fully legible while scrolling. Going back to eg 120 hz is jarring because you have to refocus and look for where you were reading on the page every time you scroll.
Also i have a few racing games that run at 300+ fps and when i play them for a while and then go back to another more demanding game that runs at say 100 fps the 100 fps game looks way less smooth ( and wayyy more blurry) in comparison
Been playing division 2 for the past week which runs at 200-280 fps on my setup at 1440 p and its so awesome. Super smooth, super clear in motion its a joy.
Not saying 120 fps is unplayable but high refresh is a significant upgrade
Strazdas1@reddit
we actually need 1000hz+ for clear motion.
Plank_With_A_Nail_In@reddit
We only need food and shelter, we don't need monitors at all.
This is all want and always has been.
goodnames679@reddit
Not everyone does, just depends on your use case.
Do you play almost exclusively single player games? If so, you probably won't benefit a ton from going above ~165ish, and honestly could get away with lower.
Do you play competitive games reasonably well? You could probably benefit from bouncing up to 240.
Are you trying to compete at the top ranks of an esports title? You could go up to 500 and easily justify it.
MumrikDK@reddit
Maybe at 500Hz the windows cursor actually looks smooth and crisp moving across the screen for once.
MonoShadow@reddit
For gaming It's a nice to have for motion clarity. I doubt many games can be run natively at 500FPS, but this is what MFG is actually for. Some other stuff like CRT emulators also benefit from higher refresh rates.
So in general, the higher the better, even if you won't push to the limit natively.