BOE shows 31.5” 8K 120Hz LCD panel on SID Display Week, mass production later this year
Posted by Balance-@reddit | hardware | View on Reddit | 154 comments
On SID Display Week (DW 2025) BOE showed a very good looking 31.5” 8K LCD monitor panel. It supports up to 120Hz refresh rate in 8K or can run at 240Hz in 4K. The firm told us that it expects some mass production later this year.
speller26@reddit
Honestly looking forward to 8k just for the perfect scaling at both 2160p and 1440p
gahlo@reddit
Huh, the first justification I've seen for 8k that I can see the usefulness of.
SmartOpinion69@reddit
32" 8k is excellent for multitasking with 3-4 windows on your screen at a time. 4k ultimately isn't good enough for small text when you consider that many programs of relatively large GUI that will take up a fixed amount of screen from the beginning
SERIVUBSEV@reddit
The fact that people feel the need for such justification speaks to how small of an upgrade these are going to be.
No one had to rationalize upgrading from SD to 1080p and then to 4K, because it was so obvious.
Skylancer727@reddit
I say the main benefit is it means the death of antialiasing. At 8K the image is so clear antialiasing is completely unnecessary as any pixel crawl is too small to be seen.
It would finally be the end of temporal smearing in games. At least when AI cheats like DLSS and FSR are no longer needed to run it.
Canadian_Border_Czar@reddit
The problem goes beyond just more resolution. The industry as a whole has pivoted away from the ever increasing resolution.
YouTube, Netflix, and more have paywalled high bitrate content delivery and many streaming services use compression even when providing high resolution content.
A Blu-ray disc can already be hundreds of gigabytes, so unless we're planning to transition to some sort of DRM protected ultra high transfer rate flash drive for movies, there's not a lot of practical applications for 8K content.
pdp10@reddit
Well, 100GB, and there are 128GB discs but I don't believe they've been used for commercial content releases. A typical bitrate for UHD Blu-ray is 80 Mbit/s at 2160p, with H.265.
Fun fact: commercial UHD Blu-rays can be high frame-rate (over 30 FPS), but there are only about a dozen titles that are.
bogglingsnog@reddit
I find a lot of folks don't really get that excited about 4k. It's just not really that often where you absolutely need the extra detail it brings. Now, for multitasking I find it absolutely amazing, but as it turns out an awful lot of people don't really multitask much on computers...
Canadian_Border_Czar@reddit
I have a 4K TV, and I like it but I'm not watching movies in a dark home threatre room with a huge sound system. There's no way to get the full immersion.
As for gaming.. it's hard to get excited about 4K when the hardware required for an acceptable playing experience is insanely expensive.
The enshitifcation of everything has ground technological progression to a turtles pace. They all make more money by keeping things like 8K niche.
bogglingsnog@reddit
I do bluetooth streaming to 2 pairs of headphones and it really amps up the movie experience. Also move the couch close for the big screen effect.
4k gaming is quite a mess but I'd rather play on a big screen at lower graphics settings than a little screen with pretty graphics. Aside from games and cad though its not particularly hard to run a 4k display.
iDontSeedMyTorrents@reddit
The justification is literally the same as your other examples. People want more screen real estate and better image quality. The fact that all other common resolutions scale perfectly with 8K is a sublime icing on the cake. It allows for a single monitor to do anything you want.
Lincolns_Revenge@reddit
Except that the jump from 4K to 8K isn't like the others. You're unlikely to the difference in sharpness at any common screen size and viewing distance like you did with the previous steps. And as far as screen real estate goes, you'll be using settings to increase the size of nearly all text and and graphic elements to make them more easily viewable.
-Purrfection-@reddit
Difference between 120ppi and 280ppi 'unlikely to see the difference'?
dahauns@reddit
ppi isn't the deciding factor, ppd is. Depending on viewing distance, you're reaching the limits of human visual acuity.
See: https://phrogz.net/tmp/ScreenDensityCalculator.html#find:density,pxW:7680,pxH:4320,size:31,sizeUnit:in,axis:diag,distance:0.5,distUnit:m
Zaptruder@reddit
Yeah... most people don't have such acute vision. Of course it depends on screen size and viewing distance....
But most users aren't quite filling up their field of view with pixels... hell, most aren't even getting 45 degrees FOV on retina - where 4k benefit is approximately maximal for 20/20 vision.
There's simply a limit to the benefits of acuity, and it's between 2k to 4k for normal users.
Even if you're not willing to admit the limits of visual acuity at 4k to 8k - surely you can agree that we're past the point of any returns if we go from 1million pixels to 10 million pixels (horizontal not total) - as the former is already more information than any eyeball can take in. So then at what point do you think we hit diminishing then no perceivable returns?
-Purrfection-@reddit
I don't know, but side by side with my Mac display and 4K 32 inch I can tell the difference, especially in text rendering. Not that the difference is massive or that the 4K one is bad or anything. I don't think I could see any use for something that's above retina so 220ppi.
I don't use glasses or contacts.
Zaptruder@reddit
Really depends on your sitting distance and main use case.
Text is the biggest beneficiary of higher resolution displays as it provides the ideal use case for it (small sharp high contrast features) - but it'll basically go from clear text rendering to clearer text rendering.
Even so, there's diminishing returns - whether or not people downvoting me want to admit that is immaterial to the simple fact that we have a limited amount of cones to detect high resolution information, and that our lens are imperfect!
djent_in_my_tent@reddit
I’m with you on this. I have a 32” 4k OLED — which doesn’t even have the optimal sub pixel layout — and not only can I not discern individual pixels, I found that I need to run it at 125% DPI in windows and about 110% in most apps (firefox, libre writer….)
I’m happy for the advancement in tech but I have zero interest in 8k lol
I’m in my thirties and I’ve had lasik to correct severe nearsightedness. If I was 18 and could still sit as close to a screen as I used to could, maybe I’d feel differently
Yebi@reddit
The idea of not being able to make out individual pixels being the apex of visual clarity is "the eye can't see more than 30 FPS" levels of bullshit
djent_in_my_tent@reddit
Well you do you, but I measure slightly better than 20/20, my monitor is about 140 dpi, my phone is about 460 dpi, and at equal distance they look equally clear to me on fine text.
iDontSeedMyTorrents@reddit
And yet there are still reasons to move to 8K. Even if you don't deem the increase in clarity worthwhile, plenty of others do, myself included. I've already mentioned perfect integer scaling, but even non-perfect scaling looks much better on a hiDPI display. The higher your DPI, the less non-standard subpixel layouts matter. It's no different from your smartphone in these regards. And there are people that would be very glad for the ability to display full 4K+ video in their editing work, for instance. Of course diminishing returns exist when it comes to only image quality, but 8K gives you flexibility that does not exist with 4K and below all in one package.
dparks1234@reddit
It also perfectly divides into 480p unlike 4K and 1080p.
240p 480p 720p 1080p 1440p 4K
All scale perfectly to 8K. The only resolution that’s really missing is “5K” but that only really existed to cover integer-doubled 1440p. 8K is such a highly composite number that it’s almost like the CRT scaling days where everything looked native.
Lukeforce123@reddit
Now it just needs to be 600hz for perfect 24, 25, 30 and 60 fps video playback
SmartOpinion69@reddit
wouldn't variable refresh rate fix this problem anyway?
terraphantm@reddit
With VRR is that really needed these days?
DNosnibor@reddit
How common is 25 FPS though? 120Hz should be fine, it covers 24, 30, and 60. And just use VRR to go down to 100 for 25Hz video if you really need it.
dparks1234@reddit
The PAL standard is 25FPS or 50 fields per second
joelypolly@reddit
I look forward to having phosphor effects emulated using an 8K display
tukatu0@reddit
? You can already do that easily. Its called the "death to pixels filter" using a combo of shaders. https://old.reddit.com/r/RetroArch/comments/1jnam48/cyberlab_megatron_miniled_death_to_pixels_4k_hdr/ if you click on the guys profile you will see a lot of stuff.
If you talk about motion clarity stuff. Well thats not that hard too. Adding the crt beam sim. 240hz is enough to emulate crt tvs from the 80s. 480 for tvs from 00s. And 960hz for crt monitor clarity.
LightMaleficent5844@reddit
Very cool.
Shadow647@reddit
Nobody's going to watch 480p content on a $3000 8K monitor.
dparks1234@reddit
Gamers will
kickass404@reddit
How else am I going to enjoy my 1990 as they were intended to be watched? 😆
-Purrfection-@reddit
Also 360p, 540p
Kamishini_No_Yari_@reddit
That makes this a very desirable monitor. I'll be keeping an eye on it to see if it's worth buying
matrixhaj@reddit
Or just play at 4k? For price of 8k display you can already get the hw
BlueGoliath@reddit
Does 8K even offer any visual improvements?
iDontSeedMyTorrents@reddit
Absolutely. Even a 27" 4K panel is only 163 ppi.
Realistic_Village184@reddit
163 PPI is close to the point at which individual pixels are no longer distinguishable to the human eye. 330 PPI is cited as the optimal density for print since you hold books and newspaper close to your face, but 163 PPI is pretty sharp for a monitor that is around three feet away from your eyes.
I'd guess that there's some minor increase in visual fidelty unless you're 4+ feet away from the monitor, in which case I doubt anyone can see the difference between 4k and 5k for a 31.5" display.
iDontSeedMyTorrents@reddit
You don't have to discern individual pixels to notice an increase in clarity.
Hayden247@reddit
And for video games stuff like aliasing are certainly noticeable even after individual pixels are difficult to pick out. Modern TAA tries to combat it but it's either not enough or it just swears the image instead where you'll want more resolution (and frame rate) to regain clarity.
armady1@reddit
Not sure about 31.5” but there is 100% a major difference in clarity and crispness at 32” 4K vs. 6K and 27” 4k vs. 5k as far as UI and text rendering goes and it’s not even close, it’s a worlds difference regardless of the operating system.
RuinousRubric@reddit
The human visual system does a tremendous amount of image processing, and can tease out some kinds of information about features substantially finer than is implied by the human eye's optical resolution. Vernier hyperacuity is particularly important to raster displays, and is why it should take displays with a PPI an order of magnitude higher than current monitors to finally eliminate aliasing artifacts.
BlueGoliath@reddit
How does that translate to visual quality though? I'd imagine there is a cost vs benefit ratio there.
hollow_bridge@reddit
https://techcrunch.com/wp-content/uploads/2017/05/ppi_for_everyone.png
Scale the image so that it does not exceed the ppi of the device you are using, then move yourself or the device until the measurement is perceptible 1cm, and decide for yourself.
iDontSeedMyTorrents@reddit
You may not notice it in games, but everything else you do is much nicer and crisper.
Of course there's a cost/benefit to it - there is to everything. 4K to 8K is definitely into diminishing returns and won't be as noticeable as 1440p to 4K, but it's still tangible to people like me. What's even better is the integer scaling benefits for gaming which will be very useful for a much larger number of people.
I'd argue 8K-class displays are the last major resolution milestone for computer monitors.
BlueGoliath@reddit
The more I look into it the more convinced I am that 8K is just some audiophile equivalent obsession but for displays.
-Purrfection-@reddit
Just take a look at any desktop Mac display, it's obviously better.
iDontSeedMyTorrents@reddit
It's really not. Aside from integer scaling, a hiDPI display also minimizes any subpixel layout effects, like color fringing on QD-OLED.
You don't have to want it yourself but there are benefits many people will actually like.
BioshockEnthusiast@reddit
I wonder if we will ever bother as a species going past 8k in actual retail products. The diminishing returns are already hitting hard at 8k.
armady1@reddit
Text and UI elements look way crisper at higher PPI, anyone who says otherwise really needs their eyes checked imo. The easiest comparison is comparing a 5k 27” display side by side with a 4K 27” display regardless of operating system.
yourrandomnobody@reddit
This infographic might give you an idea:https://i.imgur.com/mSCrYtl.png
Source: https://phrogz.net/tmp/ScreenDensityCalculator.html
BlueGoliath@reddit
According to info online a 32 inch monitor at 4K is 137 PPI. So 8K at a practical monitor size is audiophile level obsession it sounds like.
yourrandomnobody@reddit
21-32" 8K is the minimum pixel density we should go for.
4k 32" is a mediocre target to be at.
Read through these links above, which I've provided.
BlueGoliath@reddit
Ah, I thought those numbers were PPI. My bad.
iDontSeedMyTorrents@reddit
That's because 24" 1080p is garbage. It's only ~92 ppi. 32" 4K is 50% denser.
account312@reddit
And 1080p blows away 640x480. That doesn't mean it's very good.
Raikaru@reddit
Text looks sharper is pretty much the best benefit
DeliciousIncident@reddit
Tbh that's good enough ppi given the distance you would be viewing the 27" display from.
Standard-Potential-6@reddit
Not for fonts. I still remember dreaming of 300dpi in the 1990s.
Sevastous-of-Caria@reddit
My 15.3 1080p? Its a bit surreal to me these cheap gaming laptops have better ppi than industry leading 4k panels.
reallynotnick@reddit
That really depends on screen size and viewing distance.
carpeggio@reddit
I'm a little aloof to scaling, as I've never dabbled or had to deal with, having never gotten a 4k monitor.
Is what you're talking about expected to be more commonplace in the industry as 4k and higher monitors become more common?
I ask, because it makes total sense to avoid scaling issues, but keep flexbility of use cases (without overkill - not many can game at 8k, etc.)
frostygrin@reddit
Integer scaling isn't the norm, no. Monitors don't do it, and graphics cards have it as a special mode in the settings.
slither378962@reddit
Remember to turn off upscaling in your games too!
marxr87@reddit
exactly! everyone misses this when they complain that either we aren't ready for 8k (gpu-wise) or that it's unnecessary. Perfect scaling is the reason 8k is the dream. And then multiples after that, which will take forever (16k, 32k). 120hz and maybe eventually 240hz is pretty much all anyone realistically needs. After that it's all about colors and peak brightness etc. 12bit, 10k nits when?
GilroyWarlord@reddit
Definitely interested in 8k 120hz. My 5090 is about to pull overtime. DLSS and Frame Gen are a must to reach these levels
kasakka1@reddit
32" is a stupid format for 8K. Give me that in a 40-50" and I could make a real desktop powerhouse out of it.
SmartOpinion69@reddit
40 is like the absolute max that i would be able to tolerate.
there comes a point where the monitor is so big that you're better off moving close to the monitor or just getting a curved wide screen.
WaterLillith@reddit
I have no desire for 8K for content consumption. For content creation it's useful
SmartOpinion69@reddit
8k if you have a bunch of windows open and need to see them all at the same time. 4k doesn't provide enough real estate when you use titles
theorist9@reddit
Dell has had an 8k 31.5" (280 ppi) for many years, though it's limited to 60 Hz. It's superb for text work--those who've used it say they notice the sharpness difference vs. Apple's Retina montiors (220 ppi).
It's glossy, which anything with such high ppi probably needs to be. Matte coatings reduce text sharpness, and this becomes more noticeable as pixel density increases. Its downside is a lousy AR coating--nowhere near as good as that on Apple's glossy screens--making it far too reflective.
It's not practical to use 280 ppi on Macs, because you'd need to scale it to the 3:1 to get a reasonable UI size (it's very small at 280 ppi), and MacOS only offers 2:1.
If Apple offered 3:1 scaling, the 8k 31.5" BOE could be spectacular (if it has a glossy screen and a good AR coating). Until then, I'd rather see a 43" 8k (=220 ppi), which would be great for working on large spreadsheets (on my 27", for spreadsheets with a large number of rows and columns, to see the entire spreadsheet at once requires such a low zoom that the text becomes hard or impossible to read).
Yodawithboobs@reddit
5k 32 inch 120 Hz to 144 Hz Mini LED would be perfect.
battler624@reddit
5K 32" would be damn terrific.
JtheNinja@reddit
I’d love a 5K 32”. I really prefer the larger size, but they’re just SOOO rare with higher PPI that I end up settling for 27” 4K instead. Acer showed off a 32” 5K, but it’s an edge-lit LCD.
There’s also the Apple ProDisplay XDR, but 60hz and 512 dimming zones for $5200 is not quite it. It’s surprisingly easy to make work with a Window box though, doubly so if you’ve got a Mac handy to configure it since it keeps setting between devices. But if you’re fine with the default color space settings and leaving Windows HDR on full time, you don’t even need that. Just a bidirectional DP/USB-C cable and you’re off to the races.
data4dayz@reddit
It doesn’t help in terms of dimming zones / MiniLED but there’s a Dell 32” 6K that’s cheaper than the XDR. Still expensive but you can get it in pretty good condition for half price on eBay and BHphoto.
ASUS and LG (and Acer too??) have 6K60hz announced. The ASUS one just got pushed back to Q3 potentially according to an ASUS community correspondent on Reddit in the monitors subreddit.
The LG is vaporware imo idk when that thing is gonna come out or if it ever actually will. Along with the ASUS 8K that’s apparently only came out in China.
I got the Dell 6K, pretty solid but I can’t single monitor it like I thought I would. I might get another side panel either another 32” 4K or a 24” 4K portrait or a DualUp or something.
innovator12@reddit
4K 32" with 130% scaling is a pretty good experience for text, but does horrible things to screenshots scaled through the web browser. Probably want 6K to get 200% scaling and fix that.
But you're right, there are far too few >4K options.
quack_quack_mofo@reddit
It's not out yet, but check out Acer Predator XB323QX. 5K 31.5". IPS though, which is good enough for me
JtheNinja@reddit
That's the one I mentioned in my comment. It is, tragically, an edge lit LCD and will have useless HDR performance as a result. I just hope other people buy it so we get a miniLED or OLED model in the same form factor
quack_quack_mofo@reddit
Is edge lit only bad for HDR? And fine for SDR?
JtheNinja@reddit
Well, the contrast ratio kinda sucks for SDR too. It's just not the colossal "HDR effectively does not work" level problem you get with HDR content.
Vb_33@reddit
It's an odd resolution since it's meant for professional use. 8k goes even into 4k.
gahlo@reddit
"5K":1440p as 4K:1080p
mduell@reddit
I’d love a 6K60 that can do 3K at higher rate (240 would be the math, but 1xx is fine too).
reddit_equals_censor@reddit
no you absolutely do NOT.
what you want is a proper scaler, so that the panel isn't gimped, because the company wanted to save some pennies, or when that isn't the thing holding things back, you want high bandwidth enough standards to also not hold things back.
you want in your case a "6k" 240 hz display, that is integer scaling to 3k as well.
that is what you actually want.
the "dual mode" monitor part is just bullshit.
it is mostly display manufacturers wanting to save some penies on the scalers to be able to market "240 hz", while just selling a 120 hz 4k screen actually with a 1080p 240 hz mode, that is also broken, because they couldn't even be bothered to use integer scaling for the 1080p mode.
it is about scamming people through marketing bullshit.
it is about REMOVING features from you.
REMOVING 6k240 hz from you to sell 6k60hz only to you instead then with a "240" on the box.
if the panel can do 240 hz response time wise, then it is about saving a small bit of money on the scaler.
and if it is a bandwidth problem of the current highest bandwidth connections, then the display industry for no reason whatsoever is lagging behind in bandwidth displays demand.
-Purrfection-@reddit
Is it really about saving money or is it just that there are no scalers available that can do 4k480
reddit_equals_censor@reddit
well you are talking hypothetical here, because the monitor above is just 8k 60hz and 4k 240 hz.
and of course we got 4k 240 hz scalers.
also scalers get developed, if the display companies want them to exist.
so if we got panels and cables, that could do 4k 480 hz, but we don't have scalers, then that is a CHOICE! by the display industry already.
it was just made earlier on.
and again about saving money. delaying the development of 4k 480 hz scalers then, but still being able to throw a "480" on the box, if it is shity 480 1080p without integer scaling even.
so YES your example would also be about saving money, just at another point.
if you wanna be REALLY REALLY generous, then you could excuse your example for very small display companies, that aren't developing their own scalers and blame the big companies, but i am for giving 0 generosity to the industry full of lies and scams.
__
if you're wondering about lies and scams. lie: fake g2g average response times. 12 ms response time monitors lied about and a "1 ms" lie being put on the box and marketing.
and scam: oled planned obsolescence and dead pixel units being claimed and sold as "functional", while they are actually broken.
-Purrfection-@reddit
Ok? I mean why don't they just develop 16k 10000hz right now, it's their choice not to.
reddit_equals_censor@reddit
because we don't have 16k 10k hz panels yet? nor are they coming up on the horizon.
even oled can't do 10k hz as oled has roughly a 0.3 ms response time as measured by hardware unboxed. this enables a 3000 hz refresh rate.
and i don't know the smallest pixel size, that we can get with oled or lcd rightnow to get a 16k 32 inch display for example.
so please remember, that i was specifically talking about the scalers.
if we have the panels, the panels can do the refresh rate, they have the resolution, then they should not be held back by some shity scalers.
that is very basic stuff.
again the industry is selling "dual mod" displays, instead of putting a better ALREADY EXISTING scaler in displays now, that would run max refresh rate at max resolution or close to max refresh rate at max resolution already.
it isn't used, because this way they can throw the high refresh rate number on the box, despite the fact, that you shouldn't use it, because not only is it at a very very low resolution, but they couldn't give enough of a shit to have it use integer scaling either.
and if we are talking about having scalers ready for products as they come out, YES they are in charge when they start developing scalers, that aren't holding back the panels and display connection standards as they come out.
we are not talking about developing something impressive and completely new like your mentioned 16k 10k hz display.
we're talking about creating the chips to drive the actual hard part, which is the panel.
mduell@reddit
I mean I can’t push 6K240 on any graphics card I’ll buy, plus the bandwidth is insane, so I’m fine with 60Hz at high res.
reddit_equals_censor@reddit
do you mean performance wise, or connector bandwidth wise?
because if the bandwisth issue doesn't exist (i don't know the max or cable length max you can get)
but gpu performance wise itself, you still get the benefit in lots of older games then and for any future graphics card upgrade.
a monitor can be used for 10 + years easily, while you might replace a graphics card in 3 or 4 years possibly.
and in idk 7 years your graphics card might drive that 6k240 hz and you'd be pissed of it just being 60 hz then.
in other words, get some benefits now and grow into the monitor over time nicely as well gaming wise.
__
btw you want at least 120 hz and not 60 hz, because at 6k 120 hz you can get freesync + lfc and thus have a much smoother experience.
mduell@reddit
Both.
Own_Nefariousness@reddit
Dude same. Dual Mode 6k 3k for 31.5inch and 5k 2k for 26.5inch are my dream displays. It's the reason I hope Nvidia starts chasing 8k again because that would mean 6k and 5k are much more easy to do. Not to mention that DLSS works better when there is a higher base resolution. 6k at DLSS High Performance would run at 14,400 pixels shy of 2k Native. For reference 2k is the resolution 8k DLSS Ultra Performance runs at.
Vb_33@reddit
The 5090 is a significantly better 8k GPU than the 4090, progress to 8k will happen regardless the problem is that Moore's law has killed raster gains so progress will be slower.
-Purrfection-@reddit
Yeah the difference between the 4090 and 5090 is way bigger at higher resolutions like 8k. It's easier to keep all the CUDA cores occupied at any one time.
Hamza9575@reddit
5090 does stellar blade at 55fps at 8k native max settings. And that was released just now. So 5090 is a 8k gpu for many games.
tukatu0@reddit
Stellar blade runs at 4k 30 (up to 40) on a 3060. It's not the hardware that's a problem. Its software that will end up gimping everything
hurtfulthingsourway@reddit
Now DisplayPort and HDMI need to step up the game, also finding GPUs that support 8K without compression for the max of 75Hz are few, I really want a 8K display but HDMI and DisplayPort are not ready UHBR20 is too slow HDMI 2.2 is to slow as well without Compression. Maybe in a few years we can have really nice displays.
Far as I know for smaller monitors 8K is the limit for being able to see pixels, and for larger monitors I think it was 12K really no point in higher resolution after that for anything other than TVs and projectors.
Valuable_Associate54@reddit
Wonder if this will use that GPMI standard China's been cooking
capybooya@reddit
What is the problem with compression? People are already running that just fine on DP1.4.
callanrocks@reddit
People are running it fine but it's silly to go crazy on resolution and introduce lossy compression into the mix.
PastaPandaSimon@reddit
DSC is visually lossless. We are as ready for 8K as we were for 4K 240hz when they launched.
terraphantm@reddit
Visually lossless is a fancy way of saying lossy. But I do think we're reaching a point where the compression has to be accepted, the bandwidth required for 8k and high refreshrates is tremendous.
eleven010@reddit
I agree that it might be hard to see the artifacts with DSC, but for me, visually lossless is not the same as lossless, as it will depend on the visual acuity of the user as to whether or not artifacts will be visible.
Without compression, there is no question as to the visual acuity of user, as nothing is being compressed and reconstructed.
PastaPandaSimon@reddit
Visually lossless is meant to mean that there aren't visible artifacts, and compression is done to data that does not affect output perceivable by the human eye.
Say, jpg or video file compression would bind similar Pixels into blocks of the same color, resulting in visible artifacts. DSC does not seem to do that.
There were a couple of tests by major Youtubers who tried to spot or even measure compression with the help of additional instruments and concluded that they could not find any evidence of it in visible light.
AccomplishedRip4871@reddit
The only real issue with DSC is the bad alt+tab experience from full screen applications, plus lack of DLDSR support on NVIDIA GPUs before the Blackwell.
terraphantm@reddit
I think software issues like that will probably be more feasible to fix than sending that sort of bandwidth uncompressed. And as you note, the DLDSR issue is fixed with Blackwell -- so that advantage is gone.
exsinner@reddit
Older games with exclusive fullscreen? Just get a borderless window mod and be done with it.
AccomplishedRip4871@reddit
He asked what the problem with compression is, i gave him examples.
Assaulter@reddit
Random question but maybe you'll know, if you play cs2 on 4:3 stretched is there any way to get that res with borderless fullscreen? Since it seems i need to be fullscreen for it
AccomplishedRip4871@reddit
No, and it's not an issue with OLED monitors, normal LCDs can't do that too - your best "closest"result to this would be going with WOLED monitor which has 2 mods, 1440p 240hz and 1080p 480hz, and getting used to playing at fullhd.
exsinner@reddit
Cool! Dldsr at 8k is a good example as well.
StarbeamII@reddit
Also interesting in the article:
ryanvsrobots@reddit
That means it will have something like the rainbow effect in projectors. Going to be a nightmare in terms of eye strain.
bexamous@reddit
Gimmie 40". I tried 32" again its so tiny. 40" is perfect.
Qaxar@reddit
Anything smaller than 72" makes not sense for 8K.
DeliciousIncident@reddit
That's nice, but according to wikipedia, neither the latest HDMI nor DisplayPort standards support 8k 120Hz without compression (DSC) yet, so we can't really use it.
capybooya@reddit
DSC is perfectly fine though, the only problem I've heard of is it reduces the total number of outputs so you might not be able to run all the ports on your graphics card when you enable it.
ibeerianhamhock@reddit
Yeah it is considered visually lossless as in people can't identity the correct source image more than 75% (compresses/decompressed). Seems like an odd standard, but it basically means that it might be ever so slightly better than chance people can predict which signal is compressed or not.
dannybates@reddit
Yeah I can toggle DSC off and on and I cant tell a difference at all
noiserr@reddit
8K will be nice for testing high end GPUs. It removes the CPU from being a bottleneck.
PXLShoot3r@reddit
Yeah that's not how it works
battler624@reddit
It actually does but at 8K you'll have other issues (such as vram)
Metalligod666@reddit
There aint no way 8k is using 32gb of a 5090's vram. Maybe around 24gb in certain titles, but I would assume around 20gb for most current day games. I dont know if this is something someone can test accurately rn with resolution scaling.
tecedu@reddit
Yes can run out of the 32gb of vram. A couple of people have tested in VR. Last time when Linus tested he was runnning out of 24gb on native res.
battler624@reddit
It won't fill 32gb but you can't do a GPU test without GPUs to use.
It will fill the 24gb tho, it happened with the 3090.
capybooya@reddit
Testing supersampling in VR will probably do the same. Way too few of those benchmarks though.
reddit_equals_censor@reddit
we already got driver level super resolution we don't need native 8k for testing.
noiserr@reddit
You're still rendering the final image at 4K, which saves on memory bandwidth used by ROPs. So it's not a true 8K test.
Blueberryburntpie@reddit
So test the 8GB GPUs with the 8K resolution? /s
capybooya@reddit
That's great for professional cases and designers.
4K is perfectly fine, even at 32". I guess ideally maybe 6K could get some barely noticeable improvement for fonts and photos, possibly in motion in gaming but I'm not sure, but that's way down the list of priorities right now. I'd rather have higher refresh rates at max resolution or good black levels with some tech that doesn't degrade and burn in like OLED.
Vb_33@reddit
1080p monitors are fine for typical use but there's no doubt that the pixel density 8k provides would improve the experience over a 4k monitor. Technology is moving forward and that's a great thing.
exilus92@reddit
1080p is certainly "usable", but it's not as good as a 1440p for productivity. The gap betwen them can be very large depending on what applications you use. A coder might not care, but it's night and day if you work on CAD drawings. It's also fucking great for vertical montirs because a 1440p/27" is just big enough to fit the full toolbars on popular productivity softwares (eg. anything made by microsoft). 4k is even better, but I would argue it's mostly for esthetics compared to 1440p, diminishing returns and all that; It also brings a few issues with scalling
ThankGodImBipolar@reddit
IIRC even 1440p at 32” has approximately the same PPI as a 24” 1080p monitor. The highest DPI monitor I’ve used daily was a 27” 1440p monitor and I traded it for another 32” 1440p monitor the second I could. You can do a little better than that nowadays but it’s still totally serviceable and much better than 27” at 1080p (which I used in ultrawide format for a while).
capybooya@reddit
I still use one 1440 32" with my 4K monitor, and I used 1440 32" and 1600 30" for many years. Yeah, its perfectly serviceable. But side by side with the 4K one there is a stark difference when the text is so much smoother and the pixel grid is gone (although your mind kind of filters away the grid when you don't have a reference). That's why I imagine that going beyond 4K will have extremely diminishing returns, but I don't rule out some benefits either because of my previous experience. All the way to 8K at this size seems ridiculous though.
PastaPandaSimon@reddit
I'm so excited for this when combined with the improvements to DLSS. Just imagine near-perfect upscaling from 1440P to 8K, while using the hardware resources more similar to what it takes to game in native 1440P. We are so close to that reality.
I use a 4K 32inch monitor, which is 90% there towards perfect sharpness, but not quite exactly. Eventually going up to 8k would finally be the step completely creating the illusion that you are not looking at a pixel-based display anymore, but a visually perfectly sharp image.
HandofWinter@reddit
This is what I've been waiting for, holy shit. It's close to perfect, and I'll probably be willing to pay a semi-unreasonable amount for something like this.
thelastsupper316@reddit
Whyyyy 8k, it's so dumb, 8k TV's died and we are completely focused on 4k r&d rn, only Samsung still sells them and I think they might finally drop them next year because they are selling less and less every year.
Agreeable-Weather-89@reddit
In terms of resolution if 4k makes sense at 43", which it does, then 86"+ would suit 8k,.and certainly 100"+.
I do agree 8k at 32" is overkill but I could see 8K at 55" for a PC monitor.
The big problem is content, there's very little professional content at 8k and there's no point in a resolution which has no content.
I just wish the industry instead of resolution or hardware stats they'd fix the cluster fuck of HDR standards.
Artoriuz@reddit
The "there's no content" argument doesn't really matter that much for a computer display.
UI elements, fonts and images all benefit greatly from a higher resolution.
Text looks a lot better on a modern smartphone than it does on a 32" 4k monitor for example, even considering viewing distance.
Of course, how important this is varies from person to person, but there's still a lot of room for improvement.
reddit_equals_censor@reddit
in a lot of these cases text clarity is strongly effected by subpixel layouts more than anything else.
the phone could use a weird sub pixel layout, but it gets properly adressed by the os of the phone, OR DOES IT?
and on pc the 4k screen could be an oled weird subpixel layout screen, instead of RGB, which makes text clarity vastly worse and not just clarity, but also edge text artifacts.
just throwing resolution at it generally makes those problems vastly better, but the subpixel situation is important to remember here. it certainly isn't just a resolution thing.
iDontSeedMyTorrents@reddit
Subpixel layout really is meaningless at high DPI. That's why it doesn't matter at all on phones nowadays.
Artoriuz@reddit
With enough pixel density the subpixel layout stops being important as you don't need to do subpixel rendering anymore (grayscale starts being good enough).
iDontSeedMyTorrents@reddit
Just wrong. Even for gaming, 8K would allow you easy integer scaling for 1080p, 1440p, and 4K. You will never have to compromise your preference for resolution or frame rate because of limits from your monitor.
thelastsupper316@reddit
I mean HDR is kinda fixed now on most TVs and TCL and Hisense support all HDR standards other than the failed advanced HDR by Technicolor.
Agreeable-Weather-89@reddit
But not all apps, and even some subscription tiers allow for for HDR.
Then with so many standards there's probably a back end headache and I know I've seen articles about HDR issues here and there with mastering.
Content creators also rarely support it, even big technical ones.
Windows 10/11 still has HDR issues.
thelastsupper316@reddit
HDR on YouTube is still a mess, and yes some services give you fucking 720p low bitrate stereo unless you pay a lot extra, but HDR rn is a software issue not hardware one, and TV settings are definitely not the easiest thing for most users to use.
Agreeable-Weather-89@reddit
Exactly.
The panels themselves are good, great even, it's the software which lets everyone down.
I have more settings just for brightness on my TV that I do my phone despite my phone being in far more environments.
Most settings are useless additions to make up for bad defaults and unless you mess with those settings you lose so much 'performance'. It's the same with sound.
This isn't directed atyou but TV companies.
You are in control of the hardware, you know what speakers you are putting in these TVs. Just make it sound good. Stop bombarding me with settings half of which just make it worse the other half just don't do anything. The only three sound settings should be
Output
Voice enhance (just because of accessibility)
Is the TV wall mounted
Everything else is useless.
DesperateAdvantage76@reddit
8K at 32" means that you no longer need to use antialiasing, it has no effect since the eye can't discern the pixels.
capybooya@reddit
You'd need to upscale anyway though, and DLSS and related techs remove most of the aliasing in that process.
DesperateAdvantage76@reddit
On newer games yes, but many of my favorite games I render at native resolution. League, D2R, WoW, etc.
folowerofzaros@reddit
You do cause modern taa is used for more than jaggies.
DesperateAdvantage76@reddit
I'm specifically talking about the jagged edges you see from being able to view individual pixels, not artifacts from upscaling, lighting, etc.
ProtoplanetaryNebula@reddit
8K TVs are not necessary, but the reason they didn’t take off was because the price of was too high to justify when 8K content is non existent. If the price was a bit closer, some people would have bought them anyway thinking content might come through eventually and they would be ready.