HDMI 2.2 standard finalized: doubles bandwidth to 96 Gbps, 16K resolution support
Posted by EmergencySwitch@reddit | hardware | View on Reddit | 234 comments
heylistenman@reddit
Will the average consumer need this much bandwidth or have we reached a limit on noticeable improvements? Call me a boomer but I struggle to see why screens need more than 4K 120hz. Perhaps for fringe cases like pro-gamers or very large commercial screens.
tukatu0@reddit
Very much so https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
And more recently https://blurbusters.com/massive-upgrade-with-120-vs-480-hz-oled-much-more-visible-than-60-vs-120-hz-even-for-office/
sabrathos@reddit
I agree, but also I think we don't need this much bandwidth. We need eye-tracked foveated rendering!
When I'm playing alone on my desktop computer, I shouldn't need the cable to carry full detail for every single part of the screen. I would love to have the immediate fovea be 8K, the area surrounding that be 4K, and then the rest of the screen be 1080p-ish.
Give me that at 1000Hz refresh rate, please (with frame generation up from ~120fps for non-esports titles). If I need to swap to non-eyetracked mode for whatever reason, I'm happy for it to max out at 4K240Hz.
Strazdas1@reddit
anything dependant on eye-tracking will be instantly useless the moment more than one person needs to see the screen.
sabrathos@reddit
Don't let perfect be the enemy of good. If your 8K1000Hz monitor with foveated rendering has to fall back to 4K240Hz when accommodating multiple people, that seems perfectly fine by me. The vast, vast majority of the usage of my monitor is personal.
Strazdas1@reddit
I agree, theres no point in doing eye tracking when regular screens can be good enough without it. However doing eyetracking and having more than one person in the room means its flat out downgrade from them.
sabrathos@reddit
It's not a flat downgrade... at least compared to a non-eyetracked world. It's a baseline assured quality. Whatever maximum we can achieve full-field with the current bandwidth and rendering capability can be the fallback, and we're still inevitably going to keep developing to pump up bandwidths so that we eventually reach 8K 1000Hz full-screen. But in single-user scenarios we can improve the experience dramatically with fovea-aware rendering, to both make efficient use of current bandwidth and rendering ceilings.
For your computer monitor, be honest: what percentage of the time do you have multiple people looking at it? For the vast majority of people, it's well under 1%. And if the multi-user fallback is the limit of the cable anyway? This seems like an argument from purity, not one from utility.
tukatu0@reddit
Encoders don't work like that. Even if you dont need to render it. The screen and its output is seperate.
I myself am ok with fake frame tech. I f""" loathe upscaling though. The quality decrease is apparent even through f""" 1080p youtube compression. Yet it's so praised. . Ill stick to 8x frame gen that you can turn off thank you. Which there is a whole discussion about devs forcing it on one day.
You say you are ok with only 4k 240hz. But that's already not true in a sense. All lcd vr displays strobe. Its the reason they are stuck at 100 nits without hdr.. The quest 3 has strobing to 3300hz equiavlent (0.3ms). So even just 2000p per eye. It's more like 4k 3000fps that you are actually already playing in. Even the first gen displays like the rift vive strobed from 90hz to 540hz or so.
I guess you may be referring to latency wise. Being ok with noticeable lag. But like i said. We are already there. They just do a bad job advertising it. Like its not mentioned at all. By anyone oddly.
Of course fake frames to 1000hz/fps would allow hdr and getting that brightness up alot. So they need to strive for it. And all that other jazz that would come.
sabrathos@reddit
Uh... You kind of went off the deep end a bit.
I'm saying that a codec designed for foveated rendering would alleviate a huge percentage of our bandwidth pains. A nice middle ground between today's 4K 240Hz and an ideal future's 16K 1500+Hz would be a "smart" 1000Hz codec with foveated rendering information.
I said I'd be satisfied with having "just" 4K 240Hz as a fallback mode in case I have to present something to other people my monitor. I personally would be using the solo 1000Hz foveated mode for 99.99[...]9% of the time.
I don't think you realized I was supporting the cause, lol. I'm a champion of high framerates, as well as low persistence displays.
I was spitballing 120Hz as a reasonable baseline for both input lag as well as minimizing artifacts as you project up to 1000Hz.
And yes, the hope would be high refresh, non-strobed (or at least selectively-strobed) displays so we can get 2000+ nits of brightness.
(As a side note, DLSS4 upscaling on my 4K 240Hz display with native 240fps content looks very good; I wouldn't discount it. The artifacts are dramatically reduced compared to 1440p.)
tukatu0@reddit
Ooooh well why didnt you say so from the beginning. Bwahaha. It didn't cross my mind. You would think someone would have already made it. But it must not be easy. It's especially in the interest of youtube to make it. The countless video that are just 1 hour long of a few frames repreated over and over would be the easiest to just reduce to 144p while keeping a small section at native level
There is a lot to say on the fps part but frankly it doesn't matter until the tech is here. Around 2029 most likely
your_mind_aches@reddit
I mean that's Blur Busters, of course they would say that.
Tons of people are now saying the screen on the Switch 2 looks amazing because of the wider gamut, and will be powering through the insanely blurry image due to the terrible pixel response time, not realizing just how good things could look if the display was good.
CarVac@reddit
But does it have to be 4k240?
I'm a huge proponent of 4k for desktop use but for gaming I'm extremely happy with rock solid 1080p240 with backlight strobing.
willis936@reddit
I'd settle for 8K960.
ScepticMatt@reddit
While flicker fusion means that we cannot see detail beyond 120 Hz (depending on a few factors, which is why darker cinema projection gets by easier with 24 Hz), there are good reasons to have higher refresh rate:
Time-Maintenance2165@reddit
That's objectively false. The reality is that it's far more complicated. We can notice things with as fast as ~1000 Hz.
And there are many people who can instantly tell if they're playing on a 240 Hz or 480 Hz monitor. 120 Hz isn't even closer to the limit.
ScepticMatt@reddit
Yes I don't deny the fact that we can tell 120 hz from 240 hz. From a 10,000 hz display even. That was my point.
But our eye has some sort of frame rate around 100 Hz (exact value depending on brightness, size, eccentricity, color) if we had a "perfect" 120 Hz display (sub ms persistence, eye tracking etc) we in fact wouldn't notice more motion detail.
https://en.m.wikipedia.org/wiki/Flicker_fusion_threshold
Strazdas1@reddit
Airforce did some testing with pilots and found that at 1/215th of a second (as in 215hz display) they could obtain enough information of a shape of a plane to tell what plane model it is.
ScepticMatt@reddit
That's a different, albeit related statement. If you flash multiple different images of planes right one after another faster than CFF, they would blend together.
sabrathos@reddit
You're assigning way too much to the flicker fusion threshold.
You said yourself in the original post that higher refresh rates lower the stroboscopic effect. 120Hz is nowhere near fast enough to make this look natural, and lowering the persistence amplifies the visibility of this effect, not reduces it.
As long as our display is not actually updating so that content moves in 1px jumps per frame, our eyes will see "ghost jumps". The slower the refresh rate, the more extreme these jumps will be. To get perfect 1px/frame jumps at 120Hz would be only 120px/s motion, which is very, very slow. The lower the persistence, the more obvious the jumps, as anyone who's played games on a virtual screen inside something like the Quest 3 can attest to.
People aren't misunderstanding your post; I think they're legitimately downvoting your claims of the flicker fusion threshold representing something it doesn't (I didn't downvote, for the record). It's certainly a useful phenomenon to know of, but it's not some magical refresh value at which all content will suddenly look lifelike if we could only make a good enough display.
ScepticMatt@reddit
CFF is still the "refresh rate of the eye".
All these "ghost jumps", stroboscopic effect, phantom array effect etc in the end have one cause: temporal aliasing.
Just like with spatial or acoustic aliasing.
But in order to implement good temporal anti-aliasing (motion blur) you need very fast eye tracking that captures saccades. You need to filter out (i.e blur) the difference between the object and eye movement. Important that is not when the eye is following a moving object on the screen, in that case the static background needs to be blurred.
Also you need a screen that has at least two times CFF to implement this filtering according to Nyquist's theorem.
sabrathos@reddit
Ah, I see now why you mentioned eye tracking in your later comment. Yes, with eye-tracking and content motion vectors, the content itself can add blur to counteract the stroboscopic effect while keeping moving content you're looking at clear.
I'm in agreement with your actual underlying point, just not the particular framing you used to communicate it. There is no quantized refreshing going on, and our rods and cones activate and decay in an analog way. The sensitivity with which this happens can be exploited by our displays/lights for certain perceptual phenomenon, like flicker fusion.
If we actually had a "refresh rate", we would necessarily see the stroboscopic effect IRL instead of persistence of vision motion blur, and when looking at traditional displays we would have a whole bunch of out-of-phase issues with our eyes sampling at a different rate than the content is updating on-screen. This is why just dropping "CFF is the 'refresh rate of the eye'" and "we cannot see detail beyond 120Hz" without strong further qualification leads to just as many wrong assumptions as it does right assumptions, and isn't a great framing IMO (and isn't just nitpicking, but rather is exactly why people were misled as to what you were trying to communicate).
ScepticMatt@reddit
Yes it's just semantic, but CFF is the limit of temporal signal we can perceive.
Another analogue would be a camera, where the detail captured is not the Megapixel of the sensor but line pairs per mm of the whole system including the lens.
And in a display the temporal detail displayed is not just the refresh rate but also depends on image persistence and contrast. A recent example would be the switch 2 LCD, which while supporting 120 Hz, would have worse motion detail than an OLED running at 30 Hz
Morningst4r@reddit
Just because your brain will interpret low refresh as being cohesive doesn’t mean you won’t notice or appreciate higher rates. You can see the detail in faster refreshes, your brain just is just filling in the gaps to make you not notice it’s missing at lower frame rates.
ScepticMatt@reddit
The main reason you see more detail at higher refresh rates is lower persistence blur. This is why a 360 Hz OLED looks about as detailed as an 480 Hz LCD (both sample and hold)
Time-Maintenance2165@reddit
That's the sort of thing that I have a hard time believing until its been empirically validated. And even then, it might only be true for an average human. I can't imagine that this sort of thing would be identical for every human and there's going to be many with a far higher threshold than average.
ScepticMatt@reddit
As for your question about individual variation:
https://pubmed.ncbi.nlm.nih.gov/38557652/
Note that CFF strongly depends on contrast, size in terms of viewing angle and periphery of the object viewed. So don't be surprised to see a CFF value of 60 Hz in this experiment
Time-Maintenance2165@reddit
That's not evaluating what I was asking about. That's looking at the minimum, not the maximum.
ScepticMatt@reddit
If you display a full white screen followed by a full black screen, there comes a point where it transitions from evil flicker to lo ok like a constant illumination (consciously, the flicker might still trigger migraine)
You can test it yourself here. Note that in order to achieve a flicker rate above the critical flicker fusion rate, you would need a 240 Hz+ for a 120 Hz simulated flicker rate
https://www.testufo.com/flicker
Time-Maintenance2165@reddit
As I read more about it, the flicker fusion threshold seems to be a study of the minimum at which something appears smooth/steady. It's not looking at the maximum at which additional smoothness is no longer perceptible.
I also don't see anything in that link to support your claim about 120 Hz being perfect if it has the right persistence and eye tracking.
The is very misleading to say "we cannot see detail beyond 120 Hz".
tukatu0@reddit
Like the comment below pointed. It's absolutely false. Even just 50 years ago those cineams were probably 50 nits. Today with hdr 1000. Eventhat same content should be around 4x as easy to see.
Using temporal side as an example. Scroll down to "Blur Busters Law Is Also a Vicious Cycle" in https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/ altough you might need to read half the article to understand. The section right belowit also has a whitepaper on 10,000fps
I touched hdr 1000 because ofthe brightness aspect. But even the same 100 nits max content with rec 2020 would have way more detail visible getting blurry by low fps
GenZia@reddit
That's a fair point.
The difference between 120 and 240 is mere 4.16ms. That's like going from 30 to 35 Hz.
While it's definitely noticeable, at least in my experience, it's not worth spitting out 2x more frames.
A classic example of the law of diminishing returns.
innovator12@reddit
Twice as many frames means it's twice as easy to track your mouse cursor while moving rapidly across the screen.
Latency isn't everything.
GenZia@reddit
I wouldn’t be too sure about that “twice as easy” claim.
We’re humans, not machines!
A 4000Hz polling rate mouse won’t make you four times better a player than a 1000Hz one, even if it's technically four times faster.
P.S. It’s interesting how easily people buy into marketing fluff these days. No critical thinking whatsoever, apparently.
Strazdas1@reddit
if that 4000 hz polling rate mouse comes with a 4000hz display output it would certainly be extremely significant.
innovator12@reddit
Apparently you missed my point: when moving the mouse cursor rapidly at 60Hz, gaps between each position which is actually drawn are quite large. At 120Hz and the same speed, the cursor will quite literally be drawn twice as many times, halving the distance between cursor images. The same applies when jumping from 120Hz to 240Hz.
Time-Maintenance2165@reddit
You're right he's exaggerating slightly.
Though he's still right in concept so not sure what your rant about marketing is about.
GenZia@reddit
“Twice as easy” isn’t a slight exaggeration, for starters.
As for my one-sentence long “rant,” it seems people will buy anything that’s black, glows with RGB, and comes plastered with slogans like “Gamer,” “Gaming,” “Extreme,” “Pro X,” “Ultimate,” “Turbo,” and “Next-Gen.”
Take what you will.
Time-Maintenance2165@reddit
Yes, it is.
Boring irrelevant tangent.
Time-Maintenance2165@reddit
Yes, it is.
Irrelevant tangent.
Morningst4r@reddit
Why does it have to make you play better to be worthwhile? I see a lot of these “owns” aimed at mythical people that think they’ll become pros with good gear, but I care if the game looks and feels better to play, which it does at higher frame rates.
tukatu0@reddit
You are wrong. Https://testufo.com will teach you what frame rate is. Should be immediately apparent
bctoy@reddit
I haven't played at that low of a framerate for quite some time.
But similar reduction in frametime on the display I'm using, 57Hz vs 72Hz( 3Hz below 60Hz/75Hz for Gsync ), is a huge difference.
apoketo@reddit
The diminishing returns for feel/latency and motion clarity are different though.
240hz+fps is still 4.16ms of blur. Or like looking at an image with a ~4px blur applied when it's moving @ 1000px/sec, which isn't very fast compared to FPS mouse sweeps. Meanwhile Index and Quest have 0.3ms strobe lengths, the motion clarity equivalent of 3333hz.
We're likely passed the diminishing returns for feel (Hz wise) but for motion it's likely ~1500hz.
Buggyworm@reddit
So you just need to shave 4.16 ms from a pipeline, just like when you go from 30 to 35 fps. It's harder to do when your base frametime is already low, sure, but technically speaking it takes the same amount of resource saving
DeepJudgment@reddit
I recently got a 55" 4K TV and I can definitely see why 8K is already a thing. It's all about pixel density. If I were to get, say, a 75" TV, I would want it to be more than 4K.
bctoy@reddit
Same here, S90C 55'' being used a gaming monitor and a jump to 8K would be a huge upgrade.
The new DLSS transformed model however is helping out a lot, otherwise Cyberpunk looked like a blurry mess even with DLSS quality at 4k.
mr_doms_porn@reddit
Here's a couple:
PC VR (each eye has its own screen so double the bandwidth, refresh rate really matters with 90hz being the consensus mimimum)
Large screens (4K looks amazing on monitors and smaller TV's but the bigger the screen the less good it looks. Tv's continue to get cheaper and screen sizes keep going up. It's likely that eventually smaller screens will stop increasing in resolution and it will become different resolutions for different sizes)
HDR (HDR signals increase the bandwidth requirements of a video signal considerably, it is often the first thing to stop working when you use a cable that is just a little too long. As HDR becomes more common and newer formats come out, this will only become more important)
Audio (Audio is often overlooked because it tends to be much less bandwidth than video but newer formats take up more. Dolby Atmos and DTS:X include high quality audio alongside a lot of data in order to track objects)
wichwigga@reddit
I just want to be able to not use DSC for any resolution refresh rate combo because there are so many small inconveniences that come with it. Obviously I'm not talking about image quality but things like alt tabbing in games, Nvidia being an asshole and not supporting integer scaling when DSC is enabled...
SANICTHEGOTTAGOFAST@reddit
Blame Nvidia, not DSC. AMD cards (and presumably Blackwell) have no such limitations.
wichwigga@reddit
I do blame Nvidia but that doesn't really solve the problem does it
SANICTHEGOTTAGOFAST@reddit
So the industry should stop using DSC because your GPU vendor half-assed support?
METROID4@reddit
It's a bit of a problem yes when the GPU manufacturer that's at like 92% market share is the one that has this problem on almost all their cards they've sold before
DSC is neat and all but I don't think it's fair to rely on it too much, just like how you shouldn't rely on frame gen to calculate how many FPS a card can push, or soon we'll have game devs targetting 60 FPS requirements assuming you hit 15 FPS and use 4x MFG
Sure lot of people keep parroting how basically no one can ever tell the difference between DSC and native, yet there's always plenty of people saying they can. Are they all lying? Reminds me of how often people also said you absolutely cannot see above 24 or 30 or 60 FPS as a human, so there's no need for any monitor to exist higher than 30 or 60Hz right?
pianobench007@reddit
You and I will be receiving unnecessary downvotes. But generally speaking we are correct. The end user will NEVER* need a 16K display or even 8K display. Simply because as they increase resolution and screen size, the font shrinks tremendously and a lot of media does not adjust quick to the 8K or even 16K resolution increase.
Even when you apply the windows zoom to recommended. For example a 17 inch display with 4K resolution, windows will recommend a 250% zoom in so you can read the text in the NYTIMES dot COM. Or something like that.
But that is just for text. When it comes to media, the zoom in sometimes does not scale well. So you could end up with an 8K or 16K resolution but your media has been designed for 1080P.
It will be like how I try to go back and play a game made during 640x480 resolution days on my 1440P display. It just looks terrible. Even worse are 320x200resolution standard of the before time. The DOS games that I still revisit on my 1440P @ 144hz display are just terrible because of this.
I have to lower my own display resolution so that I can actually play these old games. I believe the same is true for 16K and 8K playing 1080P games.
TheWobling@reddit
Refresh rates above 120 are still very much noticeable and for fast paced games are good.
Time-Maintenance2165@reddit
They are.
As an interesting tangent, people can tell the difference between 200 fps and 400 fps on a 120 Hz display.
RefuseAbject187@reddit
Likely dumb question but how does one see a higher fps on a lower Hz display? Aliasing?
vandreulv@reddit
Frame latency. The less likely you'll have the game skip a frame and update to your movements to the next for anything that you do. 400fps even when displayed at 60fps means the highest accuracy of the update will always follow the next frame you see.
Time-Maintenance2165@reddit
It has nothing to do with aliasing. It's all about a reduction of input lag. If you're running a 120 hz display at 120 fps, then information is on average 8 ms old. At 200 fps, that drops to 5 ms. At 400 fps, it drops to 2.5 ms.
It's also due to the fact that fps isn't an exact constant. Even if you have 200 fps average, your 1% lows can be below 120 fps. That means that twice per second, you'll have information that's displayed for 2 frames (16 ms). That's just like dropping to 60 fps.
And subjectively, it is instantly noticeable. 400 fps does just feel noticeably smoother. It's not something you need unless your competitive. It's just feels so good.
arandomguy111@reddit
Tell the difference is not exactly the same as see a difference. Higher FPS for most games, especially for most competitive esports type titles, will result in lower latency.
Also they would likely not be playing with vsync. So you technically can show multiple different frames on screen during a refresh cycle (this is also what causes tearing).
bphase@reddit
Input latency, probably the amount of tearing or type of it as well
NormalKey8897@reddit
we've been playing quake3 on 70hz monitors with 70 and 150fps and it's VERY noticeable i'm no sure how is that surprising :D
taicy5623@reddit
I've got a 240hz monitor and the jump from 120 to 240 is more useful for all but completely eliminating the need for VRR on my OLED than any actual improvement in perceived latency.
vini_2003@reddit
But at that point you won't be doing 4K, so... still, nice to have more bandwidth I guess.
Time-Maintenance2165@reddit
For competitive games, yes you can be. CS2 and dota can very easily run above 120 fps.
vini_2003@reddit
Please tell me where, in my comment, I said "we don't need more than 120 FPS".
Please, do point it out. I even said: "nice to have more bandwidth", which allows lower resolutions to have more FPS.
Time-Maintenance2165@reddit
You said they won't be doing 4k above 120 fps. I'm saying that's wrong. With those competitive games (and I'm sure others but I didn't explicitly check), fps of 200-300 is very doable at 4k. And I'm not talking about 4k upscaling. I'm talking native 4k.
vini_2003@reddit
That's fair, I'll take this L. You are correct.
VastTension6022@reddit
You forget we'll eventually have 1000Hz displays with 16x framegen
hugeyakmen@reddit
Perhaps not doing 4k... yet
MonoShadow@reddit
GPUs will get faster. Plus not all games target 30FPS on consoles. And if you can reach 120FPS natively, might as well use Frame Gen to max out the screen. After all this is the proper use for this tech.
shoneysbreakfast@reddit
You should try a 5K display some time, there is a noticeable improvement in clarity over a 4K display of the same size. With 2.2 they will be able to do 5K 120hz or 240hz with DSC which will be very very nice even for non-gaming reasons. There are diminishing returns once you’re up into the 8K+ range but more bandwidth will always be a good thing.
The_Umlaut_Equation@reddit
Bandwidth required is ultimately a function of number of pixels, number of colours, and number of frames.
Many use cases start to hit diminishing returns at the 4K/5K mark. Very few people in these circles seem to accept this though, and act like the limits of human physiology are 20+ years of advances away.
Frame rate is subjective to a degree, but again the diminishing returns are obvious.
Colour depth and HDR take a bit more data but not massive amounts. Even if you go to 12 bits per channel colour depth, 16x more than the standard , that's only 50% more bits and you basically max out human vision.
8K resolution, 12 bits per channel, 240Hz is 334Gbps of bandwidth uncompressed, and I'd argue that's well past the point of massive diminishing returns for 99.99999% of the population. 5K at 10 bits per channel depth, 144Hz is 76Gbps uncompressed.
reallynotnick@reddit
Yeah I want 7680x3200 at 120hz 38-42” (basically an ultra wide 6K monitor). That for me is effectively my end game from a productivity standard.
innovator12@reddit
Would love to if there were affordable options.
Doesn't have a lot to do with HDMI though because DisplayPort already serves the need in the PC market.
shoneysbreakfast@reddit
HDMI 2.2 will have more bandwidth than the latest DP 2.1a (96Gbps vs 80Gbps) and GPUs and monitors have been using both forever so a new HDMI spec is pretty relevant to the PC market.
Not_Yet_Italian_1990@reddit
Forever? I don't think Ada supports anything higher than DP 1.4 and those GPUs were launched in 2022/2023...
shoneysbreakfast@reddit
They have been using both HDMI and DP forever, therefore a new HDMI spec is relevant to PC.
Not_Yet_Italian_1990@reddit
Oh. Gotcha.
Truth be told, I'd actually kill for a TV that had at least one display port but I don't know that I've ever seen one that does. It sorta seems like HDMI survives by pure backwards compatibility inertia. But I'd gladly give up one of my, like... 4 HDMI ports for a display port input.
CANT_BEAT_PINWHEEL@reddit
VR has always been bandwidth starved. Admittedly it’s a weird industry that’s so niche that even nvidia shipped their high end card this generation unable to run vr and neither gpu maker ever highlights the benefits of upgrading for vr. To them it’s treated like a dying legacy system or an extremely niche use case. What makes it weird is that two of the richest tech companies in the world are single handedly trying to keep vr relevant but are failing to convince other tech companies to even give a minimum standard of support (working drivers at launch).
arislaan@reddit
FWIW my 5090 hasn't had any issues running VR games. Had it since March.
CANT_BEAT_PINWHEEL@reddit
Yeah credit to nvidia for fixing it very quickly. They also fixed the issue last year of not being able to change refresh rates without restarting steamvr in less than a month last year iirc.
My 9070 xt on the other hand still requires me to restart steamvr. I haven’t tested motion smoothing to check recently but I don’t know if they’ve fixed the massive memory leak with motion smoothing either.
Impressive_Good_8247@reddit
Motion clarity is trash on low refresh rate monitors.
SolaceInScrutiny@reddit
You're a boomer
Madeiran@reddit
I’m a casual gamer and I can tell the difference between 4K120 and 4K240
SyrupyMolassesMMM@reddit
16k - thats it. Thats basically the max anyone can ever realistically need based on the size of tv’s that are possible against the wall sizes we build, ans the distance we would comfortably sit from them.
Even 8k is pushing the limit of human eye resolution. We’re literally approaching ‘end game’ for certain technologies here.
Itll still develop on for specialised commercial use, and then other stuff will get routed through the same cable.
But ultimately, there will never be a home market use for a video cable that passes > 16k resolution.
I guess theres no real upper bound to fps though; so theres always that.
Strazdas1@reddit
human eye does not have a fixed resolution. There is no limit to what it can see and will vary greatly on many factors depending on the eye itself and enviroment. Anyway, 16k is nowhere near enough to be real life replication and thus isnt enough for real VR.
SyrupyMolassesMMM@reddit
You have no idea what youre talking about :) everything in biology has limits due to cell sizes, molecule sizes etc. shit even physica has limits ie planck length etc.
Google it. Youll find that for home viewing at normal viewing distance on a normal ‘big’ tv that will fit on a wall, 16k is equivalent to human eye perception.
Strazdas1@reddit
well true, technically the limit would be the size of a single cone cell in your eye. But at the rate resolution in displays are advancing we will reach that oh in about a million years.
SyrupyMolassesMMM@reddit
At normal viewing distance, with a normal sized tv, its about 16k. Google it. Youre completely wrong Im afraid :)
Vr is a whole different thing as thats jammed right up against your eye.
Strazdas1@reddit
O wouldnt trust some random googled website for this answer as far as i could throw it. And i cant throw very far.
SyrupyMolassesMMM@reddit
Check chat gpt? Ask an expert? Lol
Strazdas1@reddit
asking GPT would be even worses option. Want an expert opinion? Look at what blurbusters put out.
emeraldamomo@reddit
You would be amazed how many people still game at 1080p.
SyrupyMolassesMMM@reddit
Yeh man, a couple years ago the ‘mode’ graphics card on steam was still a 1060…i think its up to a 3060ti or something now, but its pretty wild.
Shit i was on a 1060 until 2 years ago…
chronoreverse@reddit
Even for FPS, 1000Hz is supposedly where even sample and hold motion blur is negligible. So you could say that's the max anyone could realistically need based on what eyes can actually see.
The_Umlaut_Equation@reddit
I wonder at what point proponents will accept you can't see atomic size pixels. 4K pixels under multiple use cases are below the angular resolution of someone with 20:20 vision.
Strazdas1@reddit
I can see atomic size pixels all around me in real life.
Roph@reddit
If I drew a 1px thick line on a 4K TV, it'd be so thin that you'd be unable to see it? When did you last get checked?
The_Umlaut_Equation@reddit
Not everyone sits 1 inch away from a 200" TV, despite what this place believes.
When did you last get checked?
mr_doms_porn@reddit
The size of pixels depends on the screen size as much as the resolution, 8K and 16K will make a bigger difference for larger screens.
Morningst4r@reddit
You can definitely see more detail in high PPI displays, it’s just more subtle to the point lower PPI tends to look “good enough”.
Consistent_Cat3451@reddit
I wonder if new GPUs and new consoles will come with this
cocktails4@reddit
Eventually.
I'm still pissed that my my 4070 TI only has Displayport 1.4. I have to use the single HDMI 2.1 port to get 4k/60.
Not_Yet_Italian_1990@reddit
Doesn't HDMI 2.1 allow for up to 4k/120 and 4k/144 with DSC, though?
In any event, I completely agree that the Ada should've had DP 2.0.
cocktails4@reddit
Yeh, I use HDMI 2.1 for my main 4k/120 display and DP for my two other 4K/60 displays. But HDMI seems to have a really long handshaking delay when the monitors wake up. The two DP-connected displays are on and displaying in a couple seconds, the HDMI-connected display takes significantly longer.
Strazdas1@reddit
Yep. I have displays with both DP and HDMI connected and HDMI takes a lot longer to go from wake signal to displaying.
Not_Yet_Italian_1990@reddit
Gotcha. Yeah, that has always seemed to be the case with me, too.
Display port just seems to have better quality of life features, for whatever reason. It's mostly little things, like registering an image on my screen more quickly when I boot my computer so I can go into the BIOS. I had to slow down the BIOS select screen because HDMI took so long to wake the monitor up. That seems to just work better on DP, for whatever reason.
Hopefully the new HDMI standard changes this, but it might just be intrinsic in the technology.
PXLShoot3r@reddit
Dafuq are you talking about? 4K/60 is no problem with 1.4.
cocktails4@reddit
Sorry, meant 4k/120 and 10-bit.
Primus_is_OK_I_guess@reddit
Also shouldn't be a problem. Are you disabling DSC?
cocktails4@reddit
I don't trust that DSC is actually visually lossless for the editing work that I do, so yes.
MDCCCLV@reddit
Absolute difference between watching netflix and editing stuff, visual fidelity matters there.
Strazdas1@reddit
Yeah. Netflix is already compressed so it wont matter. editing or real time rendering visual will be impacted. There is no such thing as lossless compression. If you are compressing you are loosing data.
raydialseeker@reddit
Well it is.
crocron@reddit
Stop with the bullshit. There is a noticeable difference between DSC and non-DSC. "Visually lossless" is a marketing term and nothing else. From my previous comment containing the relevant parts:
bctoy@reddit
The study that came up with this, doesn't inspire confidence either that it'll be 'visually lossless'.
https://en.wikipedia.org/wiki/Display_Stream_Compression
And then, I looked up the 75% number above and here's another paper giving the details that even that wasn't enough for many individuals in the study.
-Perspectives on the definition of visually lossless quality for mobile and large format displays
Blacky-Noir@reddit
Gosh, THANK YOU for that.
I always was dubious of "visually lossless", especially when in the past "effectively lossless" was 100% wrong 100% of the time. But e-v-e-r-y single reviewer and outlet I've seen, even usually serious one, have said it's true and there was no difference.
After years of that, I was almost getting convinced.
Thanks for setting the record straight.
raydialseeker@reddit
https://developer.nvidia.com/blog/dsc-higher-fidelity-vr/
This is my reference point.
crocron@reddit
The article does not define what "visually lossless" means. This is the given definition in ISO/IEC 29170-2:2015 - "when all the observers fail to correctly identify the reference image more than 75% of the trials".
The main issues of the definition are that
It's not lossless at all and they have to change to the definition of lossless for it to sound more marketable.
75% as a lower bound is way too high.
I agree on that DSC and non-DSC are difficult to differentiate on still images, but with non-static elements (like moving your mouse, playing games, or moving a 3D model in SolidWorks), they are easily discernable.
JohnDoe_CA@reddit
What kind of artifacts do you see on moving images?
DSC doesn’t have any inter-frame compression tools (unlike video compression standards like h264), so it should be less sensitive to moving objects.
crocron@reddit
Ironically, I think that's the reason. If they are compressing for static images, then artifacts arise when moving. Furthermore, video compression trade off more static frame artifact for inter-frame "smoothness".
I'm planning to test this out. I currently have a lossless 3 second 1080p video of a 3D model rotating (literally just a sphere with a face wrapped around it). I'll be transforming it into 2 different videos with ffmpeg.
Convert the video losslessly into its frames. Convert each frame to a JPEG with default quality. Merge the lossly compressed frame losslessly into a video.
Lossly compress the video with AV1 encoding with default quality.
Feel free to reply back for the result.
JohnDoe_CA@reddit
FWIW, If you’re using JPEG as a proxy for DSC: it’s nothing like that. JPEG still compresses by converting to DCT and quantizing residuals. DSC is more like PNG. It predicts based on a weighted set of neighboring pixels and then quantizes residuals, controlled by the fill level of a virtual FIFO.
crocron@reddit
Thanks for the info. Is there any lossy image compression that's close enough to DSC (as PNG is lossless)? Would JXL with distance 1 be sufficient to DSC (my updated method)?
If that's the case, I'll probably, for 1, decode the video to PNG, convert it to JXL with "-d 1", and convert it back to PNG and video, thereafter.
JohnDoe_CA@reddit
I’m not aware of any. DSC was designed for minimal silicon area impact. JXL is better than JPEG and still uses frequency domain conversion.
raydialseeker@reddit
What issues do you see with a cursor for e.g.?
crocron@reddit
If they move fast between high-contrast highly detailed elements, the cursors and the elements get blurry in between the high contrast elements. Fortunately, this rarely happens outside of complex 3D models and extra detailed parts of something like hair or denim patterns in drawings.
raydialseeker@reddit
Thanks! I'll try to look out for it. Most of my work is on websites or spreadsheets and I don't notice it while gaming. From all the a/b testing I've done (10bit dsc vs 8bit without @240hz 1440p) I've struggled to tell the difference. Do you know a particular site or interaction I can use to test it ? I used blurbusters ufo test
crocron@reddit
If you have 3D model viewer (SolidWorks, FreeCAD, MasterCAM, etc.), get a complex model or a model with a lot of overlaps (like a mesh filter, extra-fine sift, or some meta-material), enable wire-frame edge when viewing, and move your cursor in between the edges (rotating the model would work, too). You'll notice some blurry artifacts when the edge of the cursor and the model move in and out. This is the worse case scenario for cursor-related artifacts.
A less noticeable but similar is in highly detailed art. Use this artist's work (https://redd.it/1f7a0k6) or any's of Junji Ito's detailed work. At a certain zoom level (assuming the image is of sufficient resolution), moving the cursor results in fringing at the edges. For Junji Ito's work, any criss-cross used for shading is sufficient, and for, https://redd.it/1f7a0k6, the sword engraving is slightly noticeable. It's not as bad as the 3D model's case, but when you're drawing something, it's get really distracting.
I don't know how it would be for 10-bit DSC vs 8-bit no-DSC, but it's noticeable on 10-bit DSC vs 10-bit no-DSC. Previously mentioned, I'm a hobby digital artist for almost 2 decades, and am more likely to be sensitive to these artifacts.
raydialseeker@reddit
That's fascinating. I'll turn the nanite view on and give it a shot. Thanks for all the info and apologies for being confidently incorrect earlier.
crocron@reddit
No problem. I would like to apologize as re-reading my own response, I was way too aggressive (as the original comment was towards a different user repeating the same thing ad nauseam in another post).
reddit_equals_censor@reddit
nvidia marketing bullshit :D holy smokes.
they are still claiming, that the 12 pin nvidia fire hazard is "user error" :D
and "8 GB vram is perfectly fine" and "fake interpolation frame gen is as good as real fps" :D
i could go on...
there is also a bigger issue. vr lenses are very rarely clear enough to not be the main issue, before dsc issues can become easily noticeable.
sth, that does NOT apply to desktop displays of course.
vr should also go so fast resolution and refresh wise, that dsc used for a while, until we fix it can be much easier accepted than on desktops.
pushing 2x the resolution of 4k uhd equivalent per eye (so 4x 4k uhd overall) at 240 hz for example is extremely hard and that is just itching on being good enough then to do actual work in vr
joha4270@reddit
It can absolutely be noticed in some cases. I'm running a monitor at 30Hz because DSC was driving me nuts (scrolling colored text on a black background).
IguassuIronman@reddit
Why would anyone want to drop big money on a GPU and monitor only to compress the video signal? Especially when it's only needed because one of the vendors cheaped out
Primus_is_OK_I_guess@reddit
Because very few monitors support DP 2.1, given that it's a relatively new standard, and you could not tell the difference between output with DSC and without in a side by side comparison.
panchovix@reddit
DSC is quite noticeable on my Samsung G8, specially on "line" definitions in some webpages.
On motion is not noticeable yes.
JunosArmpits@reddit
What is different exactly? Could you take a picture of it?
conquer69@reddit
The issue isn't with the compression but the loss of features. Like losing DLDSR and alt tabbing being delayed.
Morningst4r@reddit
Only some monitors lose DLDSR due to DSC. My monitor supports both
reddit_equals_censor@reddit
dsc should not be considered at all here. dsc is a lossy compression. it shouldn't exist in the chain, except as a temporary workaround, or for some advertisement installations or whatever, where quality doesn't matter.
so YES 4k 120 hz is a problem at proper bit rates.
Primus_is_OK_I_guess@reddit
The vast majority of people cannot tell the difference, even side by side. You're being ridiculous.
Nicholas-Steel@reddit
Unless you have hardware that disables features if DSC is in use (iirc some monitors disable VRR when using DSC).
tukatu0@reddit
Unless you pull out evidence of actual testing. We have no reason to believe you. The makers themselves dont call it lossless so why should you.
reddit_equals_censor@reddit
in the responses to the comment above there are several people, that point out, that they can clearly see the visual difference between dsc off and dsc on.
so why are you defending people wanting working hardware, instead of display makers and graphics card makers saving some pennies to not put high enough bandwidth connections on it?
and why are you defending marketing bullshit as well?
it is clearly NOT visually lossless again people responding to the comment above are a great basic example.
you are being ridiculous, trying to defend giant companies, that are trying to create acceptance for degraded visuals.
cocktails4@reddit
Shit, maybe it was some other reason. I know there was something that 1.4 couldn't do. VRR?
Unkechaug@reddit
The fact this post spawned a set of questions and clarifications is indicative of the problem with these standards specs. HDMI and USB are complete shit shows. At least with prior standards and different cables, the shape could help you better understand capabilities. Now we get to have an ever changing set of names (including renames) and everything-is-an-exception that contradicts the term “standards”.
your_mind_aches@reddit
I agree.
However, I rather have this problem than the alternative where I have a ton of cables that work with nothing and have to get new cables when I upgrade to something else. It's a waste.
I still use the USB-C cables that came with old devices I don't use anymore. Being able to reuse USB-C cables even if they're just 60W or 480Mbps is way better than a bunch of old junk rotting in a cupboard like all my pre-standardized cables.
cocktails4@reddit
Absolutely agree. Even basic functionality is riddled with caveats that you have to just figure out when things don't work they way you expect them to. My home theater has been a pain in my ass in that regard.
Simon_787@reddit
I'm running 4K/240 with DSC.
The black screen when switching resolutions sucks, but that's about it.
Keulapaska@reddit
You mean to get 4k120/144 10-bit? Cause display port 1.4 can do 4k97 10-bit or 120 8-bit without dsc, with dsc 4k240 is thing idk what the limit is
cocktails4@reddit
I meant 4k/120/10bit.
uneducatedramen@reddit
My English might not be good cuz I read it like dp1.4 doesn't support 4k/60? I mean on paper it does 4k/120
JumpCritical9460@reddit
HDMI 2.2 ports and cables that adhere to only some of the standard incoming.
scrndude@reddit
https://tftcentral.co.uk/articles/when-hdmi-2-1-isnt-hdmi-2-1
Still drives me nuts that cables can get 2.1 certification while supporting none of the 2.1 features
_Ganon@reddit
Yeah this was an awful decision for average consumers. It makes it complicated as fuck for consumers to know they're actually buying what they want or need. It makes me irrationally angry. Corps probably wanted to be able to claim compliance with latest standards without supporting anything added in the latest standards. Someone(s) on the HDMI board is to blame: https://hdmiforum.org/about/hdmi-forum-board-directors/
USB faces similar issues, and MicroSD needs to make a simpler system than continually adding more letters and numbers.
Very frustrating. It could be easy but that's bad for business.
dafdiego777@reddit
as dumb as wifi 5/6/6e/7 is really feels like they are the only ones who have figured this out
chefchef97@reddit
And they were the ones that started with the worst naming scheme of the lot
USB and HDMI have continually missed open goals for the last decade
Vengeful111@reddit
You mean you dont like USB3.2 Gen 2x2? And that USB4 doesnt actually mean anything since everything useful about it is optional?
Strazdas1@reddit
I prefer USB3.2 Gen 2x2 over USB4 but actually performing like USB2.
Lyuseefur@reddit
Vint Cerf is one of the reasons. Standards are a good thing.
When WiMax was starting up, the specifications group REFUSED WiMax because it would cause issues.
Today it’s evolved into one standards body for mobile:
https://www.3gpp.org/about-us/introducing-3gpp
And for WiFi
https://www.ieee802.org/11/
RampantAI@reddit
What, you don’t like the microSDXC UHS-III C10 U3 (KHTML, like Gecko) V90 A2 SD card designation?
deep_chungus@reddit
i get why they did it too but it's just dumb, like people are reading the fucking box if they don't give a shit about cable quality anyway
i only recently learned about cable feature levels and like i'm going to wander down to the local electronic shop and buy the most expensive cable when if i wanted peace of mind i'd actually look up (i can't believe this is a thing they have forced me to do) fucking cable reviews
TrptJim@reddit
Can this even be considered a standard anymore? What standard is being upheld, if so many things are optional?
Where's the ground truth here? What do HDMI versions even mean anymore?
HDMI and USB Standards need a complete reboot at this point because they are completely useless to the usual customer.
GhostMotley@reddit
The entire point of a new standard should be to enforce a minimum set of features and/or requirements.
We see this shit a lot with HDMI and USB.
Lyuseefur@reddit
But hey, at least you’re not buying yet another cable to throw away.
Oh wait.
firagabird@reddit
The backwards compatibility with 2.0 is nice and all, but a certified 2.1 cable should support all 2.1 features, and same with 2.2. that might be why the new version isn't called HDMI 3.0.
Lyuseefur@reddit
And this isn’t even the worst part.
Warcraft_Fan@reddit
So if I bought a 2.1 cable and it doesn't work on current 2.1 display because of weird rules, I'm shit out of luck?
Or the other way, a 2.1 compliant display doesn't have any of the actual 2.1 features?
sticknotstick@reddit
The first one. I can’t say definitively but every HDMI 2.1 port I’ve seen offers full HDMI 2.1 bandwidth, with eARC being the “optional” feature on those. HDMI 2.1 cables on the other hand are a shitshow in terms of bandwidth
bubbleawsome@reddit
A few early HDMI 2.1 ports had 40Gbps bandwidth instead of 48Gbps. I think the only limitation of those ports is 10bit color at 4k120 4:4:4. 12 bit color and 144hz wouldn’t work with them
sticknotstick@reddit
Good shout, thanks for the info
Baalii@reddit
HDMI 2.2 16k!!!!! 480Hz!!!!!! (UHBR0.5)
suni08@reddit
NUHBR (not ultra-high bitrate)
BlueGoliath@reddit
Why are all the cable governing bodies letting this happen?
gweilojoe@reddit
For HDMI, it’s because they get a small cut of every certified cable produced (either through member fees or certification fees) and they want to continue with more money coming in year after year. If they set the implementation standard based on every feature in the latest spec threshold, cables would be more expensive and they would sell less cables. The spec implementation scheme is built in a way to maximize money-making, but under the guise of making things more friendly for people’s wallets. Also, the fact that no manufacturer is allowed to actually state “HDMI 2.0” or “HDMI 2.1” on the packaging or marketing materials (and get certified) also boggles my mind.
Blacky-Noir@reddit
In the short term. Long term, this is more damaging than what they are making now.
gweilojoe@reddit
Yeah, standards groups that have established a standard are not great at long-term thinkers. USB-IF is the best example with their 5Gbps (looks at a cable) USB 3.0 cables and (looks at another cable) USB 3.1 Gen 1 cables and (looks at another another cable) USB 3.2 Gen 1x1 cables.
Blacky-Noir@reddit
Definitely. I've heard from people in a couple of start up (or pre-start up) working on novel devices, their main question and friction both in tech but also in getting investor is to make sure their customers have a good enough USB to run the thing.
And no amount of clear labels, and educating the customer campaign, can work here, because of how messy USB has become.
Morningst4r@reddit
Not everything needs all the features and it all adds cost. A fully featured USB cable costs a lot to make and validate.
pholan@reddit
True, a cable that’s good enough to charge your phone and run CarPlay just requires a fairly loosely specced data pair, the CC line if USB-C on both ends, and a pair of power lines sufficient to handle 3A. For a full featured cable you add an E-Marker, 4 high speed pairs, and the power lines need to support 5A.
Redstarmn@reddit
and 180hz of that will be AI generated frames.
Pandaisblue@reddit
Bonus points if it's unclear and unindicated which cable/port does what
Tobi97l@reddit
I wonder if my fiber hdmi cable will support it even if it's not certified.
AndersaurusR3X@reddit
There must be some way to do this and keep a reasonable cable length. These new cables are way too short.
absolutetkd@reddit
So since HDMI 2.1 is now very old and obselete and icky does this mean we can finally have support for it in Linux on AMD?
mr_doms_porn@reddit
And I just realised why I have to use display port adapters to get any of my setups to work right...
taicy5623@reddit
I switched to Nvidia for this, and now I'm regretting it because its still less of a pain to run my display at 1440p with VRR off since OLED + VRR = massive flickering.
Preisschild@reddit
Why not just use DisplayPort?
Kryohi@reddit
Or a DP to HDMI adapter, as suggested here: https://gitlab.freedesktop.org/drm/amd/-/issues/1417
taicy5623@reddit
Already tried that, even booted into windows to flash some off brand firmware and it still didn't expose VRR.
taicy5623@reddit
The most price effective OLED TV's don't tend to have DP..
BlueGoliath@reddit
Year of AMD on Linux?!?!?!?
spazturtle@reddit
AMD will probably just do Intel did and drop native HDMI output and just use a display port to HDMI adapter.
kasakka1@reddit
"How about some perpetual payola first?" -HDMI Forum
ChipChester@reddit
Still no closed-caption data channel?
QuazyPat@reddit
That you, Alec?
ChipChester@reddit
Who is Alec?
QuazyPat@reddit
Alec from Technology Connections on YouTube. He released a video a few weeks ago lamenting the absence of proper closed captioning support in modern DVD/Blu-ray players when you use HDMI. Figured you might have watched it. But if not, now you can!
https://youtu.be/OSCOQ6vnLwU?si=pcfkQno1Lp2VMV_m
ChipChester@reddit
Nope, not me. Caption company owner since a decade before HDMI was introduced.
ZeroWashu@reddit
The fun part was that I had never noticed the issue as I had a Sony Blu-ray player that he identified late in the video as providing support; there are quite a few Sony models that do. Half the video I am going, I don't have that problem - are you sure? Then boom, Sony.
It is one aspect of his channel that is fascinating in that how while many manufacturers build similar products there is more than meets the eye in regards to features available out of those possible.
Ohyton@reddit
I love this channel though I have no real interest in most of the topics. But it's presented in such a way that highlights the interesting bits of design of things you never thought about.
Dishwasher stuff for instance.
your_mind_aches@reddit
I was ready to break out the PS5 and try a bunch of old DVDs to check it, then he said it works fine on the PS5 lol
pandaSmore@reddit
Ahhh a fellow Technology Connections unenjoyer.
schnaab@reddit
You perhaps? Answer the question! Are you Alec?!
BlueGoliath@reddit
Is Alec in the room with us right now?
Yearlaren@reddit
Kinda puzzling that they're adding support for 16k and yet they're not calling it HDMI 3.
ExcitementUseful311@reddit
Considering that large 4K monitors with anything over 120 hertz are far and few still I don’t see the great advantage to HDMI 2.2. We’re not even to the point of having widely available 4K with zero ping. I guess it is great that 8k or maybe even 16k content will come, but I’m uncertain graphics cards for now will be able to come anywhere close to even running anything in that range. TV streaming is still not all 4K UHD so I’m having a hard time seeing the point. Who knows, maybe someone will start producing fantastic 8k or 16k content, but I don’t expect it at least for another 5 years or longer. I’d delightfully buy a 16k TV if they were as competitively priced as today’s high end 4K TVs.
Nicholas-Steel@reddit
There's afaik lots of 27" and larger 4K monitors with 240+ Hz.
Tycho66@reddit
Yay, I can pay $60 for a 3 ft cable again.
dom6770@reddit
I just paid 80 € for a 10m HDMI 2.1 cable because I can't otherwise play in 4k with 120 Hz, HDR and VRR.
but welp, it works now. Hopefully it will continue to work for years.
REV2939@reddit
Only to find out it doesn't meet the spec and is copper clad aluminum wires.
SunnyCloudyRainy@reddit
Could be worse There are copper clad steel HDMI 1.4 cables out there
GamezombieCZ@reddit
H-D-M-I... Oh nevermind.
Nuck_Chorris_Stache@reddit
What about Medium Definition Multimedia Auxiliary (MDMA)
nisaaru@reddit
I wonder how many products get broken HDMI 2.2 support again. That seems to be tradition by now,
broknbottle@reddit
HDMI sucks. I wish it would go away and display port on TVs was a thing.
surf_greatriver_v4@reddit
ah sweet, 30cm max cable length here we come
mulletarian@reddit
/r/tvtoohigh in shambles
SnugglyCoderGuy@reddit
Gone. Reduced to atoms.
willis936@reddit
I think the market already has the fix. These are pretty miraculous for their price.
https://youtu.be/O9QPecpLcnA
kasakka1@reddit
"What do you mean you can't connect your new TV/monitor? We included a 30cm HDMI 2.2 cable in the box? Don't you have your devices wedged right against the display?"
mycall@reddit
I have seen so many mac minis just like that
1TillMidNight@reddit
Active cables are now part of the standard.
You will likely have official cables that are 50ft+.
moschles@reddit
Do you want 16K or not?
nicuramar@reddit
Only if you want the highest resolutions.
EternalFlame117343@reddit
Yeah that's good and all but my rtx 6000 pro can barely run 30fps on 720 without dlss4.
These guys are making useless technology that nobody can use.
CaptainDouchington@reddit
Finally I can hook up my PC to this jumbotron I have just chilling in my garage.
armady1@reddit
what a waste of a resolution. i prefer high ppi so will be liking forward to 16k 24” monitors
BlueGoliath@reddit
Someone is thinking of the PPIphiles, finally.
Not_Yet_Italian_1990@reddit
Any word in this announcement on when 16k phones will become available?
armady1@reddit
soon. they’ll be 16k 30hz and people will complain but you don’t need a higher refresh than that anyway it’s a waste of battery
Meedas_@reddit
So... Who will be the first one in here to buy a 16K TV? 🤔
TournamentCarrot0@reddit
Does 16k get us into (simulated) 3d territory for viewing?
exomachina@reddit
That's a different display technology called Light Field. Google and HP are currently doing it with custom 8K displays but the entire experience relies on a suite of cameras and sensors also built into the display as well. It has nothing to do with resolution or cable bandwidth.
BinaryJay@reddit
Surely that will be enough for a long while, what else do they have to throw at us?