ANTGAMER teases 1000Hz monitor with 2026 release plans
Posted by Vb_33@reddit | hardware | View on Reddit | 124 comments
Posted by Vb_33@reddit | hardware | View on Reddit | 124 comments
legice@reddit
I got a 120hz 5 years ago, because yOu CaNt SeE a DiFfErEnCe and ye, you clearly can, but I just didnt expect it to feel so much better. But anything above that, its so marginal, but at least the tech is there.
Also I play my games on 90fps max and even at 120 just feels not worth the hardware power
Dull-Tea8669@reddit
I think the noticable difference ends around 165. I have a 240 and have been trying so hard, but can't find the difference when playing
perdyqueue@reddit
It still makes me laugh, back when I got my first "high refresh" 120hz back in I think 2011? people were going on about how it's not visible to the human eye. ridiculous, and people keep claiming it about increasing hz even though every increase so far has been objectively well within human limits.
rubiconlexicon@reddit
You're looking forward to ever higher refresh rates for the motion clarity and BFI/CRT shader ratio benefits, I'm looking forward to it because it's the only obvious solution to OLED VRR flicker on the horizon (side-stepping VRR altogether by making it obsolete with raw refresh rate).
DuranteA@reddit
I agree - it just doesn't seem like a sufficient number of people is bothered by VRR brightness fluctuations (on either LCD or OLED) for the manufacturers to get serious about fixing it. It's absolutely ludicrous that some of the first G-sync displays did better at this than some monitors you can buy today, so the only real solution seems to be getting rid of it with sufficiently small frametimes.
I do think VRR is basically already obsolete at 500 Hz, at least for me, since I can't see 2 ms judder. So hopefully by the time I need to replace my monitor some OLED displays with higher resolutions and 480 Hz are out.
windozeFanboi@reddit
I agree, I would also disable VRR on a 500Hz display, assuming the game ALSO runs at similarly high FPS...
I don't know if a 60FPS game would look as good on a 500Hz fixed display or 120FPS VRR display.
rubiconlexicon@reddit
I'm on 480Hz OLED and I simply can't see tearing or judder even at low variable fps, never mind at a stable fps that is an integer divisor of refresh rate (e.g. 60, 80, 96, 120). I look forward to the return of VRR one day, but its effective absence isn't the end of the world in the >240Hz era.
I am however worried about how playing GTA 6 on a console is going to go, because iirc consoles force vsync on at all times.
yourrandomnobody@reddit
VRR will never be "obsolete" with the current severe software limitation we face.
rubiconlexicon@reddit
Not sure what you're talking about. Windows refresh rate limit? I already find tearing and judder unnoticeable at 480Hz, so I'm sure W11's 1000Hz (?) limit will be plenty.
yourrandomnobody@reddit
What I meant with with “software limitations” is that a majority of singleplayer titles are barely able to run at 200fps, let alone +500FPS necessary if you want to chase a "fixed refresh rate no-tearing" scenario.
Not only that. if you don't subjectively perceive the benefits of GPU synchronization (VRR/Adaptive-Sync/FreeSync/G-Sync), that doesn't mean there are no benefits which are objectively available for others which are sensitive to it.
VRR may be obsolete for your use-case, but not objectively for everyone. :)
As for your 2nd part, I don't know where you got the information that W11 has an inherent refresh rate limit akin to W10/W7 (which are capped to 500Hz).
I wouldn't be surprised if this is another tactic by MS to bolster their new OS release, but for now that's unknown. 1000Hz is not anywhere near enough for the best objective eye ergonomic experience.
rubiconlexicon@reddit
You misunderstood. Refresh rate reduces the visibility of tearing and judder independent of frame rate. 37fps no-sync (no vsync, no VRR) looks drastically worse at 60Hz than it does at 480Hz.
I do perceive those benefits and am sensitive to them. But at such a high refresh rate you can't find tearing even if you go looking for it with the SK vertical bar tearing indicators. It just isn't there – the tear lines become horizontally too short to perceive due to sheer refresh rate.
BlueGoliath@reddit
Ah yes, E-Sport players are totally going to turn on frame gen.
Jags_95@reddit
Necros literally plays with framegen enabled and gets between 500 to 700fps in Marvel Rivals. Once you're past 300fps before using framegen, you can't tell the input latency. This whole topic is so overblown because the worst case examples have been showcased on big youtube channels at low framerates, but in reality at super high framerates as a baseline before framegen its not going to be noticeable, especially if your baseline is 500fps input lag and then generating to 1000fps.
windozeFanboi@reddit
If you can tell the difference between 200FPS and 500FPS (5ms->2ms) 3ms difference then you can tell the difference with enabling FG at 500FPS.
That's assuming the framerate rock steady too, because if it's wildly variable, FG also feels worse input lag wise.
is 5ms or less input lag dramatic? Perhaps not.. Is it noticeable? Yes.
AwesomeBantha@reddit
Never thought I would see Necros referenced here lmao
Back in my day homeboy was a Genji one trick on a below average PC farming crazy highlight reels
Jags_95@reddit
Haha tbh i didnt expect many people to know who i was talking about but yeah hes a great genji.
CrzyJek@reddit
That's great and all but the only real legitimate reason to go so high in hz is to reduce actual input latency for competitive reasons. By the time you go beyond 360hz, the difference in fluidity is almost imperceivable. Above 500 and anyone who says they can tell a difference is lying. So enabling frame gen is sort of pointless as it always adds latency, even if minimal.
DistortedLotus@reddit
Same shit was said about 240hz from 144 -- 360 from 240 -- 500 from 360. And people still keep pushing the goalposts back of what can't be seen every upgrade. Turns out they get proven wrong everytime.
One study on fighter pilots found they could identify images flashed on a screen for as short as 1/250th of a second, and a flash of light in as little as 1/1000th of a second. The actual theoretical limit for perceived refresh rate is well above 1000hz, since 1000hz would still have less clarity than reality.
tukatu0@reddit
The limit for fps is somehwere around 10,000fps. You only need it for vr or hyper realistic esports.
Those people are emotionally tied to their fps numbers. They want to believe 120fps is perfect and 60fps is for peasants. Which thanks nivida. Even though in reality both are horribly blurry once you up your control sensitivity that allow moevements you are capable of in real life.
Jags_95@reddit
I would highly insist you buy a 500hz oled and try it for a week. I'm not saying in terms of value that its going to be worth it for everyone, but going from 240hz to 500hz oled was genuinely very noticeable. I can also tell the difference in terms of motion clarity from 360hz oled to 500hz oled but its very very minor compared to 240hz. The sample and blur reduction alone was very nice to see at 500hz in Valorant and Overwatch 2. When it comes to frame gen, yeah i agree its pointless but he's one of the best players and he is playing at a very high level with it enabled, so my point to the other guy was that having a very high base framerate with low latency makes it very hard to notice with framegen 2x and nvidia reflex.
dparks1234@reddit
It’s to improve motion clarity
bogglingsnog@reddit
At that point your mouse and keyboard latency probably matters more XD
Jags_95@reddit
Yeah dude having good peripherals matters as well.
MonoShadow@reddit
The issue is messaging. People are just pushing against it. I see the value of FG and I'd argue the tech in question is the intended use case of Frame Gen.
It's rich get richer tech. Some people recently started comparing it to "Win More" cards in card games. At such high frame-rates(500 to 1000 for example) input lag is so small and images on screen for such small time, artifacts and increased input lag are virtually imperceptible. In return you get improved motion clarity on sample and hold displays. And in today's world this is the only way to saturate a screen like that unless you're playing Quake 3.
BUT. This is not how nVidia sells it. And at this point is not how some devs use it. MonHun Wilds got some shit for running like arse. Their solution? Frame gen. nVidia outright lied about 5070 matching 4090 based on FrameGen, And a lot of talk is 30 to 60, or even lower. Which is not this tech sweet spot.
This is why the issues are often "overblown". Because it's less of a level headed discussion, more PR bullshit and people pushing against it.
P.S. I have a though which I can't really formulate right now in writing, about separating "new" tech into perf\rendering category, what people use shorthand "FPS" for even if it isn't. And Monitor\Display tech. VRR, autoHDR, etc. More or less overcoming limitations of modern displays or their interactions with software. And in this hypothetical scenario I'd drop FG into Monitor bucket.
varateshh@reddit
No joke, I have seen ex-esport pro streamers have that toggled on for Marvel Rivals. I guess the impact is minimal when going from 200 FPS to 400 FPS on newest Nvidia GPUs.
unknown_nut@reddit
LIke who wouldn't turn it on if they got 500 fps native, that'll turn it into near 1000. You'll still get input latency of the 500 fps or near that. It's a no brainer.
Vb_33@reddit (OP)
People don't understand the nuance of frame gen. It's either "thing good" or "thing bad", there can be no nuance.
windozeFanboi@reddit
i dont want framegen on multiplayer FPS games, period.
For offline games I can argue it's a godsend, assuming you're not upscaling the laggiest piece of shit games at 30FPS base framerate at 4K. Input lag cost will be dramatic.
windozeFanboi@reddit
There is a cost to framegen though. It may manage to do 300FPS to 500FPS but it may very much struggle to make 500FPS to 600FPS... It's somewhat of a fixed cost, at a fixed resolution, obviously cost varies if framebuffer is only 720p compared to 4K.
At let's say 2ms fixed cost, you cannot break over the 500FPS limit no matter how low your graphics settings go with framegen. Idk the actual numbers so i won't speculate further than this, but there is a limit for each graphics card, depending on Compute or Bandwidth.
Jags_95@reddit
Necros plays with 2x framegen and he's a high level player so yeah its useable.
windozeFanboi@reddit
what resolution? Framegen cost at 4K can be more dramatic than Framegen at 1080p...
4K framegen is ABSOLUTELY NOT esport friendly., definitely not on my 4090, in The Finals, idk about other games.
Warskull@reddit
At a certain level, input lag stops making much of a difference.
Turning on frame gen and increasing the input lag when you are 30 FPS is a huge deal. When you already have 200 FPS, the impact is much smaller. Meanwhile frame gen is giving you an advantage by reducing motion blur.
General_Session_4450@reddit
Additionally the actual difference between frames at 200 FPS is generally much smaller as well, making it easier for frame gen to generate the frames with less artifacts.
CrashedMyCommodore@reddit
I mean, frame gen was always designed to be used at higher FPS and all three companies have said as much.
It was never intended to be a magic crutch for low fps, as there's not enough information for it to go off of.
Warskull@reddit
It was, but you game developers saw frame gen and immediate through "Another reason we can skip optimizing" and companies like Capcom push you to run frame gen on their 30 FPS game to get it to 60.
In addition many gamers don't have monitor that can make sufficient use of framegen yet. Many people are still on sub 200Hz. You really need 4k 120Hz+ or 1440p 200Hz+ to make good use of framegen. So the misuse it and say it sucks.
FragrantGas9@reddit
It was never intended to be a magic crutch for low fps.
The engineers would agree with you but the management and marketing teams fully intended for it to be used to push new product generations with cut down die sizes and better margins by moving practically unplayable low framerates into reasonable territory for marketing slides.
Strazdas1@reddit
When your base is 400 FPS it doesnt matter.
reddit_equals_censor@reddit
i suggest to use accurate terms here.
YES frame generation would be highly desired by e-sports players,
BUT the issue is, that ONLY reprojection real frame generation would be, while nvidia's fake graph scam fake interpolation frame gen is worthless garbage.
so please use at least the term "interpolation frame gen"
reprojection frame generation reduces latency and improves responsiveness, as it creates real frames.
reprojection real frame generation can also have full player movement data as part of it and even enemy or other moving object data in more advanced versions.
fake interpolation frame generation has 0 positional data at all. it is just throwing the hands in the air to guess an in between frame with 0 positional data and thus 0 player input.
casual ltt video, that explains reprojection real frame generation:
https://www.youtube.com/watch?v=IvqrlgKuowE
___
understanding the difference is important, because it seems, that the graphics industry is hell bend to scam people with fake graph fake technologies, that no one asked for, instead of real reprojection frame generation and maybe enough people's awareness could change that.
yourrandomnobody@reddit
Funny that you got downvoted when you're correct... this platform is a cesspit of mediocrity.
venturepulse@reddit
When frames are that frequent then frame gen may actually work well.
Pheonix1025@reddit
I’m so excited for this! 1,000Hz seems to be the practical limit for the human eye, so once we reach that it’ll be really cool to see where monitor tech goes next.
thelastsupper316@reddit
No we need 1000hz 1080p OLED then that's end game.
Pheonix1025@reddit
I shudder to think of the bandwidth requirements for 4k/1000Hz
Nuck_Chorris_Stache@reddit
fibre optic display cables
Scheeseman99@reddit
I figure at some point it'd be easier to send the raw frames, depth buffers and motion vectors over the cable and interpolate them on the display instead of the GPU?
panzermuffin@reddit
Slots directly into your PC. Actual desktop-PCs are making a comeback.
Pillokun@reddit
oled? u mean micro led :D
Nuck_Chorris_Stache@reddit
Or possibly QDEL
thelastsupper316@reddit
Not going to available in 1440p or 4k below 89 for another 7 years probably I just don't see the tech working unless we get some big technological breakthroughs
Pillokun@reddit
dont be a party pooper...
We have been waiting for micro led even before oled became a thing on desktop, dont acknowledge the Wait™
Kyrond@reddit
There isn't a limit, it depends on the size of the screen, number of pixels and distance, just like resolution.
Nuck_Chorris_Stache@reddit
It depends on the size of the chair.
Pheonix1025@reddit
Hmm, are you saying the limit is different for 1080p vs 4k? Is there a large difference in those limits?
tukatu0@reddit
The limit is 1 pixel of movement per frame.
Scale the resolution up and you need equal frames. 1080fps vs 1440fps isn't going to ve that different. But 2160fps or even 8k 4320fps will.
Kyrond@reddit
The highest possible refresh rate would allow you to show the object moving pixel by pixel. You can do that by scrolling very slowly. On the other hand, if you want create a blurry moving image, just scroll really fast and try to read text. But let's assume some average speed.
If you use the same video source for 4k and 1080p, the 4k monitor would need double the refresh rate (it's double pixels in each dimension). But the difference would be the same as changing the resolution to 1080p, so for objects in motion on an average screen (let's say 27") it wouldn't matter too much.
But if you showed the same image on a 85" TV or 100" projector while sitting close, you could tell.
VictoriusII@reddit
The point at which persistence blur becomes practically undetectable depends on resolution and FOV coverage. For PC monitors this limit is probably about 1kHz, while VR screens might need 10kHz. This is also the point at which certain stroboscopic effects associated with sample-and-hold monitors dissapear. You can read more about it in this article.
As for where monitor tech will head when motion blur is a solved issue, it'll probably focus on increasing resolutions, color gamut and brightness.
KR4T0S@reddit
We will probably exceed the rec 2020 standard next year in at least one TV so we might be looking at the limits of RGB soon too. RGBY comeback!
tukatu0@reddit
Good news is rec 2020 is only 50% of human eye sight. So f yeah next target is decided.
Vb_33@reddit (OP)
Rec 4040
yourrandomnobody@reddit
10kHz is a relatively decent value for all displays, regardless of VR or standard 13-32" sized options.
Pheonix1025@reddit
Oh TIL about VR screens! That’s so cool
SaltVomit@reddit
Lol I love how this keeps changing as we progress to better tech.
Like 20 years ago people saying it doesn't matter if you get over 60fps cause your eyes can only "see" in 60fps lol
Strazdas1@reddit
And 20 years ago they would have been laughed out of the room. 20 years ago we were playing 85 fps on CRTs.
Pillokun@reddit
what? running at 60hz made your eyes and brain to spasm on a computer monitor. around 75hz was the bare minumum to not notice the flickering from the crt. But yeah, 60fps for console gaming was what most were used to, not us gamers we were at 45 with max settings :P But if u were willing u could get much more if u did not up the settings on pc.
Remember pc and console gaming was not the same even back in the 90s.
HoldCtrlW@reddit
There is no difference between 60hz and 24hz. At least I can't see it.
fullmetaljackass@reddit
Those people were just idiots. A decent CRT could go over 100Hz and it made an obvious difference compared to 60Hz.
Jeep-Eep@reddit
I mean to be fair, if the color, response and HDR on this are good you could ride it at target rez until it went belly up, which could be more then a decade.
yourrandomnobody@reddit
1000hz is not anywhere near the practical limit of chasing lowest possible eye-tracked motion blur and highest possible sample rate to emulate analog reality with digital computer displays.
It's 4000Hz at minimum.
ExplodingFistz@reddit
5090 can't even do 1000 FPS in esports games. Maybe 7090
Pillokun@reddit
it comes close and in some cases it does, but at that level ie 1080p low u are cpu/ram speed bound
ActuallyTiberSeptim@reddit
Meanwhile, I'm totally happy when I get 90fps in 1440p with my 6750 XT. 😅
apoketo@reddit
This graph is my guess of how refresh rate scales with motion clarity.
WeWillLetYouKow@reddit
2000 Hz here we come!
bubblesort33@reddit
I remember the last time a game went to 1000 FPS, it was in a menu (New World by Amazon game studios?), and that KILLED GPUs. lol
Like really, though what is a game anyone has ever gotten over 500 FPS in? Rainbow Six Siege? You can't even use frame generation to get from 500 to 1000, because the amount of time it takes to generate a fake frame is likely 2ms (1 second/500) or more, when a frame at 500 fps already is only 2ms to render naturally.
Strazdas1@reddit
i played oblivion in 3000+ fps on a 60 hz monitor because the coil whine made music as FPS changed.
tukatu0@reddit
Boomer shooters. Some 2d indies maybe. Older source games if they werent cpu bound
bubblesort33@reddit
Yeah, and that seems like an incredibly niche market.
tukatu0@reddit
Im considering this alone just for medical reasons. I don't slow pan while scrolling. I like to flick the screen around. Which causes eye strain even at 240hz. Try moving your mouse as fast as you can and tracking with your eyes. Tell me how they feel after 10 minutes.
2 problems arise. This thing might not be available outside china until late 2026 or 2027. They also advertize 0.8ms gtg time. By then 4k 360hz with 1080p 720hz might be a thing. Lower fps but a lot better visuals
bubblesort33@reddit
Have you tried 240hz on an OLED, or just an LCD? Soon we'll have 500hz monitors, and I'm doubtful really anyone will be able to tell the difference between that and 1000hz.
tukatu0@reddit
I forgot to mention. The current meta quests have a strobe to 3000fps equivalent in some aspects. I find it really strange they never advertized that during their office use marketing. I bet it would have gotten alot of redditors on board
tukatu0@reddit
You need to increase your movement speed in order to see the difference noticeably. For example try this at 240hz. Atleast i can not read it. Or rather it strains my eyes heavily after minutes. https://testufo.com/framerates-text#pps=1440&count=2 specifically i chose this one because its not even that fast. A fast reader should be able to see half the text. But i flick even faster so the blur is even higher.
At 240hz that is 6 pixels of blur per frame. That means to your eyes each letter is stretched about 6 pixels both ways for 12 pixels in total.
If you have a 120hz or so monitor. Decrease half the speed to 720pixels of movement per second. That way you can get it to 6 pixels of motion blur. And again at 60hz to 360px/s.
Take a look at this article. It might give you a better idea. https://blurbusters.com/massive-upgrade-with-120-vs-480-hz-oled-much-more-visible-than-60-vs-120-hz-even-for-office/ it has illustrations with the same amount of blur you would expect for each refresh rate. Specifically at 960px/s movement. Which is just a scroll. Not fast to be a flick by anyone. The static picture is equivalent to 1000hz. You can calculate how much blur something gives when you have those 3 numbers. The amount of pixels divided by time divided by refresh time.
There is also this of the mouse. https://i.ibb.co/qLKVGmFF/static-eye-vs-moving-mouse-cursor.png its the default speed of the mouse cursor tab you can click on too.
Yourdataisunclean@reddit
Why? Is there any evidence humans can benefit from display rates that high?
Cheap-Plane2796@reddit
The people responding to you think that higher refresh rate is about smoothness or reaction times, it is not.
All LCD and all OLED panels use sample and hold to refresh.
Meaning every refresh tick they sample the last image from the framebuffer, display it and then HOLD that static image on screen until the next refresh.
So imagine a football being kicked on tv and moving across the entire screen in half a second.
Its a movie about a football team so its played at 24 fps.
At frame 1 the football is somewhere on the far left side of the screen, then it STAYS there until the next frame and then the tv shows where the ball would be 40 milliseconds later. By this time the football will already be a few cm further right on your screen.
By the third frame itll be another few cm to the right
The ball is staying in one spot for 99 percent of each frame and then teleporting to the next instantly.
Thats not how motion works in real life, a real ball doesnt teleport in chunks between hanging still in the air. Our eyes follow the continuous motion of the ball.
With the tv our eyes try to track the movement, and they keep moving to where they think the ball should be but the ball isnt moving until it teleports again.
Our brain cant make sense of this mismatch, so we perceive this as the object being out of focus and blurry.
Any panning scene or fast lateral motion in movies suffers greatly from this. But most scenes are fairly static.
In first person games, racing games or isometric games , 2 d platformers etc its ALL panning and fast motion.
Now we get to why high refresh matters: The higher the refresh rate, the shorter the hold phase of sample and hold, the more samples, the less jumping between frames.
This means less sample and hold blur.
There is a huge easily perceptible difference in motion clarity due to sample and hold between 120 and 360 hz, and the benifits continue well past 500 hz, and it probably takes close to 1000 hz to get real life like motion clarity out of a sample and hold display.
You can test this for yourself simply by opening a long page of small text in a web browser and scrolling down and trying to read the text. Itll turn to illegible soup on a 60 hz monitor, is still impossible to read at 120 hz but it quite clear and easy to read at 360 hz.
Or go to blurbusters website, start the ufo test pattern and check out the difference at 24 , 30, 90 , 100 etc hz and as high as your monitor supports.
You ll keep seeing meaningful improvement until at least 300 + hz
Pillokun@reddit
I run my lcd at 390hz and it is still illegible soup when scrolling down say a thread here on reddit. Had an 240hz oled and it went back because it felt worse than the 1080p 390hz lcd. Thinking of getting an 500hz qd oled, but that asus with the 720hz 1080p mode is forcing me to wait for how it performs.
Cheap-Plane2796@reddit
With lcd you have the 10+ ms pixel response time ( gtg response is marketing bullshit) blurring the shit out moving images. But that is a seperate issue
Oled doesnt have pixel response time blur but it is still sample and hold.
A 1000 hz lcd panel is really stupid due to pixel response, so id take a 240 hz oled panel over a 1000 hz lcd one , but for oled higher refresh is valuable
Pillokun@reddit
yeah I still remember when they went from black to black to gray to gray because it was more realistic according to the brands because lcd were never black anyway :P
but the thing is, 390hz lcd felt better then the 240hz oled, there was a certain instantaneity with oled but yet at the same time it felt really restrictive compared to the 390hz lcd.
the freedom of control ie being more connected to the game was bigger on the lcd.
Yourdataisunclean@reddit
Thank you, this more what I was looking for.
Ideally someone will do work like this paper on retinal resolution and figure out the max limit to shoot for: https://arxiv.org/abs/2410.06068
yourrandomnobody@reddit
https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
https://forums.blurbusters.com/viewtopic.php?t=6519#p64573
A good goal value is ~8kHz.
lighthawk16@reddit
Yes
Yourdataisunclean@reddit
Anything you can point to? Actually curious.
Brapplezz@reddit
Blurbusters.com
Ever seen the UFO test ? Made by that dude. To my knowledge he was the first to test a 500hz+ monitor years ago. Once past 240hz the returns are diminishing but at 1000hz there should be 0 motion blur on a traditional LCD.
The rabbit hole of response times, refresh rates, panel types, backlight strobing and other monitor stuff is insane.
Dangerman1337@reddit
AFAIK 500Hz with a reponsive OLED is basically non-existent.
Brapplezz@reddit
Yep. Basically all OLEDs have a response time, GtG, of .3ms. They also aren't sample and hold so clarity is immediately boosted. I can get my IPS @120 to almost have the clarity of a 240hz OLED by using Backlight strobing, almost.
I want a 240hz oled with BFI pls
DZCreeper@reddit
The vast majority of OLED are still sample and hold.
https://blurbusters.com/faq/oled-motion-blur/
Pillokun@reddit
yep pretty sure all lcd/oleds are sample and hold.
Yourdataisunclean@reddit
I'll check that out. I've mostly seen studies showing most humans don't have much measurable performance differences when you go past 144mhz.
Kyrond@reddit
That is for input lag. That is not the reason for high refresh rate, the main benefit is clarity of things in motion. Try scrolling faster and faster while reading text. At some point it becomes blurry. At higher refresh rates, it will stay clear at higher speeds.
If you had 1000 Hz monitor, you could probably read anything you could keep your eyes on.
salartarium@reddit
https://www.nature.com/articles/srep07861
Olobnion@reddit
Why would an ant gamer care about humans?
cptadder@reddit
And anything above 200 HZ is already pushing it, and for being honest, anything over 120 and you're getting into the top 1 percentile of usefulness.
varateshh@reddit
I think a lot of people think >144 Hz is useless because for years we had trash tier panels where the pixel response time could not keep up with the refresh rate. This is demonstrated by LG 120 Hz OLED TVs matching many old 240 Hz monitors in terms of motion clarity.
ParthProLegend@reddit
500Hz monitor feels more fluid than 240Hz, confirmed even by pros. Though I can't get that FPS natively, DLSS FSR LS will take me to 250FPS at least in a large number of games. So with VRR it's 1 frame every 2 milliseconds. Isn't that good enough? 1000Hz is peak, whatever you do you will see and respond to the frame which your body and mind can see. Meaning at 1000Hz, you become limited by your body and mind only as you have a more accurate frame at every 1ms interval. Compared to a 250Hz monitor, where you have an accurate frame only every 4ms. Think like this, if your response time is 203ms, you will respond to the frame which was at 200ms while with 1000Hz you will respond at the frame which will be at 203ms.
Green_Struggle_1815@reddit
while i totally agree. look at the mouse market 2k/4k/8k polling and the consumers think it's noticeably better than 1k :D
Yourdataisunclean@reddit
Yeah, I haven't seen anything solid that supports going past 120/144hz. Perhaps the ANTGAMERS are going after the pigeon gamer market.
jedimindtriks@reddit
its not just fps, its latency, the higher hz the lower latency from a mouse click to it showing on screen. we have tons of tests showing there are people who can tell major difference.
especially esports like CS2
VictoriusII@reddit
https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
__Rosso__@reddit
Technally yes, but it's such a small gain that it won't even have an effect on e-sports player
But like in any sport, virtual or otherwise, even the tiniest of advantages no matter how insignificant are welcomed
Dastenis@reddit
Do we need 1000 Hz monitor and whats the point of having it ?
Jeep-Eep@reddit
Enough with the kilohertz whale fodder, can we have high grade HDR, top colour gamut systems in 200ish hz and fast response time for a reasonable price already?
Igor369@reddit
What is the pioint? Quadruple black frame insertion?
aqpstory@reddit
one thing is CRT scanline emulation, though blurbusters claims its better than black frame insertion I'm not sure why that would be
It's probably also kind of like megapixels in a camera, it's partially for marketing but for a higher framerate to be meaningful other specs of the monitor also need to be improved (and we'd hope that they're doing that)
yourrandomnobody@reddit
The reason Chief Blurbusters keeps on pushing "CRT shader emulation" is due to the fact that manifacturers don't want to implement hardware-level BFI at 60hz.
It's primarily for retro games. This is the main reason the Blurbusters 2.0 Certification exists, for 60fps @ 60hz emulator content.
CarVac@reddit
The main reason you want scanning BFI instead of fullscreen BFI is that it reduces room-illumination-flicker.
A CRT scanning at 60Hz is illuminating the room around 90% of the time (NTSC vblank is 8%), while an LCD with backlight strobing is illuminating the room maybe 5% of the time so the flicker in your peripheral vision is stronger from even 120Hz BFI than it is on a 60Hz CRT.
bubblesort33@reddit
I'm not happy until 90% of my frames are black.
blarpie@reddit
Well being able to play old games with no blur is nice and for now that will probably be the best use for it when it comes out until horsepower catches up.
Now if they added blubusters rolling scan method into the firmware then maybe, but then you hit the oled needs more nits to use bfi solutions issue.
FlatTyres@reddit
The only ridiculously high refresh rate I'm interested in is 600 Hz for video - the lowest common multiple for pulldown judder-free playback for 24p, 25p, 30p, 50i, 50p, 60i and 60p video (not talking about any form of motion interpolation).
I don't care enough about 48p films to desire a 1200 Hz screen but I suppose if I really did want to watch a 48 Hz film, I'd drop the screen down from 600 Hz to either 144 Hz or 240 Hz.
adaminc@reddit
I imagine this tech might be useful in the future for some sort of layered LCD like screen to give you a 3d effect.
W4DER@reddit
Over 9000!
Coming soon...
AgentUnknown821@reddit
Geez….I used the meme people that bragged about their high refresh monitor by exaggerating….now it’s an actual thing.
binarypie@reddit
Was 1KHz too confusing?
Prince_Uncharming@reddit
1 is less than 1000.
Yes, some people will be stupid and think 500hz is better than 1khz.
binarypie@reddit
We still don't know if this is objectively better or not. It could be really fast with shit picture or other stability issues.
Prince_Uncharming@reddit
That’s not the point I was making, at all.
None of that matters anyways, obviously the assumption is a like-for-like comparison. All else equal, yes, 1khz is better.
Affectionate-Memory4@reddit
I've seen people ask things like "5070 better than 4090? Number bigger so yes? Nvidia wouldn't lie to me right?" Obviously absurd example but it illustrates the point. People associate big numbers with being better.