Why I see ton of people with v-sync disabled?
Posted by Careful-Inspector932@reddit | buildapc | View on Reddit | 425 comments
I recently bought myself a gaming pc and I noticed a huge screen tearing, v-sync came into my help and since then i never had any problems. I tried also AMD Freesync from AMD Adrenalin + v-sync disabled but still there was a little screen tearing.
I heard many people saying to disable v-sync, like... how can you deal with that screen tearing? Even at cost of some fps.
xxxTheBongSquadxxx@reddit
Screen tearing is really only a problem if the FPS exceeds your refresh rate. Mine sure doesn't for most recent releases.
Elliove@reddit
No, this has nothing to do with FPS. If frames aren't in sync with refreshes - you get tearing with FPS both below and above refresh rate; if frames are in sync with refreshes - you get no tearing with FPS both below and above refresh rate.
ToMagotz@reddit
Wait do there’s no reason to not limit fps? The gpu works less hard too that way
IronicCard@reddit
Fps past your monitor's refresh rate is redundant and not even going to be displayed. Limiting it is pretty much always a good idea. It helps prevent crazy .1%/1% lows.
FinalShellShock@reddit
It's not entirely redundant in competitive games, where it can still reduce input latency, but it is minor and won't make a big difference for most average users.
Agzarah@reddit
It won't reduce input latency per se as your input isn't changing.
What it does is make sure you are seeing the absolute latest info and can respond more accurately to the data. Rather than a frame which was requested almost a full cycle behind.
For example 100fps on a 50hz panel youl get data that was sent to the gpu 0.01 of a second ago, rather than 0.02 seconds ago using 50hz on 50. 50% of the data won't ever get rendered But what does is more recent.
(I know people don't use those rates, but it makes the numbers clearer to represent)
It might sound crazy small, but it has impact.
What's key though is consistency. And why locking the fps to multiples of the refresh rate can give a smoother gameplay than allowing in spikes
hypexeled@reddit
It also feels more smooth/responsive. I can notice a clear difference at 120hz between being at 120fps and 240fps.
SteamySnuggler@reddit
Can you feel the difference if you turn off the fps counter though?
laserbot@reddit
My wallet is lucky that my eyes are stupid and can't tell the difference between 60 and 120, let alone 120 and 240.
NotAtAllHandsomeJack@reddit
Man, I’m a special kind of stupid. Sometimes 60hz looks like a slideshow, sometimes it looks smoother than a buttered up cue ball.
Weakness_Prize@reddit
Sameee. Especially in VR even between like 30 and 60. Although I'm also just used to low framerate from other games Insuppose
Naetharu@reddit
If you're getting 30fps on VR you notice because you'll be vomiting on the floor.
A high and consistent fps in VR is critical else it simulates the effect of being poisoned and the brain responds in kind.
Weakness_Prize@reddit
Except that that isn't always the case. I've dealt with it plenty.
Naetharu@reddit
It 100% is the case. Very well documented. Was one of the major hurdles in getting commercial VR working.
Weakness_Prize@reddit
Cool. Personal experience; I've dealt with lower than 30 FPS framerate in VR for extended periods of time, and it wasn't nearly that bad. Sucks ass, but didn't feel like I was dying.
Now, when SteamVR starts crashing and flashes in my face, that is enough to make me just about puke.
weedemgangsta@reddit
you remind me of a buddy who has been complaining that his temporary tv is only 60hz, meanwhile i just upgraded to a 60fps capable device and i feel so spoiled by it. ill never go above 60fps i dont want to ruin my eyes
ImYourDade@reddit
I think it more depends on the kind of game. It's very very apparent in something like cs where you're spinning around flicking and moving, or any fps probably. But if you're playing something like balatro it means pretty much absolutely nothing to get more than like 30 fps
118shadow118@reddit
It's probably down to 1% lows, average 60 fps with high 1% lows is gonna look a lot smoother than avg 60 with low 1% lows (meaning more stuttery)
If you use some onscreen performance metric apps like afterburner, you can bring up the frametime graph. The smoother the line is gonna be, the smoother the game is gonna feel
You-Asked-Me@reddit
I think that is probably due to drops or variations in frame-rate. It's harder to tell the difference between constant 60fps and constant 120fps, but when you have 120fps that dips down to 60 and then back to 120, we notice the changes a lot more.
NotAtAllHandsomeJack@reddit
You’re giving me too much generous assumption. Just stoopid.
But nah, I only really play one game (iracing on triple screens), I can notice when gsync isn’t running/low frame rates.
On a desktop tho? Nah.
butt_soap@reddit
That's crazy to me. 60 feels like a sideshow to me going to 144
49lives@reddit
Your eyes aren't stupid if you have three monitors with 60/120/240 hz, and you move the mouse cursor in circles on all three while going back and forth. You will most definitely notice.
nonton1909@reddit
Maybe you just forgot to turn on 120 hz when you tried it? Typically monitors are set on 60 by default
Wolf_Fang1414@reddit
For me, is always a when I go back thing. I didn't really notice the jump between 30-60. And then not to 60-144. But I nog8ced when going back.
wegotthisonekidmongo@reddit
Right. I notice nothing of what anyone is talking about. And I am glad my eyes are not that sensitive to motion.
that_1-guy_@reddit
Because how the games work it will reduce input latency as the game sees your input sooner and renders it sooner
Agzarah@reddit
No, the gpu is going to have zero impact on how quickly the input is registered and then processed by the cpu.
It may give an illusion to lower latency, because you are reacting to a more recent data point. But the actual input will remain the same
AggravatingScheme327@reddit
Wrong, limiting framerate prevents the CPU from queuing frames that the GPU hasn't rendered yet. Without a framerate limiter if you just let the game bounce off of VSYNC, you get 3 frames of latency before VSYNC imposes any sort of limit.
Plini9901@reddit
If it's triple buffered.
Faranocks@reddit
No. Physics refresh rate (or whatever is controlling character in the engine) is almost never more than the rendered refresh rate. The CPU will queue up inputs and process them at the start of a new frame. Some competitive games has the latest input sent with the last local tick to reduce the impact of FPS on a competitive advantage.
Subtick in CS2 essentially adds a timestamp to when the input was pressed locally. At the same time, CS2 still only processes inputs with every new frame. This is why locking FPS to 30 allows for some movement BS. The CPU waits to process the inputs until the next frame.
tinysydneh@reddit
You can have it processing frames beyond what it is actually rendering, but how well this works is heavily dependent on the actual engine. Some are actually better decoupled so this stops working.
Faranocks@reddit
Examples please? I haven't heard of a physics engine tickrate exceeding rendered refresh rates. Exceptions for server sided physics control.
tinysydneh@reddit
Sorry, when I said "rendered" I meant displayed. It's not uncommon for frames to render/process without actually being displayed. Poor choice of words on my part.
Faranocks@reddit
Yes. We are not disagreeing then. Screen tearing occurs because of too many frames (frame buffer is overwritten as the monitor is rendering a frame.) Not all frames rendered are written to the frame buffer when the monitor is drawing an image though.
tinysydneh@reddit
Yep, just offering context for the most part!
salt-of-hartshorn@reddit
Input latency is the round trip time between making an input and seeing the results of that input rendered on the screen, not the time between an input being made and the CPU handling the hardware interrupt.
that_1-guy_@reddit
Games run on the CPU, the CPU still makes every single frame
Gpus don't just magically create an image of the game
If you don't understand the pipeline of how a frame is generated and how a game is computed I don't know how on earth you think you know where input latency comes from (other than physical ofc)
IronicCard@reddit
I just want to butt in and say that for competitive game if you have a GPU that can handle a lot more over your display then it does help. It's not "redundant" exactly but my mind isn't jumping to that when the start of the list is someone talking about how their fps doesn't meet their monitors refresh rate on modern titles anyways. I didn't think about esports after that point and even then my reasoning comes from a point of stability.
CubingGiraffe@reddit
You do get lower input latency though. 300fps on 60hz registers the action starting hundreds of frames earlier than 60fps on 60hz.
Situation A.) you are on 60fps@60hz. You click. The game takes 1/60 of a second to process that information and begin the animation and backend that completes the action of your click.
Situation B.) You are on 120fps@120hz. You click. The game takes 1/120 of a second to process that information.
Situation C.) you are on 120fps@60hz. You click. The game takes 1/120 of a second to process that information.
It's milliseconds, and you may not SEE the difference in input latency, but it is certainly there.
eddietheperson@reddit
The frame rate and the speed that the game registers mouse clicks are completely unrelated. Let’s say your gpu is only able to push 1 frame a second. Why would the rest of your computer/game wait until the next frame is drawn to poll where the mouse should be? Based on your theory, if my GPU could produce 100000 frames a second, it would magically now be able to increase the poll rate of my mouse, which is handled by the CPU, not the GPU. Not to mention, mice have set polling rates that are constant, no matter what is happening on the screen.
Faranocks@reddit
It absolutely does reduce input latency. Input latency for most games is in some way directly tied to framerate, tying an input to the current or next frame (depends on how it's implemented). The more frames the sooner the input is processed.
Screen tearing happens because of how the display buffer is sent. If you render two frames every single screen refresh, on average your monitor will output roughly half the first frame, and then half the second frame. At higher FPSs (5-6x refresh rate) you can end up updating the display buffer 2-4 times each refresh.
300fps on a 60hz feels significantly more fluid than 60fps, or even 120fps. It's not even close. Open up a game like CS or Valorant, lock your monitor refresh to 60 and play with 300+ fps compared to locked 60. Even better implementations of locked FPS don't feel anywhere near as fluid, even with the abundant screen tearing.
For non-competitive games, fluidity matters less than visual fidelity, and locking FPS to reduce/remove screen tearing can be a good thing. At higher FPSs locking frame rates can be good as being half a frame behind is a fraction of a ms rather than several ms.
jlreyess@reddit
So it does reduce input latency putting the latest input.
SolomonG@reddit
Nah, lots of older games (or older engines) only update things like cursor location when a frame is rendered, regardless of if it is displayed.
You're mouse movent isn't fluid from a games perspective, it jumps from point to point to point and there are more points with more frames.
zeldapkmn@reddit
What multiples?
Like 120 FPS for 144 Hz?
Agzarah@reddit
Those are factors of. Not multiples.
zeldapkmn@reddit
Lesson learned not to post on Reddit when first waking up
Agzarah@reddit
I'm still learning that lesson :(
IronicCard@reddit
100% but I do feel the potential for frame hitching is worse than slightly worse response time. I feel it's better to limit fps based on GPU usage rather than the monitor. But not everyone has a good enough GPU for that to always be viable unfortunately. Even my mind jumps to 120hz - 144hz being standard but plenty of people still use 60hz as well. Especially at higher resolutions.
Green-Leading-263@reddit
Load of bollocks, what bs. You will more frames always means a frame is getting to monitor quicker, providing they aren't sat waiting to be displayed.
IronicCard@reddit
I've got no clue what you're trying to say man. The refresh rate is specifically the rate in which the monitor refreshes itself every second. Fps exceeding that doesn't display. As others stated though it will display the most "modern" one which does help but it's kind of rare for someone to benefit much from it. And the only reason you'd care to limit fps is as I originally stated being able to get more stable performance over maximum. 400 fps is great in CS2 but whenever it dips it's kind of annoying.
Green-Leading-263@reddit
You are wrong. FPS over your hz absolutely is faster than 400hz 400fps. A frame made slightly faster will come to screen therefore latency is less.
IronicCard@reddit
I mean you're right it's just past a certain point very early on the returns are diminishing. At 144 fps past that the reduction falls heavily. At 240fps it almost stops getting better. And GPU utilization going to 100% is going to increase input latency. I didn't think to clarify and I still believe what I said will work for the vast majority of people. The 1% lows you get are so atrocious at 100% GPU utilization that my reasoning is the stability from limiting FPS is nice. I didn't think about anyone using 60hz monitors for esports.
Green-Leading-263@reddit
Input latency happens because your GPU is throwing frames out faster than monitor can display them. So you end up with frames waiting to be displayed. Reflex/Low latency mode prevents this and makes it more responsive. It's a noticeable difference.
bertrenolds5@reddit
Yea but it makes the game smoother. I go almost double my native refresh rate. V sync disabled.
Moscato359@reddit
"Fps past your monitor's refresh rate is redundant and not even going to be displayed."
This is not true.
The screen is filled from top down, and if the new frame finishes prior to the old frame being completed, it starts filling the rest of the screen from top down.
This is what causes tearing.
salt-of-hartshorn@reddit
This is wrong. You'll get a rolling shutter effect where, at high enough FPS, you have more than 2 frames displayed on screen at the same time. It absolutely is displayed, the monitor just can't display all of the frame before the next one comes in.
TyraelmxMKIII@reddit
Finally some sane people that don't tell everyone to "uncap fps to get 100% gpu usage because you always want 100%gpu usage" type of bs.
TheMidwinterFires@reddit
Well they're wrong, it's not "entirely redundant". FPS above refresh rate will still provide a smoother experience
Big-Resort-4930@reddit
No it won't. Smoother =/= lower latency, you reach peak smoothness when you hit your display's refresh rate and everything beyond thet will be more jerky and uneven (visually).
You will get lower latency at the cost of smoothness and stability, so it is absolutely redundant for 999 ppl in 1000.
Faranocks@reddit
Different argument.
llcheezburgerll@reddit
hey i paid top dollar for my high end gpu and want to use all the way! /s
neighborhood-karen@reddit
FPS past the refresh rate isn’t redundant since it decreases input latency since the game is physically being updated more frequently than otherwise, you are getting an actively smoother and more responsive experience with high fps
DSpry@reddit
If you limit with RTTS. You can force everything, including windows itself to only push said number you selected. I like doing this cause sometimes it wants to use more than necessary to render my default wallpaper. I like to think of this situations exactly how it handles ram. “Only 8gbs? I mean we can run on 2-3 but I wont be happy…. Now you got 32?! Cool I’m gonna use 7-8 now cause I can.”
Lokeze@reddit
Technically you get a slight bump in response time the higher your fps is, but there are diminishing returns for that and the difference is negligable for 99.9999% of people.
CasualCucumbrrrrrt@reddit
No this statement is not true. Higher fps = lower latency. Even when going above your monitors max refresh rate.
Steezle@reddit
If you have a super high refresh rate, screen tearing will be less significant. And in an esport where you want to see the latest pixels, it may be a trade off worth the minor picture quality loss.
Elliove@reddit
Pretty much. What comes to VRR - you want to keep frame times within VRR window, FPS limiter helps with that, so you get no tearing and no VSync input lag within VRR range. What comes to input lag as a whole - it used to be the case of trying to get as much FPS as possible, but these days in-game limiters are smart enough to reduce latency using your PC's "excessive power", and then Nvidia users also have Reflex. Long story short, good FPS limiter puts some of the delay before input/simulation, which reduces the time between inputs and on-screen response. Ingame limiters often do that, Reflex does that, I imagine Anti-Lag 2 does that as well, and then RTSS back edge sync, and Special K's Latent Sync, and SK's VRR low latency limiter too, and if you go way back, then you could do that for D3D9 games using GeDoSaTo's "predictive limiting" feature.
So, tl;dr - FPS limiters are currently the best way to achieve smooth and responsive gameplay, and in-game limiters (that competitive games typically provide) usually reduce latency further than external limiters (Adrenalin, RTSS, Special K - they all can inject the delays only on the rendering threads, while modern games run input/simulation on a separate thread, so if you strive for the lowest input latency, then try the in-game limiter first).
salt-of-hartshorn@reddit
If you want responsive gameplay the best thing to do is to turn off vsync, disable VRR, disable compositing, and uncap FPS. You'll get a lot of tearing but that configuration is what minimizes latency.
Elliove@reddit
Wdym by "disable compositing", and how does it increase responsiveness?
salt-of-hartshorn@reddit
Compositing being a step in rendering where the game is first rendered to a buffer external to the game that is part of the desktop interface. I'm not a windows user but IIRC fullscreen on Windows disables it for that window.
Elliove@reddit
Yeah, it seems you haven't been using Windows for quite a long time. Windows 8 introduced DXGI Flip Model, which removed the need for extra copy operations that used to add latency, the composer pretty much works in passthrough mode.
Glittery_Kittens@reddit
By “FPS limiter” you mean the one present in the Nvidia/AMD control panel right?
I’ve been running an FPS limit of 151 on my 155hz monitor for a long time. I have no idea if that’s the best way to do it but it seems to work pretty well. I’m not playing super graphics intensive games though generally.
CaravieR@reddit
If your game has one in-built into the settings then it's preferable to use that one. Not always the case but it's a good rule of thumb to follow for the smoothest experience.
NachOliva@reddit
It is hard for me to understand how adding delay reduces overall input delay.
Are we not talking about frame timing? If that is the case I understand that locking fps, wether in-game or through vsync or VRR is the fix as it stabilizes frames.
But unlocked fps should still have less latency. In my experience I feel the visual effect of screen tearing/frame skipping dissapears once the system can render maybe over double the maximum of your monitor?¿
Why isn't unlocked + very high system fps not a better way to achieve such goal?
Elliove@reddit
Simple example - at 60 FPS a single frame takes 16.7ms for CPU. It starts with processing your inputs, then changes the position of objects correspondingly, draws a frame, and sends it to GPU. At 1000 FPS, a single frame takes 1ms. Simple FPS limiter lets CPU do its job, then adds a delay, so, say, if it only took for CPU 1ms to draw a frame - with 60 FPS lock, it will add 15.7ms after CPU has done its job, and that 15.7ms will be a delay between CPU drawing frame and GPU starting to work on it. If, however, this exact same delay will be put before CPU processes the inputs and draws the frame, then you'll have 60 FPS with the same input delay as 1000 FPS. Google up how Reflex works, there will be graphs, tests, whatnot.
Unlocked FPS can, in theory, provide lower total latency than smart FPS limiter would. But you'll be hard pressed to notice the difference at high FPS competitive games usually run at due to diminishing returns. There's just no point in going outside of VRR range on 240Hz+ people use for competitive gaming, because the difference will be laughable. Here's a test from BlurBusters, and this was with just the in-game limiter, no Reflex.
NachOliva@reddit
Been playing competitive games for a long time and took the ride from very crappy netbooks on amd apus to now running a modern system. I consider myself to be very sensitive to both latency/delay and screen tearing.
I found this research: I understand they say the difference, although laughable, it is there and in competitive gaming can be significant.
"Latency and refresh rate effects are more pronounced when target motion is complicated and unpredictable, where timely and accurate visual feedback become more critical for aiming".
For that minimal advantage, just can't yet agree that there is no point of going outside VRR range.
If your game runs over 120fps on a 60hz display would you agree it should be a better experience running the game uncapped rather than capped near 60hz?
All I'm saying capping games definitely makes the game run visually smooth, but I still think is not the rule when talking about responsiveness and latency.
Elliove@reddit
We're talking VRR here. You're unlikely to even find a VRR limited at 60Hz, and the examle I provided shows 240Hz, which is a way more realistic scenario for modern competitive gaming. You'd need to x10 FPS to win a single ms of latency, according to the test I linked.
NachOliva@reddit
Aside of VRR, Im trying to point out a reason why people could prefer vsync off, as op is asking.
Why would someone just "stay within vrr range" when his system can render double the fps display for smooth gameplay, solving the frame timing issue.
Even if little difference in delay I still cant get why you would not consider that "better".
Elliove@reddit
With a decent FPS limiter, you shouldn't have stutters to begin with, so double the FPS shouldn't feel smoother. So here I am on 60Hz, with a game that I can run at over 1000 FPS - what's the actual point of having over 60 FPS, if I limit FPS with Reflex or Latent Sync? It won't make things smoother, won't reduce latency, I'd basically be burning electricity for no gains.
NachOliva@reddit
Lower overall system latency maybe?
I remember someone saying some games benefit from higher fps for input stuff (maybe helps with 1000+hz peripherals).
It may be placebo for me, I have done this test couple times and I agree It is hard to notice, but I have stayed uncapped for a long time in most games and setting fps caps throws me off in games where my system can render stupid amount of fps.
Elliove@reddit
Technically, all of the games do by default, because CPU polls the inputs every frame, unless told otherwise. But then we come back to modern smart FPS limiting - in-game limiters, Reflex, Anti-Lag 2, RTSS back edge sync, SK's Latent Sync and low latency limiter, etc, even GeDoSaTo's predictive limiting feature could do that. Such limiters can inject some delay before CPU starts polling inputs for the next frame. So, say, taking 1ms to poll inputs and draw a frame at 1000 FPS, will be no different between waiting 15.7ms, and then doing the same for 1ms - in both scenarios there will be just 1ms simulation-to-render latency. I love such smart things, because I totally don't want games running at unreasonable FPS for no benefit (and some of them, like Touhou or fighting games, should be kept locked to 60 due to game logic being tied to FPS).
Either way, whatever works for you best and provides best experience - stick to that, and enjoy your games!
NachOliva@reddit
I am sure I am sticking to what works for me.
It's just tough to see someone say with such eloquence that something I've been doing for years has absolutely no benefit, even when esports forums/threads and my gut feel say otherwise.
You reminded me of Skyrim and Forza saga which had some ugly issues with uncapping fps.
NachOliva@reddit
I just feel that competitive scenarios are being left out from your realistic approach.
AggravatingScheme327@reddit
Correct, no reason to not limit framerate. You get improved pacing and reduced latency.
Weekly_Inspector_504@reddit
So you would limit a 4090 so it performs like a 4070?
ToMagotz@reddit
Well the graphics should still be in 4090 level, just no unnecessary power draw?
RunningLowOnBrain@reddit
Unless it's an old/badly coded game where inputs are tied to framerate (many rhythm games)
Rad_YT@reddit
Personally I recommend locking your fps to 2x your refresh rate for competitive games (reduce input latency) despite not seeing the frames, and then lock your fps to your refresh rate for less competitive games
Mercureece@reddit
Unless it’s a competitive game like CS or Valorant where the increase in FPS might also increase responsiveness/decrease input delay then no but I could be wrong
Elliove@reddit
Competitive games usually have smart built-in FPS limiters, so you aren't losing much (if any) latency if using in-game FPS limiter as opposed to having unlocked FPS.
Skysr70@reddit
lol
NachOliva@reddit
we're definitely talking about system latency here, are you talking about tickrate maybe?
sautdepage@reddit
Intuitively, nothing would be more optimal than starting rendering the frame so that it finishes exactly on time to display on next screen refresh.
Like so:
> | XXX | XXX |
Intuitively, leaving frame rate uncapped won't be as good as this (because some frames would overlap with display refresh). So it's kind of a brute-force approach.
In reality, it's probably hard to time it perfectly. Still with a VRR display, I would expect net result to be close to optimal, possibly better than brute force uncapped. But I'm probably missing something.
pretty_random_dude@reddit
Most games rly on redraw per tick or rather per frame. E.g. input is processed per frame. Physics - per frame... etc.so more fps you have the more responsive the game becomes hence the display.
PhattyR6@reddit
It is almost always best to cap your FPS.
Evens out frame times, reduces latency, reduced power usage/temps/noise.
Milk_Cream_Sweet_Pig@reddit
There is a reason. Exceeding your refresh rate will cause tearing. You're still better off capping your fps to -2 or 3 your maximum refresh.
CreamBzh@reddit
Even if you lock frames to your monitor refresh rate without sync, the tear will always be at the same place making it even more noticable, but I still dont use vsync, I lock frames 5 fps under or above and I somehow dont notice it
PruneIndividual6272@reddit
yes- but with lower framerate the frames stay longer and the timing difference between gpu and monitor can be bigger, which makes the tearing much more visible. When you move your mouse without V-Sync and the picture gets updated right after half the screen was updated you get a tear that is as many pixels offset as you moved the picture in one frame, lets say that was 6 pixels. When your framerate is twice as high- the tear would only be 3 pixels for the same mouse input.
Elliove@reddit
The offset will become lower, but the amount of tearlines increases, so it's up to discussion what is more distracting. Ofc at some point high refresh rate and high FPS just make tearing too hard to notice in general.
FROGxDELIVER@reddit
I've only heard streamers that use widow in overwatch say it's good. Everyone else turns it off for fps.
At the end of the day, it's preference to whatever mames u comfortable. But most pro players want less input latency, not a smoother picture.
Elliove@reddit
As shown here, there's just no point in going for crazy high FPS. And what comes to VSync, it only adds latency when frame times exceed VRR range. Mind you, back when they did this test, there was no Reflex, and Reflex would lock to 224 FPS, which is the more sane target for 240Hz display to compensate from frame time variance.
FROGxDELIVER@reddit
https://www.reddit.com/r/esports/s/e4yf4JMM9I
I think maybe the tide will shift eventually with the newer generation of gamers, but for the past 10+ years, it's always been the argument that less fps = bad, especially at the highest level where that does make a difference.
It would be cool if that shifts, but I haven't seen it in pro play yet.
Elliove@reddit
Google up "anomalous electrical input lag". Many competitive players believe that stuff, and try to fix it. Whatever they believe helps them - they should keep doing, but I'd seriously doubt any technical information that comes from pro players.
ColKrismiss@reddit
From a technical standpoint I have no reason to disagree with you. From a practical experience standpoint, I have NEVER noticed screen tearing without having a frame rate above my monitors refresh rate
Elliove@reddit
If you're using VRR, then that's why - VRR reduces tearing significantly, as long as your frame times are stable, and don't exceed the VRR range.
ColKrismiss@reddit
Oh for sure I am using that and I understand that, but even back in the early 2000s when I was first getting into PC gaming I noticed this. VSYNC was all the rage back then ("lower fps but it looks like MORE fps" was the big punchline for VSYNC) but my PCs were so cheap and underpowered I couldn't stand the input latency. I never turned it on cause I could never see screen tearing.
Even now though, my son's PC is made from my old parts with an old monitor with no VRR. His GTX970 won't push newer games past the 60hz refresh rate of his monitor and we see no tearing. On older games where it can blast past 60fps, we see lots of tearing.
Elliove@reddit
Thing is, the scenario you've described is simply not possible. Take one of those games where you can see tearing with FPS over 60, use RTSS or something to limit FPS to, say, 55 - and you'll see the tearing just as well. I imagine the difference between games you see was caused by some of the games running in borderless mode - since the composer Windows uses is itself VSynced, unless told otherwise, simply using borderless mode in games will result in no tearing at any FPS even with in-game VSync off. That's essentially the alternative to Nvidia's Fast Sync; and since I mentioned it, try forcing Fast Sync to those games that you want/can run at over 60 FPS on 60Hz screen, and you'll see that FPS is not limited, but there's no tearing - further proving the point that FPS has nothing to do with tearing, it's about syncing front buffer updates with monitor's refreshes.
Careful-Inspector932@reddit (OP)
So if i got it correctly if i set my refresh rate to max (75hz) and cap my fps to 60 i shouldn't see any screen tearing
perilousrob@reddit
first things first.. does your monitor have FreeSync or G-SYNC? If so, use that & disable vsync.
V-sync works by trying to force your system to produce the framerate the monitor is using. e.g. 60fps to a 60hz monitor. It always adds some level of input lag, but it does eliminate screen tearing.
G-SYNC works by matching your (specially G-SYNC enabled!) monitor's refresh rate to what your graphics card is producing. G-SYNC monitors have a hardware doo-hicky that communicates with your NVIDIA graphics card more directly to handle that sync-up. 'G-SYNC Compatible' monitors don't have the hardware bit, and will try to match sync within a given range (usually 48hz up to monitor max hz) but not give the same improvements to input lag, stuttering, etc that a full G-SYNC or G-SYNC Ultimate (offering 1hz to max monitor hz) certified monitor will.
FreeSync (by AMD) gives essentially the same results as G-SYNC, but without the monitor needing an entire piece of NVIDIA hardware added, and they don't charge monitor manufacturers for it.
If you have an nvidia card & a full g-sync monitor, disable v-sync and use g-sync only.
if you have an nvidia card and a g-sync compatible or freesync monitor, disable vsync and use freesync or g-sync, either is fine.
If you have an AMD card, and a g-sync compatible or freesync monitor... disable vsync and use freesync.
If your monitor doesn't support either g-sync or freesync, use vsync or fast v-sync (if available).
Full G-SYNC is better, IMO, but only a little and mostly people won't notice that small difference. sorry for the rambling answer, i know it's a bit muddled but i'm running on just a couple hours sleep today and my brain is frazzled ;)
thatdeaththo@reddit
Nvidia advises to use G-SYNC with VSYNC on in the Nvidia Control Panel. Here are the recommended settings from Blur Busters.
https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/
PhotoProxima@reddit
Other way around. The monitor refresh rate is adjusted to match the frames produced by the computer.
bertrenolds5@reddit
Cap it at 75 if not double your native refresh rate. V sync off. Should be watching 1% fps
Mrcod1997@reddit
If you are on an Nvidia card, there is an option for fast vsync in the control panel under 3d settings. Try that.
Careful-Inspector932@reddit (OP)
Unfortunately i'm not, team red here
AnxietyPretend5215@reddit
If your monitors refresh rate is 75 and you set a frame rate cap of 60, that should minimize or remove tearing.
Supposedly V-Sync is supposed to also help smooth out frame times or something? But at least for Nvidia, if you have G-Sync and V-Sync enabled my understanding is that V-Sync won't even activate unless you ever go over your monitors refresh rate.
So having V-Sync on at the same time is basically just functioning as a fail safe because frame limiters aren't perfect. It sounds like in your situation v-sync likely won't add anything beneath being capped 15fps below your max.
Can't speak to AMD software/driver wise.
airmantharp@reddit
V-Sync does vertical synchronization... and that's it. Different effects at different framerates.
If all of your frames are faster than one monitor refresh cycle, it can smooth out effective frametimes - i.e. what is shown on the monitor, by enforcing a limit.
SeaBet5180@reddit
So if I have a 240 hz monitor, I don't need vsync on? I usually crank everything to max in games and am running well below 240, let's say in arma 3
strawlem7331@reddit
Sort of, and its alot - let's say you have a 240 hz monitor with a response time of 3 ms displaying a game at 120 fps and everything else is equal; that means your monitor is displaying 1 frame every 6 ms, which introduces stutter like if you played a 30 fps game at 60 fps only smoother because 240hz has a refresh rate many times higher than 60hz and you may or may not notice depending if you have freesync / gsync enable and / or fps/refresh rate. By doing this, you also start to increase your latency in your monitor response times. This is because you are forcing your monitor to display the same frame more than once.
For example, if you run a game at 30 fps on a 240hz monitor, the game will look smoother than on a 60hz because it is showing the 8 frames at a refresh rate much faster than a 60hz can show 2; however, because the 60fps monitor only presents 2 frames per refresh rate, the input latency is much less than the 240 hz. This is what baffles me about "competitive" gaming monitors. It's a marketing scheme targeted towards gamers and it's very misleading.
So what happens when you hit fps not divisible by 240hz? You start getting partial frames or an odd number of frames. This is normally what people talk about when they mention screen tearing in games. Take the same hz from the previous example and let's say you are running a game at 90fps. That means the monitor is trying to present about 2.7 frames every time the monitor refreshes; this is where screen tearing starts taking affect.
Since fps fluctuates constantly while playing a game, you technically can have screen tearing and stuttering at the same time because the current fps is not evenly divisible by the monitors refresh rate AND the monitor has to render multiple frames every refresh.
This is also partly why I won't go above 144hz at 4k and why 60fps at 4k is sort of a goldie locks zone - it makes no sense today because most gpus will struggle to display 4k (using ultra settings) at 120fps let alone 144hz.
The nice thing about 144hz monitors is that it displays 2 frames per refresh rate at 72 fps allowing me to have silky smooth gameplay as long as I can meet or pass it AND still have relatively low input latency while gsync with fast vsync will throw out any extra frames in addition to making the gameplay unnoticeable from 144fps at any fps between 72 and 144.
You can argue the same thing about 60 fps with 120 hz but its not as smooth as 144 but the tism makes me see red with any stutter.... so with that being said your mileage will vary
AnxietyPretend5215@reddit
I think it's something that comes down to tolerance and preference. Also, some form of VRR (FreeSync, G-Sync, or alternatives highly recommended).
As long as you're able to maintain a consistent frame rate within your monitors VRR range (ex. 48hz - 240hz) and don't experience large jumps in frame times due to dropping from like 120fps to 60fps for example you should be mostly safe. But the opportunity for screen tearing is present.
Honestly, there's no harm in trying the no V-Sync approach to see if it vibes with you. If not, it's pretty quick to get the blur busters method set back up.
strawlem7331@reddit
Not true from the nvidia side - even with gsync and lower frames you can have tearing. The only thing I've seen that acts like its unconstrained fps is fast vsync which just tosses any extra frames. If you override the application settings then generally you can turn off vsync in the app and let nvidia handle it.
grahamulax@reddit
Yup noticed this with my 175hz monitor as well. I haven’t touched settings in a while (a curse kinda since I love that stuff, but I got a 4090 for msrp) and realized this was happening in newer games that are actually “next gen” in terms of graphics and systems. MHWilds and Expedition 33 and Oblivion. Vsync fixed it all! I don’t use DLSS either but I use nvidia low reflex not OC mode and also VRR if it supports it. MH Wilds had the most tearing easily but Vsync was the way.
Another fun feature on the nvidia app I never used but when I was in VR with oblivion remaster I was getting hot and was gonna open my window up when I realized my GPU was hitting 85C (which is when the GPU goes power mode). So manual fan adjustment to 2500rpm when I play games and by DAMN it went to 45C which keeps it out of low power mode which gives me more frames.
AnxietyPretend5215@reddit
You are correct, screen tearing is still possible with just VRR and a frame cap but it's not as likely to happen.
The way I understand, is that it would require a significant jump in frame rate (frame time) for example 120 fps (like 6 or 8ms frame time?) to 60 fps (16ms frame time) for something like that occur. It's a whole interaction between the GPU > Driver > Display pipeline that I'm too stupid to understand intimately.
That's why I specifically mentioned minimizing it. Some form of V-Sync being enabled is the safer option, but I've had mine off for a long time now without any tearing.
BonsaiOnSteroids@reddit
Refreshrate of 75hz and 60 fps will literally lock you out of ever reaching 75hz with v-sync enabled. V-sync drops to the next divisor if your fps drops below your refreshrate for too long. So you will always run at 37.5 Hz and FPS, assuming you never Fall below 37.5 FPS
AnxietyPretend5215@reddit
Triple buffering (or driver buffering) prevents “lock to half”. The old “divisor lock” (75 Hz → 37.5 Hz) only happens with plain double-buffer V-Sync when your GPU is so slow it misses every v-blank and gets forced into exact fractions of the refresh. Modern PC games and NVIDIA’s control panel will use triple buffering (or more) so the GPU can keep rendering ahead.
I regularly use either NVCP or RTSS to limit my 240hz monitor to 60, 80, or 120 FPS even with V-Sync enabled. Depending on the game, I will follow the blur busters approach and I've never encountered what you're describing right now.
Biscuits25@reddit
It can definitely help with frame timing. I have a freesync monitor and i dont get any tearing in no mans sky but if i turn it on, the game is noticeably smoother. If i dont have it on, i get occasional hiccups that are really annoying. Every game is different though, i usually try to keep it off as it does cost a little performance.
AnxietyPretend5215@reddit
Yeah, I've done the whole blur busters thing and I know they've done extensive testing but it definitely feels like a game by game type situation.
Mp11646243@reddit
What GPU are you running? And what resolution is the monitor? V-sync will be much worse if you have a low end gpu and are trying to render in 4k
Careful-Inspector932@reddit (OP)
Monitor: 2K 75hz
GPU: 7900XT
CPU: 5800X3D
RAM: 32GB DDR4
(playing in 2k)
resetallthethings@reddit
by 2k you mean 1440p, because that's what is typically referred to as 2k
you have a good system, you should get a better monitor
I suspect your monitor doesn't have built in Freesync, hence seeing all the screen tearing without vsync enabled. I would spend a little money on a better 1440p, higher refresh rate (144hz or better) monitor that supports freesync.
I have been playing on AMD systems with freesync enabled monitors for the past 3 years and have never seen any screen tearing
Careful-Inspector932@reddit (OP)
Yes i meant 1440p and no it doesnt have freesync integrated, activable only through amd adrenalin
resetallthethings@reddit
then it isn't, the monitor needs to be freesync in order for the driver to do anything with it to reduce screen tearing/match the monitor
PuzzleheadedTutor807@reddit
If your refresh rate is 75hz turning on vsync will cap your fps at 75 to match the monitor. That is literally it's only job, to sync the video with the screen. Higher refresh will always feel a little smoother.
Careful-Inspector932@reddit (OP)
And if some games runs between 60 and 75? Should i cap at 60?
PuzzleheadedTutor807@reddit
Once vsync is on it will override any frame cap you have implemented to try and match the refresh rate. Ofc this only works if the frames can go higher than refresh. If they are lower, it will just try to keep them in sync with the refresh so half a frame does not arrive before the update and half after (Wich is what causes the tearing) by d laying frames. This could create slight (but often noticable) input lag.
Zomb1eMau5@reddit
You should cap your FR a little below the max hz if your screen. E.G. 140 FPS for 144hz
bobsim1@reddit
Basically correct. But id rather set the refresh rate also to 60. Vsync also always adds some latency, thats why people disable it.
Nogflog@reddit
I have terrible screen tearing on Oblivion despite my FPS cap at 60 and refresh at 60 Hz (without Vsync)
What is the issue?
KING_of_Trainers69@reddit
If you have an adaptive sync (freesync/gsync) display you should cap at 58fps. If your display is a bog standard display you should either use Vsync or put up with tearing.
nipple_salad_69@reddit
don't cap at your monitor's max refresh rate, you're gonna have a bad time. if you wanna limit your framerate at your monitor's max, use vsync
Nogflog@reddit
tyty
censors_are_bad@reddit
The issue is that any time vsync is off (assuming you aren't using VRR like gsync), you WILL get tearing.
xxxTheBongSquadxxx is highly upvoted but their main claim is completely wrong, unless they mean "I'm personally not bothered by the constant tearing when the FPS is lower than refresh rate", which would be reasonable but I doubt anyone is understanding it that way.
The behavior xxxTheBongSquadxxx is what you would expect when VRR is on but allowing tearing rather than doing vsync. (You can control this with settings in the driver, "Enhanced Sync" for AMD, "GSYNC + VSYNC = On" for nVidia if I remember correctly.)
Highly upvoted but misleading/incorrect comments have become quite common on reddit in the last couple years or so (right about when the API changes hit).
Nogflog@reddit
Yeah I had initially just commented 'No?' to BongSquads comment, but I reframed it to get an explanation. Thank you for confirming my suspicion. I'm just gonna play with Vsync on lol
NoName2091@reddit
Are you getting 60fps?
Nogflog@reddit
Most of the time, but it will frequently dip in the over-world. Is that the issue?
NoName2091@reddit
No, this has nothing to do with FPS. If frames aren't in sync with refreshes - you get tearing with FPS both below and above refresh rate; if frames are in sync with refreshes - you get no tearing with FPS both below and above refresh rate.
From another comment.
Nogflog@reddit
so basically, use VSync is what ur saying?
phoenix4ce@reddit
Try capping at 59. It's purely anecdotal but I've found capping a frame or two below your actual refresh rate can make the difference. But I also use Gsync so I'm not sure if it'll be different in your case.
Nogflog@reddit
ill give it a shot thx
PuffyBloomerBandit@reddit
the fuck it is. screen tearing constantly happens regardless of your frame rate with vsync disabled. some people just dont notice it because theyve been playing without it for so long.
CubemonkeyNYC@reddit
Please delete this comment. It's entirely incorrect.
sdcar1985@reddit
It tears regardless sometimes. If Vsync isn't on, I get tearing in every single one of my games and it drives me nuts.
tyrannictoe@reddit
I have never seen a blatantly wrong comment with this many upvotes. You can get screen tearing at 29 fps on a 60Hz display.
burninatorist@reddit
You need V-sync turned off for your Variable Refresh Rate tech in your monitor to work (they can conflict with eachother). Some people say you need vsync on for VRR to work, they are wrong.
crazydavebacon1@reddit
Sucks when you can’t get 60fps anymore huh😂
DivineSaur@reddit
This isn't true, you can still get tearing with vsync off even if below refresh rate.
Careful-Inspector932@reddit (OP)
preamble: i run mostly every game over my monitor refresh rate (w/o upscaling) exept for some new entries like Oblivion Remastered, Stalker 2 or modded Cyberpunk 2077
I tried 75hz + 60 cap -> exactly what you said: no screen tearing.
Unfortunatly it was not as smooth as before (i was playing at 60hz + 60 cap).
So i came to the this conclusion: 75hz + no cap + v-sync on which means no screen tearing, best smoothness expert in new games where fps goes from 75 to \~60. I don't know what to do in that case
AvocadoMaleficent410@reddit
Totally correct, i have 4k 240hz. There is no way any hatrdware reaches that limit. Even on older games.
antimuggy@reddit
You think there’s no hardware capable of playing any game at 4K 240+fps?
iAmBalfrog@reddit
Without generated frames, most triple A games will struggle to consistently reach half of that with a 9800x3d/5090, I mean hell Assassins Creed Shadows with that barely averages 70 with MFG 4x off.
antimuggy@reddit
Ok but he said any game, even older ones. This is just not true, not to mention he said any hardware. You know there’s a 5080 right?
iAmBalfrog@reddit
I mean it feels slightly facetious, yes, games like LoL or CSGO might hit 240fps on 4k, but games like Farcry 6, Cyberpunk, Black Myth Wukong, Warzone, Escape from Tarkov, Elden Ring, those, without RT even on will struggle to breach 200hz at 4k, let alone 240hz, with the literal best of the best watercooled/overclocked 5090 and 9800x3d.
The 5080 is worse than a 5090, why would I be using 5080 statistics, it just proves my point further that 4k 240hz is near impossible to reach on games that are at all graphically intensive.
antimuggy@reddit
but you still are not considering the full scope of the question
ANY game. there are thousands of games that will run at 4K 240fps plus on that hardware. Of course some of the most demanding “benchmark” titles won’t. They were never meant to
iAmBalfrog@reddit
I'm going to give you the benefit of the doubt and say you're not trolling, but it feels like it.
You're purposefully ignoring the wider context of, the person who has a 4k 240hz screen, isn't likely buying it to comment on the FPS of Stardew Valley, most people who have a 4k 240hz screen, are likely utilising and playing modern demanding graphically intensive games. I don't really care if Dave the Diver runs at 400fps on a 5090, I do care if Elden Ring does.
If I say this years Sauber Formula 1 car wont beat anything on track this year, you going "Welll akshuallyyy it'd beat a 1l Seat Ibiza on a track" is pointless irrelevant and missing context, I would say purposefully.
Whataboutism isn't big nor clever, if you were unable to read the wider context from the statement about 4k 240hz being unachievable with modern hardware, you are the problem.
bobsim1@reddit
Youre purposefully ignoring AAA games of the last 10 years which look as good as new games and many people still want to play older titles they didnt get to before.
iAmBalfrog@reddit
Am I in a parallel universe, when I read the comment saying modern hardware doesn't hit 4k 240hz, I KNEW he was talking about modernish triple A titles, not a fucking pokemon romhack, not dave the diver, not dinkum, not team fortress 1, not half life 1, I get there are games that do and will run 4k 240hz, it just feels like a land of participation trophies where you have to account for people who take the literal meaning of a facetious reddit comment.
bobsim1@reddit
Its just your choice to only include the newest demanding games. Doom Eternal is AAA from 5 years ago. Looks as good as new AAA games and it ran 4k 100fps on a 2080ti.
iAmBalfrog@reddit
It's not my choice, I never made the comment, if someone says that say, a Ford Focus is too small, because they have 4 children and 2 dogs, I don't feel the need to respond to that person saying "A ford focus is a roomy car for me and my partner who have no kids and no dogs", the context, matters and the context can be easily inferred, when people are saying that modern hardware does not hit the limits of a 4k 240hz monitor, you can infer the context, the games they play, and the resolution they're playing them at pretty easily.
They likely aren't playing Starcraft 2 on low, nor Vampire Survivors on medium, nor Doom on high. They likely are playing the very popular games I mentioned that do not hit 4k 240hz with the modern top tier hardware. Now could his wording have been better, if he said triple A games from the last 5 years, maybe, do I want to live in a world where I need a sign that says "don't jump in canal" when walking next to a canal, no. But I guess you've got to cater to the fucking idiots nowadays.
bobsim1@reddit
You said most AAA games will struggle to get half of 240hz.
iAmBalfrog@reddit
I asked ChatGPT:
Cyberpunk: 106-130
Elden Ring: 120
Starfield: 129
Assassins Creed Shadows: \~100
RDR2: 147
Hogwarts Legacy : 83-97
Microsoft flight sim: 151
Dying Light 2: 200+
F1 24: <160
Star Wars Jedi Survivor: \~100
Horizon Forbidden West: \~146
Spiderman Remastered: 127
BMWukong: 91
Star Wars Outlaws: 97
God of War: 192
About 130fps average across the 15? I get it, "Akshuallllyyyy 130 is bigger than half of 240", but do you get the general picture? Or do I need to draw it for you mate. Infact, just to ask a follow up. If it makes you feel better, I also asked it what happens if you disable DLSS, and it does drop to around the \~100 FPS average.
I also asked it about the 15 top AAA games 5 years before then, so Witcher 3, the original Horizon, Overwatch etc, and the average across those was 127 FPS without DLSS. So I guess akshually higher than half of 240, but I think my point stands.
Vynlovanth@reddit
You’d be surprised how many people buy the latest and greatest and keep playing what they’ve been playing for years.
People play BOTH modern and older games. They need the hardware for modern games but go back to play older games they enjoy too. Shocking I know.
Hardly whataboutism, more like you can’t fathom people play games or use their PC differently than you.
iAmBalfrog@reddit
Mate, I play League of Legends, I play Dave the Diver, but when people say "Oh modern hardware can't run 4k 240hz" I don't presume they're talking about an N64 emulator playing Donkey Kong 64, I assume they're talking about modern graphically intensive games. It would take someone well and truly inept in the space to read "Oh that guy thinks Team Fortress 2 won't run at 240hz 4k"
antimuggy@reddit
There’s plenty of people who just get the most expensive things just for the shit of it. Plenty of people who do that and can’t really afford any of it. Plenty of spoiled kids who do it because they can. Those people are probably not all hardcore gamers who meet your standards.
AvocadoMaleficent410@reddit
I have 5080 and i can not run Stalker2 or BG3 or KDC2 240 hz without DLSS.
THEYoungDuh@reddit
The 2 most popular games on earth get 400+fps at 4k
AvocadoMaleficent410@reddit
Sorry i don't play trash
guthixguthix502@reddit
What a stupid low iq comment by a degenerate individual.
guthixguthix502@reddit
What a stupid comment.
Nogflog@reddit
No?
Mestyo@reddit
It's not about the cost of fps, but about the added latency. V-sync adds a few frames of additional delay between my inputs and movement on the monitor.
It's a lot better to just lock the frame rate (ideally in the game engine, to further save performance) to be at- or just below the max refresh rate.
Plini9901@reddit
It's worth noting that the delay added is VSYNC type and refresh rate dependent. Let's use 120Hz. That's ~8ms for each frame. Single buffer of VSYNC adds another 8ms, double buffer adds 16ms, and triple adds 24ms. Repeat for any refresh rate. For what it's worth, the downside of VSYNC (latency) becomes far less noticeable the higher you go.
AggravatingScheme327@reddit
That's not why VSYNC adds latency. VSYNC adds latency because it allows the CPU to work ahead of the GPU. You queue up 3 frames before VSYNC stops the CPU from queuing work for the GPU to draw.
Plini9901@reddit
It is literally why it adds latency. You're just describing another aspect of the job.
In theory, yes. In practice, even dropping 1 frame off from the frams limiter will cause screen tears. There's a reason VSYNC still exists. Most frame limiters are dogshit and are only useable now thanks to VRR.
A1_Killer@reddit
Why in game engine as apposed to gpu software (eg amd adrenaline)?
Mestyo@reddit
Both works, in fact I believe using both is ideal, but they do somewhat different things.
A limit in GPU software prevents the GPU from sending more than N frames to the monitor. A limit in the game engine instructs the game not to produce more than N frames. This can save game logic cycles, or align with internal timings.
The GPU software only knows how many frames are coming through, while the game engine can apply all sorts of optimisations with the knowledge of a target frame rate.
Lock frame rate in the game to have the PC do less work, then also lock it in GPU software to prevent tearing in case a game can't be locked or "accidentally" produces too many frames.
If I'm on a 144hz monitor, I apply a 142 fps GPU software limit, and a 141 limit in-game.
MDCCCLV@reddit
Why not 144?
Mestyo@reddit
Honestly, I don't remember. I learned at some point that it helps VRR monitors function better. Since the GPU software limit isn't a guarantee, leaving bit of a buffer would make screen tearing virtually impossible.
UtkuOfficial@reddit
Because sometimes gpu software fucks up. For example when i lock my fps to 85 on Expedition 33 using Nvidia Center, it actually lowers my GPU usage and gives me a mixed 80-85 fps. But when i lock it in game, its a constant 85.
Use ingame solutions if possible.
Jaybonaut@reddit
Note that V-sync in the Nvidia control panel is required to be on for G-sync to work (and off in-game.)
Bentok@reddit
No? GSynch works below monitor refresh rate FPS, so if you cap it you'll never need VSynch.
Cap FPS, GSynch on, Reflex on, VSynch off is best for latency and frames.
4ut0M0d3r4t0r@reddit
The best for latency is always uncapped. For why V-Sync is enabled in NVCP, here's the excerpt from blurbusters:
Jaybonaut@reddit
I also put Low Latency mode on Ultra, which is said to lower latency for these settings further (going by the description.)
kovnev@reddit
This solution doesn't work for me, as my GPU chills out way too much if I set a 60fps cap. Then when a sudden performance spike comes, I get frame drops while it gets its shit together.
And I really notice the input lag with vsync on, and the tearing with it off. Just wish it was sorted once and for all. 25+ years this has been happening now.
GarbageOffice@reddit
This has been thoroughly tested and the best option is to enable Vsync with Gsync and DISABLE any fps limiter. This always has the best result.
jon553@reddit
Nope. Vsync will always add input latency if you reach your monitors refresh rate. So ideally, you cap your fps a few frames below your monitor's max refresh rate and then enable vsync + gsync. Battlenonsense thoroughly tested this.
GarbageOffice@reddit
Here
Though I forgot to add that you need reflex on.
jon553@reddit
Yeah reflex is a dynamic frame rate cap. It caps framerate dynamically to give some headroom to the gpu since latency tends to increase when the gpu usage is at 99%. So yes that would work but you can still use an in-game frame limit without adverse effects.
ImBoredToo@reddit
On or "Fast"
getSome010@reddit
I have no idea. So many people do and it looks god awful idk how people play like that
NooTNooTnoX@reddit
i think Variable refresh rate is what you need in most scenarios cause it will sync with whatever framerate you have if it's in range with the monitor vrr, cause It has to be supported by the monitor and the video card, in amd's case it's called freesync
CrazyElk123@reddit
Gsync + vsync enabled in nvidia drivers fixes it completely for me. No need for ingame vsync.
Sethdrew_@reddit
Yup, nailed it. A lot of people miss this, but when using G sync V sync DOES need to be enabled in Nvidia Control Panel and that’s it. OFF while in game.
podrae@reddit
Yup, as someone who despises tearing and been down this rabbit hole I can say without a doubt that this is indeed the correct method. Set max fps in driver to a couple frames below max along with the above. Vsync helps here and gives no input lag when combined with gsync. The lag comes when it's enabled on it's own.
CapitalShoulder4031@reddit
I think that's an Nvidia thing because AMD does not require v sync to be on in the control panel for g sync or free sync to work.
Yelov@reddit
Gsync does work without vsync, but technically you'll still susceptible to tearing.
ShallowMess@reddit
I'm salty about v-sync. When borderlines 3 released it was borderline(sorry) unplayable on my gf PC, but worked good on mine. I refunded both copies since I thought the issue was monkey devs. Turns out we just needed to disable v-sync on gf's pc and fps went from 5 to 60+. Still have no idea why, but I very sceptical about v-sync since
RiceRocketRider@reddit
I’ve never really noticed tearing before, so I don’t turn on V-sync unless there is another reason (like for some reason it fixed a problem with enemies being invisible in Just Cause 2.
Over9000Zeros@reddit
My monitor is G-sync enabled so that requires V-sync to be off.
JackOuttaHell@reddit
All i can tell is that VSync is mostly disabled for FPS shooter since it's adding a good amount of input delay, especially in competitive games.
Speaking personally, I've never experienced any kind of screen tearing when using FreeSync/G-Sync (used both AMD and NVIDIA, came from an RX 580 and upgraded to an RTX 2070, but got the opportunity to test an RX 7800 XT)
Maybe it's depending on what kind of FreeSync Type your Screen has (correct me if I'm wrong), because besides FreeSync there's also FreeSync Premium (which is the type of FreeSync my LG UltraGear has)
foilrider@reddit
> Speaking personally, I've never experienced any kind of screen tearing when using FreeSync/G-Sync
That is exactly the point of those features, they give the better image quality (i.e., avoid screen tearing) without waiting for extra unneeded frames to draw.
Yelov@reddit
They don't fully remove tearing by themselves. They should be used alongside vsync to get rid of tearing.
GarbageOffice@reddit
You're not using Vsync and Gsync the right way then. You should always enable Vsync combined with Gsync and DISABLE fps limiters of any kind for best results. Including competitive games.
connorconnor12@reddit
Not sure why you’re being downvoted. This is the way
salt-of-hartshorn@reddit
VRR also adds input latency, fyi.
JackOuttaHell@reddit
Didn't notice any input delay while using vrr 🤔 But will check later 🤔
salt-of-hartshorn@reddit
It's very small. On the order of \~3ms IIRC
Colardocookie@reddit
Personally speaking I only get screen tearing with freesync/G-Sync enabled. Been like that with every pc I've ever had. LG G4, Asus PG27AQDP, Gsync alienware laptop and, freesync alienware monitor. Currently have a 5090 but had a 6900xt before that and a vega 64 before that. Always the same.
EccentricFox@reddit
Same, I've never had any tearing issues with Free Sync both with an AMD and Nvidia GPU; I'd put it as like the top feature if buying a new monitor because it really clears up this problem entirely. I even find really dipping down in FPS to like 50 to still look smooth in a certain way.
sledgehammer_44@reddit
Competitive or not.. I become crazy when I see gun flashes like barcode on my screen especially at high fps
ThereAndFapAgain2@reddit
Doesn't sound like their display has VRR.
Hellcatty_9@reddit
Yeah I don't know how you guys don't get any screen tearing, I get screen tearing all the time when I disable vsync, even if it's within the refresh rate of the monitor. (I have a Samsung Odyssey with 1440p and 180hz). I also don't have any additional input lag when playing with vsync on, don't know how that is a problem
BisonSafe@reddit
Turn on V-Sync and G-Sync or FreeSync in Nvidia control panel or the AMD equivalent, also lock your frame rate like 2-3 under your monitors refreshrate and turn off V-Sync in game.
Enabling V-Sync and G-Sync (or FreeSync) in the control panel while disabling V-Sync in-game prevents conflicts between the game engine and your monitor’s adaptive sync technology. Locking your framerate a few FPS below your monitor’s refresh rate helps avoid input lag and microstutters by keeping the GPU workload consistent and ensuring G-Sync/FreeSync stays active.
bobsim1@reddit
You surely have additional input lag. Thats how vsync works. You just dont notice it. One frame is 16ms at 60hz or 6ms at 180hz. Vsync delays the frames to make sure its complete.
Elliove@reddit
The graphics card doesn't send incomplete frames, this is not how it works. VSync makes the card wait for VBlank, so the monitor does not change the frame it's displaying during the refresh cycle. This is where the delay comes from, and VRR pretty much makes VBlank dynamic, so every frame the card finishes is ready to be displayed right away. This is why on VRR displays there's no noticeable input latency difference between VSync on and off, and that's kinda the point of VRR, it was made to make VSync work better.
bobsim1@reddit
Youre right. Vsync is a fix for a problem which VRR negates completely. Its the monitor that makes the frames incomplete by switching to the next frame when it arrives despite the earlier frame not being fully shown.
Elliove@reddit
That's the thing - VRR does not completely remove tearing. VRR was created to be used with VSync, not instead of VSync. With VRR on and VSync tearing is reduced, but not removed completely - you still need VSync for that. Check out this, under "Wait, why should I enable V-SYNC with G-SYNC again? And why am I still seeing tearing with G-SYNC enabled and V-SYNC disabled? Isn’t G-SYNC suppose to fix that?" - there are all explations and examples, and it applies to FreeSync just as well.
RatherShrektastic@reddit
It's been 8 years since the original blur busters article about this, yet we still get the weekly v-sync post that brings out dozens of people confidently spreading misinformation. This thread made my blood boil. Holy crap.
Elliove@reddit
I believe the biggest issue is that people assume that FreeSync is way too different from G-Sync, thus disregard the G-Sync article.
desert_vulpes@reddit
Thank you!! I didn’t understand this and wasn’t able to word it to find the answer. I’ve had a 4080 for a couple years and despite having a GSync monitor and being able to throw far more than max frames, I’d still get tearing without VSync. This makes so much more sense.
nonton1909@reddit
You have input lag, you just don't feel it. (Google it up of you don't believe)
R1ddl3@reddit
Variable refresh rate eliminates the problem. Pretty sure your monitor would have gsync/freesync compatibility, maybe you just haven't enabled it?
makegr666@reddit
Gsync or Freesync works with Vsync enabled in your GPU control panel, and limiting your FPS 2-3 fps below the amount of HZs of your monitor. 141 if you have 144hz, 177 if you have 180hz, and then disabling Vsync ingame.
Also, disable triple buffering in the control panel, and you'll never have screen tearing (except in super duper rare cases, only one game have I ever played where I got it) nor input lag.
resetallthethings@reddit
Are your display settings set correctly in windows, in your graphics driver and on your monitor itself?
you need to make sure your monitor is set to max hz in windows, often it will default to 60hz even if a higher mode is available.
On AMD side I think it typically enables freesync by default. Not sure about Nvidia, but neither will enable or at least be used if the monitor itself is not setup to use gsync or freesync respectively.
thatdeaththo@reddit
Nvidia advises to use G-SYNC with VSYNC on in the Nvidia Control Panel. Here are the recommended settings from Blur Busters.
https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/
BisonSafe@reddit
What I do; in global settings in Nvidia panel I enable V-sync and lock my framerate 2-3 frames under my screens refreshrate. I also enable g sync or AMD freesync. In game I leave V sync off. This way you have no tearing and also no added latency from V sync.
Striking-Variety-645@reddit
I have vsync on in my nvidia control panel but vsync off in game and also gsync enabled both in monitor settings and nvidia panel and it works perfect 0 input lag and 0 screen tearing.
TechWhizGuy@reddit
Because we have VRR
karmazynowy_piekarz@reddit
I lock my games at 120 FPS even though i run 5090. My TV just cant handle more, so there is no point.
I dont need vsync when im within margins
super-loner@reddit
Basically OP needs to experience modern high refresh rate display with VRR tech.
TechWhizGuy@reddit
Top comments are all wrong or poor
Economy-Regret1353@reddit
When you jave gsync/freesync, V sync don't matter
GarbageOffice@reddit
It does because when you use it together, it keeps the fps slightly below your refresh rate, which gives you no tearing + the least input lag.
CapitalShoulder4031@reddit
Just limit your FPS -3 to your native hz in the control panel. No v sync needed.
thornierlamb@reddit
You still need it for gsync to work properly. If you don’t use it you just get a worse experience with the same input lag.
SnooDucks3047@reddit
If u put low latency mode on ultra, it will limit your fps -3 of your native hz.
Young_420@reddit
I can only speak for nvidia cards. This is a bit of a rabbit hole, but the general advice is this. Gysync is good but cannot eliminate screen tearing on its own. So the setup should be: gsync on, vsync on (in nvidia control panel, off in game), then limit your fps 3 frames below your screens max refresh rate. You limit the frames as the latency from vsync will only take effect if your fps goes over your max refresh rate on your monitor. Note that vsync will increase latency at any fps if gysnc is not active.
ConsistencyWelder@reddit
Vsync takes too much of a hit to performance. It's made obsolete by Freensync imho anyway.
dorting@reddit
You should cap to 3 less fps than your monitor and use freesync, freesync give problem when you exceed your monitor hz with with your fps
swaggalicious86@reddit
Vsync adds input lag which makes games feel bad to play. I sometimes get a bit of screen tearing but it doesn't really bother me
eurosonly@reddit
Get you a monitor with that Mach 4 refresh rate. That's the part nobody's told you.
green9206@reddit
I cannot imagine playing without vsync.
Frizz89@reddit
You will get vertical tearing no matter what even with VRR while V-Sync is OFF it will just be less noticeable.
Ideal setup for a tear free experience is Gsync + Vsync + Nvidia Reflex/Boost + Triple Buffering and whatever the AMD equivalent is.
cheeseypoofs85@reddit
vsync is pretty much a thing of the past. its not needed for monitors with VRR technology, which is most these days. its also used for frame generation in some scenarios.
ghostfreckle611@reddit
Your monitor screen has to support Freesync/G-Sync to take advantage of it.
BoardsofGrips@reddit
I have a 360 hz monitor, I force V-Sync off. One less variable that can cause problems or latency
aMapleSyrupCaN7@reddit
I don't know if my info is out of date, but after watching dozens of videos when I got into pc gaming, I'm pretty sure you can just use FreeSync/GSync (granted, GSync can be more expensive) instead of V-Sync, which would solve screen-tearing without introducing input lag.
So if you get the right monitor for your gpu, V-Sync is just a non-optimal solution.
kovnev@reddit
It's improved over the decades, but I swear I can still notice mouse lag with vsync on, compared to off - in almost all fps games.
My competitive days are long gone, so I no longer care as much, and prefer no tearing.
SirAmicks@reddit
So I have a question while I’m here. What is the advantage of Freesync if you’re using a high refresh rate monitor? Does it only help for games that go above the monitor’s refresh rate?
Droid8Apple@reddit
Does your monitor support freesync ? It usually needs turned on within the monitors settings (not from windows). Use the buttons to look through and make sure it's on.
Also good to make sure you're using a display port cable opposed to HDMI unless you're positive the monitor's HDMI version, gpu's HDMI version, and HDMI cable all support your desired resolution at the desired refresh rate.
Lastly - id highly recommend giving life a try with your adrenaline drivers installed as "driver only" (drop down box of additional options when installing the drivers). I spent 6 weeks of trouble shooting when I switched to AMD because I was having an unbelievable amount of issues both in and out of games. Turned out to be adrenaline causing all of them. I've been trouble free for over a year.
nonton1909@reddit
V-sync creates input delay, so for competitive games it should always be off. For single player games doesn't matter too much I guess, but I also turn it off almost always. And about screen tearing - of your PC is powerful enough to run the game properly there won't be any screen tearing and everything will look smooth. If turning on v-sync makes the game smoother it means you have frame drops or long frames
Main-Society4465@reddit
What you want to do is lock your framerate exactly 3 frames below whatever your monitors refresh rate is.
You can look up technical reasons for this.
"1. If you're using G-SYNC and your framerate can be sustained above your current physical refresh rate 99% of the time, leave LLM and Reflex off and set a manual in-game (lowest latency) or external (steadiest frametime) limiter a minimum of 3 frames below the refresh rate to keep G-SYNC active and within its working range.
You also won't get the scanout lag that comes with V-sync this way. (or as much)
DAlucard420@reddit
Honestly no idea what G-Sync really doee, but I keep it off because it causes most the games I play to stutter really bad.
acewing905@reddit
I usually cap everything to 60 FPS on my 60 Hz monitor and make sure the settings are dialed in so that it never goes below. Even more than screen tearing, I despise frame rates that fluctuate. They drive me mad
However, recently I learned that screen tearing isn't always a foregone conclusion either. Just a short while ago I was playing Assassin's Creed Shadows, and because my GPU wasn't totally up to the job with the settings I wanted, as an experiment, I turned off V-sync and lowered the FPS cap to 40. My conclusion has been that, at least in this game, screen tearing is negligible. It's noticeable occasionally like when synchronizing a viewpoint, but other than that it's been smooth sailing. So at least it's not important to the same extent in all games
zachjd-@reddit
They make it very very confusing for gamers. I don't blame anyone if their settings are not correct.
Psicrow@reddit
The one downside to vsync is that it can lead to a small amount of input lag. If a frame is delayed to align with your refresh rate, that is a few extra ms that your input doesn't translate to an action. Very minor nowadays,especially with a 144hz monitor, but was still very significant with 1080p, 60fps monitors.
Rawjent@reddit
V-sync causes input delay and most people have it off because pc players are on 240hz by now, which most games even on a decent build won't hit even at 1440p let alone 4k.
Silly_Personality_73@reddit
Games like Zero dawn and FBW always tear without Vsync no matter if it's in Gsync range or not, so can't do that. Wish I could.
geminimini@reddit
Screen tearing can be fixed by capping fps to be just under the refresh rate of the monitor. Vsync introduces input lag. It's a no brainer to use the former method if you're competitive
GarbageOffice@reddit
No. If your monitor has Gsync/Free sync, the best method is to keep it on with Vsync enabled. It has been tested and compared to all other combinations and it has the best results in terms of input delay and tearing. You have to disable any fps limiters though.
Bentok@reddit
Source? Because that's not true, GSynch with frame limit is enough
GarbageOffice@reddit
This guy tests and explains the combinations https://youtu.be/5mWMP96UdGU?si=582FfhnuXa5FfAOU
CapitalShoulder4031@reddit
I believe that's an Nvidia/g sync thing. AMD and free sync doesnt need v sync on in the control panel for free sync to work.
Elliove@reddit
Tearing has nothing to do with FPS, so capping it doesn't fix tearing. With VRR and FPS cap tearing can be reduced, but only VSync can remove it completely. With FPS cap alone, it's tearing galore without VSync.
MrMunday@reddit
People like seeing bigger numbers and they don’t notice tearing coz they don’t really notice anything.
Bigger number better. So put up the counter and turn of vsync, voila! My computer is more worth it coz number higher
ThereAndFapAgain2@reddit
Because a lot of people are using VRR these days, even cheaper monitors support it now, and with VRR screen tearing just isn't a thing.
CapitalShoulder4031@reddit
Max your FPS -3 from your native hz. So if you run 144hz, set it to 141. G sync/free sync doesn't kick in at max monitor refresh and 2 below the max. So if you have FPS that fluctuates between 140-144, g sync and free sync cut on and off constantly, potentially causing stutters.
ThereAndFapAgain2@reddit
Yeah, totally agree with this, had a global FPS cap at 238fps for years.
CapitalShoulder4031@reddit
If you are 240 then it needs to be 237 fps :P. At 238 your g sync is cutting off.
ThereAndFapAgain2@reddit
238 and even 239 work perfectly well on my monitor, but I have an actual Gsync monitor, not just a "Gsync compatible" monitor, and Gsync works all the way up to the max refresh rate of the display.
The reason for capping on a display such as mine is that frame rate caps often don't reliably cap exactly to that frame rate, so capping to 60fps, sometimes you will get 61 or 62 fps or even 59 or 58 fps.
The issue when it comes to VRR is not the times it goes a frame or two below, but the frame or two above, since that knocks it out of the VRR window.
CapitalShoulder4031@reddit
Very interesting. The rule for g sync or free sync (regardless if g sync ultimate or compatible) was always -3 I thought. Unless g sync now stays on at max refresh, I don't see how g sync would still be working for you if you sit at 238 FPS.
ThereAndFapAgain2@reddit
So, as I understand it, proper Gsync works from 1hz all the way up to the max refresh rate of the display it is rated for. Like so at 240hz at 240fps gsync is still active, it only stops being active at 241fps.
This is why capping to 238 will be just fine because you are giving a 1 frame buffer, then a second frame as security.
I've been doing this for years with my gsync displays.
On my ROG ally though, I have taken to giving a 5fps buffer from the max refresh in order to be sure that free sync will work properly. Gsync is definitly the better and more sophisticated tech when we are talking about proper gsync, since that means the display has a dedicated gsync chip that is managing the VRR.
CapitalShoulder4031@reddit
Interesting. Blurry busters states -3 is required for any changes a few years ago. Maybe gsync has improved since then.
ThereAndFapAgain2@reddit
Having used both Gsync and Freesync aswell as "gsync compatible", I can confidently say that proper Gsync is basically night and day over everything else.
But that's pretty much true with all Nvidia tech, it's significantly better than the competitors, but you have to pay an increasingly higher price to have access to it.
CapitalShoulder4031@reddit
You should try a g sync ultimate monitor. Even with my 6950xt the ultimate felt way better than my normal g sync monitor with an Nvidia card. I can only imagine Nvidia card AND g sync ultimate.
ThereAndFapAgain2@reddit
The 6950XT does not support Gsync Ultimate, but the display you were using might have ben a "gsync compatible" display, which typically means it also supports FreeSync, and just through it being a more modern display it seemed to perform better.
The only way to experience actual Nvidia tech is to own Nvidia hardware for the most part, and again that is why they charge so much.
They do have the best tech, they know they have the best tech, and they charge based on those two factors.
CapitalShoulder4031@reddit
When I get my 5090 here in a few months I will definitely have to check it out lol
Elliove@reddit
Correction: you can still get tearing with VRR and FPS within VRR range when frame times of separate frames go outside of VRR range. You still need VSync to remove tearing completely.
ThereAndFapAgain2@reddit
Right but the second a frame is outside of the VRR window, VRR is no longer in use so you aren't really getting tearing "with" VRR, you just aren't staying inside the VRR window.
Honestly though, I hate how there is a window at all, actual gsync works all the way down to 1hz and all the way up to the max refresh rate of 240hz on my monitor, but on my TV which is 4k 120hz but only "gsync compatible" it's onlf from 48hz up to 120hz.
Thrimmar@reddit
VRR in general replaces v-sync as it makes sure that your screen and the fps is not missmatched. if you wish to not use it i would recommend to lock your fps to a multiple of your screen, example: i use a 240Hz screen, i like to lock my fps to 120, 80, 60, 48 as they are even frame pacing for my screen.
people that use 144hz screen often hate on 60fps as it looks to laggy. but if they would lock the fps to 48 or 72 then it would look more smoothly.
Elliove@reddit
VRR does not replace VSync, it's made to be used with VSync. With VRR on and VSync off, you can still get tearing
Shap6@reddit
Only if your fps goes above or below your monitors VRR range
Elliove@reddit
No, FPS doesn't matter, it's about frame times. Here you can see examples of tearing with FPS within VRR range.
Bentok@reddit
That's all well and good, that "sudden frame time variance" can technically cause tearing, but funnily enough I've literally never had that happen so...yeah, I'll stay with GSynch on, VSynch off
CapitalShoulder4031@reddit
Maybe for Nvidia but not for AMD free sync. I been using v sync off in control panel and free sync on for last 2 years and no issues and no tearing. Gotta limit your FPS -3 from the native hz though to stay in g sync/free sync constantly.
Elliove@reddit
It works the same for G-Sync and FreeSync. FPS -3 is a bad advice, because it doesn't take into account the refresh rate; better stick to the formula used by Special K, which is refresh-(refresh*refresh/3600), i.e. 224 FPS on 240Hz.
CapitalShoulder4031@reddit
-3 has been a staple in the g sync industry. Why would it be bad advice? Have you seen blurry busters review on g sync and how it actually operates in real time? They use extremely high refresh rate cameras to record every split frame on a screen for their testing. They have confirmed -3 FPS has less stutter.
Elliove@reddit
"Has been" exactly, because people didn't know any better. These days it''s the formula I quoted.
Demywemy@reddit
I get zero tearing by using Freesync with Vsync off.
Elliove@reddit
Then keep it this way, and enjoy your games. I did say that you can get tearing in such scenario, I didn't say that you absolutely must have tearing. This is about frame times - as long as they are stable and stay within VRR range, VRR will keep the tearline out of the screen. It's for those frames that don't make it the frame time window, you need VSync to remove tearing - and only those frames will have slightly increased latency as opposed to VSync off.
Demywemy@reddit
VRR means FPS is synced with the screen's refresh rate. You shouldn't be getting tearing unless you're outside of the VRR window, either below or above.
CapitalShoulder4031@reddit
I don't think the person realizes that they also need to limit their FPS in the control panel by -3. So if you are 144hz, limit FPS to 141. G sync/free sync turns off when hitting max refresh and 2 FPS below it. So if your FPS fluxuates from 130 to 144, free sync will cut on and off constantly at 142 FPS and above, causing some stutters.
desilent@reddit
Blurbusters says the best setting (at least for NVIDIA) is to force vsync in control panel while simultaneously locking fps in the same cp 3-5fps below your maximum refresh rate.
This is for gsync monitors (aka gsync turned on)
Turn off vsync in game
shinodaxseo@reddit
With Freesync and high refresh rate monitor I don't have any problem of screen tearing
Maleficent_Space_946@reddit
Does nvidia works with free sync aswell? Or should I buy a amd gpu
shinodaxseo@reddit
Yes it works
l0stIzalith@reddit
I use g-sync with v-sync enabled in nvidia control panel.
Ryan92394@reddit
V sync adds latency.
Moscato359@reddit
I don't use vsync because I don't like input lag. At 165hz or higher, you don't even feel screen tears.
At 240hz, they basically are imperceptible.
awesomeboxlord@reddit
I usually disable it cause it increases input latency slightly
zarco92@reddit
If using Gsync/Freesync and want no screen tearing, you should disable vsync in game and enable it system wide with the Nvidia Control panel, and cap the framerate to a few fps below the max refresh rate. This is tried and true thing for Nvidia cards. For AMD cards I'm not sure if it works the same.
ahandmadegrin@reddit
If you're using frame gen v sync has to be disabled. That's not an issue if you have a gsync/freesync monitor since that tech matches your monitor refresh rate with the game's fps.
sliiiiiimmmmm@reddit
I'm no expert but in most cases it adds latency. As some have pointed out you're better off limiting frame rate.
SirThunderDump@reddit
V-Sync can cause bad framerate issues/stutters, and while it can be a good solution if you can cap your framerate and guarantee staying above that framerate, the better solution is usually VRR.
VRR (free sync or gsync) gets rid of tearing, maximizes frame output, and (usually) reduces input lag.
If you’re playing a game at 60 FPS with Vsync, and a single frame isn’t finished rendering at display time, the first frame gets displayed again, which you will experience as stutter. If this happens frequently, you get a very stuttery game that appears to flicker between 30fps and 60fps.
willkydd@reddit
I haven't seen tearing since I got G-sync, so v-sync is off forever for me. Unfortunately that means my monitor has to have a fan which sighs condescendingly under heavy load.
IcemanEG@reddit
Global gsync + vsync on, turn the setting for vsync off in game is the way.
hypnohighzer@reddit
I have V-Sync turned off because my monitors have G-Sync Nvidia's version of V-Sync. If you have an monitor with Free-Sync that is meant for AMD cards and also does the same thing. It all syncs the refresh rate of the monitor and card.
Jaybonaut@reddit
NOTE: anyone who can take advantage of Nvidia's GSync - you are required to turn on V-sync in your global settings and have it turned off in-game (among other settings.) If you do not have V-sync turned on in the driver and then off in-game then G-sync will not work.
jon553@reddit
Weird, is vsync on by default in the NCP? I've never turned on vsync in-game or in the control panel but gsync has always worked for me. I'm pretty sure I have it off in the control panel.
Jaybonaut@reddit
Link
I would add Preferred Refresh Rate set to Highest Available and Low Latency Mode set to Ultra.
A number of guides out there mention lowering max frames to 3 fps below max Hz of your monitor, etc otherwise, but I believe with my changes it should be automatic. My example is say Overwatch 2 - I leave the fps limiter off in-game and it will lock it to 157-158 on my 165 Hz monitor. It can easily do 300-400+ otherwise. No tearing.
CapitalShoulder4031@reddit
Sounds like your monitor isn't free sync or g sync compatible. A lot of people have switched to g sync instead of using v sync, this allowing them to keep v sync off with no screen tearing.
farmeunit@reddit
I never really get screen tearing so it's always disabled for me.
AOEIU@reddit
If Freesync is working you should not get any tearing (unless your frame rate is dropping below your monitor's range, often 40fps).
Personally Freesync does not work for me with 2 monitors. I can't figure it out, but it just doesn't.
This tool lets you easily test if Freesync is actually working. The animation should be totally smooth at 55fps. https://github.com/Nixola/VRRTest
EddieV223@reddit
V-sync is off so g-ysnc can do it's thing
xl129@reddit
Screen tearing doesn't bother me that much for some reason so I turn it off.
larrylarrylar@reddit
I use vsync in games built for older GPUs because I know my computer will run them at insanely high frame rates if I don’t cap it in some way.
braybobagins@reddit
I have a 7800x3d. Why would I want 1% lows at 160, when I could have 1% lows at 190?
burninatorist@reddit
has no one heard of VRR? You need V-sync turned off for your fancy Variable Refresh Rate tech in your monitor to work (they can conflict with eachother). Some people say you need vsync on for VRR to work, they are INCORRECT.
SynthesizedTime@reddit
I just cap the refresh rate instead. afaik it gives you less input lag this way
dmick36@reddit
Doesn’t v-sync cap at 60hz too?
useless_panda09@reddit
v-sync will force your gpu to sync up with updates (refreshes) to your monitor’s refresh rate. this is what removes screen tearing since your monitor and gpu are basically in sync. this also means that v-sync effectively locks your FPS to your refresh rate so if you had a higher refresh rate, say 180hz, it would lock to 180hz not 60hz
dmick36@reddit
Well damn I always disabled it and locked the frames to 144hz.
useless_panda09@reddit
That usually also can reduce screen tearing, but the main benefit of locking your fps to your refresh rate is because it prevents your gpu from boosting to a high clock speed and a high temperature unnecessarily.
GrifterDingo@reddit
It depends. I run my games with vsync on and the refresh rate set to 100 HZ in the game settings.
Methyl_The_Sneasel@reddit
Because it MASSIVELY increases input delay, if you play competitive games, input delay is a HUGE nono.
Also, if your refresh rate is high enough, it's barely an issue anyways.
machine4891@reddit
I heard them mostly advicing to turn off in-game v-sync in favor of driver one. They definitely not playing with constant screen tearing, lol.
jon553@reddit
Most people played with gsync which effectively gets rid of screen tearing in most games without input delay that vsync adds.
jon553@reddit
Most of the time, gsync + fps cap a bit below monitors max refresh rate is enough to yet rid of any tearing. If not, then also enable vsync. But never have vsync on without an fps cap below monitor refresh rate. Otherwise, it will as significant input latency.
Kicka14@reddit
Monkey see monkey do
BacklogGamingJunkie@reddit
I always limit the fps to 117-119fps on my LG C3 42” OLED since this tv has a max refresh of 120hz. No sense in making my hardware work harder producing frames I’m not actually seeing past 120fps anyways
gljivicad@reddit
I did it by default since the old days when we all had 60hz monitors but some games benefited from having more frame rate despite you not being able to see it (for example jumping higher in cod2). But I never knew vsync locks the frame rate to your monitor refresh rate, I thought it was to 60fps. So I kept turning it off on every game I ever played, thinking it’s helping me not be stuck at 60 lmao
Knarz97@reddit
Most monitors have G or Free Sync now so it’s not needed.
Roemeeeer@reddit
In some games, the input lag can get unbearable with v-sync. So I always disable it and, if possible, just lock the framerate.
Psytrense@reddit
You need vsync for gysnc so many newbs think they're still playing cs 1.6 still from 2000 and should play with vsync off
GolldenFalcon@reddit
I've never noticed screen tearing in the two decades that I have been playing video games on a PC. I permanently have v-sync off.
Telominas@reddit
If you have multiple similar features they can clash with each other. For example if I run similar feature both in the screen settings and turn it on in games they can clash. Thinking most ppl turn it off in the games bevause of it. Or if you don't need it then you're saving resources.
PristineHalf1809@reddit
Poor guy
spadePerfect@reddit
I use VRR and you need to enable VRR on a system level and disable it in games to work properly.
Kooldragon87@reddit
I have a gsync monitor so I don't need vsync
Elliove@reddit
VRR was created to remove input latency and stutters of VSync. Using FreeSync without VSync doesn't even make sense.
resetallthethings@reddit
you keep saying this, but other then the one resource you link which talks solely about Gsync, I have not seen anything anywhere from the past 5 years which suggests using vsync along with freesync
I have used a variety of AMD systems using cards from 580, 6700xt, 6900xt, 7900xt, 9070xt and 9070 on windows and now linux with displays ranging from 1080p/60hz to 1440p/240hz to 4k VRR enabled Amazon fire TV.
I always have vsync disabled in the driver and game settings and have never seen 1 screen tear across all those configurations.
It's also not as if I don't know what screen tearing is. I am very sensitive to it in fact. Wrestled withe it extensively back in the pre VRR days of the late oughts through early 2010s and in fact RMA'd a HD 6950 at one point that was reliably producing screen tearing regardless of any configuration.
Elliove@reddit
Ok, here's the author of that article explaining everything, and confirming that everything applies to FreeSync just as well, and that's why there's no need for a separate article and series of tests.
Sure, here's Special K wiki recommending settings presentation interval to 1 for VRR.
resetallthethings@reddit
that's from 5 years ago... with some contradictions in what he says to begin with. And I mention the 5 years because both the freesync/VRR of monitors and the driver level implementation has certainly changed/improved.
neither of those take away from my point, freesync + forced vsync was the only "correct" way to do things, why would I have never noticed a single screen tear? and why wouldn't there be a ton more clickbaity youtube videos from desperate content creators educating people on how to "properly" do these things?
I'm not even saying you or the people you are citing are wrong, it just seems like even if they are technically right, there doesn't seem to be enough of a tangible benefit to setting it up that way for the majority of people and setups.
Elliove@reddit
Please, do specify those contradictions, I'll ask some people knowledgeable on the topic to explain things better.
Not really, no. In fact, to be able to use VRR at all, Windows even enables tearing in the composer. The only thing that kinda helped the case was introduction of Reflex - is uses a formula that takes refresh rate into account, i.e. on 240Hz it would be 224 FPS limit, which significantly reduces the amount of frames that go outside of VRR frame time window, and, as a result, reduces the visible tearing with VRR on and VSync off. But can still happen, and totally does.
Probably because you haven't tested things extensively, unlike creators of BlurBusters and Special K wiki. On the upper VRR range, the tearline typically appears near the bottom of the screen (as shown in G-Sync 101 article), and in most games you barely ever look there. If you keep testing, I'm sure you'll run into enough frame time variances to notice the tearing with VSync off.
There aren't any downsides to this either. With VRR and frame times mostly within VRR range, you either get hard-to-notice occasional additional input latency with VSync on, or hard-to-notice occasional screen tearing with VSync off. So if we're talking facts, then to remove tearing completely you want to keep VSync on with VRR. If we're talking how people feel about things, then it might be the same as it is currently with DLAA on Transformer model presets - it has serious issues, quite visible and distracting in many games, but people don't notice any of that, they just praise that stuff.
ime1em@reddit
i don't like the mouse lag
Viriidian@reddit
Input lag. Before I mostly just dealt with screen tearing, with g/free sync now you cap a frame below the refresh rate and get the best of both worlds
Apartment_Latter@reddit
I'm gonna use all the frames i paid for
ItsRoxxy_@reddit
Latency. If you have a GSYNC or freesync display you’ll also never use Vsync since gsync/freesync are just better.
Not_A_Great_Human@reddit
If I don't get screen tearing with it off ....why turn it on?
Skinner1968@reddit
I’ve used G-sync since around 2019 now and haven’t used v-sync with its input lag since
FeuFeuAngel@reddit
Some games runs better with or without vsync
Firm_Transportation3@reddit
Not sure why, but I've never had any issues with tearing, vsync or no vsync.
UnlimitedDeep@reddit
Crazy how a bunch of the top comments are completely wrong
Br41th@reddit
I haven't seen screen tearing since 2017, what monitor you using?
ClerklierBrush0@reddit
Input lag, makes precise shooters unplayable. On valorant I try to double my monitor fps and it reduces tearing so I can keep vsync off
Targetm12@reddit
Because g sync and free sync exists and they eliminate screen tearing without adding latency
coolboy856@reddit
For any possible 60hz users in the thread:
Many monitors can reach higher refresh rates than they are rated for. I have a Samsung 60hz monitor from like 8 years ago that's been displaying at 75hz for pretty much its whole lifetime.
You can do it by modifying settings in the Nvidia control panel, there are lots of tutorials online.
191x7@reddit
Freesync when in the refresh range + fast sync when above. V-sync introduces too much lag.
ZeroCable@reddit
Lots of people have screen tearing when they try to run 1ms response time on their monitors. Usually the tech isn't good enough to play a fast based hame on 1ms latency, I usually run 3ms or 5ms to eliminate ghosting and tearing, then disable V-sync so that I get every frame I'm supposed to rather than letting v-sync delay frames.
Lust_Republic@reddit
I prefer low latency. Also depend on the game amd fps. The tearing are not really that noticable.
PhotoProxima@reddit
The input lag is unbearable. Fucking awful.
JimmiVP@reddit
V-Sync lowers the fps, so if there are no problems then don't turn it on.
pakitos@reddit
I grew up playing Midtown Madness 2 and Vsync off had a massive difference when running away with and without the gold so I just keep it off ever since.
Dependent_Opening_99@reddit
Why would you want to use v-sync when there is freesync/g-sync? V-sync adds input lag, like a few frames, which is A LOT.
Also, when your pc can't keep stable 60fps, v-sync will drop it to 30fps (considering you are using 60hz monitor) when you could have played with 59fps using g-sync.
jazix01@reddit
V-Sync is one of the first things I disable in any new game. It has a tendency to cause input latency and mouse stuttering.
-WitchfinderGeneral-@reddit
People say it adds latency but I guess I am just not nearly perceptive enough with video games to ever notice this. I use Vsync all the time when I use my computer with the TV since the TV doesn’t have Gsync. I also use my computer for production and if there’s even the slightest amount of audio latency, I’ll go insane but I never seem to mind/notice latency for video games. If it doesn’t bother you and you don’t notice the difference then it’s not even worth thinking or worrying about. The screen tearing is a much bigger nuisance than a few milliseconds in my opinion.
sadsalad21@reddit
v-sync is like a duct tape. fixes one thing, breaks two others.
RolandMT32@reddit
Currently I have a monitor that supports Nvidia G-Sync, which is a variable refresh rate technology. It's able to synchronize the refresh rate with the frame rate from a game (within limits, of course).
DaddySanctus@reddit
I don’t know about AMD. For NVIDIA, I’ve always followed the Blur Buster method. V-Sync ON + GSync ON + FPS Limit in NVIDIA control panel, and V-Sync OFF in-game.
kardall@reddit
Screen tearing occurs when the actual framerate of the game is not perfectly equally divisible by 30.
30/60/90/120 etc..
When there are dips and spikes, you can get partially rendered bits of frames and that is what screen tearing is. It's parts of an image that were rendered that the monitor just can't fully display due to its refresh rate.
That's why some games have an FPS lock now so you can cap it at like 120fps.
You can have v-sync off, and as long as your game is at or above 120fps, the game itself will cap it at 120fps (a faux v-sync if you will).
suki10@reddit
I game on a TV that doesn't support 120Hz and I constantly need V-Sync on.
Maltitol@reddit
I simply cannot stand screen tearing. Unfortunately the cost to avoid it and still have a good gaming experience is quite high. I had to get a GSync monitor that had to refresh at 240hz and I had to get a RTX 4080 to power it. If you don’t care, you don’t care. But I do, so I paid for it.
CrazyKyle987@reddit
I think we all have our own things that bother us. For me it’s screen tearing (like you) and micro stuttering. For others it’s the resolution not matching the screen or lack of AA or anything else.
I think some people might literally not notice and that’s why they have no issues with leaving vsync off or the frame rate uncapped
Mp11646243@reddit
V-sync creates massive input delay. You can limit your frames in game or through a host of other apps if you are experiencing bad screen tearing. Are you using a 60hz monitor or something? V-sync, g-sync, freesync all should be disabled in most situations. Disable adaptive sync on your monitor as well.
Over_Iron_1066@reddit
Vsync = input latency, on mnk you might as well put your mouse in a bowl of jello.
Just get a gsync or free sync monitor.
ComWolfyX@reddit
Its not that they turn it off its that they enable fast vsync, vrr, gsync or some other form of syncing
SandsofFlowingTime@reddit
I may just be an outlier here, but unless it is really bad, I actually don't notice minor screen tearing. If I do notice it, it's super minor and I start to question if I even actually saw it tear
ruet_ahead@reddit
Different settings for different games and different performance results.
Th3AnT0in3@reddit
Screen tearing happen every time your fps does not perfectly match the refresh rate of your monitor (higher or lower) so basically 99.9% of the times when you play games.
But the higher the refresh rate, the less you see it because you can see it for less time on the screen AND a higher fps implies a smaller tearing effect and less noticable.
Using V-sync remove the tearing effect by saying to your GPU to send a frame only when it's finished AT THE MOMENT the monitor is supposed to display one. So it means it add an input lag because you have to wait a little longer to see that same frame.
But G-sync/Free-sync is different, because it's the screen that is waiting to the next frame when it's ready (but the fps has to be slightly lower than your monitor's refresh rate) so you add almost no input lag, and you remove the tearing effect.
TyphonNeuron@reddit
I don't care about tearing at all. Theore fps the better.
Elliove@reddit
Unless you live in 2025 with the rest of us, because these days we have smart FPS limiters that let you have 60 FPS with the same latency as 1000 FPS (provided your PC is capable of drawing the frames that fast to begin with).
TyphonNeuron@reddit
That's not the point. The point is that OP wonders why people prefer disabling vsync, in order to get more frames at the cost of experiencing screen tearing. An opinion which OP doesn't share, as he would like to get less frames but no screen tearing. And asks other people what their preference is. Hence the thread.
11_Seb_11@reddit
Probably because they own a monitor which supports Nvidia GSync or its AMD equivalent?
sleepytechnology@reddit
I used to get screen tearing all the time in the mid 2010's with my 60hz display. Everyone always told me it was because my framerate was above my refresh rate.
Well nowadays I play on a $300 170hz display with both VSYNC and GSYNC off and in comp games where I hit 400-800fps... No screen tearing. I don't even seem to get it at lower fps. My understanding of it is very confusing but it seems like with high refresh rate displays (at least 144hz+) that screen tearing just doesn't happen? Would love to hear some ideas why I don't experience it anymore no matter what.
LimesFruit@reddit
Vsync adds latency. In some games that is a problem, some not so much.
tATuParagate@reddit
I've tried everything all these comments say and I still get occasional screen tearing, I don't get it. I don't notice enough latency with vsync on to give a shit. I guess it's game by game issue and maybe my low latency settings on control panel and in my monitor settings help it. I'm taking the road less traveled cause yall are crazy
Moscato359@reddit
Screen tearing doesnt even matter if you have a fast monitor because the tear lasts for less time
60hz monitor tears for up to 16ms
240hz monitor tears for up to 4ms
f0xy713@reddit
Input latency. It's much better to cap FPS at your monitors max refresh rate using something like Rivatuner than to use vsync.
DerGeist91@reddit
I think the very first time I disabled it, was when I played the first dead space. There was an incredible input delay on my mouse, and v-sync was the cause of it. That is why I always have it disabled.
steaksoldier@reddit
Adaptive sync is a very common feature on most monitors, kinda eliminates the need for vsync.
IndyPFL@reddit
You just need your fps capped a little below your monitor's refresh rate. V-sync does that but can also introduce input delay, which an fps cap via your graphics drivers or in-game settings usually won't.
Le-Misanthrope@reddit
Others have already stated ways to minimize it or get rid of tearing. The higher your refresh rate is the less noticeable tearing is. So say when I switched from a standard 60hz monitor to a 1440p 170hz monitor I hardly noticed tearing when at 100fps+ and only if I had lower than that. However the other way around it was to enable Gsync which is basically compatible with most if not all modern TV's and monitors. However if you enable Gsync you then want to cap your fps to slightly below your refresh rate. So for me I can cap it to 115fps on my TV and 165fps on my monitor. You now no longer get tearing. Or avoid all of this and use Vsync. Lol
I still occasionally enable it on any story games. Hell even with it's supposed latency Vsync causes I used to be Diamond in R6 Siege on a 60hz monitor... Obviously a jump to that higher refresh rate felt worlds better. But it did not make my skills better. It just made my eyes bleed less.
netscorer1@reddit
From what I understand, turning V-Sync on kills VRR. If your monitor supports VRR tech, you really don't need V-Sync
CurlCascade@reddit
V-sync adds a bit of input latency since it holds a new frame back to the next interval rather than showing it immediately.
Some people value lower input latency over screen tearing, or just don't see the screen tearing, or only care about the FPS number.
People also copy reviewers, who turn it off because it makes measuring performance harder.
SantasWarmLap@reddit
You need a 120Hz monitor or higher.
SimpleMaintenance433@reddit
In short, V-sync reduces frame rates so people often only use it if they really need to.
Prize-Confusion3971@reddit
Well I have an OLED which means VRR flicker. It's really annoying. My OLED also has a 360hz refresh rate. Not really worried about it so I disable it because I hate the flicker when web browsing
cre3dentials@reddit
I've been on 240 and then 360 hertz for years. I only play games, that hit those frame rates. No matter how hard I look, I just can't see any screen tearing at these refresh rates. It's a different story though. Since vsync introduces a lot of latency, it is never worth using without combining it with adaptive sync. In that case it behaves differently and the latency penalty is minimal.
bananabanana9876@reddit
Latency. They deal with screen tearing by capping fps 1 hz below the monitor refresh rate.
Elliove@reddit
Capping FPS does not remove tearing.
RankedFarting@reddit
If you get tearing with freesync then you are above the actual freesync range of fps.
Vsync can lead to increased input lag. What most people do is activate freesync and then cap their FPS 3 frames below their monitors refresh rate. In game you turn off vsync. This way you never get into vsync range and instead are always withing the range of freesync.
Rasutoerikusa@reddit
I've never seen screen tearing in modern pc gaming even with v-sync off. And the added latency is super annoying, so no point keeping it on
Slow-Secretary4262@reddit
Never enabled vsync or VRR and never seen any visible problem on screen
ZombiFeynman@reddit
Freesync should eliminate tearing. In a very oversimplified explanation you have:
1) Nothing. Your GPU draws to the framebuffer (a part of the VRAM where the contents displayed on the screen are), and the monitor gets its info from it at a fixed rate (the refresh rate). If the GPU is writing a new frame as the monitor is being updated you see part of the old frame and part of the new, which causes the tearing.
2) V-Sync. Your monitor keeps working at its fixed refresh rate, but now the GPU waits for the monitor to finish reading a frame before it writes a new one. There's a wait, so there's an increase in input lag.
3) VRR (Freesync, GSync, etc). The reverse of 2). Now the Monitor waits for the GPU to tell it that a new frame is ready, so the pacing is set by the GPU instead of the fixed refresh rate. As soon as a frame is ready in can be displayed (as long as you don't go over the maximum refresh rate, of course).
blacklotusY@reddit
People generally disable v-sync because it creates input lag, as it forces your graphics card to wait for the monitor’s refresh cycle before displaying frames. This is really bad for people that play games, especially online games that require consistent FPS.
Whenever I play any game, the first thing I always do is go into options and change all the settings, including disable v-sync. turn off motion blur, depth of field, reflection and all of those, etc.
Rockozo@reddit
for competitive games if you get way over your refresh rate, the screen tearing is harder to notice.
Ok_Seaworthiness6534@reddit
screen tearing is very visible in 60/75/100hz monitors, i runna 155hz one and never had to turn on any type of sync :)
THEYoungDuh@reddit
Vsync caps your frames. More frames more better, it's EZ math.
Talking for competitive shooters like CS where having the most up to date information is important
antimuggy@reddit
Ok but he said any game, even older ones. This is just not true, not to mention he said any hardware. You know there’s a 5080 right?
200YRedWine@reddit
I realy dont like coil whine at 800fps, when my monitors refresh rate is 144hz
Pocok5@reddit
We don't get screen tearing with adaptive sync. Check your monitor and gpu settings - and make sure you're using the right port, some monitors don't accept gsync/freesync on all the inputs.
badassbolsac@reddit
because people don’t like to cap their fps even though going above your monitors refresh rate is pointless in my opinion.
Desperate-Steak-6425@reddit
If you have a 60Hz monitor, you leave Vsync off for better input lag.
Other than that there are cases where you need to go above your refresh rate for FSR FG to get frame pacing right.
heavy-minium@reddit
Most do it for no good reason. Those that have a good reason are either übergamers that need to shave off every nanosecond of potential input/ouput lag, and those that need to benchmark the performance.
janluigibuffon@reddit
I have seen VRR displays that are not as smooth as mine with just v-sync. Always on - admittedly, I don't play competitive games
RichardK1234@reddit
I don't notice screen tearing as much as I notice the reduced input latency from higher FPS.
Granted, I have a 60hz pabel and cannot visually see the difference, but you can definitely feel the lower latency.
Moist-Station-Bravo@reddit
Does your monitor support Variable refresh rate (VRR) if so enable that and also enable it on your gfx card settings then you will see what we all do it.