Why do so many monitors not seem to actually carry the G-Sync module? What does "G-Sync Compatible" actually mean?
Posted by Wonderboyjr@reddit | buildapc | View on Reddit | 128 comments
I'm guessing it's partially cost or that the module means no Freesync?
DrKrFfXx@reddit
Funny because the very first Gsync monitor was actually a DITY kit.
BagboyBrown@reddit
if your monitor doesn't have gsync or freesync it's cheap af.
InfinitelySub@reddit
G sync compatible just means that it's software based sync vs hardware based g sync module.
Wonderboyjr@reddit (OP)
So kinda just a scummy reuse of the G-Sync name? Why even call it that when it is still just Freesync?
iAmAddicted2R_ddit@reddit
Long ago in the bad old days, you couldn't use Freesync with an Nvidia GPU at all; you had to buy an AMD GPU or do without. So people got very used to the idea that Freesync was the AMD adaptive sync and G-Sync was the Nvidia adaptive sync, and simultaneously, those who cared enough to understand what's going on under the hood were aware that G-Sync is/was a more "premium" frame sync solution than Freesync. When Nvidia first decided to hitch their wagon on, they wouldn't have wanted to plainly call it Freesync due to the first point, and the second point means they would have been giving up valuable brand cachet if they came up with a new, third thing.
TinkatonSmash@reddit
You’re missing one detail. At least as first (not sure if it’s still this way), you couldn’t use just an Freesync monitor. You had to use an Nvidia approved one. They would test a monitor to make sure it met their standards, and then enable it in the driver. “G-sync compatible” labels were an easy way to know if the monitor was approved or not.
Wonderboyjr@reddit (OP)
No, I get that. I too immediately associate Freesync to AMD so I wasn't suggesting Nvidia call it that. But I still think it's disingenuous to use the name G-Sync when it has nothing to do with the technology.
Choconolait@reddit
Considering how Nvidia also calls frame generation with the name of Super Sampling, it seems that marketing team at Nvidia doesn't care about how the tech actually works.
ZekasZ@reddit
Huh? Aren't DLSS and frame gen separate technologies?
Just_Maintenance@reddit
When NVIDIA added framegen they turned DLSS into the brand that their AI powered featured fall into
DLSS has three components
DLSS upscaling DLSS frame generation DLSS denoising (ray reconstruction)
AIgoonermaxxing@reddit
AMD and Intel do this too, unfortunately. AMD's frame generation is officially called "FidelityFX Super Resolution Frame Generation", not FidelityFX Frame Generation. Same goes for Intel's X^(e) Super Sampling Frame Generation.
Honestly, I kinda get why Nvidia did it (brand recognition), but with how bad FSR's reputation was prior to FSR 4 I think AMD would have actually benefited from calling it FFG (FidelityFX Frame Generation).
KillEvilThings@reddit
Lol what the fuck no they didn't? Did you read your own post?
They still have "FRAME GENERATION" in the name. Anyone with two brain cells can comprehend that and ignore the technobabble.
Calling FG supersampling on Nvidia's end is deliberately disingenous.
Affectionate-Memory4@reddit
Yup. Personally I think these should be called DLSS, DLFG, and DLRR.
ZekasZ@reddit
What the fuck that's so dumb
MarcBelmaati@reddit
As far as I understand, a G-Sync Compatible display is a FreeSync display that has been tested to make sure it works with NVIDIA, so all G-Sync Compatible displays are FreeSync, but not all FreeSync displays are G-Sync Compatible.
francesco_lomi@reddit
A secret, more sinister third thing
survfate@reddit
fun fact, I posted a thread 7 years ago testing that there is a way to use Freesync with Nvidia GPU by using the AMD iGPU as output, at the time I was quite sure that eventually Nvidia gonna adapt the Adaptive Sync as its software solution and here we are.
nicholsml@reddit
It's important to point at that NVIDIA did not allow their GPUs to work with FreeSync. It has always been free for manufacturers and consumers to use.
AMD did help develop Freesync and when launched, only AMD GPU's worked with it.... but that was implementation and Nvidia was always allowed to use it for free. Nvidia finally started including it in 2019 I think.
InfinitelySub@reddit
Because it's rated by Nvidia to offer to hit a minimum requirement for the technology. The actual module can just do it better but it's $$$ to get it there. The g sync compatible is good enough for most people.
Wonderboyjr@reddit (OP)
Gotcha. Yeah, I still think it's confusing to use the G-Sync name rather than saying Nvidia Certified or something like that.
EnlargedChonk@reddit
Fun fact: There is also G-Sync Compatible Certified to further the confusion.
So you get:
G-Sync compatible = you can turn on g-sync with this display, it should work but not quite up to nvidias arbitrary specs, i.e. the range is too small
G-Sync compatible certified = you can turn on g-sync with this display, it should most definitely work and meets all of nvidias requirements
"real" G-Sync = has nvidias hardware module, should most definitely work, meets all requirements, has some extra features that may or may not matter.
Not to mention some older freesync "only" monitors from before the g-sync compatible label will also work with nvidia these days. They just predate that label/marketing.
McGondy@reddit
Idk, I agree they want to brand it but using the same/similar name for a different implementation is very tricksy. I get why they're doing it, but it's bloody confusing.
LewAshby309@reddit
It's not.
The technology is called "adaptive sync".
Freesync is what AMD uses to brand all sync technologies for thwir partners. Gsync is what nvidia calls all sync technologies.
When nvidia decided to support software adaptive sync they named it gsync compatible to have a difference to gsync monitors with a hardware module.
AMD did similar naming shemes (for different reasons) with freesync, freesync 2, freesync premium, freesync premium pro,...
Wonderboyjr@reddit (OP)
Ah yes I meant Adaptive Sync. And I understand G-Sync is what Nvidia calls their sync technology. But in order for that technology to work, it requires their module to be installed in the monitor and the GPU. So calling it "G-Sync Compatible" doesn't make any sense other than from a marketing standpoint.
LewAshby309@reddit
Gsync and Gsync ultimate is their own sync technology that requires a hardware module to run.
Gsync compatible uses the VESA standard "adaptive sync" which is software based. Gsync compatible does not need the hardware module. AMD uses adaptive sync as well and calls it freesync. Adaptive sync/Freesync/Gsync Compatible are software based.
That's the difference. No dedicated hardware for gsync compatible.
ciaranlisheen@reddit
It’s not a software vs hardware thing, afaik both use a similar hardware solution. The nvidia sync module used to be much better than the freesync modules that were available at the time, but then they caught up so nvidia dropped their own one.
HanseaticHamburglar@reddit
freesync is software based solution. The only requirements are that the gpu and monitor are capable of using variable sync VESA standard. Gsync was a prepriatary hardware module that monitor manufactures had to pay exorbitant license fees to install it in their products.
That is why it was considered better, it was a custom built hardware solution that originally outcompeted the software based approach.
It is also by its nature anti-consumer and im glad its not the standard today
ciaranlisheen@reddit
The variable sync in standard monitors still uses a hardware based implementation though. It still has a physical chip or silicon in the main scaler that operates the sync.
The only difference is that the nvidia one used to be much more capable and as you said came with intense licence fees.
They are still both hardware solutions
HanseaticHamburglar@reddit
we are talking about gpu manufacturer solutions.
AMD does not manufacture and market chips for VESA variable sync.
Nvidia manufactures and markets gsync modules, increasing costs to monitor manufacturers.
Freesync uses the industry standard hardware.
Gsync does not.
Your argument is stupid as hell. It might as well be "freesync and gsync both need monitors and gpus, they are obviously both Hardware solutions"
ffs
ciaranlisheen@reddit
I don’t think you understand what hardware and software means, it’s not just the product vs its capabilities. There is hardware asics for the sync modules inside all monitors. If they put in a generally purpose cpu in them and then wrote software to run the sync that could be described as a software solution.
CrateDane@reddit
"Used to be" in the sense that the controller in a monitor can nowadays be just as capable as a separate module. But it can vary between monitors depending on the controller inside.
ciaranlisheen@reddit
Yes exactly, although I don’t know if any non-gsync variable sync have ever had low frame rate compensation all the way down to 1Hz
Deeppurp@reddit
It's all hardware base support, it's just an Nvidia module or a stock vesa module.
ukimafija@reddit
Gync used to come with that module, while amd was using VESA adaptive sync, abd branded it freesync. Nvidia finally decided after a lot of years to join amd and to charge monitor manufacturers $5 a pop to get gsync compatible badge sign on, aldough since the standard is open and it works pretty good on newer displays, that is the default way hit the monitors are produced today...it just works....and it's great because the graphics cards drivers pick it up momentarily and it gets enabled right away. You just click and play...you only enable it and set it up the first time. Afterwards, it's off to the races...
dehydrogen@reddit
I have a gsync monitor with an actual gsync module inside of it and it actually performs worse (ghosting) than my gsync compatible monitor.
iirc, the gsync module was stated to allegedly relieve some processing stress from the gpu because it allows the monitor to put in some work. I have no idea how to measure this alleged benefit or if it ever actually existed. Just one of the dumb things you may have heard online back in 2017.
Wonderboyjr@reddit (OP)
Sounds like I was greatly overestimating the benefit of G-Sync vs what we have now.
dehydrogen@reddit
I have a gsync monitor with an actual gsync module inside of it and it actually performs worse (ghosting) than my gsync compatible monitor.
PsychologyGGG@reddit
Because it’s generally a non factor in performance and costs money
queenbiscuit311@reddit
it's because it's basically pointless in a world where freesync with LFC is not meaningfully different from a full gsync module
Sufficient_Fan3660@reddit
money
aragorn18@reddit
Cost.
The G-sync module costs a couple hundred dollars and the open source Freesync is generally good enough.
Wonderboyjr@reddit (OP)
Oh man, I had no idea. I'd be interested to see a module monitor in person to see why/if it justifies costing so much.
LavishnessCapital380@reddit
LTT has a video on g-sync and the requirements for monitor certification, it will explain a good deal of it.
Pyrostemplar@reddit
Just because it costs money, it doesn't mean that it is better. Or needed for anything.
I remember an instance where a high end monitor manufacturer stated that they didn't sell gsync monitors, because they'd have to use a 3rd party standard module that was inferior to the ones they used.
Gsync is dead anyway.
12amoore@reddit
How is g sync dead?
Lemonade1947@reddit
Because freesync is better and cheaper. The open, free standard always wins.
12amoore@reddit
Maybe you need to clarify and say “g sync module” is dead. But g sync itself and the way it works is certainly not even close to dead.. it’s a staple for no tearing and no input lag like v sync and been like that for years now
Pyrostemplar@reddit
No. GSync is dead. Because GSync was a proprietary HW + SW solution.
What you now have is GSync as a "feature brand", appearing as "GSync compatible", which basically with meaning Freesync or VESA Sync.
Good riddance.
FlameFrost__@reddit
Big W for AMD I guess. Makes me happy.
Lemonade1947@reddit
hate to break it to you buddy but gsync without the module is just branding. It's all based on the same underlying technology.
G-sync was created by nvidia at a time before an open standard existed. AMD, knowing they didn't have the clout of nvidia created an open standard and said "hey everyone this is free to impliment" and then a lot of companies did just to have another selling point on the box. It worked so well that it got folded into the vesa standard and the rest is history.
hardolaf@reddit
The standard actually existed but monitor manufacturers didn't want to invest into putting the feature into ASICs. So Nvidia made a FPGA based solution that could be thrown into a monitor that manufacturers could add or not add at the last minute based on what orders they were receiving. And the manufacturers got to do massive markups for it.
Pyrostemplar@reddit
About as dead as Monty Python's parrot.
nVidia may say it is just resting, I mean, GSync compatible, but it is dead.
AFAIK, in the past couple years, no significant GSync monitor has been launched.
Syrath36@reddit
Not sure it's dead. Nvidia has gsync pulsar coming soon(tm). there's been a few monitors announced but the update is rather slow. We'll see about pulsar it is suppose not cost as much hardware wise and the rollout has been delayed it seems a couple of times.
That said gsync ultimate monitors still preform better then adaptive sync but some margins. Like the aw34dw vs the f version, although with the costs these days the marginal increase probably isn't worth it to a lot of people.
Deathspiral222@reddit
You should post this on the audiophile subs. :)
residenthamster@reddit
man, he will be crucified in there.
Tiflotin@reddit
Pretty sure the only thing it does better is allow syncing at any refresh rate, 1hz-whatever ur monitors max is. Gsync compatible and freesync iirc does not work below 40fps.
Silly_Personality_73@reddit
Mine works above 48fps.
LouBerryManCakes@reddit
Same, and when the framerate is lower than that, it simply refreshes at double the framerate. Like 42 FPS will be 84 Hz, which is effectively the same thing. Draw the frame twice, problem solved.
BoltActionPiano@reddit
I've had a fee GSync Compatible monitors exhibit bad enough flicker problems that I had to turn vrr off in Linux.
Turtvaiz@reddit
It also does variable overdrive in LCDs
MajkTajsonik@reddit
It does work through LFC.
Kittelsen@reddit
I tried reading up on it a couple of years ago and came to the conclusion that it is outdated and not worthwhile anymore. Just go for a gsync compatible monitor.
hardolaf@reddit
The module uses an expensive FPGA whereas the monitors which support VRR (under different marketing names) use ASICs that do the same thing.
xenphor@reddit
It would be nice if they could decouple the FPGA from the monitor so people could just buy the module and use it with any display.
HardwareSoup@reddit
That would require OEMs to make a module slot/adapter, and the firmware to go along (expensive), for something that almost nobody would care about in the first place.
Adaptive sync/Freesync is already the solution.
xenphor@reddit
When I had a Gsync native monitor, I noticed that when using Framegen, frame times were a bit better compared to the Freesync version of the same monitor, so there is a benefit. People also say that VRR flicker isn't as bad too.
archangel0512@reddit
I have one of these and the main difference I noted is it has zero VRR flicker.
dbcher@reddit
Same, I have 2 GSync monitors and 1 gsync compatible. The compatible one has VRR flicker which I find annoying (and it makes me nauseous) while my GSync monitors give me no issues.
xXSNOOOPXx69@reddit
My g sync module monitor vrr stars at 14hz.. much lover than non module monitors
Smooth-Sentence5606@reddit
and is that really useful? do you dip down to 14 fps?
xenphor@reddit
It's useful for N64 emulation.
a4840639@reddit
Potentially it can be better for movie watching but I have not been able to get Gsync to work with any media player on Windows. And you can probably just use low framerate compensation instead of actually feed the low refresh rate contents directly to the monitor… On the other hand, the HDMI forum VRR implementation on MacOS does adjust refresh rate to media contents
TreesLikeGodsFingers@reddit
Nobody wants to dip down to 14fps, but it happens. When it does happen, it's nice that you don't tearing on top of it. I have a 5070ti and a Samsung G9, they work really well together
Herrjeminewtf@reddit
14 fps is unplayable anyway, who cares if you have tearing on top of it.
Smooth-Sentence5606@reddit
im not sure I’ve ever dipped down below 60-70 fps ever
xXSNOOOPXx69@reddit
No i dont lol 😀 the g sync module just have a wider range.. just what in trying to say
zarco92@reddit
In general it doesn't. I have an old-ish true Gsync monitor and a Gsync compatible one side by side and the latter is what I use for gaming.
No-Actuator-6245@reddit
The G-Sync module also had a data bandwidth limitation. It was something like 4k 144Hz and 1440p 240Hz was the most it could run. With many high end modern monitors exceeding this the existing module would actually limit modern high end monitor performance and lower tier monitors the additional cost doesn’t make sense. I recall reading that NVidia had started designing a replacement module that could handle higher bandwidth but it got scrapped. Ultimately the G-Sync module has been made obsolete as it didn’t keep up with modern monitor requirements. I have a monitor with the module, it works perfectly but it is obsolete tech now.
Deathspiral222@reddit
This is exactly what happened to me! I turned gsync off at the highest 4k refresh rates because I got flickering. Waste of money and false advertising offering a 4k monitor with these specs without telling you you can’t actually use them all at once.
boodopboochi@reddit
In a word, gsync was priced greedily and the generic version "gsync compatible" killed it ince it came out.
There was a time (circa 2014) when most monitors did not have any sync technology. Nvidia came out with Gsync and marketed it as "if you own an Nvidia GPU and you want a compatible monitor with adaptive sync, then it must have gsync". And they charged a premium for gsync. Once monutor manufactuerers learned that that FreeSync and other adaptive sync technologies worked on Nvidia GPUs but didnt cost extra, we began to see "gsync compatible" monitors at lower cost and thus ended "gsync" as a feature.
Long-Broccoli-3363@reddit
Im actually not sure of this, back in the day I had a monitor that was one that you could slap a gsync module in(this was like... 2012? 2014?), and someone did an assay of the price vs components of the module that you could buy, it was something like $97 in parts at bulk that they sold for $150 at the time, so it was not a terrible markup.
From what I understand, the tech behind it was pretty insane at the time and required quite a board/power to actually implement it.
zagblorg@reddit
In practice, not much difference. I have a G-Sync compatible and a Freesync 2, both Dells. Both worked fine with my old 1080ti, but obviously only the Freesync one works with my current 9070XT.
itsapotatosalad@reddit
Most of the money goes to nvidia for the gsync license i’d bet.
HirsuteHacker@reddit
Not just good enough, freesync is objectively good
Laughing_Orange@reddit
Freesync is good, nobody except Nvidia fanboys says otherwise.
G-sync Ultimate is technically superior to Freesync, but it cost a lot more for not that much of a perceived improvement.
jabbrwock1@reddit
Im not even sure it is the actual hardware that costs much more money. It might just be that you have to pay a hefty licensing fee just to use the name.
digitalsmear@reddit
It's the hardware.
Area51_Spurs@reddit
No way it costs a couple hundred dollars to add G-Sync.
My Bravia 7 has G-Sync and no way Sony is spending an extra $200+ on a feature like 0.01% of users are using. lol
aragorn18@reddit
TVs handle G-sync over HDMI. That's handled differently and doesn't require a module.
Area51_Spurs@reddit
Ah
Deathspiral222@reddit
I have one with the module. I honestly can’t tell the difference except that at the highest refresh rates I get a flickering issue so I turned it off.
KillerxKiller00@reddit
G-sync module is a thing of the past since back in 2024, Nvidia announced that they would ditch that module and use Mediatek solution instead.
Sander001@reddit
How do I verify it's active?
Creationship@reddit
In the console, G-Sync settings. Click display at the top and check the indicator option. It will show “g-sync” in green in the top corner when active. This resets after every driver update.
KillerxKiller00@reddit
Just go into the nvidia app or control panel and see the g-sync part
dakkottadavviss@reddit
The short answer is g-sync hardware module is very expensive and comes with high licensing fees to use. The benefits were it had much higher capabilities than most freesync monitors and higher standards.
Freesync / adaptive sync monitors are effectively like unlicensed versions of g-sync now. There are monitors out there that could exceed the capabilities of hardware g-sync but the vast majority are much lesser. The main thing is they do the job good enough for most people and the cost to add is very little. These monitors do have some trouble with flickering while using VRR, the hardware g-sync is supposed to solve this. This is very easily reproducible on the vast majority of g-sync compatible monitors under the right conditions. How much you run into those conditions really determines if it’s a big deal breaker or not. For most people they don’t care.
There is new g-sync technology that does have much more advanced capability. See g-sync pulsar which uses backlight strobing with VRR to have a nearly blur-free experience.
PsyOmega@reddit
gsync hardware allows a few things like:
if your fps fluctuates the monitor keeps up 1:1 without flickering. with freesync you get flickering when your hz changes dramatically.
My freesync monitor, don't get me wrong, is good! but it has annoying artifacts when hz changes or on rapid color shifts. My hardware gsync monitor has zero flaws in presentation what-so-ever.
withoutapaddle@reddit
I have an old gsync monitor with the hardware module and it still flickers when the framerate changes drastically (eg unreal engine traversal stutter).
Acrobatic_Fee_6974@reddit
I had one of the last generation Gsync modules in my old IPS: AG273QG. I remember it was one of the few monitors on the market with a variable overdrive algorithm at the time, but that's a lot more common in newer freesync scalars now. I upgraded to an OLED, so variable overdrive was no longer needed.
andrew_2k@reddit
"I upgraded to an OLED, so variable overdrive was no longer needed."
Can you elaborate please? Im planning to get an OLED next week, and most of them seem to be "only" G-Sync compatible, while on IPS I had to return a monitor for a full G-Sync supported one (Its a personal issue of mine, I'm very sensitive to tears etc.) and I'd really like to avoid these issues with an OLED.
I have an Nvidia GPU and so far I'm settled on the MSI MAG 271QP QD-OLED X24.
pss395@reddit
Variable overdrive is not variable refresh rate. It's motion handling of the monitor that affect things like ghosting and trailing of image. With LCD display you need to have good overdrive setting to mitigate these thing, while Oled naturally have very little of them so you don't really need overdrive.
Variable refresh rate (Free/Gsync) is still working just fine on Oled.
andrew_2k@reddit
Thanks for the response.
This means I don't have to worry about full G-Sync compatibility?
I was also wondering if I can expect OLED monitors to look "sharper" even when they are the same resolution just because of the technology and pixel layout.
xfloggingkylex@reddit
Not at all, gsync is variable refresh rate. Overdrive is a refresh setting for LCDs. If you've ever watched a Monitors Unboxed video or looked through RTings, you'll see they have different overdrives settings that impact how fast the monitor handles transitions. Faster overdrive settings allow for faster transitions to hit 240hz as an example, but often have downsides like ghosting or overshoot especially at lower fps numbers.
This typically leads to a situation where at 200+ fps, the "extreme" overdrive setting might be ideal for the fastest transitions but at 100hz or below, extreme has way too much overshoot and instead, the "fast" setting may be the best option.
Variable Overdrive handles this by adjusting how much overdrive is applied based on the current frame rate so you are getting slower overdrive at lower fps and higher overdrive at higher fps to get the best possible setting for any given frame rate.
OLED doesn't have this issue because transitions are basically immediate at all refresh rates.
andrew_2k@reddit
Thanks, very excited to pick it up tommorow.
oscardssmith@reddit
it will be sharper but nothing to do with pixel layout. OLED has faster response then IPS so you get a leaner picture no matter the framerate.
andrew_2k@reddit
Yep I was only expecting it in like a placebo effect, so thats perfect.
Thanks for the information.
semidegenerate@reddit
I have the MPG 271QRX, which I’m pretty sure uses the same underlying QD-OLED panel, just a slightly better trim, paired with an RTX 4080. I’ve had it for a year now, and I’m completely thrilled with it. VRR has worked flawlessly, and the visual quality is amazing.
This is my first OLED panel. Coming from an early gen 1440p IPS 144/165hz monitor, it’s a massive upgrade.
andrew_2k@reddit
Yep, will be coming over from an older gen 165hz IPS too, so I'm very excited about the big jump.
Actually the monitor will be arriving tommorow as well as I got access to company account to purchase it for less due to removed tax, very, very excited now. Especially for games like The Witcher.
Acrobatic_Fee_6974@reddit
OLEDs have perfect pixel response times across the refresh range, so they don't need overdrive to speed up colour transitions like LCDs do, and hence they don't need variable overdrive to avoid overshoot.
BasonPiano@reddit
I mean, I don't know what variable overdrive is, but I have a 4070 Ti and a MSI MAG 321UP and haven't had any problems.
Skarth@reddit
There's a hardware and software version of g-sync.
The hardware one has a module with ram for a frame buffer, so as the monitor receives new frames, it can locally store one frame to display while waiting for the next one.
Software based has the monitor wait for a new frame instead, but cannot sync as well at lots framerates.
The hardware version is technically better, but costs more, and isn't better enough in most cases to be worth the cost.
Liam2349@reddit
It was mostly about certification because most adaptive sync monitors have some pretty glaring issues. Nvidia was first to market with this tech and they did it correctly from the start.
When you buy an actual G-Sync monitor it should work correctly.
number8888@reddit
The module is no longer needed since adaptive sync features are now baked-in to monitor controllers. Pretty much all monitors that has VRR would be "GSYNC compatible" anyway.
The original module is really old anyway. IIRC it only works via Displayport and doesn't support HDR. I think they came out with GSYNC Ultimate tier that added those features, but not sure if it needed a new version of the module card. Also having a module card would bump up the price of the monitor, so there's no reason to continue adding them to the monitor hardware as there's no benefit.
PanicOtaku@reddit
Also, if you have the real G-sync module installed, installing new firmware for the monitor becomes impossible outside of a factory, because you need proprietary tools to flash the G-sync chip.
Price-x-Field@reddit
I don’t know if this is just me but I feel like gsync and free sync just make my game a stuttery mess and I have a much better time just locking my game to the refresh rate of my monitor.
chr0n0phage@reddit
It seems you’re out of the loop on the importance or the Nvidia specific scaler. In 2016 it was a benefit but we’ve moved on, adaptive sync is in all displays now.
kermityfrog2@reddit
I’m out of the loop. I bought an expensive G-sync only monitor (3440x1440 IPS 120hz) and I don’t want to change it. What graphics card do I have to use with it? Am I stuck with Nvidia? It’s oldish now so doesn’t have freesync or adaptive sync.
Wonderboyjr@reddit (OP)
You're right, I didn't realize G-Sync is basically old news at this point. And yet companies still use the name to promote their monitors.
Hiro4ka11@reddit
"G-Sync Compatible" simply refers to the use of standard Adaptive Sync (such as FreeSync) that passes Nvidia's validation, it does not actually contain a G-Sync module, but it functions well for the majority of users without charging more.
LazyDawge@reddit
Nowadays means nothing, 10 years ago it meant everything if you had an nvidia GPU and the price premium was very high
Emerald_Flame@reddit
GSync Compatible: This is Nvidia's implementation of the DisplayPort/HDMI VRR standards. Just like FreeSync is AMD's implementation of the same standards. It also means it's been tested by Nvidia to be compatible with their GPUs and doesn't exhibit issues like blank frames, flicker, etc. that can happen but is pretty rare these days.
GSync: Has Nvidia's actual GSync module in it. The biggest technical difference is that it supports variable overdrive which can reduce ghosting. Other than that it's mostly just that GSync validation testing includes a bunch of other things like color accuracy, color gamut, response times, etc. So to get the certification it has to be a pretty good monitor.
GSync Ultimate: Basically just GSync with the module but with proper HDR support.
kuug@reddit
G-sync compatible is just Nvidia trying to take control of Freesync marketing. They’re not technically the same thing. But Nvidia knew freesync and freesync monitors were an AMD selling point and they flexed on monitor companies to give them favorable conditions.
MrSqueak@reddit
It means nvidia beat down a manufacturer and made a monitor more expensive to make it do something amd does for free.
Pinossaur@reddit
Never made much sense to me. You'd pay extra to have the ability to run games smoothly at lower framerates, but if you had money to buy a monitor with a gsync module you're more than likely already have a powerful enough computer to not need it.
qwaccmaster69@reddit
The other day I accidentally turned on V-sync for a Game on a 15 year old monitor and it crashed my GPU, SSD and Windows went on diagnosis mode for some reason. (Was running fine again after 2 restarts)
LewAshby309@reddit
I think you confuse what the term "freesync" means.
The technology is called "adaptive sync" which is a VESA standard.
Freesync is what AMD uses to brand all sync technologies for their partners. Gsync is what nvidia calls all sync technologies.
When nvidia decided to support software adaptive sync they named it gsync compatible to have a difference to gsync monitors with a hardware module.
AMD did similar naming shemes (for different reasons) with freesync, freesync 2, freesync premium, freesync premium pro,...
Wonderboyjr@reddit (OP)
You're right, I was confusing the two.
ckn@reddit
FWIW I have an Acer Predator with G-sync and 2x illyama with freesync. and run my machines heavily and have never noticed a difference in performance on any of the RTX x090 cards i own....
I'll be buying the cheaper illyama monitor next time...