Nvidia is ending support for Maxwell, Pascal, and Volta in the upcoming driver branch
Posted by BlueGoliath@reddit | hardware | View on Reddit | 247 comments
BlueGoliath@reddit (OP)
What really urks me about this is you won't be able to use these cards as backups in case of needing to do an RMA. I guess Jenson really wants people to go out and buy 3050s.
cholitrada@reddit
The youngest of them (Pascal, which is over 9yo) struggles with modern standard features: Async compute, DX12, etc. I don't count Volta because it's a prosumer architecture. I'd be surprised if its users haven't upgraded by now.
If you want a backup card, a 3050 would genuinely be better. Because even a 3060 is just \~7% slower than the 1080ti while having much better efficiency and features. Cards from Maxwell era wouldn't even be in the same ballpark.
3050 MSRP is 249 USD. I've seen it goes down to sub 150 here in Canada. If you're willing to go used GPU it's even easier to beat the 1080ti.
Tech gotta move forward at some point brother. 9 year is an eternity for hardware.
AntiGrieferGames@reddit
is this the 8gb version or the 6gb version? because i dont think the 6gb version beats the 1080 ti except for effiieciy
Even a 4060 cost 100 dollar more and has longer driver support (my guess)
BlueGoliath@reddit (OP)
If we were talking about Fermi I would agree with you. Maxwell I think is different because it is/was the first "modern" GPU generation and can still do almost everything you throw at it still without even being the flag ship.
Nevermind Pascal which is still used by a lot of people.
TSP-FriendlyFire@reddit
Maxwell was last considered "modern" years ago, at this point Turing is the new baseline. There's just so many features missing in older cards, you can't really expect good support for them going forward.
The point of the world "modern" is that it changes over time, it stays, well, modern.
JonWood007@reddit
Yeah, current baseline is RTX 2060/3050 on nvidia side, RX 6600 on AMD side.
JonWood007@reddit
Dude, it's antiquated. It doesnt even have modern features. It's a decade old at this point. It's as old as the venerated 8800 GT was when that stuff came out. If not older. Time to upgrade.
cholitrada@reddit
But they lack crucial modern features that are present in even low entry cards like the GTX 16xx series. They're not good at being backup cards unless you don't play any games newer than 2016 (which, at that point, you don't need a new card). That's my point. They are being hampered by things unrelated to their processing power.
DX12 doesn't like Pascal/Maxwell. Any games that uses DX12 make these card struggle, even if they have the horse power to run it. I experienced it hands-on because when Elden Ring came out, my 1080 struggles to keep stable 60fps despite not having CPU bottleneck.
Budget gamers WANT DX12 because it helps immensely with CPU usage compared to DX11, which in turns extend their PC's lifespan.
Lots of people still use Pascal, true. But it doesn't mean they should. If you are still on a Pascal card, you should find a used Turing/Ampere/RDNA2 card. ALL of them would be a huge upgrade in term of security and features for \~150 bucks or so after you sell your old card.
lusuroculadestec@reddit
Nvidia will continue to release driver updates with the old branches, but it's just mostly security updates and application fixes. You'll still be getting updated drivers for a few years. Even after that point, nothing stops you from just using the latest driver indefinitely.
Qesa@reddit
They're still perfectly functional. And the legacy branch still includes security updates and support for new OS's. They just don't get new features or game-ready drivers.
BlueGoliath@reddit (OP)
For Windows yes. Unfortunately Linux is developed by... less intelligent people... so if you attempt to run a GPU that isn't supported anymore it might crash. Hopefully they could make it so you could atleast get software rendering working.
steik@reddit
Bro what? Linux is basically where old GPUs go to pasture. Go to r/homelab and see if older GPUs are a problem (they are not).
starlevel01@reddit
not these generations of nvidia, the signed firmware means that they can't be clocked up from their minimum clock speed outside of the proprietary driver.
GetFuckedYouTwat@reddit
Why do you say that?
BlueGoliath@reddit (OP)
Unlike Windows where drivers can be freely attached detached from the NT kernel, Linux requires that drivers either be compiled into the kernel or loaded as a DKMS module. DKMS modules need a reboot to work, require downloading the modules, and the process is lengthy on slower CPUs.
So if software rendering doesn't work and you're on 585, you're screwed.
GetFuckedYouTwat@reddit
How old are the CPU's you're talking about for a reboot? One of mine is from 2017 (8 years) roughly takes about 45-60 seconds to reboot on Linux. That includes the BIOS screen and OS selection.
BlueGoliath@reddit (OP)
I'm talking about compiling the module, not rebooting.
bexamous@reddit
They will likely keep doing updates from R580 for years that add support for new kernels and fix security issues.
Eg Fermi was last supported in R390. First R390 release was 390.12 on January 4, 2018. The last R390 release was 390.157 on Tue Nov 22, 2022. Almost 5 years later.
BlueGoliath@reddit (OP)
My understanding is that updates for legacy LTS branches are slow and low priority. If the Linux kernel breaks something and you use a rolling release distro you're out of luck.
Regardless, if your GPU suddenly dies and you're using 585, or whatever, what then? If software rendering works then it's somewhat OK but if not you're left with a brick of a PC.
bexamous@reddit
I think security updates are usually quick, like when a security update is made public all the drivers are posted at same time I think.
But kernel compatibility I'm sure is not quick.
You may still get basic functionality with nouveau.
viperabyss@reddit
Maxwell is more than 10 years old at this point. Pascal is well into its 8th birthday. How long are companies expected to dedicate engineering resources to support these cards?
And as others have said, it just means the cards won’t receive new updates, but older driver will work just perfectly fine.
JonWood007@reddit
Yeah I say this as someone who has a 1060 under my desk in case my current GPU breaks, but yeah, that card is ancient at this point. It had enough support. 8 years is typical for nvidia, it had a good run. Anything older than 2000 for nvidia and 6000 for AMD doesnt even have modern features at this point.
JonWood007@reddit
Eh they're not paperweights. You just won't get updates which might mean newer games might not work. I can put my old HD 5850 in my PC and use it for testing. It stopped getting updates when windows 10 dropped 10 years ago.
ScTiger1311@reddit
What are the consequences of this? Will it be like when a phone stops getting support, where it's a security vulnerability? Or does it just mean no more game-ready drivers, which barely matter?
Like if someone has a 2080ti, that's still a pretty good card, but it would be discontinued here.
WJMazepas@reddit
2080Ti is Turing, so it will still receive support.
But cards from 1080Ti and lower will stop receiving those game-ready drivers and new features. Probably wont receive security updates as well
AntiGrieferGames@reddit
the gtx 1650 is the same, which is funfact a low end one. gt 1030 however not sadly.
BlueGoliath@reddit (OP)
It will probably get occasional security and maintenance updates but no new features or Game Ready drivers.
Verite_Rendition@reddit
Correct. Unless NVIDIA changes their driver development and support model, the last driver branch to support a given architecture will be released as a Long Term Support Branch (LTSB), which gets 3 years in total of updates and support. In practice, that means 4-6 months as a mainline driver receiving minor feature updates, and then the last 2.5 years or so as a legacy driver receiving just security updates and critical bug fixes.
For reference, the last time NVIDIA retired a GPU architecture was Kepler in October of 2021. The last driver branch for that was R470, which saw new updates up until mid-2024 (NVIDIA actually supported it until September 30th, but there were no security issues that warranted a new driver release between July and September).
To get an idea of what this might look like, you can see NVIDIA's complete driver branch history for Windows going back some 13 years on their website: https://www.nvidia.com/en-us/drivers/rtx-enterprise-and-quadro-driver-branch-history/
Reactor-Licker@reddit
2080 Ti is Turing, which isn’t affected.
ScTiger1311@reddit
Oops, my bad for not looking deep enough. You're right. Still, a 1080ti is not a horrible card. It's maybe time to upgrade but it really depends on the needs of the user.
Vb_33@reddit
It's not about power, it's about age.
TSP-FriendlyFire@reddit
It's roughly equivalent to a 4060 but without support for all of the modern trappings like DLSS. You're looking at a pretty low price to upgrade from, especially if you consider used 30 and 40 series cards.
I think it's not outrageous for Nvidia to stop adding new driver features for these cards. You can still run older drivers with them, they don't cease to function overnight, they just won't be covered by game-ready driver optimizations and so on.
kuddlesworth9419@reddit
10 series finally being put to pasture.
AntiGrieferGames@reddit
Rest in peace Pascal driver support 2016 - 2025.
NonameideaonlyF@reddit
What game-ready driver are you on currently
kuddlesworth9419@reddit
566.36 is stable for me with no real problems. I was getting black screens and crashing on a bunch of games on newer drivers so I went back to this one.
Direct_Witness1248@reddit
Same here on 40 series.
Helpdesk_Guy@reddit
Please don't. Nvidia has been sneakily dialing down older Gen's drivers on the regular since years, to cripple those and quickly degrade its performances, for enforcing faster purchases of then newer cards or even the latest Gen.
Only the latest Gen gets the speediest drivers, as long as they are CURRENT and the newest Gen – Everything else in line gets sneakily toned down in performance over time …
That's actually a MAJOR advantage for users of older GPU-hardware: AMD hasn't been doing that and even years after purchase, you get the biggest performance with the latest drivers only — Of course, of no other issues and bugs are involved, like black-screens, texture-flickering et al. …
And you easily can go back in time and paly with already years-old driver-revisions, if it's doing it for you.
enkoo@reddit
Can you post some benchmarks showing all that?
BloodyLlama@reddit
Yeah usually there are small differences in different driver versions with various games and benchmarks, but thats true with the newest GPUs too. The fastest 3Dmark drivers for a 5090 are not the latest drivers.
ehaliewicz@reddit
Note that this doesn't prove performance is decreased intentionally, there can be other reasons. Graphics card drivers are incredibly huge and absurdly complex, I wouldn't be surprised if there are many unintended regressions with every new version.
kuddlesworth9419@reddit
I have never done a benchmark on which drivers would give the best performance but I guess I might have to one day.
ResponsibleJudge3172@reddit
Then why is AMD not stomping Nvidia in the games that came out back when the old GPUs were new? In fact, why is this hate driven to the company giving more driver support?
talkingwires@reddit
The dichotomy of Reddit.
PastaPandaSimon@reddit
Not resting quite yet. That final branch is yet to get started, and will likely see new driver releases for about a year until Nvidia jumps to the next branch ~late 2026/early 2027. Even at that point, those cards will have a mature and functional driver rather than becoming paperweight.
reddit_equals_censor@reddit
if you can't force people into "upgrading" through unusable raytracing and fake graphs, i guess you do it by dropping driver support then right?
the 1080 ti is perfectly useable today, especially as it has very very barely enough vram to still be usable today, unlike the 8 GB broken graphics cards launched by nvidia rightnow.
the 1080 ti is faster than the rtx 2070 in 1080p gaming with a perfectly playable 64 fps average in a 1 year old revisit of the card:
https://youtu.be/hmMWNrRHiNY?feature=shared&t=355
so in absence of improved performance and vram regression, that is all that is left to do, artificially break older working cards to try to force people to downgrade to worse hardware.
as again the 5060 8 GB is WORSE than the 1080 ti 11 GB, because of the vram alone.
disgusting shit.
AntiGrieferGames@reddit
You dont need those shits. They still maintain to work fine after end of driver supports depends on engine that are compatible.
Sopel97@reddit
how is this a forcing change?
Helpdesk_Guy@reddit
Nope — “Make it F. What a fail!”
Jokes aside. That's really not how it's done though, not in real life at businesses.
Since nVidia figured since ages, that it's way more fruitful in the long run, to just *pretend* to issue driver-updates, lead your user-base into the false believe, that they'd get actual driver-improvements when updating drivers even for older cards, and bring them up to be subconsciously afraid of losing out on (actually virtually non-existent) driver-improvements — You know how it's done: Through the good bad classic psychological trickery of FOMO.
… and then use the de facto access (voluntarily granted to you and freely given by them) to their older hardware, to sneakily cripple that with every new release for the older cards ever so slightly, that no-one seems to notice. You don't even have to really cripple those driver-revisions in actual performance — Just secretly removing the fixes and patches for given games, will do it already — It will effectively break support for older games in the long run.
Et voilà, you have granted yourself basically a nigh assured never-ending stream of ever-consuming clients, which then will keep updating, as the older hardware seemingly quickly becomes lackluster … And no-one seems to notice.
Since not actually having factual 'access' to your consumers' hardware (to sneakily cripple it secretly through newer driver-revisions), is for Nvidia likely the worst possible nightmare. That's why they 'support' their cards so long.
Yet the support is actually not for you or your hardware, but for them and their quarterly profits' sake alone.
Quite frankly, I'm still waiting for the day, when Nvidia decides to declare their drivers as intellectual property, falsly falsely ramble about alleged protection of their IP (likely under the disguise of trying to prevent IP-theft through reverse-engineering from Far East), officially forbid everyone to mirror, host (or even make available by others through third parties), to collectively wipe the Interwebs of all older driver-revisions, basically instantly bricking any older hardware.
… only to turn around and then only ever offer the very latest driver-revision only and exclusively for everyone, immediately removing the last one when some new comes online — It would be a appalling dystopian reality coming true, yet a alarming reality most business today fantasize about and eventually totally lust after.
diceman2037@reddit
everything you said is headcanon and not based in reality.
NeroClaudius199907@reddit
If 8gb is broken graphics it means everyone still using sub 8gb should upgrade. 1080ti can still chug along
CriticalCat8142@reddit
9-11 year old cards. I kinda don't see the issue. Especially since the lack the important modern features
Nicholas-Steel@reddit
It's gonna turn a lot of perfectly functioning hardware in to E-Waste when Windows breaks compatibility with the drivers that work with these cards.
A lot of indie games available and likely a lot of those that will be available in the future... don't need high GPU performance nor the latest features.
AntiGrieferGames@reddit
I know for a fact that Win 11 still works with Fermi Cards so i dont think it will breaks that.
AntiGrieferGames@reddit
Sad news about Maxwell and Pascal, but the most impressive things was Maxwell one with over 11 years of Driver support.
Wish they are still maintain driver support for Pascal :(
I_Dont_Have_Corona@reddit
Bruh I just ordered a used 1070 for my secondary PC as my RX 570 was rarely receiving any updates… Talk about timing.
psydroid@reddit
I got a used GTX 1650 4G, as I knew CUDA 13 was abandoning my current Nvidia GPUs.
Utinnni@reddit
Is it possible for these to still be maintained as an open source project?
psydroid@reddit
https://nouveau.freedesktop.org/
Aggravating_Ring_714@reddit
Still much better than AMD’s support for old cards 👍🏻
-Outrageous-Vanilla-@reddit
AMD drivers on Linux are always receiving improvements.
And with Proton gaming is faster than Windows.
Aggravating_Ring_714@reddit
Ok but how many gamers are in windows vs linux with desktop amd/nvidia cards?
-Outrageous-Vanilla-@reddit
https://www.reddit.com/r/technology/comments/1lofgad/windows_seemingly_lost_400_million_users_in_the/
Aggravating_Ring_714@reddit
“Update: 7/1 7 am (ET): Since the publication of this article, Microsoft has updated the blog post in question, and now claims that it still has over 1.4 billion monthly active devices. The rest of the article remains as published below.”
You already think 400 million mfs are gaming on linux with desktop amd or nvidia cards?
-Outrageous-Vanilla-@reddit
I don't know why are you so strongly defending a multi billion dollar company that sells your data while using their products.
It is mindbogglingly to me.
Aggravating_Ring_714@reddit
I mean you’re defending AMD like crazy insinuating a shit ton of people use dedicated gpus to game on linux while this is as niche as it gets. I can assure you that 400m people did not switch from windows to linux for gaming or anything else.
-Outrageous-Vanilla-@reddit
I am saying that on Linux AMD is still having support for old hardware and you can play the latest games faster than Windows.
psydroid@reddit
I don't have any modern AMD GPU but I've noticed that Nouveau is working quite well with ny Nvidia GPUs on Debian 13 at first glance. I haven't really tried gaming so far.
As for Windows, it's a dead-end and bleeding users left and right, if not to GNU/Linux then primarily to Linux-based Android.
Aggravating_Ring_714@reddit
You should add: Faster in SELECT games on AMD gpus. https://youtu.be/4LI-1Zdk-Ys?si=Ac4X-bt_ar-kqc4B
ShadowRomeo@reddit
AMD Radeon ended support for Polaris and Vega way too soon back on 2023 right about only when they were 6 - 7 years old, whereas Nvidia with Pascal and Maxwell they ended it at 9 - 11 years old on mid 2025.
I think this just proves that Nvidia is really the one that supports GPU architecture far longer than AMD despite AMD having the credits for their supposed "Fine Wine" branding.
GumshoosMerchant@reddit
polaris and vega are still supported, just on a slower driver update cycle
the most recent driver was from may https://www.amd.com/en/resources/support-articles/release-notes/RN-RAD-WIN-25-5-1-POLARIS-VEGA.html
ShadowRomeo@reddit
They aren't real game ready driver support though more like security bug fixes to keep it functioning, but the official support for newer upcoming games?
Yeah, that one is long gone ever since 2023.
AMD Begins Polaris and Vega GPU Retirement Process, Reduces Ongoing Driver Support
MdxBhmt@reddit
Is there any 2024+ game that is not functioning on vega and polaris from lack of AMD support?
ResponsibleJudge3172@reddit
Lack of support desn't brick your cards. Just like an older Nvidia GPU will continue not only to work, but get security patches
MdxBhmt@reddit
well, it's the guy I am answering that is implying polaris isn't going to be supported on newer upcoming games.
Strazdas1@reddit
it wont be supported. does not mean it cant run them as long as games dont require hardware features they dont have. Just that there will be no active developement to support it.
MdxBhmt@reddit
Polaris and vegas are still getting bug fixes and have active support of the driver. It is not as active as RDNA, but people like you are making it sound as worse than what it actually is.
'Supported' as 'it runs' is what AMD is doing. 'Supported' as 'it is optimized for' is a whole different discussion that we won't be getting to the bottom of it from just looking at patch notes.
Strazdas1@reddit
Polaris and Vega is getting community made big fixes, not AMD made bug fixes.
MdxBhmt@reddit
This is an absurd take.
https://www.phoronix.com/news/Linux-6.17-AMDGPU
This is from the last 24h on linux.
25.5.1 on windows.
On what are you hinging your 'community made big fixes' and 'no AMD made bug fixes'??
Elketh@reddit
Surely you can't possibly believe that 'Game Ready' drivers are actually doing anything to optimize the latest titles for architectures like Maxwell and Pascal, right? The only time those architectures are ever mentioned in the changelogs is for bug or security fixes. The support they've been getting is absolutely no different to legacy AMD cards, bar the fact that as usual Nvidia are better at PR than AMD. They let you update your driver once or twice a month to make it feel like you're getting some sort of upgrade, but I thought most tech-savvy people know that downloading the latest driver isn't actually going to do anything to help the performance of your 980 Ti in whatever brand new game(s) it's being released for.
ShadowRomeo@reddit
Uhh yeah? That's literally their purpose and the main reason why they are still listed by most game devs on their minimum requirements, obviously moving forward that is now gone as well as Nvidia also pulled the plug just recently with Pascal / Maxwell.
Some of Nvidia GPUs older than Maxwell despite being ended with gamer ready driver support yet a year later they still received driver security support when only it's necessary. But that doesn't mean anything to actual game driver support just like AMD's Legacy drivers.
.AMD's Legacy Drivers isn't equivalent to their Official Adrenaline Game Ready Drivers, it's literally just a slow driver that is focused on security stuff, and this is also a thing as well with Nvidia
Nvidia Delivers Important Security Update Driver for Kepler GPUs | Tom's Hardware
MdxBhmt@reddit
I mean, no. Legacy is no/zero expectation of driver updates. See
https://www.amd.com/en/resources/support-articles/release-notes/RN-RAD-WIN-22-6-1-LEGACY.html
(some gpus were 10 years old at that point). Polaris and vega are defacto not at that stage.
Helpdesk_Guy@reddit
I vividly remember that very same kind of naïveté from back then — I think I lost it at age ten to twelve or so?
Gosh, what a trip down memory lane, so refreshing! Are you also somehow under the impression, that newspapers still print the truth, or are obligated to publish every readers' comments coming in?
MdxBhmt@reddit
Polaris and vega are not legacy status, but extended support.
Legacy is a whole different level.
laxounet@reddit
Hmmm to be fair I had issues with MH Wilds on my RX590 build, until I installed this exact driver. So there are definitely some game fixes / optimizations as well.
ShadowRomeo@reddit
Monster Hunter Wilds doesn't run "fine" on any hardware in general.
laxounet@reddit
Sorry I didn't express myself very well : it was unplayable before the driver with texture glitches all over the place, holes in the ground and sand waterfalls. The driver fixed all those bugs.
cyberloner@reddit
it is not supported at all.... no more new game for it
SireEvalish@reddit
It's become clear that it really should be "Poor initial drivers"
hackenclaw@reddit
IMO, 10 years should be the minimum standard for support.
Microsoft still remain the gold standard in terms of support for consumer product such as Windows.
Unlike those phones... looking at you android phones.
reactcore@reddit
Agreed. Windows 10 IoT Enterprise LTSC will be supported until 2032. I started using 10 on the release day in 2015, and will keep using the LTSC until 2032.
That means 17 years of continuous support.
That’s just incredible.
Alive_Worth_2032@reddit
The difference is that AMD launched cards using Polaris much later than that. Support dates should be based on when the last product is released utilizing something, not the first.
The RX 590 only got less than 5 years of support and I bet they launched something after 2018 as well.
noiserr@reddit
Not really. You can play the latest Indiana Jones on Vega 64. On Linux. Can't do that on other pre RT GPUs.
adelphepothia@reddit
I think you're mixing up community support with official vendor support. AMD supports open source much better then other vendors, and this means their hardware can be supported for much longer if people are willing to do the work, but it's not AMD themselves doing this. It's only their linux driver that is open source as well, and that's a small percentage of users compared to windows.
noiserr@reddit
AMD Open Sourced their driver. That's precisely the benefit of AMD's drivers.
lolatwargaming@reddit
On the back of Joshua Ashton, a volunteer contributor
With these mental gymnastics we should also thank nvidia for making real time RT reality
noiserr@reddit
How is that related to the topic of driver support?
Nvidia should thank AMD for inventing HBM, that's where they are making tons of money. But that's besides the point.
Elketh@reddit
AMD's developers also work on the open source driver. It's not just a bunch of random community members developing it (though many such people do also contribute). The point of the open source driver is that it doesn't have to be just AMD working on it. Developers from Valve, Red Hat, Google and other companies have also put a lot of work into it, but AMD have contributed a lot to RADV as well. In fact, they recently made the decision to end development of their proprietary OpenGL and Vulkan drivers for Linux in favor of focusing on Mesa and making it the "official" option.
The combination of all these talented developers working together is the very reason AMD drivers on Linux are so impressive. The open source model is just better for graphics driver development, and it's a shame it's limited to the Linux space.
adelphepothia@reddit
Yes, I don't think there's any debate on the benefits of open source. I was trying to qualify and give context to what the person I was replying to was saying. For those that do benefit, it's clearly superior, but the range of users that are supported is quite limited, so I wouldn't use it as an example of AMD having better support. If AMD were to open source their windows driver as well it would lend much more weight to the idea that they themselves are providing better long term support, but as they haven't I'm not sold on the idea that they've done it solely to improve the quality of their drivers.
No-Broccoli123@reddit
Woah Linux so goood
Spiritual-Cookie-709@reddit
If you really wanna use such an one-off example, Turing GTX16 series can run Doom: The Dark Ages
https://youtu.be/iQ5I7i1kS8g
noiserr@reddit
1660 came out 2 years after Vega64. Not sure what your point is.
Vaxtez@reddit
End of an era. The 1080 Ti was an absolute trooper of a GPU & if you don't play modern AAA games, it will still go on for years to come
HulksInvinciblePants@reddit
To be honest, the best driver for all these models is almost certainly an older revision. For example, Ampere still seems to have a sweet spot on 537.58, which was released in late 2023. Sadly, Windows is now forcing users on 560, unless steps are purposely taken to stop it.
I remember installing the latest supported driver on my 6800 Ultra, attempting to play Black Mesa Source. The end result was completely broken, until I went back to a much earlier release.
i_max2k2@reddit
I’m on Windows 10 and I likely won’t be moving to Windows 11, hoping to try Steam OS and see how that goes
Tuna-Fish2@reddit
I hope you're on an AMD GPU. NV on steamos is still a total crapshoot.
For those on NV, windows 10 enterprise LTSC is still an option.
i_max2k2@reddit
Thank you, unfortunately in some way, I’m on a 5090.
Tuna-Fish2@reddit
Then the only option is enterprise LTSC. You can install it on top of a normal win 10 install so that you save all your data, and that gets you updates until 2027.
i_max2k2@reddit
Oh I had no idea, this is happening tonight lol
Tuna-Fish2@reddit
There's a guide somewhere in /r/windowsLTSC
Do make backups just in case before you potentially nuke your main drive.
i_max2k2@reddit
Thank you very much! I’ll look up therr
Tuna-Fish2@reddit
I actually looked up the proper guide, it's the first answer on this:
https://learn.microsoft.com/en-us/answers/questions/544156/windows-10-pro-to-ltsc
Strazdas1@reddit
as far as i know windows only forces driver if you dont have a driver already installed. If you have an older version then it does not force it. At least for me it never forced an updated.
HulksInvinciblePants@reddit
It 100% has a minimum version it will force install if you’re behind it.
https://www.reddit.com/r/techsupport/comments/1jnzgbd/driver_56094_keeps_reinstalling_despite_removing/
Strazdas1@reddit
this appears to be a bug in windows update that is preventable on non-home version of windows: https://www.reddit.com/r/Amd/comments/1hdjxba/guide_how_to_resolve_windows_update_installing/
Helpdesk_Guy@reddit
Yeah … Remember how the The Green leather jacket wearing Hobbit™ decided to show everyone the bold, naked F!, as they were forced to comply to a GTX 970 owner's settlement, and dropped driver-support for the GTX 970 the very same day afterwards as a nice gesture of "appreciation" for the law-suit, making it legacy?
It's not only, that the GTX 970's driver-branch was just merely dragged along completely un|altered since (with none whatsoever further driver-improvements), nVidia even rolled back the very (evidently quite shady) driver-mechanics, which—from release until the fallout about the shady 3.5 GByte came to light—managed to at least tame a game's VRAM-usage (through trickery of internal texture-compression and LoD-cheating through the driver).
Immediately after, games were no longer kept from using the slow-attached 0.5 GByte of VRAM as the last memory-resource (if at all), and instead map the first speedy 3.5 GByte firstly and foremost — Nvidia made it then so, that from then on out, the last slow 0.5 GByte were shifted to be used *first* instead, and the speedy 3.5 GByte being occupied as last, only when the upper 512 MByte were fully occupied.
The result made age the card horribly and in no time, quickly degrading its usefulness, even if not even all VRAM and the full 4 GByte were fully utilized (and thus, the upper/last 0.5 GB as well also).
I have a friend who loved to play GTA 5 with his GTX 970 back then, it was quite fun – The moment the lawsuit was lost, the drivers went into the shitter, he had lots of in-game stutter and the game became barely playable …
So … Thanks Nvidia, you effing pr!cks for even punishing your paying consumers for your own wrongdoings and cripple their hardware, when you got caught cheating!
Cynical_Cyanide@reddit
Surely this would be trivial to benchmark. I had a 970 around this time and I have no memory of everything turning to crap with drivers - do you have any proof of performance suffering in real practise?
Minimum-Account-1893@reddit
I believe you.
When people want to bash something, they will find a way. Reality is often opposite of an individually identified reality, which is different per person. Reality itself... only 1 of em.
Cynical_Cyanide@reddit
Look - I could have been lucky, or I could simply have been ignorant. I welcome the opportunity to be shown that there was something that screwed me over that I somehow missed. God knows I have no love for Nvidia's corporate interests.
... But honestly, by the time that the VRAM issue for the 970 came to that legal head, it was already as evident as it was going to be (again, as far as I could tell). I welcome the opportunity to be wrong, every surprising datapoint is an opportunity to learn.
UsernameAvaylable@reddit
Are you using an LLM for your tirates? I thought that you seem to have a chip on your shoulder and looked into your comment history and you have so many of those low entropy rants...
DILF_FEET_PICS@reddit
Tirades*
StickiStickman@reddit
Reddit moment
No-Broccoli123@reddit
Touch grass
exomachina@reddit
How is it the end of an era? 1080ti hasn't gotten any specific driver updates in years.
PastaPandaSimon@reddit
It still is. That branch is yet to come, and will likely see new driver releases for about a year until Nvidia jumps to the next branch ~late 2026/early 2027. Even at that point, those cards will have a mature and functional driver rather than becoming paperweight.
Hour-Firefighter6665@reddit
for someone that owns a ROG Strix 1080 Ti, bought brand new from PC Case Gear, im pretty sure my Australian model just got its last driver update, im very impressed with my GPU, beast back in 2017, R.I.P 1080 Ti.
NeroClaudius199907@reddit
My maxwell, pascal and volta friends its safe to upgrade now
Zenith251@reddit
Mfg? Does anyone actually use that garbage?
NeroClaudius199907@reddit
Its only garbage until amd brings their version and more people get to try it. You havent even tried it yourself lol
W_ender@reddit
4x mfg really is usable only in 120+ fps situation on 320 hz+ screens, because lower than 120 games start to inflict input latency
NeroClaudius199907@reddit
What about 3x? and 2x? what frames do they need before activating? and which input latency exactly? I'm sure you tested it
W_ender@reddit
surely you can find all the testing on the internet, but i guess you will lower your standards as much as possible to be contrarian to my take, or pull some shit like "yeah yeah at 3x there are some graphical artifacts and maybe game doesn't feel as smooth as native fps and input latency isn't relevant because i'm an old, disabled, blind, non competitive gamer but woohoo big fps number (the only game tested is cyberpunk)" it's kind of ridicuolous how guiillable and easy affected by marketing redditors are
NeroClaudius199907@reddit
Quick question, Do you think dlss is better than native taa, smaa sometimes? or upscaling is always worse than native?
W_ender@reddit
Upscaling AA methods are better than anything we have on the market, is it worse than native or not depends on TAA implementation, for example lies of p is miles better in Native than with any upscaler because it barely has any TAA
StickiStickman@reddit
With Reflex + MFG the latency is literally still lower than "native"
Zenith251@reddit
Don't need to. I know what it does. I've seen what it does.
I said no such thing and I'll ask you to not put words in my mouth.
Upscaling is a neat tech, and RT can be a neat tech.
Altixis@reddit
This is where you instructed everyone to stop reading
Zenith251@reddit
Brother, I read reviews, I watch reviews. I know what the tech does.
BloodyLlama@reddit
I use it when I have consistent over 100 FPS, but I have a 240hz monitor and a 5090. I had a 5080 before and ot was rare it could run games fast enough for MFG to be usable.
Zenith251@reddit
Hence why I cast shade at MFG. It's not useful for most 50 series users, in most scenarios. That is unless you're playing games that juuuuust get around that 100-200FPS, but you have a 240hz or better monitor.
nate448@reddit
You can pry my B stock blower 2080ti out of my cold dead hands. Lmk when I can get double the performance for $550
lolatwargaming@reddit
lmao these entitled gamers think people actually give af about them
Altixis@reddit
You draw strange conclusions from comments that you read
Not_Yet_Italian_1990@reddit
If Nvidia could figure out the driver issues, maybe.
reticulate@reddit
No idea why this is marked as controversial.
24H2 broke something fundamental in Nvidia's driver stack and they've spent several months trying to fix it. I've reverted back to 566.36 and won't be budging until they can guarantee they've solved the problem.
BlueGoliath@reddit (OP)
Shh don't mention those or you might get disappeared.
NeroClaudius199907@reddit
and the burning cables as well
Zeroth-unit@reddit
Me a 1070 Ti owner: "That 9060 XT is looking pretty good right now."
StickiStickman@reddit
Looks worse than a 5070 in every way including price.
No-Broccoli123@reddit
Not so smart then
NeroClaudius199907@reddit
Buy it, it costs less than what u bought 1070ti
https://www.newegg.com/asrock-challenger-rx9060xt-cl-16go-radeon-rx-9060-xt-16gb-graphics-card-double-fans/p/N82E16814930140?Item=N82E16814930140
Thorusss@reddit
Honest question: what does that change? The existing drivers will continue to work, new features are not expected for such old cards anyway. One might not get the newest game specific performance optimizations, but even new games typically work just fine (with exceptions). I mean it is not like the end of Operating System updates, where Internet Browsing becomes more dangerous.
ThrowawayusGenerica@reddit
The main issues are that if there are drivers bugs that cause issues with newer games they won't get fixed (and don't say it'll never be relevant, not all new releases are AAA - I'm happily playing Victoria 3 on my 1080 ti), and eventually Windows is going to drop support for these older driver versions. Neither of these are likely to be immediate major concerns, but it's not nothing.
ShadowRomeo@reddit
RTX 2080 Ti despite the massive hate it got on launch is now the new 1080 Ti of this generation, despite being 7 years old it still supports some of the newest feature such as DLSS 4 Upscaler and still is on par with the modern hardware like the PS5 Pro as well.
Dreamerlax@reddit
It being to able to support DLSS upscaling is its saving grace.
ShadowRomeo@reddit
Yep, Turing despite being 7+ years old still gets official DLSS 4 Upscaler support, whereas AMD Radeon can't even make RDNA 3 or lower support their newer FSR 4.
That is enough proof on how future proof RTX Turing Architecture is with their Tensor Cores that was underestimated and laughed at by most of us including myself back when they were new.
I am glad to be proven wrong on that point, definitely changes my view on tech hardware in general, not everything is about rasterization like what most tech reviewers at the time were pushing for except for Digital Foundry of course.
Dreamerlax@reddit
Yep. It's about the performance of a 3060 Ti right? But with a bit more VRAM which definitely helps.
IT_IS_I_THE_GREAT@reddit
Much faster than 3060ti, it’s on par with 3070 but with more vram, 3060ti is on par with 2080 super
ResponsibleJudge3172@reddit
The gap between 3060ti and 3070 was as small as the gains Blackwell has over lovelace
lolatwargaming@reddit
Ok and? 2080 Ti trades blows with the 3070, as in it’s faster in some cases
ResponsibleJudge3172@reddit
So it's not "much faster", the gap is around 15% between 3060ti and 3070
Capable-Silver-7436@reddit
heck it still beats a 5060ti 8GB because of the vram
Capable-Silver-7436@reddit
yeah the 2080ti is probably the best card ive ever bought. even having also owned a 1080ti
lolatwargaming@reddit
Imagine all the suckers who bought into RDNA while nvidia was creating all this good tech lmao
Jon_TWR@reddit
2080 Ti was a decent card, but with a bad price and not a big raster performance bump over the 1080 Ti…however, it did age really well, like the 1080 Ti before it.
Hellknightx@reddit
I just finally replaced my EVGA 2080 Ti lady month after it started failing. Felt bad having to get a non-EVGA card.
Jon_TWR@reddit
I replaced my EVGA 2080 Ti for a 4080 Super in like November, not quiiiite as much performance as a 5080, but close, and not only was it cheaper, it came with Indiana Jones and the Great Circle!
f3n2x@reddit
The 1080Ti did not age well at all. At launch it dominated the 2080 but later this advantage swung in the other direction and for several years now the 2080 has been far ahead even before considering the advantages of DLSS. Pascal doesn't do well when games that aren't specifically optimized for it.
lolatwargaming@reddit
Pascal had shit support for HDR, which would cost about 10% in performance. Turing could do HDR natively with very little cost. Turing is of the modern era, pascal is obsolete
Stewge@reddit
The 2080Ti was 30% faster in raster which is decent.
The problem was the pricing. It was the start of the "pay more to get more" generation. So you got 30% more raster performance, but for 30% more money. Previously gamers would be treated to 30% more performance for 0% more money. If you rode the mid-range models since the Geforce 2 MX, you could almost expect performance doubling every 2 years for the same money!
I do think the 2080ti will age better now though, if only because DLSS allows it to push much further into the future than the 10 series. Using a lower DLSS scale seems far more tolerable from an image quality standpoint, than more traditional methods of dropping resolution and settings.
Alive_Worth_2032@reddit
And was a really good OC card if you raise the power limit trough mods/bios. Nvidia held that thing back a lot vs the rest of the Turing stack. It would have needed like 50-75W extra at stock to have the same power/area of silicon used as the 2080 for example.
lolatwargaming@reddit
Yeah my bios flashed 2080 Ti basically halved the performance delta between it and a 3080, so a 3080 was like 10% faster after my oc’d bios flashed 2080 Ti
ResponsibleJudge3172@reddit
Before rtx 30 launched, rtx 2080ti already showed clear signs of widening the gap. With a 45-50% gap over 1080ti.
ShadowRomeo@reddit
It's a product that had advanced future features that was useless at the time it was released and only shined when the RTX 30 series Ampere was launched and aged gracefully throughout RTX 40 series Lovelace.
fixminer@reddit
Sure, but the same is true for the 2080 which was $300 cheaper. And if you waited a year the 2080 super was even better for the same price. Not to mention if you waited for the 3080. You paid for features that weren't usable until better value cards launched.
Gambler_720@reddit
The 2080 has not aged nearly as well due to lower VRAM. There was a very limited time window in which anyone was able to buy a 3080 at a normal price. So in hindsight the 2080 Ti was indeed a good buy thanks in large part to the crypto mining era.
fixminer@reddit
If you saved the $300 difference and sold the 2080ti for maybe $200 you would have been able to buy a used 3080 by the time 8GB became a notable limitation at 1440p. Paying 43% more for 15% more performance and 3GB extra VRAM wasn't a good deal.
Gambler_720@reddit
Buying a used GPU isn't a good argument to make especially in this case since so many 3080s in the used market were heavily used for mining. People who buy used pretty much always buy used so from that context buying a new 2080 wouldn't be on the menu to begin with.
Also it isn't exactly a good idea to upgrade to a 10GB card from 8GB if VRAM was the main reason for the upgrade. Going the 2080 route you won't even be able to upgrade to the 4070 without exceeding $1000 total. You would have had to wait for the 4060 Ti 16GB and that would only get you into a marginally better situation than the 2080 Ti.
KARMAAACS@reddit
Yeah when you put it like you have, honestly, it was a good buy in hindsight. Good amount of VRAM, good features later on and decent performance in terms of age. However, it is starting to seem pretty dated, the 3070 and cards around that performance are starting to age now, what used to be Ultra/High settings is more like Medium now and soon low will be needed for 60 FPS at 1440p/1080p depending on the game.
ShadowRomeo@reddit
At 1440p maybe, but majority of gamers plays at 1080p and the 2080 Ti is definitely still very competitive at that resolution, It also matching the PS5 Pro performance the most powerful console released, and it supporting DLSS 4 Upscaler is going to put it more relevant nowadays and going forward until at least the release of next gen consoles.
NeroClaudius199907@reddit
Even has access to frame generation through fsr 3.1 couple with dlss. Its golden
shugthedug3@reddit
I bought a (used) Titan RTX which is basically a 2080 Ti with more VRAM and it's still enough for me.
You can see why these cards were so expensive, the die is massive.
Alive_Worth_2032@reddit
Turing also has the benefit of some products in the stack being sold for a LONG time after most of the stack went EOL.
They kept making and selling 16xx and 2060s for god knows how long into the pandemic.
Capable-Silver-7436@reddit
even after the pandemic right?
Alive_Worth_2032@reddit
I think that was mostly just leftover stock. They stopped production for the 2060 in late 2022 iirc, the lower end SKUs might have continued a bit longer.
chefchef97@reddit
The 2080Ti can never be the inheritor of the 1080ti's crown
It doesn't matter how performant it still is today when its price was so ludicrous you could've upgraded twice in that time and ended up ahead today
Gambler_720@reddit
That would be a valid point normally. But this timeline was not normal. There was a very limited window to upgrade to the 3080 and after that you had to wait another 18 months to be able to buy a GPU at a normal price.
Capable-Silver-7436@reddit
and the 3080 still had the tiny vram issue
Gambler_720@reddit
Yup. With more VRAM 3080 could have legit been the GOAT.
Capable-Silver-7436@reddit
it had a rare 20GB variant even... if that was the bast it would hav been fucking perfect
Gambler_720@reddit
Yup and it didn't need to be $700, an $800 20GB base variant would have been killer but then nobody would buy the 3090 lol
Keulapaska@reddit
If there was 20GB variant it would've cost 1200+ msrp.
Capable-Silver-7436@reddit
heck in RT its still a little ahead of the ps5 pro. i think in raster too but im not 100% about that. and yeah supports better upscaling than the pro. its still a very usable card and better than the 3070/ti 4060/5060 and the 8Gb version of the 5060ti even because of its acceptable amount of vram
LuluButt3rs@reddit
RTX Fine Wine technology
SkylessRocket@reddit
The 2080 Ti has aged better than the 1080 Ti because of DLSS alone.
Sevastous-of-Caria@reddit
Deal breaker might seem dlss at first. But the actual kicker is Dx12 ultimate dupport
chr0n0phage@reddit
These type of posts and the comments make me think that people believe the cards will now just stop working.
ResponsibleJudge3172@reddit
They think game ready drivers are game support drivers, as in, the game won't launch due to drivers
Azuretare@reddit
I'm worried about the security implications of ending support
zyck_titan@reddit
Kepler got security updates as recently as last year.
They stopped driver support for Kepler in 2021.
Security driver updates and game ready driver updates are different.
Azuretare@reddit
That's really good then, I didn't know that when I wrote this
exomachina@reddit
Why? There have been no public exploits targeting GPUs that I can find. Every CVE patch from the last 5 years has been from proof of concepts submitted by security labs. If something exists it hasn't been publicly disclosed and isn't patched by any driver.
lolatwargaming@reddit
Don’t use ancient hardware?
exomachina@reddit
A lot of people still fail to understand that if a new driver doesn't specifically address an issue you're having with a specific game, you have 0 reason to update the driver other than to get rid of an outdated driver warning. In the last 5 years there have only been 2 fixes for Pascal desktop GPUs. RDR2 crash in 2020 and a Gears of War 4 crash in April this year caused by a regression from the previous driver. If your GPU is more than 2 years old, chances are updating drivers will do more harm than good.
StumptownRetro@reddit
My GTX 1080 is crying.
GenZia@reddit
Maxwell OG a.k.a the 750Ti had a hell of a good run, I should say.
That's 11 years worth of driver updates!
exomachina@reddit
750ti was a garbage GPU that was overhyped on Reddit because it was considered a "console killer" at ~$150 when paired with a ~$100 G3258 CPU. All these console killer builds being hyped up and recommended when you could go on ebay or craigslist and find a used office tower with a 3770k and 16gb of ram for like $150 and just throw a 280x or 760ti in there for the same price as a console killer build.
Zenodeon@reddit
Damn, just upgraded to 3090 from 1080 last week, shit gonna increase already high used newer gen gpu prices.
randomkidlol@reddit
maxwell and pascal sure, but volta is cutting it a bit short. especially considering there were no consumer volta cards. V100, quadro GV100, and titan V were top of the line workstation and datacentre cards at the time.
Ninja_Weedle@reddit
Well, hopefully the final 58x.xx drivers will be good at least
ThisAccountIsStolen@reddit
Feels like I'm running an ARC A770 over here with all the driver bugs. They fix one thing and just break three more. Currently YouTube flashes randomly when in full screen (and it's not the ambient mode bug, I've had that disabled for ages) and different games that didn't stutter on 572.70 or 572.83 now do, and games that stuttered on those don't on the newest.
Nvidia hasn't dropped the ball this badly on driver stability since before the release of any of the three architectures about to be deprecated. But yeah, if removing these older archs helps them fix this mess, I'm fine with it.
zerinho6@reddit
I don't think that youtube thing is a nvidia issue, I have a RX 6600 and Youtube just feels like its drugged sometimes, I have one particular bug where the "shining background" of video will sometimes turn fully white and cover 70% of the screen, one where the player just turns full black while the audio is being played and other where I click on a video, I hear it playing on the background but I'm still on the home page.
ThisAccountIsStolen@reddit
It began only with the driver update, and went away when I reverted to 572.83, so this one is definitely on the driver.
AMD GPUs have a bunch of issues with YouTube's implementation of HW acceleration, though, and that's not new. Most of them can usually be solved by disabling MPO in the registry, but some do persist regardless. My 6800XT system in the living room has had these issues too.
SpaceCadetEdelman@reddit
I thought they already did this a few years ago? They announced something like this after QuadroRTX was released..?
lusuroculadestec@reddit
Nvidia still supports the Maxwell-based Quadro cards and newer on the latest driver.
evangelism2@reddit
Gonna see a lot of anger over this, but the 10 series is over 8 years old now. Time to upgrade, dog.
ShadowRomeo@reddit
Nvidia 10 series Pascal launched on May 2016 they are over 9 years old now and is about to be a whole decade old on May 2026.
evangelism2@reddit
I was going based on the 1080ti's release date, which is considered a GOAT, but a dying one. But true, and it backs my point even further.
Fatal_Neurology@reddit
In a vacuum, sure. But it's been nothing but raw deals on raw deals in the GPU marketplace for four years now. I'm not going to expect anyone to engage with this market until we're in a bit better place than the current moment.
evangelism2@reddit
You can get a brand new 5060ti that can play new triple A games at 60fps for 370.
The state of the GPU market is not ideal, but its not as bad as youtube would lead you to believe.
Kezika@reddit
That and then there is also running lower power cards as a second GPU for if you're running 4+ monitors, since you can still only have max 4 monitors per GPU even if it has more than 4 physical connector slots. Then there is stuff like doing something that can benefit from work offloading to a second GPU like vTubing.
I use my older GTX980Ti for both of those things and works wonderfully fine still for those applications with a 2070 Super as my primary.
Strazdas1@reddit
its not like those cards will cease to work. they just wont get any updates anymore.
Helpdesk_Guy@reddit
There's a pretty easy solution for all of 'dis: Stop rewarding CRIMINAL gangster-companies like Nvidia with purchases!
Oh wait, never mind … I forgot about the average Nvidia-user! That's the one who nearly begs to be bend over by Nvidia, obviously loves to take it hard in the back and is even deeply grateful for throwing Jensen a stack of cash of hundreds of dollars for incremental generational performance-increases.
Something, something self-restraint. You literally get what you actually pay for …
Fatal_Neurology@reddit
This is kind of a stupid argument because unless you got into the week 1 release of the 9070 Xt, that card has been priced near or at 150% of its Msrp at US microcenters for the vast majority of the time since launch. $850 to $900 for a "$600" card isn't competitive or reasonable. And nothing from Intel is at xx80 tier performance.
Give us a real fucking alternative to Nvidia if you want us to stop buying from them. And when we did have an alternative with a bunch of $600 9070 XTs on the shelf - guess what, we bought the fuck out of them - so just ease off on this.
JamesEdward34@reddit
there are still about 3 1/2 years of possible tarrifs and conflicts due to the current US admin. waiting rn seems like a bad idea
Capable-Silver-7436@reddit
9+ year old cards im not surprised. heck in 2027 or 2027 the 2000 series will probably be dropped too
Strazdas1@reddit
i feel like i read this news couple months ago already?
hackenclaw@reddit
Why didnt they End Maxwell first, follow by Pascal/Volta 2 years later?
GarbageFeline@reddit
Probably because Pascal was a close evolution of Maxwell so it's likely that Maxwell was still supported only because Pascal was still supported.
BlueGoliath@reddit (OP)
No that isn't true.
GarbageFeline@reddit
From the wikipedia page:
While I'm not sure how much that influences the development of drivers and support, these architectures are usually evolutions of each other so it's possible that some parts of the driver apply equally to different generations at the same time.
I never claimed this was true or false, just that it could be likely.
Kepler_L2@reddit
Pascal was just a die shrink of Maxwell
BlueGoliath@reddit (OP)
Not remotely true.
BlueGoliath@reddit (OP)
All Nvidia GPUs since Turing have a dedicated "GPU System Processor". Nvidia has been wanting to align all their hardware to use it for awhile. See:
https://github.com/NVIDIA/open-gpu-kernel-modules/issues/19
FlygonBreloom@reddit
Damn. It's going to royally suck if my GTX 960 and GTX 1060 get hit by security issues.
diceman2037@reddit
security updates will continue on the 580 branch
FlygonBreloom@reddit
I darn well hope so hahaha.
Sopel97@reddit
paranoid
Confidentium@reddit
Man. That’s sucks. 😕 Those cards have been awesome for budget builds. But I know they’re old. Pascal is 9 years, and Maxwell is 11. That’s a long time in terms of free software support!
cyberloner@reddit
keep support old gpu and new gpu is not gonna sold... xD
mduell@reddit
For Unix, no indication I see about Windows.
az226@reddit
They also said 12.9 will be the last cuda for Volta.
BlueGoliath@reddit (OP)
Both are roughly based on the same driver branches and sometimes get updates on the same day. It applies to both, promise you.
BlueGoliath@reddit (OP)
https://videocardz.com/newz/nvidias-next-major-gpu-driver-branch-to-drop-support-for-geforce-gtx-700-900-and-10-series
Slimeballs at videocardz not giving me credit.
Glass_Strain_2453@reddit
It's a shame they are deprecating these architectures all at once but I can't really complain with how many years it has been.
What I can complain about is the current state of their drivers. Hopefully they'll improve them drastically now that they have less gpus to support.
mysticode@reddit
Dang I guess I chose a really good time to replace my 1070ti last week!
kimo71@reddit
With a classic gpu nvidia should show some respect and just keep it going wee long as its still aok at 60fps 1080p nvidia fucking dollar