Report claims Nvidia will not be releasing any new RTX gaming GPUs in 2026, RTX 60 series likely debuting in 2028
Posted by Forsaken_Arm5698@reddit | hardware | View on Reddit | 540 comments
Saisinko@reddit
In one sense I get mad,
In another I realize I'm playing Rimworld and Terraria.
uicheeck@reddit
current gen GPUs are incredibly powerful machines. If anything, it is actually better for gamers, if studios will keep system requirement at the same level for couple of years. What I'm afraid that they'll stop to actually produce gaming gpu's
letsgoiowa@reddit
They won't and they simply don't care. See: Monster Hunter, Outer Worlds 2, Borderlands 4. "Just upgrade your computer!" they say.
The objective is an internal rendering resolution and 30 real FPS. Just AI upscale the rest! Bleh
hackenclaw@reddit
just dont buy lousy games that need super high requirements while having average graphic quality.
silbervogei@reddit
Right, that's nice sentiment but, take something like GTA V. I can't even remember what GPU I had when, I think it was probably GTX 580, and I couldn't believe how badly it ran, so much so that I bought a GTX970, which actually wasn't much better.
letsgoiowa@reddit
Sure but it becomes a trend over time and eventually there's going to be something you'll be sad to miss out on
InformalBullfrog11@reddit
There are good games launched in the last 15 years. I'm sure no one played all of them
Strazdas1@reddit
so they already included the optimization? You do realize that most optimization happens through internal features being lower resolution and upscaled?
MrVibeThemes@reddit
how ? the pricing is absurd. forget the msrp, we're getting in multiples of msrp now.
ldn-ldn@reddit
They won't. They have too many wafers and not enough 3rd party components for server devices. There's only so much HBM memory available.
hackenclaw@reddit
Yep, a single datacenter GPU using GDDR7 probably need 4x more RAM
HBM takes another 3x on top of that, assuming HBM takes 3x wafer vs GDDr7 for the same capacity.
Right now we are facing vram shortage, all those extra GPU has to go somewhere.
Ragnogrimmus@reddit
Not only that but the current offerings can handle everything. Especially with DLSS and FSR super sampling, or super resolution. Unless you want 8k at 100 fest.
MDCCCLV@reddit
How hard would it be to make it user upgradable with more vram?
Strazdas1@reddit
extremely hard.
MDCCCLV@reddit
This is an alarming comment to get suddenly in your inbox. Do you frequent the Century Club?
Strazdas1@reddit
I will have to disappoint you - i do not.
dabocx@reddit
Keeping the speed high gets harder due to worse signaling. It’s the same reason why Apple uses built in memory and so does strix halo. It’s easier to get faster speeds stable
ElectronicStretch277@reddit
I believe Strix Halo uses soldered memory. Not on die. That also helps stability but not to the extent Apples implementation does (where they do put the memory on the package).
Triedfindingname@reddit
There was a GPU supposed to be released like anytime now that had upgradeable VRAM, called bolt
Anyway they talked a good game about shaking up the market. Now just sounds like they will be all AI cards.
Who the hell knows anymore.
Kiwi_CunderThunt@reddit
Physically not difficult but on paper a problem due mainly to signal integrity.
GDDR operates at a far higher frequency than the DDR your system uses and requires it for parallel processing and swapping large amounts of data at time
monocasa@reddit
GDDR doesn't realy have a spec to allow it to be on DIMMs, Something like CAMM could probably be developed, but that'd take longer than the AI bubble is likely to last.
Strazdas1@reddit
Who upvoted this nonsense? Stagnation is the worst thing that can happen to hardware. Keeping requirements same for years is horrible thing.
uicheeck@reddit
and it's worse thing because..? what? consoles keeps the same hardware for years and nothing bad happened yet
Strazdas1@reddit
Its worse because we loose any attempts at progress and prioneering, embrace conformism and lazyness and all this results in bad products.
consoles have been damaging the gaming industry for over two decades now.
MrMPFR@reddit
I thought it was the absurd game development cost explosion.
Impossible to justify keeping new games locked to the latest and greatest when they cost hundreds of milllions. Can't recoup that cost without maximizing TAM.
Hope this changes in the future but it doesn't look like it.
Strazdas1@reddit
The costs would decrease with adoption of new technology though. RT is a great example of this where it can save months of developement time (and thus costs) over traditional lighting.
MrMPFR@reddit
RT adoption did nothing to stop AAA cost explosion, something more drastic is needed and doesn't look like it'll happen anytime soon considering the strong anti-AI stances we hear from big AAA rn. Hope that changes when the tech matures.
Perhaps AA will be the ones that push tech forward in the future.
Regardless all the low level APIs need to die. They're outdated and a nightmare to work with. Fingers crossed we see a new API and not just DX12 and Vulkan in perpetuity.
Strazdas1@reddit
we already had AA games with AI voices in them. They can posture all they want the moment they think its going to be cheaper to use AI they will use it.
The APIs are not going to die if only for the fact that youll want your GPUs to be backward compatibile with 30 years of gaming history.
MrMPFR@reddit
The don't need that baggage for new games. Devs can redesign their engines + mandatory RT is already a pretty significant cutoff.
As soon as it happens everyone is going to jump on board. DX12 and Vulkan is a nightmare to support.
Strazdas1@reddit
games are slow to adapt. People were still making new DX9 games when DX12 released.
Lol, thats the funnies thing i read all morning. All engine talent has been fired from dev studios and instead replaced by people who know how to click a few buttons in UE user interface. Its "cheaper talent." Game developers could be hardly expected to redesign theie engines even back when everyone had their own engine, not coming anywhere close to that now.
I mean yeah. Devs got what they wanted. They wanted more direct access to hardware, they got it, now they dont know what to do with it because abstracted was easier in DX11.
MrMPFR@reddit
Sure. But it's better to start now than newer. But realistically such an API reset prob isn't happening any earlier than the end of the decade + add another 4-6 years for adoption and familiarization.
Then it's Epic's problem.
It's not just that. Many idiosyncracies in those APIs. This is clearly laid out in the blog post + getting rid of those would make it simpler to use.
MrMPFR@reddit
Consoles have been stagnant. In general the entire industry have been extremely stagnant for far too long. Things gotta change at some point.
PS6 will change that but extended crossgen will kill that deferring any real progress to the tail end of that generation and the PS7 generation.
Valmar33@reddit
Current-gen GPUs are extremely powerful ~ while current-gen AAA games are so horrifically unoptimized in probably the worst ways they ever can be that you'd never know it.
einmaldrin_alleshin@reddit
Give a programmer a resource, and they'll use 100% of it. There's just no incentive to optimize beyond the point where the software runs on the target spec.
But if the goal is to get the game onto a switch or deck, they'll manage.
Strazdas1@reddit
youd be surprised how many developers outright refused to make such ports because they dont want to cut out pieces of their work to make it run.
Valmar33@reddit
In this case, it seems to be PS5 Pro's and a 30fps budget.
They're not paid to care about optimization ~ they're paid just to pump out a game so it can be released as quickly as possible. In which case we can blame management and the publishers giving them absurd deadlines with abysmal budgets.
If these programmers were given an appropriate time frame, resources, a 60fps or 120fps budget, told to scope to game to X, Y and Z without that ever changing during development, I'm sure they could make something excellent.
Because the game is scoped for that. Usually, for a port, though, it means cutting so many corners.
Rise well on the Switch because it was designed for it from the ground-up ~ Wilds, as a port, will be a struggle to run without severe compromises being made, in which case it will just look much uglier than Rise.
Strazdas1@reddit
you wouldnt know optimization if it hit your monitor dead on. Every time gaming optimize something gamers cry in unison about it being bad.
YF422@reddit
If the market plateues for a few years until this resolves, Game Devs wont have any excuse about not focusing on Optimisation as people are going to be using existing hardware for much longer bar hardware failure replacements. Hell if they're forced to optimise games out of necessity its at least one positive outcome in this sea of bullshit.
Also fuck AI, I'm sick of them all trying to shove this useless shit at us all while it fucks up the hardware market.
Aaron_Judge_ToothGap@reddit
They won't. Nvidia, AMD, and now Intel know better than to put all their eggs in one basket.
Imagine if the AI craze crumbled overnight. Do you know how much these companies would be scrambling to go back to creating gpus?
kkrko@reddit
Yeah, Nvidia knowing not put all their eggs into the gaming basket is what allowed them to ride the AI wave.
HughMongusMikeOxlong@reddit
They started the AI wave, not rode it.
Not a single AI company could exist currently without Nvidia. They did literally magic to enable software engineers to use simts/simd's for vector math. GPU's were not general purpose, and not many people could write their own shaders.
DougChristiansen@reddit
More than gamers use these cards. Some work and play on their cards; I was really looking forward to the 5080s 24gb vram for work in unreal engine. Considering the older 7900xtx now as 16gb cards max out fast in scene development.
uicheeck@reddit
do you think 7900 xtx will be much better than 9070 xt, btw? I'm kinda in the market for the GPU for unreal too, and there is 200€ difference between them, I don't know how much of a difference vram may cause to overpay so much for an older card
DougChristiansen@reddit
For the 24gb VRAM; yes. I think it may well be better. I can hit the wall with 16gb on larger scenes pretty fast. For my specific use case vram is king. I’m even considering the B70 pro as it will launch with 32gb. A 16gb card is fine for many/most things but I think more vram will be helpful.
I was really, really looking forward to a 24gb 5080s; that would have been a really sweet card for hobbyists/small creators. AMD should have really considered dropping a 9090xtx 24-32gb card for us. They overthink the GPU wars and don’t realize they can compete on value too.
uicheeck@reddit
yeah, agreed, absense of anything more powerful than 9070xt was shock to me. they just gave nvidia all, while their top card was not that much slower at third of the price
jhenryscott@reddit
It’ll help Stop these heathens from “upgrading” from their already good hardware tbh
Rude-Wheel470@reddit
I upgraded from my 5090 FE to a ASUS 5090 Matrix. Well worth the upgrade.
jhenryscott@reddit
You really ought to look into ditching the matrix-as it’s now outdated- and going with something like an MSI Lightning z
BabySnipes@reddit
Why even bother with low-end cards. Get the Pro 6000.
kkrko@reddit
Drivers?
ElectronicStretch277@reddit
It's 11% faster even without drivers. It's the same hardware, just more of it so it'll run the games.
ComplexEntertainer13@reddit
And that's with a relatively bad cooler and low power target (considering it has more compute units). Imagine that thing with 800W+ power and a water block!
Liatin11@reddit
It's my money and I'll buy a god damn rtx 69420 of I wanna!
r3volts@reddit
Current?
My 2080ti handles everything I play at 1440p no worries.
Obviously I would always appreciate some more power, but is it worth the price to upgrade? Absolutely not.
I_Dont_Rage_Quit@reddit
They are powerful but they are not powerful enough for native 4K Ray tracing gaming. Even a 5090 comes to its knees with Native 4K Ray tracing in AAA games. If you are really a true PC enthusiast, you need that shit lol
BigBananaBerries@reddit
They could maybe get back to optimising games again, like the old days.
996forever@reddit
The old days when GPUs would be obsolete in 2-3 years, and when the highest settings were really intended for future hardware?
BigBananaBerries@reddit
That was a much longer time ago than 5/10 years ago when game optimisation was still a priority with publishers. I was going to say devs but it's the publishers pushing the nonsense we see these days.
996forever@reddit
It’s easy to forget now, but mid to late 2010s was all about complaining about “poor console ports”. AC Unity was a famous poster child. Witcher 3 at launch with hairworks. The insane discourse when RTRT and DLSS first came out in 2018.
BigBananaBerries@reddit
I remember it but most ports weren't that bad. You'd maybe have weird things like button prompts still being X/Y or whatever but for the most part performance wasn't bad. Granted, there were some like Arkham Knight that were a complete mess but it wasn't the majority.
TripleS82@reddit
DLSS makes things doable for a while. Considering you need a $1,000 to do native 4K on high end AAA games anyways, I’m sure most will settle for upscaling.
gandalfmarston@reddit
But why do you need a gaming GPU if the AAA industry is shit now?
IguassuIronman@reddit
Even if you don't need it it's hard to avoid the FOMO. What if you do need it in a couple years and can get anything? Not to mention the fact that you're probably not going to get much more for your dollar based on the last couple generations.
I used my 3770k/1070 until 2022. I wonder if I'll also get a decade out of my 12700/3070ti...
vandreulv@reddit
What FOMO? There's always going to be something better in the future. Enjoy what you have now for a little bit longer. That "what if..." is always there regardless of the circumstances. The real question is... would you even notice the difference if you weren't watching benchmark results or that little FPS counter in the corner?
Puzzleheaded_Race539@reddit
I'm still on a 980 and can't play some of the latest games as they don't even get to the menu screen. So that is kinda noticeable lol.
InsertCookiesHere@reddit
I would bet on the 12700K aging very well, perhaps not as well as the 3770K as there were a lot of years in there where Intel was barely moving and I wouldn't count on that happening again but I imagine you get quite a few strong years.
The 3070Ti though.... Pascal was legendary, the 3070Ti is already has been staring at the ugly realities of limited VRAM for awhile now. The arrival of the next console generation is probably going to spell the end of it's relevance beyond low quality pretty quickly.
Tman1677@reddit
You think the next console generation is going to beef up on VRAM in this market? I honestly think we could see serious delays on a next console generation at all considering how little progress AMD has made since RDNA2. When we do get a new console generation I could easily see it built around low quantities of extremely high performant RAM, as the architecture will be built with datacenter HBM in mind, but costs per GB are exorbitant.
That being said, all of this is speculation so who cares, enjoy your GPU. The 3070 is already five years old, it's easily going to last another five at least on medium settings, and probably beyond that on low.
IguassuIronman@reddit
I've also got a DDR4 board which gimps me somewhat. Oh well, all I can do at this point is wait
varateshh@reddit
Depends on what DDR4 you bought. A Ryzen 5xxx with tuned 3800-4000 memory still performs relatively well. The same thing happened with DDR3 - if you had CL 9/11 2133-2400 MHz then there was little point in upgrading until the 8700k arrived. The earlier DDR4 Intels were truly dire.
doodullbop@reddit
I feel like the GDDR6X that Ampere used is gonna be the weak link on that kind of time frame. It was for me anyway, my 3080 became unusable due to vram degradation. Retired it in November, so right around 5 years since purchase. I was never comfortable with the high 90's memory junction temps, but that's supposedly "in spec".
ButtPlugForPM@reddit
Really
there is NO game that a 5080 isnt' maxing out
And theres likely to be No game this year that will
5080 can get u well past 160fps at 1440p on almost any title baring a few pieces of unoptimized shit..
and that's BEFORE u use DLSS
So there isn't really a NEED for a 6080 right now
magbarn@reddit
When you’re dropping >$1000 on a GPU I expect great 4K ultra performance with RT @ 120FPS or better. We’re still not getting that with a 5080.
Olobnion@reddit
I have a high-resolution VR headset and I'm running regular games in VR using the UEVR mod. Basically, I need a GPU that can run games in 7120x3560 and never dip under 75 Hz. Ideally with path tracing.
ButtPlugForPM@reddit
Yeah VR makes up 2.1 percent of the gaming sector...
that's literally a niche beyond niche market.
TenshiBR@reddit
Playing Total war warhammer 2 here, on a 5090. To my surprise the game makes the FPS tank to 30 sometimes or even lower at the campaign. At battles, I can't use MSAA 2x, 4x, 8x, only FXAA, FPS goes down to 90 on some city battles. Truth be told, this was already the case with my previous 3080 ti.
Before this I was playing Endless Legend 2...
Although, I keep jumping between other games as well, which are more gpu hungry, but your point still stands, there are many good games which aren't demanding at all.
magbarn@reddit
Geez, do you have a bulldozer paired with your 5090? Sounds like a StarCraft 2 type of game that only uses 1-2 cores.
zeronic@reddit
Only reason i'm interested in this new generation is potentially more VRAM for local LLMs(more than 32, maybe 48?,) and the 6090 being a massive meme number so i can't possibly not buy it.
I'm not holding my breath for more VRAM than the current 5090 though, 32GB is already more than 99% of people need, and i'm sure nvidia would prefer to artificially segment the market at a fairly low number so people aren't buying consumer cards for that purpose.
ComplexEntertainer13@reddit
The way things are going, I wouldn't be surprised if the 6090 is something like 34GB at most. They probably won't do a 512 bit bus early on the next node. And G7 has enough headroom that they can deliver more bandwidth than the 5090 despite just 384bit.
Which was the main reason they went with 512 on the 5090. Since 4090 is rather bandwidth starved as it is. There are some titles at high resolutions, where the performance advantage over 3090 is a lot less than would be expected as a result.
Osgiliath@reddit
I bought a 5090 and I find myself just playing project zomboid with Super Nintendo level graphics
thunk_stuff@reddit
This is why I bought an AMD 395+ system with iGPU and am selling my circa 2021 desktop rig. It's plenty good enough for the gaming I do, much smaller PC, and it sips power.
f3n2x@reddit
I've played like 40h of OpenTTD in the last few weeks on my 4090, haha.
But to be fair I also play AAA regularily.
trewbarton@reddit
such an absolute mood.
Zeroth-unit@reddit
Me and my rust bucket of a 1070 Ti still going strong playing Factorio and occasionally Rimworld. If anything I've sunk more hours into those 2 games than the entire rest of my Steam library combined and they don't even utilize double-digit percentages of the GPU.
JonWood007@reddit
6650 XT here. Playing new games I feel like my biggest bottleneck is VRAM. And given they WERE charging $350-400 for 16 GB BEFORE this crapshow and now even the 8 GB GPUs are hitting that price range, I'm fine standing pat. The longer they continue optimizing games for 8 GB, the better, that's the big bottleneck of my current card's longevity as I see it.
traveleon@reddit
I have my 5080 and I’m still playing other games like Zelda and Pokemon Legends Arceus lol.
kog@reddit
Yes, currently playing WoW classic and Terraria, haha
Spirited-Ad5093@reddit
c'est triste de voir que hormis le MFG la génération 5000 n'a rien pour elle .....ma 4090 peut tenir encore un long moment
LoudBoulder@reddit
I've realized I don't care anymore. There is no way I'll pay $1000+ for a gpu anyway so all these new cards and releases and benchmarks are just fluff that doesn't concern me anymore.
Building computers used to be fun
CrzyJek@reddit
Just do what I did and trade PC building for 3D printing. I swear it's cheaper 😭
onlyforsellingthisPC@reddit
For now. You seen the prices on Pis?
That said. Klipper/Mainsail/webcam still works just fine on a PI 3, but even those are going up in price.
littlefrank@reddit
you don't need a picfor 3d printing (although octoprint rocks)
onlyforsellingthisPC@reddit
Fair enough, there are expansions to popular boards like the SKR-mini that offer (similar) functionality to a Pi and at this point. Do that.
I'm on mainsail/klipper but I also only run a single printer (Ender 3v2 of Thesius).
I like the UI/find it easier to manage on Octo but I can't track down the gremlins that grind everything to a halt when I remote into my home network.
goobdaddi@reddit
Any hobby costs a lot money. The real issue is people making a reasonable choice of what they need to buy versus want to buy. Sounds like you know your limit which is a good thing.
SnavsMatiq@reddit
Yes any hobby needs money, but it would be disingenuous to act like PCs have always been this expensive. Its a cheap excuse for an expensive issue.
Phioltes@reddit
I splurged on a 4090. I'd always wanted a top of the line machine since I started building computers back in middle school when I built my first AMD Athlon 64 system. Got my first big girl paycheck after finishing residency and built a beast. Its probably going to be my last
SubstantialPoet8468@reddit
Dude i got a 6900 xtx for $600 in 2021
Krigen89@reddit
It's still fun. Just unaffordable.
mycall@reddit
Just get someone else to buy it for you 🤠
Beautiful_Ninja@reddit
My 5090 FE MSRP purchase is going to go down as legendary value if I'm getting 4 years out of it.
bwat47@reddit
hanging on to my 5800x3d/4080 for dear life
plantsandramen@reddit
5800x3d and 9070xt here, and it's the best CPU I've ever owned. Legendary up there with the 2600k, but I think the 5800x3d could last even longer and be relevant due to GPU market stagnating for consumers, and RAM being insanely expensive now.
Own_Bet5189@reddit
Yeah. 5800x3D and 4080S here.
My 5900X wouldn't play Starfield, so I got a 5800x3D for cheap after Zen 4 dropped. I'm so glad I did, and got 32 GB RAM as well. I'm aiming for AM6.
itsabearcannon@reddit
Gonna have to call BS on that one since Starfield’s minimum PC CPU requirement is a Ryzen 5 2600X and the “recommended” is a 3600X.
The 5900X absolutely can play Starfield. If it wouldn’t on your system, that was a you issue, not the 5900X.
No_Weight5486@reddit
Honestly, the only explanation I can think of for that guy… is that his 4080 Super is getting bottlenecked by DDR4. (Aside from the game maybe being poorly optimized.) Look, I have a 5070 Ti ASUS TUF OC and out of the box it performs about 10% better than the other models,(bios 350w) so we’re basically above a 5080. I’m running it with a 14600K and DDR5, which already beats the 5800X3D at stock… and keep in mind I still have room to upgrade the CPU later and push performance even higher. I haven’t even changed CPU yet despite the years, and it’s still a fantastic chip plus overclocked to 5.8 GHz it’s really tempting. Let’s see what happens when the new sockets come out. (maybe i get a new cpu? :O)
Own_Bet5189@reddit
I couldn't get Starfield to run. It just crashed every time. I tried all the things and gave up and played other games. Unrelated to this, and because I had extra money and I only game, I got a 5800x3D. I used the 5900x to build a rig for my daughter. I did NOT buy it to play Starfield.
I eventually tried Starfield and found it ran. This had nothing to do with FPS benchmarks and only with game stability. The 5900x was unplayable and the 5800x3D has had 0 issues.
Pocket_RPG@reddit
Your 5800x3d deserves a better home, send it to me to pair with my 5070ti and it’ll feel more special 🥹 ngl I’m envious. I missed AMD’s last drop of them on Amazon for 350$. I was just about to buy and it sold out.
GenZia@reddit
I got my 5700X3D for $200 and paired it up with dirt cheap 2x16GB 3200 CL16 that cost me $70.
Turns out to be one of the best decisions of my life.
PlattypusRex@reddit
Me too, and I did it on a complete whim just because it was such a good deal. Had a 3800X before that.
Pocket_RPG@reddit
Yup. Trying to do that nowadays is about the same price as a new low end pc
TriumphantPWN@reddit
Ouch 500 on ebay
Pocket_RPG@reddit
Try like 1,200 on retail websites too.
TriumphantPWN@reddit
I got mine on Amazon July 1st 2023 for 290, jeez
Pocket_RPG@reddit
Wish I would’ve gotten one then but I don’t have plans to upgrade at that time :/ xi hope amd drops a fresh batch again now that prices are surging for most pc parts. I planned on going am5 this year but at this point it’s not even worth it.
TriumphantPWN@reddit
I upgraded because my 3600 and 2070s weren't cutting it for starfield 1440p144 lol, now I'm trapped for the foreseeable future with these specs
Pocket_RPG@reddit
My first build was r5 5600x with a 12gb 3060 asus oc and 16gb ddr4 with a super low clock. I recently upgraded to a 5070ti and 32gb of ddr4 cl36 before prices went stupid and now I’m in the same boat, because no more am4 x3d cpus will hit the market. :/
TriumphantPWN@reddit
5700x3d is around 350 on eBay right now, that might be your best bet. Can't imagine why the price would ever go down though
Pocket_RPG@reddit
I don’t trust used pc parts let alone from eBay. I’d rather not get let down even if it’s such a minute chance. My setup rn isn’t terrible but my r5 5600x is a major choke point. I wanted x3d but at this point I’ll probably skip further am4 upgrades and piece mail an am5 board to eventually replace my current 2020 setup (minus the 5070ti, had to get that while it was msrp)
Jon_TWR@reddit
Me and my 5700x3d/4080 Super/32gb DDR4 sweating…
TriumphantPWN@reddit
I have the same combo here, upgraded from 3600/2070s in 2023
Spir0rion@reddit
pays 2k for a gpu
calls 4 years out of it a win
sonnytron@reddit
People who buy $2000 GPU’s aren’t people who upgrade every 4 years. A x090 is someone who resells and shifts up to the next x090 GPU. I promise you a lot of with a 5090 upgraded from a 4090, which itself was an upgrade from a 3090. So basically they sold their 3090 for close to $1000 and paid $500 for a 4090. Sold their 4090 for close to $1500 and paid $500 for a 5090. They actually get x090 performance for two to three years for $400-500.
goldcakes@reddit
Yup I'm a video editor. I sell and upgrade every generation, 2080 Ti -> 3090 -> 4090 -> 5090. Time is money, and I'm working with 6K footage more and more so need the power and VRAM.
BenFoldsFourLoko@reddit
Lots of people in PC gaming with more money than sense
Look at the standard reddit advice to buy a 5080 over a 5070 Ti lmao. You are actually a noob (no judgement) or a fool (big judgement) if you did
Spir0rion@reddit
I love it when people recommend buying a 5070ti for 1000 when you can have a 9070xt for 700 as well
ColaBottleBaby@reddit
Woah woah woah. Dont go around recommending R*deon cards around here buddy
Spir0rion@reddit
Dude just casually dropping the hard R
GenZia@reddit
I'm planning to keep my $600 4070 Super for at at least 6-7 years, if not a whole decade.
Of course, I don't mind playing at medium to high, as opposed to ultra.
In fact, Digital Foundry has made a whole video about why cranking up the graphics to ultra is almost always unnecessary.
Pamander@reddit
See I know this is true but the sick fuck in my brain tells me to crank it up anyways.
Strazdas1@reddit
There is nothing wrong with that. People extract different value from it. In fact, if you game on average 4 hours a day then it will be cheaper in terms of dollars/hour than any other form of entertainment by far over those 4 years.
TK3600@reddit
Pays 2k, somehow sell for 2k 3 years later.
twofort_@reddit
4 years of it being top of the line card is pretty neat though.
EnoughWarning666@reddit
Just checked how much 5090s are currently going for here in Canada...
https://www.memoryexpress.com/Category/VideoCards?FilterID=f45809a5-4cd7-f27a-ade0-61742d10ac98
Guess I'll stick with my 3080 for a while longer!
LevelUp84@reddit
it's his money, so it's all good.
996forever@reddit
Everything is good, no discussion needed so let's close down this sub
AmazingSugar1@reddit
The slow conditioning of the next gen pc gamer continues
SirMaster@reddit
Back in the day flagship GPUs were $500 but they came out yearly…
So it’s a similar cost per year. Actually cheaper now considering inflation…
Beautiful_Ninja@reddit
Yeah, I would upgrade sooner if Nvidia has a product available. But if nothing's going to be faster, I might as well enjoy the cost amortization.
_Lucille_@reddit
3080 MSRP at launch, gone through COVID spikes, crypto, and now AI.
Still able to play games really well.
ColaBottleBaby@reddit
My 3060ti still works great on 1080p lmao
magbarn@reddit
Scoring a 3080 at MSRP is still one of the hardest GPU’s to get at that time. I tried for over 2 years and ended up paying twice more for a 3090 as I got tired of waiting.
lessthanadam@reddit
I'm in the same boat. I upgrade every 3 generations, so I came from a 980. I plan on going big and getting a 6090 for my next upgrade. Luckily I've been saving money since 2020.
peepeeinthepotty@reddit
Same man. Most lucky purchase I’ve ever had especially for 1440p gaming. Even got an EVGA so it’s got some years left in it in terms of build quality.
ixvst01@reddit
5090 owners will be set for the next decade. It’ll most likely outperform the 6080 and maybe be on par with a 7080.
Tystros@reddit
in 10 years, trying to run the then-best local AI models on a 5090 won't work at all. in 10 years you need at least 128 GB VRAM for even the smallest then-released AI models.
moofunk@reddit
GPUs probably will stop being used for AI workloads, and we'll see highly beefed up NPUs or more suitable architectures for that instead.
MrMPFR@reddit
Not gonna happen. If anything it looks like NPUs have been a huge joke and not used and now everyone is rushing to do what NVIDIA did back in 2018 and integrate HW directly in the GPU.
moofunk@reddit
They are absolutely used, but maybe not so visibly on /r/hardware's preferred gaming PCs. Google's TPU is such a thing.
GPUs have a memory architecture that is actually terrible for AI, which is why you need very fast, power hungry GPUs to do the work. Integrating tensor cores into a GPU may be a currently practical way of adding AI workflows to GPUs that still don't have them, but the future prospect is a slower release cadence of GPUs, because of the constant requirement to be on the bleeding edge.
NPUs are much more efficient and not suffering from this problem, which is why I'm thinking that eventually, they'll scale up and move past GPUs.
MrMPFR@reddit
DC HW =/= consumer glass Copilot compliant NPU. Guess it's a misunderstanding.
Yeah and that TPU is loosing to NVIDIA's latest GPU in tokens/$, but let's be honest that has nothing to do with a GPU. It's just a parallel compute and ML accelerator.
Von Neumann is not a good idea in general if it can be avoided. The dataflow architectures is quite interesting especially those that move beyond systolic arrays. You're right that GPUs will eventually get disrupted.
Strazdas1@reddit
This sort of claim keeps popping up but every time people try to make it a reality they fail flat.
Tystros@reddit
do we see any indication at all yet of that happening though? I don't see anything like that on te horizon so far for consumer/prosumer space (something you plug into a PCIe slot in your desktop PC).
moofunk@reddit
Tenstorrent's architecture is an obvious indicator of where things are going to be in 10-15 years. Vastly different and economical way of managing memory and having a much more balanced scaling model that requires no external silicon. Even if Tenstorrent themselves are not successful, they are laying some very important architectural foundations for future cheap scalable AI systems. I'm sure there are others in a similar position that will duplicate their work.
Basically, their current solution is datacenter level interconnect in a PCI card form factor at the cost of a high end gaming GPU. Then you are free to interconnect up to 16 of them at the cost of the cables and the PCs to install them in. The chips talk to each other directly. It should eventually scale up to hundreds or thousands of chips, where the chip costs will be the dominant factor.
The modern AI servers following Nvidia's GPU centric memory model is as expensive in interconnect as the chips themselves and are becoming more so, and there is significant additional hardware and custom chips required to scale up.
I don't believe this constant bleeding edge rush is going to hold up, and it will be extremely cost prohibitive to continue.
Tystros@reddit
Interesting, I had not really paid attention to that Tenstorrent already released PCIe cards that you could buy and plug into a PC already... That's nice, so that indeed gives a possibility for something on the horizon that might work, if they can get their software stack working well enough. Since currently, all local AI stuff is optimized for CUDA so it would be really tricky to use the Tenstorrent cards in practice.
moofunk@reddit
Their software stack is the big problem, so everything available is custom built for specific setups.
So, it's not so much development of the software stack itself (there's plenty of movement), but the ability to compile Pytorch models automatically for specific chip configurations is apparently an extremely complex problem, that they haven't yet solved and is unique for the architecture.
So, every model is handcoded for now.
amidoes@reddit
Who gives a fuck about AI? You won't need even half of that 128GB to play any game
Tystros@reddit
I didn't say I'm a gamer. I care about hardware for AI.
VastTension6022@reddit
This post is specifically about gaming GPUs. Nvidia will happily release many AI GPUs.
Strazdas1@reddit
no, it is not. Those GPUs havent been specifically about gaming since 2007 when CUDA launched.
diabetic_debate@reddit
For nVidia at least, all GPUs are Ai GPUs, not all Ai GPUs are gaming GPUs. We use a lot of Ai features and CUDA for non gaming workloads on gaming GPUs as we can not afford to shell out for the 'Ai' GPUs. So even if the title says gaming GPUs, this effects non-gaming work as well.
Strazdas1@reddit
billions of people, apperently. Theres over 2 billion of AI users in the world now.
diabetic_debate@reddit
I don't play games, only reason I have a 5070ti is for Ai workloads. A lot of us do, you should check out /r/LocalLLaMA.
m3thodm4n021@reddit
Ya no thanks lol
IguassuIronman@reddit
So totally fine for the large portion of people who don't give a shit about AI models?
Tystros@reddit
I'm quite sure that in 10 years there will be many more consumers who use a GPU to run AI models locally at home than who use a GPU to play games
IguassuIronman@reddit
I'd take that bet all day long, presuming were talking about desktop form factor dGPUs.
Tystros@reddit
yeah, talking about desktop form factor dGPUs. or generally compute cards you plug into a PCIe slot, because maybe in 10 years nvidia will primarily build GPUs without a graphics output and then they might just not be called GPUs any more, even if the architecture is still the same like now.
IguassuIronman@reddit
I think it is incredibly unlikely such a thing takes off in a widespread sense in the consumer space, even if chatbots or the like remain popular. The use case for a consumer playing games (entertainment) is going to be much broader then talking to a chatbot or whatever. At most I could see it having a similar market to people buying a GPU to do rendering or similar as a hobby, which is to say a market much smaller then gaming
m3thodm4n021@reddit
Most people use their GPU's to play games, not run local LLM's. I couldn't give a shit less about that.
Future_Noir_@reddit
They won't even sell you a GPU at that point.
Tystros@reddit
I'm more optimistic than that. in 10 years, the industry will hopefully have managed to scale everything up enough to serve both demand from datacenters and consumers/prosumers simultaneously
Future_Noir_@reddit
I hope so
Lamborghini4616@reddit
Sure everyone is ai slop bro
airfryerfuntime@reddit
What a wild claim to make, lol.
dparks1234@reddit
5080 is slower than a 4090, it’s not unreasonable
junon@reddit
That's because 5 series wasn't a die shrink.
InformalEngine4972@reddit
Don’t need to shrink a die to get better performance. Back in the days you got a node shrink every 2 years and a new architecture every 2 years. Both gave about equal gains.
Ok_Spirit9482@reddit
most of the low hanging fruits are picked at this point, there is only so much optimization you can do.
InformalEngine4972@reddit
Intel said the same a decade ago. Now somehow amd an apple are competitive and CPU’s are getting nice architectural upgrades. Latest one being x3d CPU’s. Was also 30% better on the same node.
Innovation never stops.
Strazdas1@reddit
and intel was right. the architectural upgrades from AMD are mostly just more cores and they arent all that much better. Apples approach is interesting but they did a lot more than just CPU architecture to make it worok.
MrMPFR@reddit
Isn't the quality loss supposed to be minimal? This isn't FP4 or MXFP4.
Strazdas1@reddit
FP4 is what i was talking about. Quality loss is debatable. For something like video upscaler it may not matter. For something like machine learning tasks i wouldnt use FP4 model if i could avoid it.
MrMPFR@reddit
Yeah FP4 is mostly useless but NVFP4 is different though.
NVIDIA did talk about it in a technical blog + there was a LLM paper a couple of weeks ago that showed almost no quality loss compared to FP8.
Sh1rvallah@reddit
6080 will just be that much more cut down
VastTension6022@reddit
Dies won't be shrinking that much going forward.
InformalEngine4972@reddit
4000 —-> 5000 was the worst generational leap in existence. 1 5000 Series core is not even 1% faster than the 4000 series. All they did was add more cores and more power.
magbarn@reddit
I still think the 2XXX series was the worst release with massive increases in price and very little in performance. Fast forward over 7 years later, DLSS makes them viable, but their RT performance will forever be pathetic.
InformalEngine4972@reddit
2080 wasn’t that bad tbh. A 20% compute increase and the massive rt and dlss capabilities. In hindsight it aged way better than all the amd cards that came years after. It was the right choice.
ASYMT0TIC@reddit
At least 5090 gets additional vram. I noticed that 4090 was selling for $2k used on ebay, and that the store down the street from me had 5090 for sale new... for $1999. Baffling, but whatever I'll take it.
varateshh@reddit
Some enthusiasts have modded their 4090s VRAM for a 15%-39% uplift in performance. I suspect that VRAM on 5090 is the main reason why the card had a performance uplift at all.
TheNiebuhr@reddit
Yeah, Crapwell is easily the worst generation in Nvidia history.
It's so bad that AMD has basically catched up to them in performance per compute unit when they were quite behind with Rdna3.
And Intel with Xe3 cores looks like they overtook both...
996forever@reddit
Comparing "performance per compute unit" is pointless with vastly different performance tiers. It doesn't scale linearly.
Finmail@reddit
It's a 10% difference between the two. The 50 series was never intended to be a raw horsepower upgrade; it was a design to push efficiency and AI.
All signs have pointed towards frame generation being the next big leap for card performance upgrade. It's cheaper at this point to invest in more AI core vs investing in manufactering to continue to reduce the size of chip dies.
4090 is a beast, and will be relevant for many years to come, but 50 series, including the 5080, will eventually show it's relevancy if we remain on our current trajectory.
iDontSeedMyTorrents@reddit
Same node, basically identical architecture. That's not going to be the next gen.
cocktails4@reddit
Just have to woke the power connector doesn't overheat and burn down your home.
MrMPFR@reddit
Man when will NVIDIA stop supporting that stupid power connector. Pumping 600W through it and having zero circuitry on the other side since 40 series (30 series had some). Should've just stuck with good ol' 8pin PCIe.
Strazdas1@reddit
you have to hope that something that has never happened in 4 years of product lifetime continues not to happen?
Tai9ch@reddit
I wouldn't dream too hard about going a decade.
We'll get VRAM overproduction at some point as the current AI bubble pops or wobbles, even if just GDDR7 and HBM3 as the old generations. When that happens we'll absolutely see consumer GPUs with 64+GB, and the 5090 will be a mid-range card. I'd bet this happens before 2030.
SirMaster@reddit
But with a longer time gap between generations, there’s greater potential for the performance jump to be larger as well.
iDontSeedMyTorrents@reddit
Not necessarily. New architectures are worked on years in advance and pretty much set in stone long before they ever launch.
SirMaster@reddit
Architecture isn't really the main reason for big performance uplift though. It's process node size.
There was little difference between 4000 and 5000 series per mm^2 or per watt, because both used 4N.
Making next gen in 2028 instead of 2026 means they will be able to use a smaller more refined process node which would translate into higher potential performance.
IguassuIronman@reddit
Assuming they redesign for the smaller node instead of launching the same product a year later, after VRAM supply catches up
iDontSeedMyTorrents@reddit
They weren't ever going to launch in 2026. There's also no guarantee that a new process is design compatible, meaning they very well might have to spend tons more time and money for new tapeouts and verification.
InformalEngine4972@reddit
Can easily get 10-30% better performance on the same node with architectural upgrades.
certainlystormy@reddit
dude a 5090 should last like 9 years 😭
Strazdas1@reddit
no it shouldnt. a 5090 lasting 9 years would show a massive stagnation issue in the industry that would be at catastrophic levels.
MrMPFR@reddit
Sounds about right. PS6 delayed 1 year to late 2028. \~6 years crossgen (thank the handheld and even less reason to upgrade than PS5). \~2034 when crossgen ends.
Hope I'm wrong. Can't see how it'll be any different unless PS6 does something new the 5090 just can't. Can't see that rn + even if HW is more sophisticated 5090 is a compute and VRAM monster.
hackenclaw@reddit
the power connector might not tho lol.
koryaa@reddit
You ve probably the same chance to die of a heartattack, stroke from stress if you surrender yourself to every hysteria.
StevannFr@reddit
And what about the 4090? Six years later, it's still the second best CG on the market.
SirMaster@reddit
6 years?
StevannFr@reddit
In 2028 if nothing comes out before then
Strazdas1@reddit
60 series will launch in 2027.
lolhello2u@reddit
chill, some of us are still milking our 1080s
Strazdas1@reddit
that cow ran dry 4 years ago, its time to stop the hand motions.
temo987@reddit
Given Nvidia has dropped driver support, you won't be doing that for long...
burtmacklin15@reddit
I was too until last year when I moved to a 9070 XT (with 6-pin power connectors). I have zero regrets.
mrandish@reddit
For sure. I was milking an overclocked 1080 Ti until I found a 4070 Ti Super for under MSRP 18 months ago. I was already thinking I'd be milking that for at least 4 years.
This news just confirms my plan of choice won't be a choice :-).
aurantiafeles@reddit
1060 6GB with 5700x3d 32GB ddr4 here. Mostly plays what I want, expedition 33 was okay. Bloodborne emulated is like 45 fps which is fine.
Alowan@reddit
1080ti represent!
lolhello2u@reddit
approaching 10 years of service... o7
dragenn@reddit
Does nothing!
Winning...
e_c_e_stuff@reddit
I will never beat the value of the 3090 FE I got for free
certainlystormy@reddit
how the fuck do you get a free 3090
e_c_e_stuff@reddit
I as a PhD student in computer architecture/chip design went to an nvidia career/recruiting event where they raffled off one and ended up winning. Had been trying to buy a 3080 for a few months at that point so it was amazing.
certainlystormy@reddit
aw nice :D
alpacadaver@reddit
That is technically correct
Swoly_Deadlift@reddit
Paid $1440 for my 4090 in December 2022. Welcome to the club, it's a great one to be in.
DiggingNoMore@reddit
My RTX 5080 powers my 1600x1200 monitor.
kasakka1@reddit
The lack of DP 2.1 is my only complaint about the 4090 specs. It sucks so much that it was not included.
But the flip side is that most games I play aren't running even at 4K 120 fps with DLSS Balanced/Quality so what am I really complaining about?
Currently playing Silent Hill F and it's hovering around 80 fps with DLSS Quality.
cocktails4@reddit
After one of the recent W11 updates, my desktop HDMI connect stopped waking up. Both of my DP monitors would turn on instantly but the HDMI monitor would require me to power cycle the monitor twice. Super annoying. I bought a 5070TI the other week and at least I can do DP 2.1 now so there's that. And the old 4070TI went in my old SFF box to use in the living room.
4x4Mimo@reddit
It feels like the 4090 came out just 2 years ago. Can't believe it's been that long
Lamborghini4616@reddit
Just keep playing at 4k? You don't have to buy everything that comes out
Swoly_Deadlift@reddit
I’m primarily a Mac user who only uses PC for gaming. I’ve been waiting years for 220ppi gaming monitors so I can run a single monitor for both computers.
sonnytron@reddit
My wife just told me I better not sell my 5090 or she’ll leave me. I tried to wake myself up.
vialabo@reddit
Not regretting my 4090 at all. 5090 sounds awesome too.
rohit275@reddit
Literally a GPU that has appreciated in value 3+ years after it launched at what most of us thought was way too high of an MSRP lmao. Mind boggling.
CrzyJek@reddit
That's a lot of cards these days. My 7900xtx did the same. It's the VRAM capacity people want.
rohit275@reddit
Oh I know, I get it as someone into ML research myself. It's just annoying how supply has not kept up with demand, and AI data centers are ruining personal computing for everyone. I am conflicted.
udderlymoovelous@reddit
I'm hanging onto my 1080Ti as long as I can
ILoveTheAtomicBomb@reddit
For real lol, sitting pretty with my 5090 with all this memory nonsense going on right now. Don't think anyone saw OpenAI deciding to buy everything out till 2028-29, fucking garbage company
cultoftheilluminati@reddit
The worst part is that they don’t even have fucking money to do this. It’s just IOUs lol.
ILoveTheAtomicBomb@reddit
Actual insanity lol. And every company decided that’s okay? I hope this shit crashes hard
Chrystoler@reddit
MSRP Founder's Edition 3080 is going strong, $699 in January of 2021 with 1440p still chugging along fine
MSRP 5090 absolutely a banger deal looking at it now lol
IguassuIronman@reddit
I feel like my 3070ti is just below where I'd like to be at 1440P. Which is a bummer because I'd really like to jump to 4k in the next year or two...
Chrystoler@reddit
I can definitely see that, I can't see jumping to 4k anytime soon with the way things are. I'd rather have high refresh rate 1440p than lower hz 4k (especially with 1440p OLED monitors being more and more affordable). At least dlss is helping keep stuff smooth so far
IguassuIronman@reddit
I'm hoping 4k OLEDs will do the same with regard to price. If a 4k 32" is ~$500 it's probably cheap enough for me to buy in. I don't really see much value in chasing super high refresh rates (especially in AAA games) so even the 240Hz of most current panels is excessive
rcook55@reddit
I'll be holding onto my 4090 for a loooong time.
Secure-Tradition793@reddit
Bezos hopes 5090 remains the most powerful consumer GPU forever.
tonyspilony@reddit
As long as it doesn't explode
RobinsonHuso12@reddit
My 3060 is still perfect
Tystros@reddit
yeah same, I'm really happy I got a 5090 FE at MSRP.
anyokes@reddit
You really think it's going to be outdated in 4 years? You're mad in the head.
ritz_are_the_shitz@reddit
I'm in the same boat, felt like a terrible decision at the time last August, but it's only getting better.
Agret_Brisignr@reddit
Bought a 5070ti and I will rock it until it dies or I make way more money than I do now.
No_Weight5486@reddit
How long did your GPU last? I upgraded my 2070 to a 5070 Ti only because of Monster Hunter Wilds (since it’s badly optimized…). How long do you think this GPU will last at 1440p?
A17012022@reddit
At this rate I'll be keeping my 5070ti for 10 years
crymo27@reddit
It's very wellpossible, mu gtx 1070 says hi.
kuddlesworth9419@reddit
I also still use my 1070. Only games I can't play are the ones that require RayTracing. There are so many games out there these days, so I just play the ones I can play and ignore the others.
Puzzleheaded_Race539@reddit
Ye I'm still doing fine with a 980. I just can't play hell divers or some of the very new AAA stuff (which are mostly trash anyway).
I'm just triyng to decide whether to ugprade to this gen or wait another two years as I want something that's gonna be future proof for local LLMs for a good while.
All the LLM stuff is pretty new so might not be a good time to upgrade.
kuddlesworth9419@reddit
I've given up on building a new PC. It's just not worth buying anything these days, everything is overpriced in my opinion. It doesn't help it's not terribly exciting anymore either with very minor gains gen on gen. I got a new vintage lens for my camera instead.
No_Weight5486@reddit
when it comes to CPUs the situation is actually pretty good. If you want to save money, you can go with AM4. If you want to save money but still have modern performance and features, you can go with LGA1700 even a 14600K is a fantastic CPU, for example.(you can usre High DDR4/DDR5) For GPUs I agree: the good ones are extremely expensive 5070 Ti, 5080, 5090… Maybe you could look at the used market instead? A 2080 Ti? Or a nice 3080 Ti?
No_Percentage_2@reddit
FF7 Rebirth 🥲
tommypickles5149@reddit
10 Series is the 🐐 for a reason
droptableadventures@reddit
It's OK, NVIDIA is fixing that by dropping 10 series support from the drivers.
Which is doubly ridiculous given they're bringing back the 3060, so it's now actually only two generations behind current.
Strazdas1@reddit
Nvidia is dropping support for a 9 year old cards. This is not unexpected.
droptableadventures@reddit
It's not unexpected, but given how many people are running them, it's a bit of a dick move.
Also, 1080ti was on sale up until about 7 years ago.
Strazdas1@reddit
It is not a dick move and it is not unaffordable. Inflation adjusted the prices havent changed except for the 5090. People are just more whiny and less willing to spend on their hobbies.
droptableadventures@reddit
The prices on the rest of the 50 series have barely moved because there's none in stock to buy anywhere.
Strazdas1@reddit
Thats just not true. My local retailer has every 50 series card available.
Sailor_Spaghetti@reddit
This might be a shock, but your local retailer is just one place and doesn't necessarily reflect what other places look like.
Strazdas1@reddit
Are you saying that my small eastern european country is the only country in the world without shortages of GPU? what would make us so special?
Sailor_Spaghetti@reddit
I don't even think it's a country thing, you just might be in a locale where gaming is more of a niche hobby.
Strazdas1@reddit
Considering my town hosts one of the major european gaming conventions, i doubt it.
Sailor_Spaghetti@reddit
It could also be that people are being priced out of buying hardware.
Strazdas1@reddit
unlikely. Wages have outpaced inflation here and inflation-adjusted Nvidia cards arent that much more expensive anyway.
996forever@reddit
The 20 series (2060 super - 2080Ti) despite launch prices (which look pedestrian now) ended up being the real longevity generation
Theoboli@reddit
Still rocking my RTX 2070, but it’s starting to show signs of age and I’m not sure I can wait till 2027, let alone 2028. At the same time, looking at the current quality/price ratio of a new 5000-series, ewww. In any case, the 2070 will have been by far my longest serving GPU o7
tommypickles5149@reddit
The 1080 is the king of longevity imo but overall the 20 series is definitely a contender. Still have a 2070 Super in my box.
Strazdas1@reddit
lol no. 1080 aged horribly. 2060 aged far better.
LiliaBlossom@reddit
lol no. I have a 1080 Ti and also an old spare 2060 laying around, I‘d even take the non ti 1080 over the 2060, bcs extra vram and honestly the rtx on the 2060 is so bad, you can‘t play proper raytracing games on it anyways, and the 1080 is a lot more powerful in rasterisation. 1080ti is goated for non rtx games, the oversized vram and the super slow adaption of raytracing made it stay relevant for far longer than it should have.
Strazdas1@reddit
Enable DLSS and the 2060 beats the 1080ti with resources left to spare. 1080ti is trash for any modern game.
ThrowawayusGenerica@reddit
Barely better than Pascal in raster, for an inflated price, with RTX that wasn't ready for prime time unless you bought a 2080 Ti - but having access to DLSS might just carry it for years to come.
996forever@reddit
Yes, longevity and forward thinking instruction set.
PnWoo@reddit
Yeah I run pretty well everything OK with my 1070
Baardi@reddit
Same with my 1070Ti. The problem about upgrading is that despite getting significantly more fps, newer gpus are really loud, som I'm enjoying my quiet pc as long as I can.
LiliaBlossom@reddit
just retired my 1080Ti after 9 years 💪
No-Performance37@reddit
Maybe in 5 years I can upgrade to a 5090.
Gohardgrandpa@reddit
I'll be holding onto my Pny 5070ti for a long time. I've got a massive backlog of games to play thru and I'm not upgrading anything again until I get thru those games.
MeraArasaki@reddit
Honestly same
Good thing JRPGs are usually not that difficult to run
Dpek1234@reddit
I used a gtx 1050ti untill late 2024 bacause i got my pc during 2021 and the gpu shortage (and was actualy considering if i really need a upgrade, same as when i used a 2010 laptop with a igpu)
My rtx3060 12gb will be enough for a very long time if it needs to
Jokerit208@reddit
I'm good with my 5090 for a little bit.
Specific_Quarter_455@reddit
This is good because most people don't replace graphics cards with each new series anyway.
Hour_Firefighter_707@reddit
Apple could do the funniest thing ever here by officially integrating Proton into macOS or supporting the Asahi Linux project.
Because AMD probably pushed RDNA 5 back to 2029 hearing this news
Low_Excitement_1715@reddit
I’m pretty sure Apple could get Valve to enable Steam Play/Proton just by asking nicely and making some minor promises about not ripping support out from under Steam for a while. Problem is, Apple doesn’t care at all.
As for Asahi, I’ve got that running here, and while it’s amazing and great, the GPU performance is much lower than under macOS, I wouldn’t call it a great gaming platform. (Asahi runs only on M1/M2, with some limited M3 progress and no GPU support on M3.)
jonydevidson@reddit
Given how much development GTK got and how much better the performance is on MacOS nowadays, Apple cares.
Metal API is awesome to work with.
Strazdas1@reddit
metal API is the worst to work with according to everyone i know that actually worked with it.
MrMPFR@reddit
Even worse than Vulkan? Yikes!
This entire API bloat situation is a joke. I really hope the nextgen APIs can move to a different paradigm like this one https://www.sebastianaaltonen.com/blog/no-graphics-api
Strazdas1@reddit
Whats described in that blog is beyond my ability to understand the APIs, but the way it was explained to me this is more of a nice theoretical framework that does not really work in reality when you consider all the legacy that needs to be supported.
MrMPFR@reddit
Check the "Min spec hardware section". It's fairly reasonable. If games ditch PS4 and Pascal then everything should be able to work. Yes this is just a proposal for how things should be, not an actual API implementation.
It's mostly about embracing bindless API design and ditching shading languages and all the bloat (abstraction layers) that was required back when DX12 and Vulkan was agreed. DX should be more like CUDA and stop supporting HW that's outdated anyway.
The end of the blog explains it better than I can:
Strazdas1@reddit
The issue is not what you think. Its not hardware. Its software. Your new GPu will still need to support that outdated game from 2013 because otherwise theres going to be a shitstorm of people who have no idea what they are talking about. See: PhysX32 bit support debacle.
That means you still need to support all that old bloat in the driver.
What was achieved in that post is a framework for API, not API itself. You are expected to hang all your functions on top of it, so if you want to support DX12 software youre going to be hanging all those functions in that modern API and end up with al lthe same issues.
MrMPFR@reddit
I never talked about HW changes and neither did the blog.
The suggestion was to change the API for all software moving forward and drop backwards compatability. So it's a API reset.
Strazdas1@reddit
and that is not going to happen.
MrMPFR@reddit
I think it was specifically for the HW support. Not that a complete HW redesign is required to adress it.
There's many things in DX12 and Vulkan that just isn't needed any more. Things have changed a lot in the last 10-15 years.
When DX12 launched in 2015 oldest GPUs with support was Fermi (2010) and GCN1 (2012).
A cutoff of 2018 for NVIDIA HW and 2020 for AMD HW is reasonable for a future graphics API (late 2020s, early 2030s).
Strazdas1@reddit
you think its reasonable and i think its reasonable, but just look at what happened when a hardware support for a non-critical feature whose last supported release was in 2011 was dropped? Im talking about the x87 32-bit PhysX acceleration.
MrMPFR@reddit
The new API doesn't even need implement any HW changes. It just needs to embrace bindless ressources and how modern GPUs already work. Zero HW changes needed and nothing that makes them incompatible with previous games.
Everything will still be backwards compatible, it's not a PhysX situation again, just that Pre RTX and pre RDNA 2 cards won't work with the new API.
Over time RTRT already forces that cutoff, so really changes nothing and anyone who won't push graphics will just stay on DX11 and older like we've already seen.
NVIDIA has dropped driver support for Pascal and Maxwell and AMD will prob kill off RDNA1 soon. Can't see why the a new engine by the late 2020s has to support outdated HW. What am I missing here?
Strazdas1@reddit
It does not need to implement HW change, but it needs to drop support for old software. If you want it to remain compatible with old software then you have to bring most of the issues of old APIs back in.
RTRT? Real Time Ray Tracing? Havent heard this abbreviation before.
well, they dropped developement for those cards. The drivers are still recieving security updates and whatnot. AMD already dropped RDNA1 last year.
MrMPFR@reddit
So legacy bloat will hold back PC gaming forever. Great news.
Yes RTRT.
Indeed I was referring to game support drivers.
Didn't know that. AMD impressive 6 year product support xD.
Strazdas1@reddit
As long as the policy and consumer outlook in PC scen is support everything forever, yeah. Its why companies like Apple have it so much easier. Transition to ARM? Just tell your developers "make an ARM version next year or we ban you from the store".
Based on how AI codes now, id say it would result in a lot of bloat itself. The way it reaches results is not efficient and sometimes.... really strange left turns. Ive once seen a code to classify people that considered being redhead a sign of being from netherlands, and yet at the end of the day the results were almost as good as real human workers.
I think in the thread on this sub about the AMD driver support ending people were mentioning they sold new hardware that got support dropped as recent as 18 months before drivers drop.
MrMPFR@reddit
That's a real shame.
So that's not a solution either. :(
Yes that 6 years is best case. People here and elsewhere have to stop with the Radeon glazing.
Low_Excitement_1715@reddit
Good points, I was unclear.
Clarification: Apple cares about games in the App Store. Apple does not care about Steam. Apple arguably doesn't care about games outside of the App Store in general. Tolerated, not interesting.
jonydevidson@reddit
They very much do.
https://developer.apple.com/games/game-porting-toolkit/
Macs are pretty good at new games. New MacBook Air is pretty good for a laptop with NO FANS.
Macbook Pros are not bang for buck like Air or Mac Mini. But with the increases in memory and storage and GPU prices they might as well be.
Macs now game pretty well, support adaptive refresh rate, AI upscaling and frame gen. For native games you can ship precompiled shaders. For emulated games via Crossover, you can use Nvidia DLSS (yes including Nvidia Frame Generation).
Low_Excitement_1715@reddit
I am aware. I have an M4 Pro mini and an M4 Max MBP. They are very competent when compared to my other laptops and SFFs, but don't come close to comparing to any of my desktops. Massive progress, sure. Massive achievement? Not so much.
Is my MBP comparable to other laptops, and does it beat most non-gaming laptops? Yes. Does it smoke everyone in the space? No. Pretty much any competent "gaming" laptop with a real GPU will smoke it. I bought my MBP for other uses, but pretending my MBP can handle games at high resolution, high quality settings, and getting a smooth or high frame rate? That's bending the truth more than I'd be comfortable with.
jonydevidson@reddit
My M3 Max is running like a mobile RTX 3080, so half as slow as desktop RTX 3080. I have an RTX 3070 laptop as well and it's a bit faster than that one, so I imagine mobile 3080 is the ballpark. Cpu bound games run much better than that laptop, though.
With 64 GB and 2 TB the price for it was like 4000 EUR 2 years ago, so if GPU power usage is your primary metric, you could have gotten a mobile RTX 4090 laptop for the same money. I had one of those too, but macOS is much better and smoother than Windows (especially in the last 2 years), the computers last a lot longer without slowing down, are better built and overall are better products.
But it's no longer like it was 5 years ago where Mac gaming sucked, so even when you drop money on a Mac, you can still game very well. For anything that requires a massive GPU, Geforce NOW works much better than Windows: it supports cloud G-sync for no tearing gameplay and AV1 streaming up to 120fps. Latency is so small it almost feels like native and because of AV1 there are almost no visible artifacts, especially in dark scenes.
magbarn@reddit
I have the MBP 16 M3 Max and even in BG3 which is one of the better optimized games for MacOS, my 5 year old Legion 7 3080 smokes it in graphics quality, fps, and fps stability. It’s a much smoother experience. The M3max can somewhat match the 3080 mobile at times, but there’s still major issues with occasional stuttering and the fps is all over the place.
jonydevidson@reddit
Latest Crossover Preview lets you use DLSS4.
magbarn@reddit
It looks like ass. Nothing like real DLSS4. $4000 for the cheapest full fat M4Max and you still don't even get OLED. I love my M3Max MBP16 for work, photo/video editing, but man it's not a gaming machine by far. I carry both my gaming laptop and mbp with me while traveling.
NeroClaudius199907@reddit
70% of apple hardware is slower than 980ti
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam?platform=mac
Unless the hardware becomes as fast as current hardware. People wont bat an eye at apple.
Low_Excitement_1715@reddit
I'm happy for you? I was saying what my hardware is doing. I'm glad your M3 Max is still making you happy. I expected a little more from my M4 Max. I don't have a 3080, but I do have a 3090, and it positively walks circles around anything I've been able to get my M4 Max to do. That 3090 is actually a spare right now, all my desktops have higher end graphics installed.
Is the M3 Max/M4 Max garbage? Of course not. I never said that.
Is it a high end gaming option, capable of running most games at 4K with good graphical quality settings and >60 fps? Also no. Most games I've tried at the native res seem to run out of fillrate long before the pixels are all satisfied. If I run at a lower resolution and upscale, that works for frame rate, but that's not something I have to do with the competitors.
If you're saying Apple has capable and competent mid-range graphics horsepower, I agree and we have no argument. If you're trying to tell me that the M4 Max rivals a 7800XT or 5080, you're looking at some really cherry-picked benchmarks under very contrived circumstances.
26295@reddit
The problem isn’t that apple doesn’t care. The problem is that it cares, and it cares about having as much as a monopoly as possible in software distribution in their platform. If apple could get away with locking macOS as they’ve done with iOS they would have done it already.
Low_Excitement_1715@reddit
They're working on it. Can't roll it out all at once or the users rebel.
MrMPFR@reddit
Nah RDNA 5 is still happening in 2027-2028. It's just RDNA3.5+ iGPU seemingly in perpetuity.
Strazdas1@reddit
Nah, Apple wants everyone to use Metal and will actively try to stop anyone using Proton or Asahi.
noiserr@reddit
Nobody cares about overpriced Apple hardware though. Gamers want cheap.
Wonderful-Sail-1126@reddit
"overpriced". They're insanely good values for the hardware. M4 Mac Mini 16GB for $499 on sale frequently. M4 Macbook Air at $799 on sale. You can't get the same value anywhere for an ultra efficient fast well built machine.
noiserr@reddit
with an obsolete 256GB SSD, and
Seanspeed@reddit
Unless consoles get delayed, RDNA5 will be out at the least by the time they do.
Sen91@reddit
2039! NO, 2040!!
Ar_phis@reddit
So we are mad at Nvidia for not releasing any new cards soon, instead of getting mad at Nvidia for releasing new cards that costs too much?
Silicon doesn't get cheaper, memory goes through the roof and even simple copper is up by ~40% or so, what was the scenario for new cards in this environment, a RTX 6080 with an MSRP of 1800$ and 20% increase in performance?
SnavsMatiq@reddit
Mad at nvidia because holding the 60xx line drags the used market to a halt. Let them make a crazy expensive card, let the whales buy in so the annual hermit crab trade can commence.
BilboBaggSkin@reddit
Who’s to say we won’t be paying next gen prices for current gen GPUs?
ChefLeBoef@reddit
How many integrators and peripheral maker will go bankrupt until then? I am thinking cooler, power supply and case manufacturers with low margin will not exist by then. Add the unpredictable tarrifs and say good bye to most of them.
Krigen89@reddit
Why? They can still sell 50x0 gen GPUs. Or could sell AMD.
SnavsMatiq@reddit
Its about everything being more expensive. Small tech shops dont have great margins, something like this could very easily tip their books to the negative.
Blueberryburntpie@reddit
There's always selling to the datacenter market.
Instead of selling 750W PSUs to the consumer plebs, it's selling 7.5-75 kW PSUs to OpenAI.
FinancialRip2008@reddit
this is great news. game design has fallen way behind hardware capabilities. let's take however long to develop great games that use old hardware tech to its fullest. the more game devs learn to optimize for the ps5 and similar hardware (steam deck) the better for everyone who just cares about gaming. devs and consumers.
SnavsMatiq@reddit
Unlikely for the simple reason that optimization isnt profitable. Mainstream game making nowadays is focused on one thing and one thing only - profit
NeroClaudius199907@reddit
Games are already being optimized for consoles first
dabocx@reddit
Good opportunity for AMD if they actually can deliver rdna 5 in 2027 in volume. My faith is low though
siazdghw@reddit
People say that every generation. 'Nvidia left room for AMD to shine', they haven't done that in over a decade, and now it's clear that consumer GPUs aren't going to be AMDs priority either
SnavsMatiq@reddit
Theres no room for anyone to shine right now, theres no stock.
LiliaBlossom@reddit
but they did aside Raytracing and people need to stop acting raytracing is the only thing that makes a card good when there‘s only a couple of games having it
imaginary_num6er@reddit
RDNA4 was supposed to launch quick so it helps AMD focus their efforts on UDNA5
996forever@reddit
Every other radeon generation is supposed to be a "stop gap while they focus on the real deal".
TK3600@reddit
RDNA4 is the real deal. It is matured and optimised RDNA. CDNA on other hand is new, possibly half baked until few iterations later.
996forever@reddit
CDNA? CDNA is already on CDNA4. It is data centre only and can’t even output to a monitor.
A “real deal” generation still only having 2 dies in total and nonexistent in mobile is really, really bad. RDNA2 scaled far better both up and down.
Strazdas1@reddit
He means the CNDA after RDNA gets merged into it. What we sometimes call RDNA5.
996forever@reddit
The merge does not appear to be happening despite the rumours if we look at where CDNA4 is at the moment 🤷♂️
MrMPFR@reddit
Look at LLVM uploads. CDNA5 and how RDNA5 derives a lot from CDNA5. Things are getting merged.
itsabearcannon@reddit
AMD is going to deliver what they always do: NVIDIA’s options, somewhat competitive in raster but double digits slower in RT, with fewer software features, for $100 cheaper.
Strazdas1@reddit
if you are in US, otherwise all things same but 100 dollar more expensive.
IBM296@reddit
Yup. Massive opportunity for them, but nahh AMD gonna launch in 2028 and be behind Nvidia again lol
ryanvsrobots@reddit
Massive? Gaming is still a small opportunity compared to DC.
Strazdas1@reddit
gaming is a massive oppotunity compared to the amount of DC AMD can convince to use their cards.
JapariParkRanger@reddit
With what fabs?
upbeatchief@reddit
And what ram?
NectarineSame7303@reddit
AMD does not have the memory bank volume purchasing power as Nvidia and Intel, not even close.
DragonPup@reddit
Also an opportunity for Intel. The B580 was a great value, but can they capitalize on the mid tier market?
Wonderful-Sail-1126@reddit
If the next-gen is made on TSMC, they have no shot at any volume. Heck, even Intel's own fabs don't have any more capacity.
TheLastOfGus@reddit
Well with the B770 cancelled and the future Celestial GPUs currently reported as cancelled or delayed indefinitley there isnt much for them to capitalise on with sadly.
Intel reporteldy taking the stance that they aren't a "financial viability," and believe the project wouldn't be a worthwhile use of its resources - aka they want to ride the AI bubble so will reallocate resources to focus on that. The BMG-G31 die that was going to be used in the B770 is now going to be used only for the Arc Pro B70 workstation card with 32GB vram so it'll cost a lot in the current/future market.
Homerlncognito@reddit
B580 is 5070 die size and memory capacity. Scaling higher doesn't make sense, we're lucky Intel is still selling them at very reasonable prices.
dantemp@reddit
Amd had the opportunity to offer something better than Nvidia forever and every time since Turing what they did was wait for Nvidia to release and amd would give us a gpu that's about 10% better at raster price to performance and awful in every other aspect. I see no indication for them to change course based on their ces presentation, if anything they doubled down on following Nvidias lead except worse
Beautiful_Ninja@reddit
AMD is going to be fighting with everyone else for RAM, I don't see how they'd be able to pull off a massive volume launch if Nvidia can't get enough RAM to satisfy their demand.
Shanare_@reddit
They are going to be fighting for fab space first. Tsmc doesn't have capacity
AIgoonermaxxing@reddit
Is Samsung doing anything worth noting right now? I saw a rumor posted here about Nvidia considering going back to them. If TSMC's new nodes are too prohibitively expensive and Samsung has something comparable to TSMC's 5 nm they might have to be an option.
Jonny_H@reddit
Maybe? But using a new fab and PDK isn't a trivial process but a multi-year project with significant investment, and lots of things like layout/low-level optimizations would all be invalidated.
So to make it worth it they'd have to undercut tsmc significantly for the costs to pay off, plus the risk of the market possibly being very different by the time things actually get to the level of being released.
dabocx@reddit
Samsung 2nm is shaping up to be pretty good, no clue if the yields are solid enough yet
upbeatchief@reddit
any high end node is going to sell in this economy. even if bad yield rates makes them cost twice the fully booked competitor. because the competitor is not going to have capacity until 2028.
Klutzy-Residen@reddit
The cost of the GPU itself is pretty much nothing compared to the final cost of the product in the entreprise market, especially these days where they have both insane memory costs and margins.
Roxalon_Prime@reddit
Some deranged people in AI space (or should I say bubble) blame TSMC for not being impressed with Altman's "7 trillion for fabs" plan and only investing conservatively 52 fucking billions dollars in capex in 2026
Blueberryburntpie@reddit
Altman wanted TSMC to use their own money to gamble on spamming fabs. TSMC has been in the fab business for decades and seen all of the booms-and-busts.
Had OpenAI had the money to use Apple's strategy of just throwing money at TSMC, they would have gotten somewhere.
wilkonk@reddit
yeah they can't do miracles, the supply is the supply. Even if they wanted to set a low MSRP if the demand is high because of nothing else on the market the things will still end up expensive af.
Yebi@reddit
They've had tons of oppurtunities and fucked up every single one, I'm not holding my breath
goodnames679@reddit
AMD have never had such a fantastic opportunity to miss an opportunity
From-UoM@reddit
Amd will never release Nvidia because they always wait for Nvidia's price.
Just look at rdna4 during CES
Firefox72@reddit
I mean if they have the chance to release months ahead of Nvidia they absolutely will.
ResponsibleJudge3172@reddit
Rtx 60 will launch H2 2027. Ignore nonsense rumors
sonnytron@reddit
Best they can do is match Nvidia’s strategy, raise their prices but give a semi “aw shucks we sure tried” shrug.
cadaada@reddit
We were told that for rdna 4 when no super versions were announced for ada
HuntKey2603@reddit
no one is going to buy those compared to Nvidia. They were in that situation before.
glizzygobbler247@reddit
I mean prices are already up and is reported to keep rising, i dont see how they can deliver new products when its hard to keep up with current ones
ls612@reddit
Sitting here with a system I built in 2022 and a 4090 I installed in 2023 and I'm legitimately wondering if I will be more worried about hardware reliability and longevity than I am about my hardware becoming obsolete. Like a 6090 is probably only going to be a 2x jump in performance over the 4090 so why would I build a new system? But on the other hand at that point my components will have seen 6 years of daily use and my GPU over 5 years of service and almost everything will be out of warranty. What a world we live in.
Quealdlor@reddit
with 9070 XT or 5070 Ti you can last till 2034
you can just change the CPU and RAM when they become faster and cheaper in 3 years
by 2027 there will be RDNA 5, and by 2028 there will be RTX 6080, Zen 7 and DDR6 - you will be able to build a computer with roughly 2x better performance and memory for the same price as today
even Zen 6 will most likely double performance/price in this year already compared to Zen 5
Strazdas1@reddit
no new GPUs in 2026 was the basic expectation, not sure whats to report about here. 60 series cadence would be 2027, though, not 2028.
MrMPFR@reddit
Seems like it's a misunderstanding. FY28 = 2027. People mix up fiscal year with real one.
Strazdas1@reddit
Ah, that makes sense. For Nvidia FY28 will be 2027-02-01 to 2028-01-31. Thats most of 2027 in that.
TophxSmash@reddit
love how these comments randomly believe this rumor.
BidnessBoy@reddit
I mean it doesn’t exactly seem that far fetched given what we knew about the Super refreshes for the 50 cards (VRAM upgrade)
Strazdas1@reddit
it seems very far fetched. Supers only ever exited in 2 generations to begin with, they were never guaranteed.
Any_Towel1456@reddit
Strange. I just read a 5090 Ti/Titan is expected Q3 2026.
https://overclocking.com/et-si-nvidia-lancait-une-nouvelle-rtx-50-en-2026/
Phelixx@reddit
I’m honestly fine with this because the 50 series cards are already expensive as hell and they can’t stock them. So if they just keep producing these cards it’s all good with me. It will be years before anyone can reasonably buy them anyways.
Not bad for studious to, you know, optimize their games either. No one can buy new rigs, they are going to need to get creative or they won’t have sales.
I’m thinking I’ll be hanging on to my 5070TI, the gods willing, for 8-10 years at this rate.
great_airflow@reddit
They should release cheaper cards tbh, or prosumer that justify the pricing, like if 6070 is sub 800 and 6080 is sub 1k at release, that'll be sweet. And the RTX 7000 pro if there's one, should be around 2-3k.. But obviously knowing the state of the market, these 60 series will cost as much as a kidney
Wooden-Ad-8204@reddit
So far, so good; on PC I mainly play Diablo 4 and Overwatch 2
laffer1@reddit
There is no vram to release products
damalixxer@reddit
I'm ok with this. I got my beast 5090, 64gb ram setup (last year before the current crazy price hike). I just feel bad for the people who aren't able to upgrade or build PCs in the current climate!
Deditch@reddit
remember when people were yelling at hardware unboxed
Voodoo2-SLi@reddit
Rumors about RTX60 in 2028 maybe based on a misunderstanding. The original source probably meant nVidia's fiscal year rather than the calendar year. RTX60 in FY2028 largely corresponds to the calendar year 2027, as nVidia's FY2028 runs Feb 2027 to Jan 2028.
Source: 3DCenter.org
welsalex@reddit
That makes the most sense. No one is waiting over 2 years without releasing anything....
Inb4 next week the articles will say 2029/2030. Seems to go up everytike it's mentioned.
Jared_pop21@reddit
Does it matter if they do or don’t release then? Extra value for the cards we currently have seems nice.
randomkidlol@reddit
if they pull an intel where management starts hamstringing engineering to maximize profits by selling old shit at a premium, competition is bound to catch up eventually. based on jensen's old interviews about him pushing his engineers to start competing against their past selves, i highly doubt they will pull an intel.
MrMPFR@reddit
This news should prob be a separate post to not get burried here.
NectarineSame7303@reddit
Big doubt honestly, they'll just raise the prices and people will still buy it because eventually the 50 series stock will run out.
wainp@reddit
How much you want to bet this'll be used to attempt to strong-arm everyone into streaming gaming services?
Awhispersecho1@reddit
You will own nothing and everything will be a service. This is how it starts, raise prices until you price everyone out of the market and then eliminate the market
whalesum@reddit
Thank god I bought a 5080 at the end of november last year
Consistent_Story903@reddit
I bought a 4090 in Sep 2023 during a short period when when there were some models available at or near MSRP. Feeling like I'm gonna get my money's worth on this one.
Exact_Library1144@reddit
Yeah I pulled the trigger on my first PC in over a decade in June 2025. Managed to get a 5080 for £810 through a friend in the hardware industry, and my 32GB CL30 6000MT/s RAM was less than £100 at the time.
Really feels like I couldn’t have timed it any better.
whalesum@reddit
Thats awesome. I managed to get a new pny 5080 OC for ~600 USD with discounts and then in a month it tripled in price lol
Exact_Library1144@reddit
That’s an excellent get, a no brainer if you have the money even without the benefit of hindsight.
whalesum@reddit
For real. I bought most of my computer parts from walmart. Used my 10% employee discount ontop of PayPal 20% cash back offer they had with their pay in 4. I made out like a bandit :D
vidati@reddit
This is a good opportunity for AMD to not worry about the next Gen and just focus on getting the software up to speed with Nvidia.
FSR4 is great, let's do it on RX6000\7000 series. Better implementation of Redstone across more games.
awr90@reddit
6000/7000 series lack the hardware to implement fsr4. Can’t believe people still don’t know this.
Tai9ch@reddit
RTX 6969, now with 4GB of GDDR5!
lan-devo@reddit
Just you wait for the new tech 360p to 8k scaling with 3.5 GB ram and 64 bit bus
NeroClaudius199907@reddit
vram is increasing across the board except for 60 stagnating.
lan-devo@reddit
Yes, but with the gddr situation I wouldn't be too optimistic about it, more like probably the same except the top model. Remaking 3060 8 GB, TSM saturated, at the end the increase in whole chip+ram price will be painful. The next top model will make 5090 release price seem cheap
9isgt0@reddit
honestly, at this point idc anymore. i usually chased high-ultra graphics on my pc, but my steam deck made me realize games can be fun even in lower graphical settings. And there are plenty of older titles that are amazing. RX7800XT for next 5 years 😬
Xlxlredditor@reddit
Plus a 7800XT is still really nice
DHFearnot@reddit
New Xbox will be slightly faster than a 5090 and start Console Masterrace
Xlxlredditor@reddit
Source? Or maybe this is just obvious sarcasm I'm totally missing
Minimum_Exchange_622@reddit
more crazy part is this, my 4090 is off 3 year old warranty, its 3 year old gpu, in 2028 its going to be almost 6 year and its still going to be god tier gpu for gaming....its not even 1080ti legend level its 100 miles ahead of it
Xeryl@reddit
I saw the writing on the wall and bought an RTX 5080 in November (upgrading from a 3080 Ti). Had purchased a 4K monitor earlier in the year and was holding out for news about Ti/Super versions, but decided to bite the bullet. Tbh the rumours for the Ti/Supers didn't show them being a particularly urgent jump over a normal 5080 anyway.
Only problem now is the rest of my PC, not sure how affordable it'll be to upgrade it in the forseeable future and it's 6-7 years old and I think my CPU struggles with 4K gaming (i7-10700K).
mrchicano209@reddit
I’m glad I was able to snatch a 4080 Super at MSRP. Thing is has been a real beast and expecting it to last me a long while.
Shanare_@reddit
GPU makers forgetting what industry made them this capable
ILoveTheAtomicBomb@reddit
Lmao. I love how Reddit thinks they're responsible for these companies and need to answer to em
gringewood@reddit
I think he means gaming, not Reddit. They kind of have a point, around 2020-1 nvidia was getting around 50% of its revenue from gaming.
noiserr@reddit
PC gaming definitely didn't make AMD powerful. Consoles and datacenter did.
gringewood@reddit
We’re talking about nvidia?
noiserr@reddit
Top comment in this thread:
https://www.reddit.com/r/hardware/comments/1qxo1er/report_claims_nvidia_will_not_be_releasing_any/o3xve0d/
ILoveTheAtomicBomb@reddit
Even then that was prob just crypto bros trying to snag whatever they can. Gamers have never been a priority and it makes me laugh when they think they're some force companies need to answer to
Exist50@reddit
You really should look at Nvidia's financials pre AI boom. It was the backbone of Nvidia's business for most of its history.
ILoveTheAtomicBomb@reddit
Split pretty evenly between enterprise and consumers where I’d still bet consumers were counted as crypto
Either way, the fact people are trying to argue Nvidia owes them something because they might have purchased a gpu once in that time period is pretty funny. They owe you nothing lol. You bought a product, simple as that
Exist50@reddit
No, it was heavily consumer weighted. And if the furthest back you can remember is crypto, that's not doing you any favors...
Not going to weigh in that myself. Just pointing out that it is accurate to claim that gaming put them on the map and drove the company to be in a position to intercept AI.
ILoveTheAtomicBomb@reddit
Poster I originally replied said 2020-21, height of crypto boom where Nvidia put those sales under consumers. Anything before that, sure, gamers had a little to do with their sales and never to the point where I’ll agree gamers made Nvidia lol
Exist50@reddit
Uh, no. Who else do you think Nvidia was selling to before DC took off? Professional graphics was always a significantly smaller market vs gaming, and iGPUs ate the basic display adapter market ages ago.
Exist50@reddit
And that was after the Mellanox acquisition. The ratio was even higher before.
nameonreddit__@reddit
He means gaming not reddit
zakats@reddit
I'd further amend that to 'individual consumers'
Tystros@reddit
yeah reddit is filled with gamers who think they're the center of the universe
MikeFrett@reddit
Everyone on the Internet thinks they are, or should be, the center of attention. That is how Social Media was born.
voidspace021@reddit
It is true though, Nvidia would not exist without gaming
MVIVN@reddit
I’m still running a 2080 Ti in the pc I built in 2019. I was planning to do a new build this year, but with all the fuckery going on with RAM and GPU prices, that’s on the back burner now. Besides, I do probably 95% of my gaming handheld these days on either my Steam Deck or my Switch 2, and the other 5% is split between the PS5 and PC. I think I’ve come to accept, after years of being primarily a desktop PC gamer, that now I just prefer convenience over cutting edge maxed out settings
xRedStaRx@reddit
I can sell my 4 year old 4090 today for more than I paid for it new back then, it's unprecedented.
koryaa@reddit
you could sell gtx10xx cards 5 years later for double the money.
shecho18@reddit
Can someone give me another diagnosis as to why I should care?!
jianh1989@reddit
Only thing i wonder is, what can i do if one day my current gpu fails by itself.
imaginary_num6er@reddit
Next coming up:
AMD confirmed they will be launching in 2029 since Nvidia is launching in late 2028
ConsistencyWelder@reddit
I knew this sub would find a way to spin this into bad news for AMD.
Pretty-Emphasis8160@reddit
3 years per generation (that too with proper gains per generation only limited to the top end)? Keep it up Nvidia and you'll find yourself in the same position as the US car industry in 5 to 10 years when Chinese companies get ahead.
gypsygib@reddit
Looks like the Nintendo model, where devs have to do more with older gen hardware will become common across the industry.
SargathusWA@reddit
Well maybe developers will optimize their game now because they cannot rely new powerful gpu’s anymore that they are expect us to buy.
hank81@reddit
One GEN every 2 years since the first Riva 128. Have you forgot? Rubin should be scheduled for 2027 as usual.
LargeSinkholesInNYC@reddit
Just buy Nvidia stocks to fund your gaming laptop.
Liesthroughisteeth@reddit
And boy, are they going to cost you!!!
WorriedGiraffe2793@reddit
Big AAA games will need to keep releasing for the base PS5 for a couple more years given the PS6 has been released.
plasticbug@reddit
I can't wait for nVidia to become irrelevant.. I mean it probably won't happen, but as a gamer, and a long time supporter of nVidia (I bought many of their cards, even before they were cool, starting with Riva 128), I feel abandoned.
Alas, I fear AMD and INTC are also chasing the AI money, rather than tech enthusiasts.
msolace@reddit
good, now lets just hope the 60x series isnt really 50x super and a real upgrade.
Bucser@reddit
I guess my RTX3080 has to chug along until 2029
Key_Fill_4857@reddit
If this is true I'm super happy I lost control and got a 5090.
mrandish@reddit
While this seems weird because it's a deviation from the pattern, it actually makes sense and not just due to the obvious "better to put scarce wafers/RAM/capital toward higher margin AI products." There are three other reasons:
3D GPUs have been rapidly evolving for over 30 years. We're now at a point where the silicon designs for real-time graphics pipelines have reached maturity. Most of the big wins have already been harvested. Of course, new design innovation will still happen but the performance leaps won't be as large, as frequent or as easy as they once were.
That makes gen-over-gen improvements increasingly reliant on new process nodes but the end of Moore's Law and Dennard Scaling have reduced those gains to typically sub-10%. And the gains no longer improve gate count, speed and cost all at once like they once did. There was a magical time when a major new node might deliver >20% more gates AND >20% more speed AND be >20% cheaper. Now it's more like ~9% more usable gates OR 11% faster BUT >20% more expensive.
For several years around roughly 2020 there was an extraordinary financial bubble driving irrational amounts of capital investment in large game studios. That drove unsustainable hiring and a bunch of acquisitions at insane valuations. The inevitable crash happened and we're still in the early parts of a long hangover and industry reset. It'll still be a few years before we fully recover to pre-bubble levels. Until then there'll be fewer AAA games which push the bleeding edge so amazingly that they simply can't properly shine without brand new GPU silicon. Sure, it's always possible to saturate performance by increasing resolution, FPS and maxing settings and that's never bad but there's not going to be many titles which make an obvious OMFG! difference from across the room.
noonetoldmeismelled@reddit
I've been wanting min requirements to pretty much not really push past the PS4 for a long time. Crappy memory prices, Steam Deck, Nintendo Switch 2, continued popularity of the PS4 and we're getting that. The next leap will be when LPDDR6 makes it into budget laptops. I'm betting on a Steam Deck 2 waiting on LPDDR6 being at a reasonable cost for it
sonsofevil@reddit
Nice, then I don’t have to buy new hardware :) forever live my 4080S
Melbuf@reddit
maybe this will make game deves optimize their games
hahahahahahahahaha
dataplague@reddit
More time to save and enjoy my 5090
dsl2000@reddit
My guess eight nownis they will release 50 supers in 2027 and use then as a launchpad for price hikes, so that another round of price hikes with the 60 series late 2028 / early 2029 dont look as bad.
Valmarr@reddit
2028? That means over six years without a full-fledged generation. Because (apart from the RTX 5090) Blackwell cannot be called a real new generation, where the new xx80 card is clearly weaker than the previous xx90, and the new xx70 is weaker than the previous xx80.
Dangerman1337@reddit
RTX 60 series needs be a big leap, like "something that does 2x a 4090 at least" offerings.
127-0-0-1_1@reddit
What does “need” mean?
If “need” is “need to make you upgrade”, fair enough, everyone has their own standards.
If “need” is “need to launch a successful product that sells many units and makes a lot of money” then absolutely not, they do not need to do that at all.
uppercuticus@reddit
Or what? People will stop buying? People will go to AMD? Jensen thinks he's benevolent for even selling to us plebs.
rchiwawa@reddit
Well, double my cards perf in all metrics is my dead minimum for replacement so...
🤞
I just hope my shitty EK block survives on the 4090 that long... I should probably look into snagging a spare and hunkering down
defensivedig0@reddit
I'm sure they'll be brought to their knees by you not purchasing. Considering people bought gpus through the crypto boom and people have bought the 40 and the 50 series... I don't think how "good" or "cost effective" or "better than the previous generation" a GPU is matters literally at all.
rchiwawa@reddit
Just stating my decades long standard while throwing out some commentary, jag-off. The sales at seemingly any price make your case and the rest is just irrelevant noise.
defensivedig0@reddit
Jeez, sorry. Didn't mean to offend you so badly.
rchiwawa@reddit
You didn't. Your lack of objectivity and prose earned you the title for that moment.
JJ3qnkpK@reddit
I, for one, will make a disappointed comment on reddit. That'll show 'em.
_Fibbles_@reddit
That's a pretty stupid definition for a generation. Especially since it depends what you're measuring. The 5080 has 35% more TOPs than a 4090.
Ill-Shake5731@reddit
and the xx60 cards are only good cuz the previous 40 series was regressed really bad in comparison to the 30 series
Ill-Shake5731@reddit
and the 60 series is only good cuz the previous 40 series was regressed really bad in comparison to the 30 series
Dull_Reply5229@reddit
Be nice if that puts it in line with PCIE 6, AM6 and DDR6. Be the perfect time to build a full next gen pc 😀
keikun17@reddit
AMD : Do nothing.. Lose
hardware-ModTeam@reddit
Thank you for your submission! Unfortunately, your submission has been removed for the following reason:
txdv@reddit
Intel has the actual capacity to catch up
spellstrike@reddit
imagine if they had stayed in the memory business...
theineffablebob@reddit
They're getting back into it through a joint venture with Softbank and Saimemory. We might see it in 2028
spellstrike@reddit
according to this article prototypes perhaps soon but no commercialization till 2030 or later.
https://www.trendforce.com/news/2026/02/03/news-intel-reenters-dram-race-a-closer-look-at-the-z-angle-memory-collaboration-with-softbank/
Exist50@reddit
With what? The products they already killed?
buzzkill_aldrin@reddit
And they shelved the B770
Deckz@reddit
Top comment is a stupid one, never change reddit
zakats@reddit
And that's a good reason to be an asshole?
Deckz@reddit
yes
zakats@reddit
Weak
Darksider123@reddit
Especially this sub with the ever present brand wars.
Deckz@reddit
It's exhausting, I should know better than to click on the comments anymore. It's always the same recycled trash.
deefop@reddit
They're responding to precisely the same market forces as Nvidia. Unless they literally can't sell products to data centers for Ai, that's exactly what they'll do.
And the ram shortages are likely to fuck them over even if providing gaming gpus to nerds is their fondest wish.
pi-by-two@reddit
There are people who genuinely think AMD could release card with 128 gigs of VRAM that performs like two 5090s for 499$ if they just wanted to. For some strange reason, they just choose not to do it.
Intrepid_Lecture@reddit
Yep... adding to this... if 16GB RAM chips now cost $200 instead of $50... that previously $300 video card can easily get a $100 price hike while making AMD (or nVidia) less money.
imaginary_num6er@reddit
No, they’re going to launch in 2029 now
someguy50@reddit
And it’s be slightly worse than Nvidia option at a similar price
cheesecaker000@reddit
The old Nvidia -$50
cloudone@reddit
Wdym by do nothing? You don’t like MI450?
JapariParkRanger@reddit
Lose what? Gamers don't buy them, and just like nvidia they make an order of magnitude more money selling to data centers. With supply of everything constrained right now, there's zero chance any player is going to prioritize anything but that AI money.
angry_RL_player@reddit
Apologize to AMD right now
TripleS82@reddit
Yeah, I’ll be good with my 4070ti for a while.
Benji998@reddit
I'm perfectly content with my 3080 for a few more years. Only thing is if or breaks, that would be annoying.
Thorteris@reddit
Little did I know when I bought my 5080 last year that it would last me 3 years minimum
DeadPhoenix86@reddit
2028 seems generous I will not be surprised if the A.I craze keeps up, and they will exit the market all together. I hope not, but I have this feeling this is the way they're going forward.
-CynicalPole-@reddit
Ofc, why waste silicon on gamers?.. AI, AI, AI, AI!
HisDivineOrder@reddit
Gamers can rent some time on GeForce Now, Xbox Cloud Gaming, or Luna after all. Imagine how annoyed Google is now they're missing that gravy train where the AI Boyz deleted all memory from the market and accidentally force cloud gaming into becoming a real thing finally.
GalvenMin@reddit
Nvidia? You mean the AI company?
SXOSXO@reddit
I can't afford them anyway.
balrog687@reddit
On a possitive note, once the bubble finally explodes. RAM prices will go down, so we will get lots of cheap vram
ibstrd@reddit
I get what you mean, though I still remember when Ethereum miners were left with meager opportunities for mining and I expected a flood of cheap GPUs to hit the used market... and then they just didn't. The prices barely moved.
I'll keep my hopes very low to avoid the same disappointment.
vankamme@reddit
Hopefully this will force developers to optimize there games more….
RedditNotFreeSpeech@reddit
And just like that, the 3090 on my shelf just went up in value another $100
equitymans@reddit
I'm actually still not convinced we see the 6000 series at all haha consumer gpu may be dead before that. If not it'll be the last gen I'm betting. Makes no sense margin wise to keep them going at all for nvidia lol just doing it to be kind at this point and not get a total hate train
LastChancellor@reddit
lol lmao, I guess Intel just got an extra year to get their iGPUs to catch up to 50 series 😂
SEI_JAKU@reddit
If this UDNA (or whatever it's called at this point) ever actually comes out any time soon, we might get somewhere... but probably not with the market the way it is.
vialabo@reddit
Going to need that long to save for the 6090.
ZoteTheMitey@reddit
Just over 3 years of use out of my 4090. I don't see ever needing to replace it for quite awhile at 3440x1440 165hz. Especially considering I only play single player games besides Elden Ring and Dark Souls.
Hang in there, old friend.
Balls_B_Itchy@reddit
I’ll have grandkids by the time I upgrade my 4090FE
purplelives@reddit
I'm glad I bought my 5090 FE (got super lucky on BB canada) last year for msrp. Sure, I don't use it a whole lot these days, but since I always wait 2 generations to upgrade, it looks like I'll be getting 5 years out of it by the time 70 series rolls around in 2030...
Method__Man@reddit
On the plus side all my parts are appreciating in value
On the down side imma tech YouTuber so I buy loads of stuff constantly... I'm legit worried about being able to keep up if this impacts gaming laptops (my primary focus)
Darksider123@reddit
My god this comment section is such an eyesore
12A1313IT@reddit
Yea people want better GPU but graphics have already topped out. You don't need 100000 TLOPs to play Valorant, League of Legends, etc. NVIDIA was way aheads of its time and already gave u the best possible
Pe-Te_FIN@reddit
2028 sounds fine for me. 4090 has still plenty of grunt left, still running pretty much everything maxed at 4K with DLSS quality.
alabasterskim@reddit
The good thing is this means devs will have to actually optimize their high end games
elbobo19@reddit
anyone that got a 4090 at launch at MSRP made one of the best tech purchases ever
SirMaster@reddit
Fine by me lol. For have to worries about paying crazy money for something new for awhile at least.
Flimsy_Swordfish_415@reddit
so no $1100 6060 in 2026?
wordswillneverhurtme@reddit
honestly sounds good since most game devs always "push the limit"... Maybe they'll optimize more. Plus I'll get more juice out of my 5080, though I didn't plan upgrading it for another 5-6 years anyway
Spicy-hot_Ramen@reddit
Good, maybe at least some substantial boost will be present in rtx 60
SteveBored@reddit
Pc hardware is so boring these days. I’ve pretty much lost interest
Malygos_Spellweaver@reddit
The most exciting hardware is the lower power ones, Panther Lake and B390 look amazing.
caiteha@reddit
I don't know, I kinda like it so folks don't have to upgrade and devs can focus more on optimizing for current gen of cards.
viladrau@reddit
Come on Jensen! How about a 5070ti 8GB + Neural Texture Compression marketing? /jk
WolfeJib69@reddit
I’m gonna keep my 4080 super forever
RobsterCrawSoup@reddit
I think at this point I've managed to free my mind from the now anachronistic expectations we grew to have about the rate of advancement in the fields of photolithography, transistor design, and computer architecture. So I don't get mad or surprised when the time between generations of products widens and the improvement between generations narrows, but it still makes me a little sad to think about it. The slowing of technological progress is definitely nothing to celebrate, but on the upside, it means holding onto old hardware for much longer makes much more sense and you're missing out on much less.
GreatnessToTheMoon@reddit
GPUs have hit a bit of a wall anyway. More time between them is good
Actual-Suit-2997@reddit
highkey totally get it, spending on noctua fans for a mid-range build is like putting premium gas in a beat-up car lol
Valmarr@reddit
2028? That means over six years without a full-fledged generation. Because apart from the RTX 5090 Blackwell, it cannot be called a real new generation, where the new xx80 card is clearly weaker than the previous xx90, and the new xx70 is weaker than the previous xx80.
Forsaken_Arm5698@reddit (OP)
Big upgrade like 30 -> 40? or a milder one like 40 -> 50 ?
AutoModerator@reddit
Hello Forsaken_Arm5698! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.