NVIDIA GeForce RTX 50 SUPER rumored to appear before 2026 - VideoCardz.com
Posted by KARMAAACS@reddit | hardware | View on Reddit | 183 comments
Posted by KARMAAACS@reddit | hardware | View on Reddit | 183 comments
tedsmosh@reddit
Love that they are releasing a new card, while millions of 50 series cards have been unusable for 6+ months with no solution.
Negative-Key-6243@reddit
as rtx 50 estão virando peso de papel ? como assim ???
jedidude75@reddit
Guessing no 5090 Super/TI this time around either though.
Vb_33@reddit
There is 0 competition for the 5090, it's way way faster than a 5080 and AMDs best is slower than the 5080.
SummonerYizus@reddit
The next gen Xbox will be as powerful as a 5080 and use an amd. So im assuming amd is releasing a new GPU you in 2026
_BaaMMM_@reddit
even the 4090 > 5080.
Omotai@reddit
I think releasing a 48 GB 5090 is probably way too dangerous for their workstation cards. I can't see them doing it.
RogueIsCrap@reddit
High end gamers want more performance not VRAM. 32GB is already more than enough for gaming but 5090 is barely adequate in new PT games, even with DLSS upscaling.
DerpSenpai@reddit
They literally cannot make a bigger 5090
Rude_Pie_5729@reddit
It would be around 10-15 percent. That's pretty solid!
jv9mmm@reddit
Can their workstation cards pool memory over nvlink? Because if they can, that alone would be enough to protect their workstation card line.
Plank_With_A_Nail_In@reddit
5090's aren't just being bought by gamers.
JtheNinja@reddit
Nvidia would rather they were only bought by gamers, and making a 5090S with 48GB will only make this problem worse. Lots of workstation/compute tasks where the drivers don’t matter and ECC isn’t worth the premium, people only pay the Pro card markup for the extra VRAM
Beneficial_Two_2509@reddit
What? Nvidia, like every other company, cares only about their bottom line. If they wanted them only bought by gamers, they wouldn't charge $2k (in reality $3500). If they only had gamer sales, they'd go bankrupt on that card alone. They love that scalpers started bot-snatching the 3090 and 4090 because they could show their investors "look! We sold 100% of our stock in .3 seconds!". Then, they saw that people were actually paying scalper prices so they "joined in" and went from charging $699 for the 2080 to $1999 for the 5090.
Without scalpers, Nvidia never would have had the audacity to up the price on flagship cards by 1300. That's why their sales went thru the roof since the COVID scalping days and they built their new architecture almost strictly for AI performance geared towards AI devs like Elon musk who preordered $13 billion in Blackwell chips.
NeverLookBothWays@reddit
Aside from gaming I’m also looking at VRAM for LLMs and stable diffusion, and the RTX 6000 Pro is absurdly expensive ($10k). 48GB on the Blackwell architecture would be a nice in-between.
Noreng@reddit
In many ways, the 5090 could be barely considered adequate actually. VRAM requirements seem to increase at least as fast, if not faster than actual performance requirements.
amazingspiderlesbian@reddit
I dont know. I've literally never seen more than 45% vram usage on my 5090 except for 2 games.
Modded cyberpunk 2077 with pathtracing and like 30 4k-8k texture texture packs installed which used like 19gb.
And pathtraced Indiana Jones at 4k which used like 17gb
Ethrealin@reddit
I did manage to run out of 24 gigs on a 4090 with the 4k pedestrian faces mode, but it was about it. 32 GB does sound like a hefty, 1080 Ti-like buffer: you'd want a new GPU for the latest titles comfortably before needing more VRAM.
amazingspiderlesbian@reddit
Cyberpunk seems to like choke and die even though it's not using the whole amount of vram in my experience. If that's the game youre talking about.
Like on my 5080 I would get vram performance issues even tho the game was only using 14ish gigabytes but was reserving 16. It seems like of the reserved amount goes over the vram buffer limit it'll die. Even if its not using all of it.
Like I can see the allocated vram amount in cyberpunk with all the texture mods is like 22-24. Maybe leaking 24 a bit which would fold your 4090. But its only actually using like 18
Ethrealin@reddit
That seems about right to me (and yes, I referred to Cyberpunk). My game started to choke at about at 22 gigs displayed in Afterburner, and removing the 4k pedestrians mod lowered it to sub 20 gigs.
Noreng@reddit
If the PS6 or next gen Xbox gets 32GB or more, you can be pretty sure 24GB will be troublesome, and 32GB reasonable
amazingspiderlesbian@reddit
Yeah i can see vram requirements going up after a few years after the ps6 launch when all the ps6 xbox next exclusive games start getting finished and published and the cross gen period is over.
But even then I wouldn't expect a more than doubling of vram requirements. Because currently you dont even need 16gb or more. Unless youre using like pathtracing and high res texture packs combined which I dont think even the ps6 and next box will be strong enough to use PT
Noreng@reddit
I wouldn't be surprised if the 5090 is capable of competing with the 7080 in half a decade's time.
capybooya@reddit
Absolutely. Although I fear that as cost is an ever bigger challenge, they might cheap out and go with 24GB and count on AI to sort out the rest (which even in the most optimistic scenario probably won't work well toward the end of the generation in 2034...).
panchovix@reddit
Wan 2.2 released today and you need like 60GB VRAM to run it fully on GPU (if not more) at fp16 lol.
Only 80GB+ VRAM chads can do it.
NeroClaudius199907@reddit
Thats why Jensen Invented MFG
At 4k all the path tracing games on 5090 are like \~32fps
6090 improves things by 60% you'll still need dlss
Dangerman1337@reddit
They'll do 48GB for a 6090/6090 Ti next gen.
Vb_33@reddit
4GB would have actually be manufactured first, I don't imagine it'll happen any time soon. There is one difference the modern era has, even GDDR memory is feeding the AI revolution so perhaps that demand could accelerate progress.
Dangerman1337@reddit
I mean that Kepler Backed MLID leak with 128GB AT0 SKU only is viable with 4GB Modules, 3>4 in the span in two years isn't impossible (hell wouldn't be surprised to see 5GB Module using Pro cards in 2029 or so).
Vb_33@reddit
You're referring to this diagram?
Assuming it's real there are indeed 32Gbit memory modules referenced in it but it's paired with 184CUs as well as PCie 6 and apparently aimed at the COVID (?) market i.e not desktop gaming. The desktop gaming big chip is using 24Gbit memory modules and apparently only has 36GB of memory, PCIe 5 support and 150CU. It's an interesting diagram for sure, I hope RDNA5 is a home run.
Caffdy@reddit
don't threaten me with a good time
NerdProcrastinating@reddit
RTX 6000 Pro Blackwell is effectively the RTX 5090 Super (priced).
capybooya@reddit
There never is. Although I guess with the exception of the 3090Ti but that was kind of a joke, and done only to justify increasing the price during the mining boom.
_BaaMMM_@reddit
why when you can sell more gb100s or whatever enterprise card for 10x
LuluButterFive@reddit
Just 4x more for the RTX 6000 Pro Blackwell
Firefox72@reddit
The 5070 Super will be my next GPU if it manifests with that 18GB of VRAM.
TomiMan7@reddit
You would go to just a 5070 from a 6700xt? Makes no sense.
Firefox72@reddit
Because its an 80% performance uplift in raster performance? Over 100% in RT and in the heavier games even more. It makes RT usable when its currently not on my 6700XT
I'd get access to DLSS4 and Ray Reconstruction versus having to use FSR3 which will never get better.
TomiMan7@reddit
Get the 9070XT, its closer to a 5080,you get more than 100% uplift, you get fsr4 and all the other AI stuff, without the stupid 2x6 melting connector, and you dont have to shell out a 1000$ on that either.
NGGKroze@reddit
I did replace 6700XT with 4070S (basically 5070) when 4070S released and to tell you, the power is there, the rt is there, the upscaling is there as well as efficiency, but the 12GB really starts to limit me in some scenarios.
I'm going for 5070TI 24GB as LLM will love as well.
Rude_Pie_5729@reddit
The fact that DLSS 4 balanced and performance are decent definitely helps mitigate the sparse vram capacity.
HateMyPizza@reddit
I replaced my 6700xt with 9070 and couldn't be happier. One of the most efficient GPU, has 16gb of Vram, really powerful. The only downside for me is memory temperature.
Antagonin@reddit
Why not? You won't ever need more than 64KB. /s
TheCh0rt@reddit
640KB!
I remember Star Trek: 25th Anniversary took up a whopping 560KB and it took FOREVER to get my config.sys drivers lean enough to run it.
bluntspoon@reddit
Holy crap I’d forgotten about having to do that!
FrankLReddit@reddit
Load High!
FlygonBreloom@reddit
Apparently BLAST PROCESSING DMA from RAM to VRAM is good enough for any GPU.
morgothinropthrow@reddit
Will it be worth it to update from 5070 to 5070 super
Lamborghini4616@reddit
Gotta consoom
JerichoVankowicz@reddit
I got 5070 and it is really strong card like top 5-10% of steam charts. I won't give money to jensen for their mistake to get super series. I will wait at least 2 years to get series 60
Lamborghini4616@reddit
You know you don't have to buy a card every generation right?
morgothinropthrow@reddit
These 18 gigs sound nice doesn't it
Lamborghini4616@reddit
Not when you already have a 5070
morgothinropthrow@reddit
I could sell my card for good money. I am sunday gamer and I have played only 20 hours on it while undervolted. I am really not trolling. If I will updated my monitor which I bought 2 years ago I could go for 4k card like 5070 ti super
Skrattinn@reddit
Depends on your target resolution. My own 5080 is already cutting it a bit short in a few games at 4k with DLAA. Meanwhile, 1440p with DLSS upscaling will likely be fine on 12GB cards until whenever the PS6 comes out.
PS6 won't likely come out for another 2-3 years. I'd much rather wait and upgrade shortly before that since those cards will likely have the same memory config.
Chimbondaowns@reddit
Jensen does need a new jacket.
Jeep-Eep@reddit
That thing will be the real competition to the 9070.
morgothinropthrow@reddit
Turn RT on 9070 to get 25 fps 🤡
DepravedPrecedence@reddit
RT in 2025 🤡 🤡 🤡
morgothinropthrow@reddit
TFW pure raster in 2025 ??? Are people ragebaiting
RedIndianRobin@reddit
I think they meant the 9070 can handle RT just fine.
JerichoVankowicz@reddit
He is right 30 fps rt lol. I had 9070 and instantly to 5070. Now I can play ultra native full hd with max ray tracing in 50-60 fps Best decision ever
Vb_33@reddit
Technically the 5070 already is. It's cheaper has the Nvidia featureset and it's close in performance. Only downside is VRAM.
salcedoge@reddit
The 5070 unironically being the okay budget option is pretty funny.
People clowned AMD for pricing the 9070xt and 9070 too close but imo it actually worked because I’ve seen way too many people overpay for the standard 9070 because all the reviews shat on the 5070 and it shared a lot of goodwill from the xt variant
wilkonk@reddit
the 9070 is actually the better card at the same price though... especially if you undervolt it it's not far off the xt.
PovertyTax@reddit
Dont count on it... 5080 has 16 of VRAM afterall
Prince_Uncharming@reddit
3GB GDDR7 means the 5070 would jump from 12 to 18gb. A theoretical 5080 super would go from 16 to 24.
Antagonin@reddit
I don't see a reason why they couldn't cut them to 160 bit bus, to reach nice and round 15GB, ofc for 100$ more.
bubblesort33@reddit
Because you'd get something slower than an RTX 5070, but with 3gb more VRAM.
Antagonin@reddit
I know it might seem a bit counterintuitive at first, but you're forgetting one important fact...
This is ngreedia we are talking about.
KinG131@reddit
It'd literally cost them more money to re-engineer the bit bus than they'd save on the 1 vram chip. They're not doing this to be the good guys, they're doing this because it's a good business decision.
Antagonin@reddit
What reengineering? Every 32 bit MC is independent, they can literally just cut them post-manufacture.
Vb_33@reddit
That would be smaller chip so it would be weaker, it would have to be a 5060ti but now it would have less VRAM than it already does.
Noreng@reddit
Introducing the 5060 Ti Super for $499
sharkyzarous@reddit
it might mine too if it comes before currency crash :)
Primus_is_OK_I_guess@reddit
I'd bet it will cost nearly as much as a 5070ti.
ExplodingFistz@reddit
Probably $650 so it doesn't cannibalize either of the adjacent cards.
Wardious@reddit
Me too, i cant replace my 3060 ti with a 12GB card.
TheMegaDriver2@reddit
You can just get a 8 GB GPU. AMD and Nvidia both agree that this is enough. Don't know why they even bother selling other configs.
hyxon4@reddit
I hope so. It's time to replace my GTX 1070, but I'm not switching from an 8 GB to a 12 GB card after 9 years.
BitRunner64@reddit
I solved this problem by getting a 9070 XT 16 GB.
randomIndividual21@reddit
Both AMD and Nvidia sucked this gen and the last. It's not like 9070XT is much better value that 5070TI, I got that but would definitely opt for 5070TI if it weren't for the crazy inflated price at launch for the 5070ti. The 80watt extra and the lack of fsr4 makes me regrets it abit imo.
HotRoderX@reddit
so you play one of the like six games in existences with FSR4.
ThankGodImBipolar@reddit
Wouldn’t you upgrade your card so that you DON’T have to use upscaling anymore?? And the upcoming games where you might want upscaling will probably have FSR 4; that’s how it worked for 2 and 3 when they weren’t supported in anything either.
Rude_Pie_5729@reddit
The problem is, dlss 4 quality and fsr 4 quality, help mitigate some of the undesirable effects of TAA. So many people consider a decent upscaler a necessity these days.
Stiryx@reddit
Not OP but I have a 480hz monitor so I need all the frames I can get.
Thrashy@reddit
I hate that it's such a hacky band-aid, but Optiscaler really unlocks the card's potential in games that haven't or won't get official FSR4 support, and it's made it much less of a loss to miss out on the broad support of DLSS.
Rude_Pie_5729@reddit
From my understanding, with Optiscaler, you can use fsr4 in games that have dlss 2 or later support. Does it work with almost EVERY game that fits that criteria?
Derpface123@reddit
How well does it work? Any weird artifacts?
Thrashy@reddit
Granted that my use of it has been somewhat limited, but the only time I've seen any oddities are when enabling its built-in framegen (which is not great). For regular upscaling, it's seamless.
dorting@reddit
Optiscaler just work when you correctly install it, no artifacts.
Ultravis66@reddit
I disagree, I think AMD did a good job this time around, you can buy either card 9070 or 9070xt and get reasonably good performance for the price. If i was in the market right now, its the card i would buy.
I know people who own it and are very pleased with it. Everyone i know games at 1440p except one person at 4k, but they using an older amd card and have not upgraded yet.
wewd@reddit
I'm playing RDR2 on a Dual UHD (7680x2160) monitor with a 9070 XT, using the Hardware Unboxed settings and getting 85 fps average at native resolution, without any weird stuff enabled in Adrenalin. I'm very pleased with the card.
Ultravis66@reddit
I waited and waited and waited for amd to release this card but couldn’t wait any longer, so I ended up with a 4070 ti super. Good enough for me. I was gaming on a dying msi laptop running an old 2060 mobile.
_BaaMMM_@reddit
5070 ti constantly popping up at msrp has me tempted. might just wait for the super idk
goodnames679@reddit
At this point I'd personally just tough it out for the super. The temptation is real, but the generation as a whole is underwhelming.
I'm personally holding out on this gen entirely. In a year or two I'll do a full new PC with the next generation of cards and AM6.
hyxon4@reddit
I wish if CUDA wasn't proprietary.
ijustlurkhere_@reddit
I was about to click 'buy' on a 5070 ti, i guess i'll wait.
Shidell@reddit
I'm gonna pull the trigger on the pny 5070 ti oc @ 750 @ best buy. Rumors, scalping... too many unknowns. At this price, I'm just going in.
tedsmosh@reddit
Carefully with any 50 series so many of them have turned into paperweights in the last 6 months and nvida is ignoring it.
ButtPlugForPM@reddit
If they smart
a 5080 with 20 percent more shaders and cores,plus 24gb and it will sell well.
Nicholas-Steel@reddit
From what I've read over the last couple months AMD's upcoming RDNA5 graphics cards are playing catch up with Nvidia so Nvidia likely just needs to lower prices (in addition to increased VRAM capacity) to sustain their momentum in the market.
Method__Man@reddit
AMD Is already caught up. Dollar per frame it's much better. Really AMD only behind on path tracing really. Which in those GPU segments isn't really relevant. You are looking at a 5090 or 4090 if you want to properly utilize path tracing
SchizoNaught@reddit
the issue for gamers, with amd, is a lack of quality drivers from AMD on anything other than the 7900XTX or 9070XT... and very little game developer attention
Method__Man@reddit
wtf are you smoking. AMD drivers are WAY more stable than nvidia now. This isnt even a debate.
Blackwell drivers are a fucking debacle that even nvidia admitted to
SchizoNaught@reddit
For the bigger or newer cards? Maybe. But for the 7600XT, I can assure you they stopped caring.
feanor512@reddit
Waiting to upgrade my 6900XT 16GB until the rumored 9070XTX 32GB or 5070Ti Super 24GB come out.
RedIndianRobin@reddit
There's no such thing as a 9070XTX 32GB lmao. Where did you hear that from? MILD?
SchizoNaught@reddit
reading is hard, i get it. but they said "rumored". they didn't claim that it exists.
chiplover3000@reddit
Don't care, it will be too expensive.
BasedDaemonTargaryen@reddit
Scalped + overpriced + shit stock for months until it stabilizes and then 6000 series will be 6 months away as well.
Sharp_eee@reddit
You reckon it will happen again like this? I’m trying to work out what to do with timing. I was going to get a 5070ti/5080 at the end of the year as that’s when I’ll start to have some free time again to game. No point me buying now as I don’t have time to use it. If I wait though, I could get hit with the new release and higher priced cards. Alternatively, most people could be wrong and they will release at a decent price…same odds as the king of Nvidia selling his leather jacket collection.
BasedDaemonTargaryen@reddit
I think it'll definitely get scalped. Not sure about stock though. Most likely we'll see the cards around february again and we'll see a repeat of the 50 series except now there's the AI guys tryna get them all due to the insane amounts of VRAM (especially if they release the 5070ti super).
Sharp_eee@reddit
Hard to know but you are probably right. Mighty be better off buying around Black Friday when the current 5070ti will be at its lowest. Hard to buy when a new release is literally right around the corner. Will be the right move though if it’s a repeat of the last launch. It will be like those who bought the cheap 4080s just a couple months prior to the 5000 drop.
BasedDaemonTargaryen@reddit
Unless you need the extra VRAM for productivity purposes, 16GB is more than enough even for 4K, so you're not gonna be missing out on much anyways.
Sharp_eee@reddit
I don’t need it for productivity, but the main use for my PC is a sim rig that pushes 3 x 1440p screens. At the moment the sim I use only needs 12gb or so of VRAM, but the engine is old and utilizes the GPU and CPU in weird ways. They are building a new graphics engine which will utilize modern GPU features better and it could push VRAM usage high. There are some other sims that do. I also have a 4K OLED for more traditional gaming.
BasedDaemonTargaryen@reddit
Ooh that's a good point. It's hard to say in that case until you know how much VRAM will the new engine use. But surely dropping the texture quality from ultra high to one tier lower will do the trick in the worst case scenario.
Sharp_eee@reddit
This is true - for normal gaming anyway. Settings are little different in each sim. They might have an option like that though in the new graphics engine. Lots of guesswork with buying new hardware these days.
UltimateSlayer3001@reddit
Here we go, time for the same ride we’ve been doing since the 20 series launch lmao.
Decent_Abrocoma_6766@reddit
Does anyone else agree with me that I feel a bit betrayed that this is happening so soon? I just bought a 5070 Ti, and yet there's going to be a better-value card coming out. This puts me in a difficult spot of potentially returning my card or just sucking it up and carrying on.
Solid-Transition4402@reddit
Nah i feel the same. My return window is up though, and atleast 16 gigs will be enough for a while, but it does suck. A 24gig card would ensure parity for texture quality settings with the inevitable PS6 generation.
k0unitX@reddit
I understand that everyone loves complaining about getting shafted by VRAM capacity, but this obsession about talking about nothing but VRAM lately is getting dangerous
The reality is 99.9% of games on Steam can be played at 4K max settings with 8GB VRAM just fine, and certainly with 12GB. Not everyone is trying to play Indiana Jones at 4K max on repeat every single day.
All of this VRAM talk will push uninformed buyers to get a 5060 with 16GB VRAM over a 5070 with 12, while it's extremely likely they will have an overall superior gaming experience with the 5070.
When can we start talking about CUDA cores again? I'm much more upset how the 5070ti, 5080 are cut down compared to the 5090 in terms of CUDA cores than these boring repetitive VRAM discussions.
Nicholas-Steel@reddit
2025 games and older maybe, sure, but people want their cards to sustain their desired texture quality and such over a period of multiple years when looking to buy a new graphics card. Guess what excess VRAM capacity allows for?
k0unitX@reddit
Hate to break it to you but developers will need to target 8 - 12GB of VRAM for the foreseeable future
Nicholas-Steel@reddit
Yes, and the games will look abysmal at low texture quality. I dunno why anyone would want to play a game where all the ground, walls, ceiling and model surfaces are smudged.
Rustic_gan123@reddit
During 2025, yes, during the next few years it is far from certain that 8 GB will be enough, given the release of new generation consoles and the corresponding revision of target characteristics for developers, as well as the fact that NVIDIA will most likely switch to a new technology process, and AMD to a new architecture, and the next generation should make a bigger leap than 40xx and 50xx (at least I hope so, it is unknown whether NVIDIA and AMD will play the same manipulations...)
only_r3ad_the_titl3@reddit
Also HUB regularly uses settings to prove 8 gb isnt enough where even the 5060ti 16 gb struggles to get playable framerates. However they dont do the same when it comes to RT.
UltimateSlayer3001@reddit
I’m gonna need a $500 equivalent to a 9070xt; gone are the days of $750 middle-of-the-pack GPUs. Especially with how horribly-optimized games are being shoveled out of the woodwork these days, it’s not worth it even as a thought.
InevitableSherbert36@reddit
Original source: TweakTown.
Darksider123@reddit
Tweaktown is a terrible source
Salty_Tonight8521@reddit
Do you guys think it is worth it to wait for 5070ti super if I'm gonna mainly game at 1440p and don't really care about AI?
ghostsilver@reddit
16GB should be plenty for 1440p for several years at least. No need for the extra VRAM from the Super.
The non-TI Super would be interesting though.
morgothinropthrow@reddit
Idk my 5070 12gb slays everything in ultra 60fps at 1440 with r5 9600x and isn't using 100% resources I will update it when it won't be enough. So around 2 years in future
Locke357@reddit
I have a feeling pricing will be an issue
However if it makes a brief window of reduced prices for non-super variants... now that would be swell
1mVeryH4ppy@reddit
Does it matter... you will still need to choose between instantly sold out FE cards or overpriced AIB models.
rrbrn@reddit
Downvoted but right.
rrbrn@reddit
Everyone waiting for the Super versions means months waiting until we’ll see them at MSRP…
__________________99@reddit
Nobody gives a shit. The only thing we want is a 5080 Ti for something to fill that huge performance gap between the 5080 and 5090.
THXFLS@reddit
Eh, I'll still consider it, but I'd definitely rather they turned the RTX Pro 5000 into a 5080 Ti.
HobartTasmania@reddit
Well, there's generally only really two things to consider in cases like this, which was always the case in the past;
(1) How powerful the GPU is, determines the maximum resolution you can comfortably game at.
(2) The resolution you are gaming at, determines how much VRAM you need to have. With texture compression these days, then who really knows for sure how much you need to have now.
Therefore, there's not much point having one of those when you don't have the other.
Morningst4r@reddit
That needs a whole new die so chances are the 6080 will be the best card to slot in that gap
human-0@reddit
Why is there a 5090 D V2 that has less memory and worse performance than a 5090, and then why create a 5080 Super that's nearly identical to the crippled 5090 D V2?
THXFLS@reddit
5090 D v2 still has a 50% wider memory bus and 2x the cores.
shugthedug3@reddit
Surprising if true but very welcome if they come with expected VRAM increases.
only_r3ad_the_titl3@reddit
Why is this a slap in the face? 3 Gb chips becoming available more isnt something unknown so this update has been rumored basically since the cards launched. It also wont make your current card worse.
MrGunny94@reddit
Just recently made the switch from a XTX to a 5080 and to me thus far 16GB is more than enough.
Might upgrade next generation to a 90 class if I see that it isn’t enough VRAM by then doubt it
killermojo@reddit
What res?
Bluemischief123@reddit
I did the same thing and playing at 4k 16gb vs 24gb made no actual performance difference (or limitation I should say) for me personally so far.
MrGunny94@reddit
Same at both 4K and at 3440x1440 (ultrawide)
LLMprophet@reddit
I went from 3080 to 5080 at 1440p.
hackenclaw@reddit
the 8GB $300 card need to die already, it is ridiculous that this can go as expensive as 5070 laptop. wtf
ThankGodImBipolar@reddit
Back in the day, a move like this would have heavily damaged Nvidia’s reputation, since they’re fucking over their strongest consumers (day one adopters) so quickly after launch. Is the market just too big (and/or potential profit too small) for Nvidia to really give a fuck nowadays??
MyWifeHasAPhatAss@reddit
This is a bad take and not thought out at all.
A swift & effective resolution to the largest criticism is now equated with not giving a fuck? Making adjustments and giving people exactly what they are asking for is called listening to feedback. They dont need to delay that response on behalf of jealous fee-fees or childish reactions like this one. This doesnt hurt anyone's gpu, and if they are that bothered by not having the newest one, they can "upgrade" like anyone else. It's never been easier to do that, most people got more money for their used 4080s & 4090s than they paid for them brand new. That's still happening for 4090s and 5080s.
Demand far outweighed supply at launch and for several months - being a launch day customer was a matter of luck, not an indication of being nvidias strongest customers LOL.
ThankGodImBipolar@reddit
I feel like your comment is written as if I own or have ever spent mine/somebody else’s money on the 5000 series of GPUs, and I just want to be clear that that is not the case; I own a 6600XT. I also didn’t spend money on the 2000 series or 4000 series where this happened as well, and the “take” in my comment was based on the reaction that I saw when Nvidia pulled the same move on non-Super purchasers of those series. The complaining was loudest during the 2000 series, it was less for the 4000 series, and nobody had commented on it under this thread when I posted it, so I thought there was an interesting discussion to have.
A swift & effective resolution to the largest criticism is now equated with not giving a fuck?
I think the important distinction here is that the “largest criticism” with these products was a choice that was made by Nvidia that made their products less useful/valuable for the people who bought them. Let’s not pretend that Nvidia didn’t know that people would be unhappy with a 12GB 5070; people were unhappy with a 10GB 3080 back in 2020. I don’t believe that Nvidia fixing a manufactured problem is a cause for praise (quite the opposite actually).
being a launch day customer was a matter of luck, not an indication of being nvidias strongest customers LOL.
This is also not really what it’s about. Being a part of the bleeding edge means risking a potentially degraded software experience compared to last gen. Nvidia has been real good about that lately (which may be related to the strength of demand at launch), but you sign up to be a beta tester when you buy hardware based on brand new architectures, and everyone who bought a 5000 series card without getting that experience previously learnt that lesson the hard way.
Curious whether your take is actually thought out better than mine or not
MyWifeHasAPhatAss@reddit
>I feel like your comment is written as if I own or have ever spent mine/somebody else’s money on the 5000 series of GPUs
Respectfully(sincerely, not sarcastically), I would say to re-read it then. I specifically avoided pinning it to your perspective, saying things like "doesnt hurt anyone's gpu", "if they are that bothered...they can upgrade", etc. I noticed you didnt specifically say you bought one, so I got ahead of it.
Your comment about the 50 series VRAM doesnt really track for me, you framed it like people didnt have full control over their choice to buy a blackwell gpu or were otherwise deceived about the vram specs when they clicked the button to buy it... That's victimizing the customers in an unnecessary and imo untrue way. People are fully welcome to not buy a product they deem not good enough. I was one of the people trying hard to get a 5080 within a $100 of msrp and was just unsuccessful. You are also playing both sides of the fence: unhappy about low vram and now simultaneously complaining about the rumor that there'll be options with more vram soon.
ThankGodImBipolar@reddit
I don’t really disagree with your argument, but I try to be sympathetic as well. Several of my friends are running Pascal cards, for example - it would be hard to blame them for upgrading 8 years later, even if the 5070 still had a disappointing amount of VRAM. Neither of them have, but if they did, I could understand why they might be upset.
And from a practical perspective, if Nvidia is going to be making GB205 dies no matter what, it’d be nice to see them going into cards that will last as long as possible. Making a 5070 with 12GB of RAM isn’t planned obsolescence, because Nvidia ultimately isn’t the party that makes the 5070 obsolete - but it is intentionally myopic, in order to encourage user spending (+waste) and to prevent another Pascal situation.
Like you said though, not buying will always be an option. The 9070XT is also an option. And previous generation high end cards can be an option. Not releasing gimped versions of your cards to slightly pad your margins for a year - also an option. Even if you can blame the consumer for buying cards that they ultimately weren’t happy with (which I surely did somewhere in my comment history the last time this happened), I still feel like this launch strategy is pointless (for the general public) and wasteful, and Nvidia deserves to get dragged for it.
panchovix@reddit
I mean is not that "rare". They released the 3090TI (Jan-March 2022) and then a card like \~60% faster on the same year (4090, Oct 2022).
surf_greatriver_v4@reddit
they have like 90% consumer dgpu market and to most, they are the only producers of GPUs they know
that's why they'll be fine
H3LLGHa5T@reddit
meh, I'll probably wait for the 6000 series refresh or the AMD equivalent when they drop, performance uplift from the 4000 series was too small anyway.
dumbdarkcat@reddit
Will they do a Blackwell N3 refresh? Could lower the power draw by 15-20% while having a bit better performance.
KARMAAACS@reddit (OP)
Not a chance. NVIDIA is not going to waste money on something like that when they have their next architecture which is on 3nm or 2nm brewing and everything they have now is already in high demand (except for the garbage 8GB cards).
NeroClaudius199907@reddit
Arent the 8gb cards going to sell the most units like the previous every gen
KARMAAACS@reddit (OP)
Sure, but their yields and quantity per wafer are way higher than the larger dies, so relative to their quantity they're probably underperforming demand compared to a 5090 is.
NeroClaudius199907@reddit
Yields this yields that...people are poor. 5090s cost $2000+
KARMAAACS@reddit (OP)
Yes but their demand is high relative to how many dies there are, unlike 5050s and 5060s.
NeroClaudius199907@reddit
I disagree heres why: steam initial sales (similar timeframe)
RTX 5060 (0.34%) has nearly identical adoption to the RTX 4060 (0.33%) and 4060M (0.28%) (May-June data)
RTX 5090 sits at 0.19% from January to July, compared to 0.33% for the 4090 from October to February
That doesn’t point to massive 5090 demand; it suggests limited availability, not outsized interest.
Its even shown in JPR dgpu shipments decrease
_elijahwright@reddit
I think there are probably going to be more people buying 5090s for local inference than there are 4090s. it's not worth paying scalper prices unless you desperately need CUDA and tensor cores, a larger memory bus, more VRAM, larger L2, etc. there are still shortages even if the 5090 isn't at MSRP because of AI workflows
KARMAAACS@reddit (OP)
You're misinterpreting what I am saying.
What I said was that relative to how many dies there are, 5090 has higher demand. That doesn't mean 5090 sells more units. It means that 5090 is sold out or sells for a high price due to lack of supply.
If you REALLY believe that the 5090 is not in high demand, then I suggest you try and find one in stock and at MSRP. Also most 5090s are not going to gamers, they're going toward AI in China and other regions, hence why it won't really show in Steam Hardware Survey, because they're not going into gaming rigs.
Vb_33@reddit
8GB cards sell the most out of any of their cards, enthusiasts are disconnected from reality here.
NeroClaudius199907@reddit
Thats the plan for Rubin + new features.
Blazr5402@reddit
5060 Super with 12 GB of RAM could be a great card if it's price-competitive with the 16GB 9060XT. Less VRAM would be an alright tradeoff for Nvidia's more mature AI suite.
l1qq@reddit
I will own a 5070ti Super or 5080 Super on day 1. The lack of VRAM was the only thing keeping me from buying already.
awr90@reddit
You aren’t getting a 70 ti super this gen. It’ll be 5070 super, 5080 super.
Plank_With_A_Nail_In@reddit
https://www.techpowerup.com/gpu-specs/geforce-rtx-5070-ti-super.c4312
l1qq@reddit
It's going a long with the same rumors as the rest but nevertheless I'll be getting a +20gb VRAM Super card on launch day.
upbeatchief@reddit
I highly doub that a 5070 ti super is coming. Their only real way of improving the card without outright replacing the 5080 in performance is with 24g vram. And that would also make it too competitive in ai workloads.
A 1300 usd (actual street price) 5080 with 24gb l. Yeah i think that will be their offering.
Vb_33@reddit
5070ti super is confirmed. It's the same exact chip as the 5080 super just with defective sections.
ThankGodImBipolar@reddit
I feel like your comment is written as if I own or have ever spent mine/somebody else’s money on the 5000 series of GPUs, and I just want to be clear that that is not the case; I own a 6600XT. I also didn’t spend money on the 2000 series or 4000 series where this happened as well, and the “take” in my comment was based on the reaction that I saw when Nvidia pulled the same move on non-Super purchasers of those series. The complaining was loudest during the 2000 series, it was less for the 4000 series, and nobody had commented on it under this thread when I posted it, so I thought there was an interesting discussion to have.
I think the important distinction here is that the “largest criticism” with these products was a choice that was made by Nvidia that made their products less useful/valuable for the people who bought them. Let’s not pretend that Nvidia didn’t know that people would be unhappy with a 12GB 5070; people were unhappy with a 10GB 3080 back in 2020. I don’t believe that Nvidia fixing a manufactured problem is a cause for praise (quite the opposite actually).
This is also not really what it’s about. Being a part of the bleeding edge means risking a potentially degraded software experience compared to last gen. Nvidia has been real good about that lately (which may be related to the strength of demand at launch), but you sign up to be a beta tester when you buy hardware based on brand new architectures, and everyone who bought a 5000 series card without getting that experience previously learnt that lesson the hard way.
Curious whether your take is actually thought out better than mine or not
IgnorantGenius@reddit
Get ready to pay $4000.
chipsnapper@reddit
I already know it’s not gonna happen, but if they’d move 5070 Super off of 12V-2x6 it’d be a killer card with zero downsides.
joe1134206@reddit
There's always bus width, cuda core count, die size
MrDunkingDeutschman@reddit
12V-2x6 @ 250W has zero downsides.
AutoModerator@reddit
Hello KARMAAACS! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.