Nvidia RTX 50-series Blackwell designs and specifications expected to be finalized this month — RTX 5080 rumored to have 400W TGP
Posted by imaginary_num6er@reddit | hardware | View on Reddit | 89 comments
DoctorMckay202@reddit
I've been squeezing the last drops of my 2080 for the last year after I upgraded to a Samsung Neo G9 screen. Poor guy cannot handle the VRAM games are asking of me to play at fullscreen.
Hopefully the 5080 FE drops at around 1200$ and I can put it to rest.
bubblesort33@reddit
Remember the 420w RTX 4080 rumors? https://www.techpowerup.com/295614/nvidia-rtx-4080-rumored-to-feature-420-w-tdp#:~:text=The%20RTX%204090%20has%20been,power%20requirement%20of%20420%20W.
They did drop it to the correct values like a month before official launch announcement, though. https://www.tomshardware.com/news/40-percent-power-reduction-rtx-4080-4070
LAwLzaWU1A@reddit
I am not that big of a fan of kopite, or "leakers" in general. A lot of them tend to make multiple statements that contradict each other (like claiming several different TDPs) and then when one of them is correct it is easy to go "oh, they knew!". Making multiple claims and then getting one right is not that difficult. MLID is especially notorious for this.
As for kopite and the 40 series, they also said Nvidia would bring back the Titan branding, and the incorrect TDP you mentioned, and that the RTX 4070 would have 10GB of memory, just to mention a few of the things they got wrong. Kopite seems a bit more reliable than several other leakers, but it is important to not take this as fact. There is a quite high likelihood that it is wrong.
MrBirdman18@reddit
He’s another leaker but AGF (typically very reliable) has dismissed the Kopetite wattage claims as overstated.
ResponsibleJudge3172@reddit
There is a difference between contradicting yourself, and updating when news updates.
The first 4090 time spy were using conservative clocks, the final 4090 had higher clocks, so first was a lie right? No, Nvidia was still dialing in the final marketing specs.
Same with 3080 20GB, never launched, but was found 2 years later as a sample on the second hand market, showing that it definitely existed.
Another example, AD103 was reported to have 84SM by leakers. Later Nvidia got hacked and the files also showed this. But it launched with 80SM. Why? A bug that was leaked about an issue with 2 TPC (which has 4SMbut never really discussed. In Nvidias's eyes, it's not worth retaping because it was a negligible 5% of SMs and AD103 as is was competing with NAVI31. You will find those missing units if you look at a die shot
Even a month before launch, what will be considered a 5070 ir 5070ti or if 5079ti will even exist, will constantly be changing
LAwLzaWU1A@reddit
1) Several leakers, including Kopite, have made mistakes that can't simply be excused by "the company changed their plans." Things like clock speed adjustments can change as final specifications are dialed in, but getting something like the memory configuration or core count wrong is a different matter. These are foundational aspects of hardware that are typically set well in advance and aren't subject to last-minute changes. So when leaks get these major details wrong, it’s not just about shifting plans—it's about the leak itself being inaccurate.
2) Even if what you’re saying is true—that leaks are "correct" at the time but companies change their plans—that still means leaks are unreliable. People look at leaks because they want to know what to expect, and if the excuse for getting things wrong is always "they changed their minds," then the leak might as well be speculation. Constantly falling back on "plans changed" makes leaks feel more like guesses. It's awfully convenient for leakers who frequently get things wrong to blame it on Nvidia or whoever supposedly changing their minds.
The point is, even if some details in a leak might be accurate at one point in time, that doesn’t make the information trustworthy for consumers or enthusiasts who rely on it to get a clear idea of what’s coming. The fact that these "updates" are often vague or conflicting adds to the unreliability of the leaks themselves.
BoltTusk@reddit
I’m a fan of RGT that claimed “Alchemist+” and 3x performance uplift from RDNA2 to RDNA3 lol
WHY_DO_I_SHOUT@reddit
Could Alchemist+ be Xe-LPG?
Successful_Winner838@reddit
Yeah, but the logistics of it make sense this time. While it's a newer version, they are still on the same node, and looking to increase core count by 33% for the 5090. Short of figuring out another optimization on the level of Maxwell, when they solved tile-based rasterization, I don't know how you're not stuck boosting power by at least another 20-25% at the same tier compared to Ada.
Jeep-Eep@reddit
They're also hedging their bets against Blackwell Next being late, so if they have to stand their ground against RDNA 5 they won't embarrass themselves.
Successful_Winner838@reddit
How do you mean? There'd be no point in releasing a new generation that wasn't faster than the last one.
Strazdas1@reddit
I think what he means is that the 5000 series may have to last longer and will end up competing with RDNA 5 before the 6000 series launches. So they want to increase power to make them more competetive to next AMD generation too.
Successful_Winner838@reddit
That still makes no sense. They're not boosting power to make it some sort of monster, they're doing it just to make it fast enough to be worth making it a new generation. The 5090 might be 50% faster than the 4090. If they were truly worried about what AMD launches 2 years from now, they would have held out a few months and moved to 3nm. This isn't that at all. This is them saving money by staying on an old node. This is Nvidia saying they aren't worried about AMD at all.
IglooDweller@reddit
They probably also want to avoid bad publicity linked to melting components and might be over-engineering some components.
capybooya@reddit
Definitely increased power. Hopefully not too much, but they're gonna need it to get within the expected minimum 5090 performance increase. I doubt they will have the transistor budget for more cache, so probably memory bandwidth, frequency, power, and probably some AI/RT acceleration unit tweaking as they'll want to brag about new features or at least capabilities.
Successful_Winner838@reddit
0 chance they go backwards on the cache. I would even expect more of it per core even. Something as big as what's rumored would probably need double the bandwidth of a 4090 if they were to go gimping the cache. I think this will either be a new die size record, or it's going to be an MCM design.
hackenclaw@reddit
I think GDDR7 bandwidth allow Nvidia to halt that growth in cache size for 1 generation. So I wouldnt be surprise 50 series have similar cache size.
GenZia@reddit
At the risk of sounding like a broken record, the shift from 4N (custom N5P) to 4NP (custom N4P) is a baby step compared to the massive leap from Samsung 8nm (Ampere) to 4N (Ada).
I would've dismissed these claims myself, had Blackwell been on N3.
For all intents and purposes, Blackwell is going to be a "tick" for Nvidia, as opposed to a "tock," to borrow Intel's nomenclature.
NewRedditIsVeryUgly@reddit
Tick-Tock means a release every year. The 4000 series was 2 years ago. This is more like Tick-No real competition-Tock. They just have no reason to pay for the most expensive node, no one is going to beat them.
XYHopGuy@reddit
has nvidia ever paid for the most expensive node?
ArseBurner@reddit
A100?
XYHopGuy@reddit
7nm in 2020? Nope
ArseBurner@reddit
IIRC tick tock was about separating major architecture changes from die shrinks, and the architecture changes were tocks, while die shrinks were ticks.
Super series is so minor it wouldn't even count as a tick. It'll be more like 4770k to 4790k.
bctoy@reddit
Tock is a arch change while tick is a node change, so the other way round since Blackwell was rumored to be the first major shader change since Maxwell/Pascal.
YNWA_1213@reddit
At the end of the day, all that matters to the gaming crowd is perf/$ (to a large extent). If 4090 performance all of a sudden was now a flat $1k, it’d be a major leap forward for a majority of the market. From the opposite side, look at the hate the 4060 got for being more of an efficiency gain than a performance gain, but if a 5060 gave 4070 perf @ 4070 power for 4060 pricing, it’d represent the first jump forward for that tier in years.
unga_bunga_mage@reddit
Best NVIDIA can do is 5070 for the price of a 4090 with the performance of a 4060. Consumers are going to love it because AI.
Massive_Parsley_5000@reddit
NV doesn't have any pressure to do this though (other than maybe the DOJ, lol...). AMD has seemingly completely ceeded the high end market this go around based on the rumors. NV has the x8 and x9 level cards completely to themselves this gen.
YNWA_1213@reddit
Yeah, I think the major hope for most is AMD is competitive enough so that the 5060 is a cheaper 4070.
Jeep-Eep@reddit
It's also a tick that risks facing potentially one hell of a tock from the opposition if anything goes wrong with their own tock.
Nointies@reddit
Hahahahahahah
AMd gave up on the high end with rdna 4, and now you're gassing up rdna 5?
Wild.
Jeep-Eep@reddit
One one gen; AMD has skipped the high on gens before and returned, much as with Polaris or RDNA 1.
Nointies@reddit
You do realize that by the time AMD releases RDNA 5, they wouldn't be up against the blackwell 'tick', they would be up against the next 'tock', right?
Jeep-Eep@reddit
They're hedging against the possibility that that plan does not play out; Team green isn't getting complacent here.
Nointies@reddit
What?
Jeep-Eep@reddit
I'm saying they don't want to be caught with their pants down if blackwell next has serious delays.
Nointies@reddit
There's no indication it will so you're just... ???
I have no idea how you got here from your weird gassing up on RDNA 5, an architecture we know nothing about.
Jeep-Eep@reddit
My guess here is that they'd be going with the Blackwell Super-like designs out of the gate so they don't have logistical snarls if things go wrong; I think consumer Blackwell is monolithic, I might be wrong. If next isn't monolithic, that is a significant source of risk of delays right there.
ResponsibleJudge3172@reddit
Last time around, only reason people assumed 40 series would be both hot and inefficient is because they are not AMD (don't even ask me how that makes sense, but people literally put intel/Nvidia together VS AMD And made tons of posts about why they don't care about efficiency but will build barely faster system that use double the power)
The claims make sense this time around in that 4NP is not a big leap, and things like 512 bit bus and higher clocks will consume a lot of power.
sittingmongoose@reddit
The 4080 was supposed to be very high wattage. They dropped it close to launch when they realized there was no point. That’s why most of the 4080 models had 4090 coolers on them.
Gold-Foundation1277@reddit
I am planning to build a PC. What should be the safest option of CPU to pair with 5000 series now ?
Gold-Foundation1277@reddit
I am planning to buy components for new pc. What should be the safest option of CPU to pair with 5000 series now?
PostExtreme7699@reddit
Nvidia doesn't need to release anymore gpus, they gonna delay all they can the 5000 release.
ofon@reddit
no idea what you're talking about man...new generation of sales even with some new features and modest (\~20%) performance gains at most tiers still generates new sales.
This is likely going to be a pretty big generation as they're gonna jack the power consumption a lot.
Awankartas@reddit
I prepared $2000+ for 5090. Hope it will be enough. I'm scared Nvidia will just do nvidia and they will jack up the prices to $3000
Deckz@reddit
If you're just playing games you have to basically be an insane person spend 2 grand on a gpu. That or rich, I'd only buy one if I made 150k+ a year and live in a cheaper area. If I lived in NY or Boston like 300k.
GISJonsey@reddit
$150k a year isn't rich.
Strazdas1@reddit
It is very rich almost everywhere with except a few cities.
BarrieTheShagger@reddit
That's nearly top 10% of all Americans as of 2021 statistics, so yes rich, taking household income as well pushes it higher but still top 25% so yes by definition quite rich, not super rich like millionaire/billionaire territory, but 99% of the world is below that, even a large portion of the west.
GISJonsey@reddit
$150k in 2021 dollars is $175k in 2024 dollars.
I guess it's just definitions. To me the top one person out of every 10 isn't rich it's more like the top one out of every hundred people.
Yahoo finance defines upper middle income all the way up to $150,000.
SilasDG@reddit
It's all perspective. People who haven't dealt with the rich, think 150k is rich.
It's like when you're 10 and you think $100 is a lot of money, only to grow up and realize it's nothing in the grand scheme.
150k It's upper middle class for sure but in the grand scale. The top .1% make 2.8 Million a year. 150k doesn't scratch the surface of rich.
PainterRude1394@reddit
Making 150k doesn't make you rich. For example, you could have 150k of debt.
Deckz@reddit
Depends on where you live, that kind of salary in the right neighborhood you can dump a huge portion into a 401k and ira. Or if you don't have a family.
Vb_33@reddit
Depends how important gaming is for you as a hobby. Gaming even PC gaming is pretty cheap compared to many hobbies like for example cars. $2000 once every 4 years for a GPU isn't necessarily a big deal for American middle class.
Sad_Animal_134@reddit
I think it's more justified at 2000$ every 6 years. Going from a 3090ti to a 5090 doesn't sound justified to me. Going from a 2090 to a 5090 sounds like an actual upgrade.
But I'm sitting here with a 2070 super because so far I still haven't seen a need to upgrade. Next year has some big titles I'm interested in though so we'll see
wolvAUS@reddit
Fun fact. A lot of AI folks are shifting away from NVIDIA and picking up Macs. Why? Because you can customise them to have an insane amount of VRAM (180 GB). Decent for the cost.
TattooedBrogrammer@reddit
Can you give me more info, I’m in the market for a new AI machine, been waiting to drop 3-4k on a new PC but I’d be just as happy ish to drop it on a Mac Studio
wolvAUS@reddit
Check out r/localllama and search for “Mac”.
sansisness_101@reddit
So maybe 5090 drops october or november?
Strazdas1@reddit
October/November would be what Nvidia usually does.
PainterRude1394@reddit
I'm hoping for it. Maybe a late September announcement? Been waiting for the new Intel CPUs and 5k series; hope both land this year!
thermalblac@reddit
Remember the leakers/rumors preceding the 40xx release claiming it will require a gigantic 4.5 slot cooler, massive power requirements and possible reintroduction of an external power adapter?
Never a shortage of dogshit info
mi7chy@reddit
Very likely already finalized. Been tracking restocks and while 4080s are getting restocked but not 4090s so very likely 5090 is being released first and soon like next month. Running a 4080 Super with power limit so total system power consumption is <300W from wall and plan to also power limit 5090.
iwasdropped3@reddit
Sorry if this is stupid but how do you track restocks?
mi7chy@reddit
https://www.nowinstock.net/computers/videocards/nvidia/rtx4090/
iwasdropped3@reddit
thank you! does this site show when something has been stocked or are you inferring by availability?
pain_ashenone@reddit
I was thinking that too, I can't find any FE stock in my country and AIB cards seem to not go down in price at all, so there is probably not many stock left I guess
GODCRIEDAFTERAMDMSRP@reddit
also noticed that there is almost no 4090 stock anymore
SoTOP@reddit
That very likely would be sub optimal. The cross point for 4090 versus 4080 in efficiency is ~180W. Above that larger 4090 is better, below smaller 4080 gets ahead. With 300W limit at the wall you should already be pretty close to that number, and its likely that even bigger 5090 will have its optimal efficiency above 200W.
So getting cheaper 80 card might be better this time too, though you would lose ability to get full 5090 performance should you need it, and would most likely have to wait more for 5080. At the same time I expect pricing will differ noticeably more between 90 and 80 cards this time.
PineappleMaleficent6@reddit
Wonder how long nvidia can continue with this power consumption...in 10 years from now they will just stop selling gpus for home market and you will have to pay for their gpu servers.
Spirited-Guidance-91@reddit
nvidia is almost certainly giving partners a very high TGP/TDP so that they don't under-design coolers.
A side effect is that the basic cooler will be huge, so as to discourage using consumer cards in parallel for inference/ML use.
Successful_Winner838@reddit
I don't think so. The 5090 is supposed to be 33% bigger and they are still on the same process node. There is nowhere for TDP to go but up by a substantial amount.
asdfzzz2@reddit
There's also only a very slight 2.3% drop in frame rates with an 80% power limit — measurable but not particularly meaningful. Even with a 70% power limit, performance is still just 5% slower than at stock, which drops to 10% slower at 60% and 23% slower at 50%. (c) https://www.tomshardware.com/news/improving-nvidia-rtx-4090-efficiency-through-power-limiting
Nvidia could just release slightly power starved card, like they did with 3090. Hypothetical 33% bigger 5090 on 450W would work the same as 75% power limit 4090, being ~3-4% slower core-per-core, but having 33% more cores.
JensensJohnson@reddit
Oh no we'll get good coolers again! What a travesty! Nvidia so bad!
PainterRude1394@reddit
Couldn't any partner easily release a slim version after the fact and corner that market? I'm not sure how viable this strategy is.
Radulno@reddit
I imagine partners want to stay in Nvidia good graces for obvious reasons
PainterRude1394@reddit
Right, so it sounds like it isn't Nvidia tricking partners with tdp figures is what I'm suggesting.
MeelyMee@reddit
I believe NVidia are pretty damn strict about this stuff and releasing blower cards, single slot cards etc is likely to piss them off to the extent of them no longer selling GPUs to you.
It's why there's so little innovation, Nvidia has the AIBs locked down and given we have seen absolutely nothing more innovative than Palit and their small range of passive cards I think it's safe to say the lawyers have made sure nobody is releasing anything like this.
haloimplant@reddit
i wonder if there's also an element of under-selling the next gen to avoid killing current gen sales
buying a 300W 4080s feels better if you think the 5080 is going to be a 400W monster and the 5070 is far away
theholylancer@reddit
that could have the opposite effect
if those numbers are good, that means likely the 5080 is back to using the top die, ie the 02 die and not the 04 die it is currently
which means a smaller gap between it and the top dog, a return to what is normal more or less.
the gap between the 3080 vs 3090 was something like 10% and unless you need the vram for work it was not really a consideration at the time for gaming.
tukatu0@reddit
Hn doesn't really matter. Last quarter for the summer they still made more money than in any other quarter that isnt recent or from 2021.
These new audiences aren't rhe same as pre covid. They don't seem to care about price and probably won't about launches either. It's not like the parent buying off credit for kids homework use is going to even know to care. Plus the adults probably think they can just sell their current card for significant amount of money too. So they don't care either.
I really have to wonder what the credit behaviour on gpus looks like today versus 2017 or 2011. It's probably outright 10 times higher. And i specify gpus and not pcs because they sure as sh""" arent buying displays. 1440p 180hz displays going for $150-180 usd. Just amazing.
haloimplant@reddit
my guilty youtube pleasure lately is Financial Audit (people come on and talk about their horrific finances), and yup people be using credit like crazy
tukatu0@reddit
I don't even want to call them people. In an insulting way. Just whales. In relation to this topic anyways. These people make up 1% or less of any market but will dominate it's action. Candy crush makes up like 0.5% of gaming revenue (just a small billion a year) yet somehow i doubt more than 100k people are spending money on it. But eh. Im wrong and user number should be somewhere.
haloimplant@reddit
not gonna lie I'm guilty of this on the spending side. I was dialing the money burn on my gacha game way down but then I had a financial windfall and stopped caring since I can afford it
what's wild is the credit people that are whales for the banks, they get behind and all their money goes to 10-30%+ debts forever
YeshYyyK@reddit
:(
https://www.reddit.com/r/sffpc/comments/12ne6d7/a_comparison_of_gpu_sizevolume_and_tdp/
pain_ashenone@reddit
So any chance this drops by nobember/december? Need a new gpu by then and it feels bad to buy a 4080/90 knowing there will be a better one so soom
GODCRIEDAFTERAMDMSRP@reddit
Does that mean we will 5xxx soon? Or its still will be around CES 2025?
zenukeify@reddit
The real question is if they’re preparing to drop a titan. I’ve always suspected the -90class was just an excuse to push the top gaming card to titan pricing. They could totally drop a titan at $3999 with 48gb Vram and call it the “titan ai” or some shit
3ebfan@reddit
I’m so ready to pour out my wallet for a 5090 with Arrow Lake.
It’s time to put this 8700k out to pasture.
randomkidlol@reddit
the designs and specs have probably been finalized as of early this year. what theyre ironing out now is how to cut down the dies to build SKUs, and designing a reference PCB.