NVIDIA, this is a joke right? - RTX 5080 [review by optimum tech]
Posted by mostrengo@reddit | hardware | View on Reddit | 216 comments
Posted by mostrengo@reddit | hardware | View on Reddit | 216 comments
SoggyCerealExpert@reddit
and to think people thought these prices where somewhat good
lol
Far_Success_1896@reddit
I mean it's better than a 4080 super and it comes in at same price. People will buy it based on that alone.
If it was anything significant Nvidia would've priced it like the 4080 at launch. They didn't because it's not that and people probably wouldn't buy it because xx80 buyers won't go over $1000.
This will review poorly but will sell well regardless. There's been two cards in the last two gens that reviewed well and both the 3080 and 4090 could not be had for a good year post release.
This is what you're getting and there isn't much you can do besides vote with your wallet. If you're unhappy you can wait until next gen and if you hate things now it will likely be even worse in two years.
thefreshera@reddit
"vote with your wallet" is one of those things that fall on deaf ears. Like never preorder games, things like that. It has been said for decades and it still needs to be said, so what does that tell you about consumers?
varzaguy@reddit
Ok, so I have a 3080 and an Oculus Quest 3 that I play iRacing in and a super ultrawide monitor. The 3080 can barely keep up.
So what is vote with my wallet mean here? I need a new graphics card.
I think you guys forget there is a sizeable difference between a 3000 and a 5000 card. Not everyone has a 4000 series card.
Material-Question-88@reddit
I'm trying to sim race in VR with a 3070! It's now becoming a sick inducing experience, so I've just ordered a 5090 at below MSRP. I just missed the delivery today 😅
Suitable_Spell_9130@reddit
As someone who also has a 3080 my choice is simple. I'm going to buy a 5090. 2.5x my fps is a no-brainer and I have the money so what's stopping me.
Devastate89@reddit
Being an educated consumer would stop you. I mean if you dont mind throwing your money into a fire pit, that's on you. I'd wait a year or two personally.
Material-Question-88@reddit
How is it throwing his money into a firepit when he's getting a good product?
Suitable_Spell_9130@reddit
That's your amazing advice? Wait two years? Fucking Reddit man.
Yeah I'll wait two years and continue with a card that's already no longer fit for purpose just so the 4 trillion dollar company will lower the prices by 10%, don't make me laugh.
Objective_Sentence86@reddit
I know this thread is old, but it’s the same comments where people say price vs performance. No, I need the performance. Yes there is a price ceiling, however, I’m not buying a GPU because of the price vs performance alone. It still has to PERFORM in my games with an acceptable frame rate. I could say the 3080 was a good price to performance but if it won’t run the game I’m trying to play, there is no sense in buying it.
I agree, fucking Reddit lol
cincyeaglefan@reddit
Preach, brother.
sgtsavage4@reddit
Yer I’m on a 2080 so the difference to a 5080 is going to be huge to
FatPanda89@reddit
AMD 7900 and the upcoming 9070 are also options that should prove a significant upgrade, and historically been better value on vram and raw rasterize performance.
f1rstx@reddit
They’re not an option for anyone who isn’t playing CoD or enjoying more then 4k/30fps since FSR is unusable, or do any work on pc, or playing Vr, or streaming, or using HDR… AMD cards can’t offer anything good
RomanBMW335i@reddit
You are lost in your fantasy world.
f1rstx@reddit
sure
kuddlesworth9419@reddit
XeSS exists and looks pretty good at the Balanced setting and above.
chlamydia1@reddit
How many games support XeSS? I've seen it in maybe like one or two games.
FatPanda89@reddit
They are a perfectly fine option - it's not like their cards CAN'T do it all the things you mention, in fact they do it all the same, or with so little difference it's only noticeable for very enthusiastic users. For the average gamer running 1440p, the majority of games will perform more or less the same, for less money. Saying AMD can't offer anything good is a strong simplification and simply untrue.
varzaguy@reddit
They were targeting the 5070 ti and the 5070 though?
Super-Handle7395@reddit
I put my 3080 in a spare rig 2 years ago and purchased the 4090. The 3080 just needs to hit 60FPS which normally it can do with DLSS.
ExtremeFreedom@reddit
Not buying a card and dropping quality.
varzaguy@reddit
Already running lows on iRacing. Can’t drop it anymore.
What a ridiculous sentiment.
I bet you probably play at 1080p or have a nice card that meets your needs already.
Really easy to talk shit if you don’t have to give anything up.
ExtremeFreedom@reddit
I only play lol and cs, I can use a toaster, games aren't worth the money being charged for this shit, might as well just buy a beater car and drive it around a real track, or get a gocart and do the same. It's fucking ridiculous. If that's your thing then that's fine but I'm just telling you not buying something is an option. Or try to get a 4090 someone is selling for maybe less than a 3080 as the prices on that potentially collapse. Could also hope for an AI bubble burst to flood the market.
tukatu0@reddit
Unfortunately 4090 pricing probably isn't going anywhere. If redditors are really right about muh ai proffesionals using them. Then pros aren't necessarily going to switch over to the 570watt card. Since electricty is something that concerns those types.
varzaguy@reddit
I play iRacing because track days are too expensive. I sporadically do them because of the cost, but I wish I could do me.
$1k just in tires and brakes. Not to mention track fees and insurance.
Buying a $1k card for the next two years is the cheaper option in that regard lol.
tukatu0@reddit
Nothing like getting aggressively offended over something that does not matter.
And this is why we now have to pay $1000 just to get 60% the fps of the top card. Smh. Because of peoppe like you
Anyways. Dont bother wasting your life over the internet arguing about something you are going to do anyways.
varzaguy@reddit
You sound mad that I’m in the market for a graphics card. Bizarre thing.
You want me to buy a 4080 instead?
LickIt69696969696969@reddit
Only works when people are not stupid
jaaval@reddit
I often vote with my wallet but if I need a gpu this year my wallet goes to why is the best option for me. Even if it is very disappointing product.
csgothrowaway@reddit
I mean...I think that's part of the problem, but the other part of the problem is ridiculous demand for a subpar product.
I'm still on a 1070. I needed to get a new platter drive for my NAS but I also decided to finally upgrade and get a 5080. So I went to my local Micro Center about 20 minutes before opening, thinking, well if its available, cool I'll just snag it. If its not, no biggy I still need to get a new HDD for my NAS.
I get to Micro Center and there's a line going around the entire store for these GPU's. The demand is just fucking insane. I talked to people in the line and apparently people have been camping out since Monday. Its wild to me that there is an overlap of SO MANY people that both have $2k to blow on a 5090 and nowhere to be from Monday-Thursday.
I don't know, maybe there's just a lot more rich kids I didn't know about that don't have a job to be at or maybe people are just that irresponsible and putting it on credit. Either way, the demand for this subpar product is unbelievable. Turns out, the there were only five 5090's in stock for Launch day and then the rest were 5080's...and after we learned that information people STILL sat in line waiting for the 5080's, which from what I hear, is hardly better than a 4090...which they could have just bought any number of prior months to today. So, they literally sat in line for 4 days, spent over $1000 for...DLSS 4? I really don't understand you people.
Spoonfeed_Me@reddit
Exactly. However, I don't think that people in your situation are the target audience for critical reviews like this. I've been watching all the YT reviewers, and they all mention the same thing: The 5080 is not a "bad" gpu, just not the improvement gamers wanted to expected (with a bit of shadiness from NVDA marketing). Who benefits from this info? The hobbyists who upgrade every generation because they want really value the low double-digit uplift they expect every time.
So if your desire for a gpu is out of "necessity" to play a specific game at a specific resolution/quality because your old rig doesn't cut it, it's still on par with the previous generation and therefore worth getting, but if you're the kind of person who likes to upgrade each generation for the performance gain, there's nothing really here for those people on the hardware side.
Strazdas1@reddit
thats because there are no reprecutions for not doing it. Heck, we arent even allowed to shame people for preordering.
obp5599@reddit
Voting with your wallet does work. Redditors just think because they don’t like something everyone doesnt.
This card is still good value if upgrading from an older gen. Its the same price as a 4080 super with slightly better perf. Anyone looking to build a new high end system in the next 2 years will look at this unless they already have 4000 series
Mean-Professiontruth@reddit
If you only get your GPU opinions from redditors you would think AMD dominates the market lmao
9897969594938281@reddit
How the fuck is this downvoted? Man’s telling the truth. Redditors can vote with their slim wallets and everyone else will snap these up.
AggravatingChest7838@reddit
5070/ti seems specifically targeted at all the people still rocking 10 series cards.
Mean-Professiontruth@reddit
Consumers do not want to buy a card with outdated tech and drivers that bans you from multiplayer games?
AShamAndALie@reddit
Im buying a 5080 for $1550 this month. Between higher prices and tariffs, that $1000 limit you are talking about is a joke outside the US.
WalkureTrap@reddit
Whilst the price being the same might be true for US, where I live (Australia) the 5080 is priced at starting from 2,019 AUD MSRP, whilst I managed to grab a 4080S for 1,649 AUD, felt it’s a good deal now.
The marginal improvement doesn’t warrant the ~20% difference in price for me.
BigGirthyBob@reddit
Yeah, I'm in NZ and regional NVIDIA pricing has gotten so bad here, I was able to buy 3 7900 XTXs for a little more than I would have paid for one (admittedly overpriced STRIX) 4090.
I won't be buying a 5090, but I am morbidly curious to see just how bad the pricing is here lol.
warspite2@reddit
Yeah and those 7900XTX are more then enough for most all gamers. Can't see why anyone (or any game) would need more when performance at that level is hardly noticable.
Brunohanham45@reddit
$2599-$2800 lol
BigGirthyBob@reddit
Yikes! That's $200-400 more than I paid for my XTX Aqua (and at 555W, that keeps up with a stock 4090 in raster).
Brunohanham45@reddit
$6000 for 4090 maybe
Brunohanham45@reddit
*5090
Anim8a@reddit
I went and checked mine but it was for a 4070 which i got @ A$829 with a 7950x3d @ A$799 (May-2023, date of purchase).
Looks like the AUD has dropped a lot vs the USD since. Causing the current gen parts to be more costly than previous gen did in terms of cost per frame.
5 Year chart AUD vs USD:
https://u.cubeupload.com/Anim8/AUDvsUSD.png
WalkureTrap@reddit
Yea, I agree FX plays a role in the inflated price for Aussie consumers, but doesn’t change the simple fact that 5080 is not the same price as 4080S and likely is not worth the extra costs for what it gives compared to 4080S (at least that’s what I think).
anapoe@reddit
I've been voting with my wallet for five years, which is how I've ended up with a 1660 (aside from an AMD 7600 recently for my HTPC). At some point you have to break down and upgrade.
plantsandramen@reddit
I have a 6900xt and am probably going to get a 7900xtx. 4k raster performance is great. Just no great RT or FSR.
f1rstx@reddit
If you play new games 7900xtx not gonna cut it, it can’t run 4k/60 natively 9 out 10 aaa games
plantsandramen@reddit
I don't really play AAA. My 6900xt does BG3 and Metaphor Refantazio at 4k/60 well enough for me
f1rstx@reddit
so whats the point upgrading then ;)
plantsandramen@reddit
The 7900xtx averages 20-30fps higher at 4k than my 6900xt. It's more about getting something with better 4k longevity before tariffs are enacted.
mechkbfan@reddit
Yeah, I went over the top and got 7900 XTX Nitro. Then seeing that AMD won't lilely release a card for a while that out performs it means I'll be holding on for a long time it seems
I run Linux, so CBF dealing with NVidia's shitty drivers
plantsandramen@reddit
Yeah I'm looking at the Sapphire Pulse myself.
Tbh though, if the 4080 Super was priced reasonably now, I'd consider that. The prices are ridiculous though.
moonknight_nexus@reddit
But still massively overpriced compared to the rest of the 80 class of previous generations. And this card is not even a true 5080, but a 5070ti
Far_Success_1896@reddit
gen on gen performance is something that you can expect but you're not entitled to. there are other things at play that created this situation. AI for one and gamers not tolerating a $1200 xx80 class gpu.
so they anchored on the $1000 price tag from the 4080 super and made the best card they could using the same node for that price. if you wanted them to make it much better you would be looking at 4090 prices and people would be more pissed.
if you were following all the chatter leading up to CES, everyone though these cards would be much more expensive. it wasn't but you also weren't getting the performance increase you wanted.
you can't have everything. not in this day and age.
bubblesort33@reddit
I'm curious how many scalpers will go for this. I hope a lot of them, and they all get stuck with their cards.
Jeep-Eep@reddit
I've said before the GPU price was was coming sooner or later, but that nVidia was the one to blink first was in retrospect a major warning sign.
Visible_Witness_884@reddit
I mean... if you've got a much older card, you can get performance from 2 years ago today on a new card...
bubblesort33@reddit
I mean imagine they were as high as people thought, $1350+ or more, and then this was the performance.
mr_valensky@reddit
I have a 3080 10GB that I got pretty early on in their release.. this gen is killing me
Sufficient-Ear7938@reddit
Less than 1% uplift in pure path-tracing - Quake 2 RTX 5080 1440p
https://i.imgur.com/UkKCc92.png
Justaguyindayz@reddit
So is the 5080 worth buying in a new pc for the build I'm going for it will be 4.2k to buy and that's what comes in it or should I steer clear?
MemphisBass@reddit
Yes, of course it is.
EconomicsJazzlike@reddit
You play quake 2? I started with quake 1, and it was the best game in history
Framed-Photo@reddit
Another thread pointed out a regression in 3D mark ray tracing workloads, maybe by some miracle there's an issue here and Nvidia can fix it so these cards actually get a performance boost lol.
That's probably wishful thinking.
AttyFireWood@reddit
This has me curious if cards generally perform better in benchmarks after a year of driver updates. Or for something like a synthetic, it is what it is.
WHITESTAFRlCAN@reddit
We actually have seen this in the past with AMD drivers getting better and cards performing better but I don’t think we have for Nvidia
kwirky88@reddit
It’s nvidia saying “go buy an amd or intel, we dare you.”
JakeTappersCat@reddit
Something must be wrong. Maybe there is something going on with the chip design itself that is causing it not to clock as high as it should? It is totally bizarre that the majority of 50 series GPUs clock lower on a new process than the 40 series
Dangerman1337@reddit
That's crazy, Blackwell seemed to be the architecture to really push Path Tracing! This is pathetic.
sharkeymcsharkface@reddit
So my 3070 is still good right?
richardizard@reddit
Haha you and me both. I wanted to upgrade, but I don't see a huge benefit. We have cards that will last us years. The only reason I'd upgrade my 3070 is for 4K gaming, which I kinda kicked myself for not getting a 3080, but scalping prices were abusive during the pandemic. I'm very happy with 1440p ultrawide though, and tbh these uber expensive cards are not a necessity.
PureSquash@reddit
I have a 4060ti and don’t know if that’s worth upgrading? I want to upgrade either my GPU, my CPU, or my monitor but I’m not sure where I’ll see the most benefit :/. I wish I were more knowledgeable.
richardizard@reddit
What cpu and monitor do you have right now?
If I were to upgrade your 4060ti, I would go AMD with the 9070xt if you can find one at MSRP. Avoid buying from scalpers and wait for the restock if they have run out already.
PureSquash@reddit
I currently have an i7 cpu and some mediocre basic 1440p monitor. My goal is to play newer games at ultra-high settings with a capped frame rate of 120 and for the picture to LOOK nice (I watch a lot of movies on my pc too).
Schlapatzjenc@reddit
With a 3070 you would see a big jump in performance switching to 5080, and not just in RT (though obviously that is a good selling point).
It's still a bad value proposition, but you'd see it.
richardizard@reddit
Yeah, for sure. I just don't see the point of spending so much atm for so little value when what I have is working for me. Perhaps when I get more into VR and 4K gaming, I'll consider upgrading, I just hope there will be something of better value when that happens. If the 4090 is at a much better price point in a couple of years, for example.
AffectionateFinish22@reddit
I just bought it , xoming from a 3090 it's a massive upgrade for me on Native settings even without DLSS and frame gen
EconomicsJazzlike@reddit
I have been using Nvidia cards for years now. I recently bought an AMD card and am more than happy. Granted that the idea of buying an AMD card provoked some anxiety (fear of the unknown), so I took a few days and researched everything AMD. There are so many you tube videos explaining everything in-depth, they made me feel more confident. Amd's software is much more polished, and the card (7900 gre) runs everything great. Nvidia takes gamers for idiots. We should be growing tired of their lies, paper launches and the hype. It's time to put a stop to this.
SprinklesConscious41@reddit
I agree. I've been using Nvidia for 9 years now and been waiting for rtx 5080. Hearing that it is melting, I will definitely buy RX 9070.
tonma@reddit
Man, the 5070 is going to suck so bad
Very_Curious_Cat@reddit
I'd like to replace my 2070 because I just switched from 1080p monitor to a 1440p one (sales). If the 5070 is a bit better than 4070 super and not pricier, I'll still be interested BUT I'll wait for the RX9700 before any decision.
Engineer_DS@reddit
It's a "5070" for a 4080 price
mostrengo@reddit (OP)
Between the insane demand from AI and the total lack of competition from AMD (even on price/value) I'm not sure what other outcome would have been possible.
rabouilethefirst@reddit
Cheaper GDDR6X cards with slightly less performance should have been an option. 24GB VRAM GDDR6x 5080.
Everyone knows they’d prefer that
Strazdas1@reddit
I would not prefer that. I want GDDR7 in the next card im buying. And ill wait for the 3GB chips. Supers should come with that.
rabouilethefirst@reddit
So you want GDDR7 for the 2% performance gain or because it “might” get 3GB module cards in the future that could have been switched over at a later time?
Strazdas1@reddit
GDDR7 offers 80% increase in bandwidth. Depending on your usecase, that may be a lot more than 2% performance gains.
rabouilethefirst@reddit
16K gaming as shown in benchmarks. Not a relevant thing today
Jeep-Eep@reddit
Blackwell but with 3 gig modules would have a chance of holding on for those vaunted features to have some chance of proliferating.
Far_Success_1896@reddit
No they wouldn't. They tried a $1200 xx80 card and it didn't sell well. If you have vram anxiety you can just go get a used 4090.
But guess what. That's not going to be anywhere close to $1200 either.
rabouilethefirst@reddit
If that $1200 card had 24GB VRAM and the performance of a 4090, it would sell well. People would still have options on the lower end. What would happen in reality is nobody would want the 5090.
Far_Success_1896@reddit
Why would they sell a 24gb vram card at $1200?
Gamers in general don't actually need that much in vram. That much vram is much more in demand for AI and pricing it at $1200 would ensure that gamers would see it as often as they did the 3080 cards at launch.
I also want a 24gb vram card for $500. It's not happening.
dampflokfreund@reddit
Because some people plan to keep their cards for a long time. The PS6 is going to have 32 GB unified memory and might release as soon as 2027. When it releases, 16 GB cards are dead in the water atleast for max settings. Besides, 16 GB is already on the edge in recent titles with path tracing. Just imagine the struggle in 2 years.
Far_Success_1896@reddit
Why are you entitled to keep a card for a long time and have it run like it did 5 years ago?
You realize the card that started off this gen was the 30 series and if you want an analog the 50 series would be the equivalent to the 20 series.
When the PS6 comes out you can probably expect 2080ti performance out of the 5080. Which is fine because that card is 6 years old.
Jeep-Eep@reddit
I managed that with a fucking Polaris 30, if it's asking so many times over that asking price it had better hold on for at least 4 fucking years.
Far_Success_1896@reddit
I mean a 2080ti is hanging on but just barely. You're not running cyberpunk ultra rt at 4k and getting 120 fps.
No_Sheepherder_1855@reddit
Honestly don’t even see the point of ddr7. Doesn’t really seem to do much for performance.
ethanethereal@reddit
The memory allows the 5090 to be 40-50% faster at 4K native Wukong, 75% faster at 8k native GTA5, and 100% faster at 16K native GTA5. Problem is that nobody is going to run 4k natively now with the new transformer model Quality (1440p Native) looking better than native and Balanced (1200p)/Performance (1080p) looking comparable to native. Oh, and nobody's playing native 8K/16K period. Solutions in search of problems.
therewillbelateness@reddit
1200p isn’t that 16:10?
Drando_HS@reddit
Honest question here - who has to buy a high-refresh rate 4k gaming monitor but doesn't have the budget for a xx90-tier card?
1440p - hell even 1080p - is perfectly suitable and acceptable for gaming. 4k is diminishing returns and total overkill.
ARabbidCow@reddit
Different situation for me in case of a racing sim. I use 3x 1440p 165hz screens as the main display and then a 4th mounted above for telemetry, voice chat, ect. With racing I find it disorienting and distracting when frames drop below 90 and stutters usually result in crashes on track. My 3080 does the job in iRacing with only a few compromises but, ACC I have the majority of my settings on low to try and maintain at or above 90fps. AC Evo that's just hit EA week before last will certainly test my 3080.
I could drop everything down to 1080p but making out finer details for brake markers or car numbers can already get difficult at 1440 mid race. Having more clarity or even maintaining more frames more often will benefit me massively.
WJMazepas@reddit
5090 costs 2k. It damn well should be good at playing even at 8k at this point.
And people can run with DLAA at 4k to get the best image possible
panix199@reddit
Minimum 2k... a friend of mine in EU told me that 2 years old used 4090 are getting sold at $1800 in his country... and a new one at 2.6k...
i assume with the difficult availability of 5090, these would have probably a price of $3k (in some countries)
ARabbidCow@reddit
In Australia were looking at between AUD$4500 to $5700 for a 5090, USD equivalent \~$2800-3500. As usual I'm expecitng short supply driving prices up further anyway.
tukatu0@reddit
Its funny because dlss transformer is literally just less aliasing than dlss cnn. Looking at the techpowerup comparison. Does not mean much when native is forced taa
No_Sheepherder_1855@reddit
I was actually interested in the 8k/16k benchmarks since you need to run games at those kinds of resolutions for new VR displays and with the UEVR injector you can play most unreal games this way now too. From what I saw some games do perform better but most were about the same performance difference you saw at 4k.
Morningst4r@reddit
Which is funny because everyone was screaming that the 4000 series was hamstrung with small memory buses and must be bandwidth limited.
BFBooger@reddit
Not possible. There aren't 3GB GDDR6X chips.
GDDR7 does have 3GB chips, so a 24GB model is possible.
Otherwise, to get 24GB with GDDR6X would require a larger memory bus, which significantly increases the die size and the board traces / cost, so such a thing would be _more_ expensive, not less.
A more realistic change would be a 5080 with 20GB RAM, a 320bit bus, and 25% more cores. This would however use more power, and be perhaps 15% faster than the 5080, and yet cost 25% more to make, so probably a $1200 card if NVidia wants to keep similar margins, for just 15% more performance.
So yeah, if the performance is your annoyance, there isn't much to be done until there is a new manufacturing node used, but N3 and N2 will not be the same gains as historical node bumps.
If the16GB RAM is what annoys you, then there will be a 24GB model at some point, using 3GB GDDR7, but most people guess that to be the 5080S at the same MSRP one year out.
Honestly though, 16GB is not an issue. 12GB is not an issue in ANY game today, as long as you are willing to accept turning down a couple settings from the max (which traditionally is acceptable on 70 series products; in the 'old days' the 80 series would run ultra, the 70 series high, and 60 medium/high mix; today everyone expects ultra for the whole stack, for some reason).
16GB is not a gaming bottleneck at this tier, except for those who do things like run specially modded games. AI hobbyist things -- sure, 16GB is a big issue.
rabouilethefirst@reddit
Still though, the 4090 is right between the 5080 and 5090 in die size and performance. I can’t help but think people would have rather had that card at a reduced rate over the 5080. Even if it was still $1499
Morningst4r@reddit
I think you underestimate how people would have reacted to a $1500 5080. Expensive products make people mad no matter how fast they are.
BrkoenEngilsh@reddit
It's not really between, it's like 67% bigger than a 5080. The 5090 is 20% bigger than a 4090. We also don't know how much nvidia is charging for a 5090, but given the AIB news about it being a "charity" we can probably assume they are making even more on the 5090. So nvidia really doesn't have a reason to keep producing the 4090.
Far_Success_1896@reddit
Why? The market spoke. No one's buying an xx80 card over $1000. People much prefer this.
BFBooger@reddit
Do we know if they are going to stop making the 4090?
It will probably remain available used for < $1400 as many 5090 buyers will be old 4090 owners who sell their old equipment. So I guess it fills that gap.
Price aside, its out of the power consumption range I'm willing to accept.
The one thing good about the 5080 is its power efficiency, quite a bit better than the 4090, 5090, and somewhat better than the 4080 line.
rabouilethefirst@reddit
We know they already stopped production. I just think it may have been a bad idea since the 5080 fails to fill the gap, and the 5090 is just out of reach and out of stock for many.
The efficiency can't be THAT much better. I've already seen people say it can use about 400w, and it's doing that while still slower than a 4090. My 4090 rarely goes above 400w
Fullyverified@reddit
12GB is an issue in many games today. Nice paragraph.
brentsg@reddit
And the lack of reasonably priced new process nodes for manufacturing. In the past, we’ve always gotten the big gains from the combining architecture and process node advancements.
MrMPFR@reddit
Yes the problem is nodes getting too expensive. It's extremely likely that GB203 dies costs more to produce rn than the almost 2x larger TU102 dies cost back in 2018. The node price creep and higher TDPs are the cause of price creep not NVIDIA being greedy. Pascal, Turing and Ampere were NVIDIA peak milking not Ada and Blackwell, although they still enjoy quite healthy margins.
Frosty-Cell@reddit
It seems they have raised prices far beyond production cost. A GP203 should cost about $250 assuming a 4N wafer costs $20k and yield at ~50%.
auradragon1@reddit
What was your math?
BOM should cost around $300. Add in marketing, admin costs, warranty, 3rd party makers cost, shipping, etc and $750 seems about right.
Frosty-Cell@reddit
Not sure what's unclear. A 50% yield would result in about 80 gp203 per wafer, but I suspect it's higher than that.
Nvidia's GPU profit margin is apparently 40-50%.
MrMPFR@reddit
Try calculating the die cost for a 1080 TI die with a 8000 dollar 16FF node, a 2080 TI die with a 6000 dollar 12FFN node and 3090 die with a 5000 TSMC 8N.
Not saying NVIDIA couldn't lower prices, but they clearly won't and AMD isn't willing to disrupt pricing like with RDNA 2 (the only reason why 3080 was GA102). If they were then they wouldn't hesitate. The wait smells Radeon -$50 again.
therewillbelateness@reddit
Wafers used to get cheaper with new nodes? When did that change?
MrMPFR@reddit
No they've gotten more expensive over time but very slowly and PPA benefits used to be excellent with each new node. FinFet is when things started to go wrong. TSMC having a monopoly doesn't help either. I really hope Intel foundry can execute their roadmap and bother TSMC, because Samsung foundry is a joke.
Another problem is the chip design cost. Try googling it. Used to be sub 50 million. now it's approaching almost 1 billion on newest nodes.
therewillbelateness@reddit
For chip design costs are we comparing like for like, for example the latest Intel CPU 20 years ago vs latest now? Damn I didn’t realize new nodes made design more complex.
MrMPFR@reddit
It's probably for a SoC like Apple's M4, but I'm not sure. Yes this is why AMD can't afford to make 4-5 dies when competing against NVIDIA on bleeding edge and keep reusing older tech (Rebrandeon). Doing the same thing as NVIDIA would literally bankrupt AMD.
therewillbelateness@reddit
Wait are you saying every die Nvidia makes in one gen for their GPU lineup is a billion dollars? That’s nuts
MrMPFR@reddit
No not that much and there's also a lot of IP block reuse between dies (copy paste) + licensing of logic from third parties for stuff like memory and PCIE controllers.
This website states this: "$542 million. Between 2006 and 2020, the cost of designing a new chip on the latest manufacturing node has increased by a factor of more than 18. A 65nm chip in 2006 cost about $30 million to design, while a 5nm chip in 2020 cost over $540 million to design."
IDK how much money NVIDIA and AMD actually spends on each architecture but it's only going up rapidly with each new generation.
Adromedae@reddit
There have been plenty of times in the past where the process or the architecture did not provide gains.
Shit's been weird since 45nm.
therewillbelateness@reddit
What was it about 45nm? I thought it was when they moved from planar when it got messy.
Adromedae@reddit
90nm started to get fucky with leakage. 45nm is when cost dynamics started to be less predictable, and cost per transistor started to get wonky.
dstanton@reddit
Amd has only failed to compete at the Halo level for a single generation (now 2 with 5090). The 7900xtwas neck and neck with the 4080s in raster.
And it's looking likely that the 9070xt will get within 15% of the 5080
Both cards at lower prices.
Where they have lacked is RT and upscaling tech, which yes remains a generation behind.
Still if the 9070xt is 5070ti level with RT at or better than 7900xtx and FSR4 that will absolutely be a competitor for all but the 5090 given Nvidia absurd pricing and Gen on Gen gains
Adromedae@reddit
But the 4080 was not the Halo product for NVDA.
dstanton@reddit
Never said it was. AMD did not have a 4090 or now a 5090 competitor.
They did have 3090 competitor
Adromedae@reddit
Got it, sorry I misread your post.
Nointies@reddit
Being a generation+ behind on those technologies is not a small deal.
Jeep-Eep@reddit
They'll be a gen behind until the models outgrow the 5070's cache.
Adromedae@reddit
Where are you getting the model fit fully in the cache?
Nointies@reddit
They'll be a gen behind forever at this point.
Jeep-Eep@reddit
That is of less concern as a user then 'how long will this cache be able to fit modern RT models and run them?'
dstanton@reddit
When those technologies only apply to a handful of the game catalog I consider it a relatively moot point.
I still prioritize pure raster and vram over anything RT/PT/upscale related.
And the jumps made with 9070xt arent insignificant. 7000 RT and FSR were definitely subpar comparatively. 9000 RT and FSR4 will put AMD purely competitive with RTX 4000, which is good when you look at how small of a Gen on Gen RTX 5000 is.
BFBooger@reddit
It looks like many future AAA titles will be heavy into RT. BM:W and AW and IJ are just the beginning.
Its 2025 now, its 5 years since RT was introduced, its no longer a toy.
Aggressive_Ask89144@reddit
Especially as DLSS4 gets released for all RTX cards. The settings actually look better on DLSS4 Performance than it did for DLSS3 Quality for many things. AMD cards are also bricks for Blender, and lack other fluff as well.
If they're all the same price, I would go with the 5080 but the 7900 XTX is really good with it's 24 gigs of VRAM for demanding 4k games (Texture wise instead of RT) and with it's common several hundred dollar discounts lol
BFBooger@reddit
It will be interesting to see how far AMD has closed the RT gap with RDNA4. It doesn't look like the 5000 series significantly moved the RT / PT performance relative to raster, unlike the prior two gens.
There is a chance to make the gap much smaller, which would make the AMD value proposition much better going forward.
Then, they just need FSR 4 to be good. Even if it is CNN based, that would demonstrate that in the future a transformer based one would be doable and on the way; the hardware on the GPU side to run both is essentially the same.
Close both the upscaling and RT gap significantly, and that would be huge. They don't need to match NVidia on these, but they do need to be a lot closer than they are now.
MrSauna@reddit
Customer or market demand for AI? I bought a 7900xtx a year back. Especially for ai/ml stuff because nvidia offering wasn't competively priced, I would have had to double my investment if I went with nvidia from 1k€ to 2.2k€. I was choked on vram as most ai stuff you run at home is. With the same price amd was and is at least an magnitude faster than nvidia would have been for ai.
For gaming and rasterization, I think it was clear winner in price/performance also, as the competition was 4080 and 4090.
Healthy_BrAd6254@reddit
what kind of AI applications do you use?
MrSauna@reddit
Vram hungriest: running diffusion models or any open LLMs. Then also training my own models with pytorch \w mini batching. Nvidia would probably win on pytorch in some metrics but then again if I dev anything I'm on linux and on linux nvidia is a pain, so amd again ends up winning the race for me.
Hekel1989@reddit
Mind if I ask how do you do this on an AMD gpu? I've tried, and I've found both SD and local LLMs to be painfully slow on AMD, but, I might have been going about it the wrong way
Healthy_BrAd6254@reddit
Looks like it's slower than a 3090 still.
Still better than I remembered. AMD cards were trash for ML just a couple years ago and didn't even work unless you went to extreme lengths.
MrSauna@reddit
Rdna is missing quite a few highly parallelized uops for matrix/tensor stuff. On microarchitecture scope and on paper it should be even slower. However, I usually find that the larger the model the more there is all kind of overhead from suboptimal libraries which gives amd a change to catch up to the performance a bit. As long as it fits in vram it has been good enough for me. To reiterate, if nvidia offerings have faster pipelines but fail to fit whatever in ram, it becomes so much slower that a 3x perf difference doesn't matter comparatively.
Personally, I just wanted to run as much as possible stuff locally so it boils down to amount of vram.
RedTuesdayMusic@reddit
NGL, 5080 reviews have made me look at 7900XTX more than before. Even though I'm technically content with my 6950XT, I feel like since I got a good 3 years out of that card that I could get another 5 out of a 79XTX. Especially since modern AAA games are 95% garbage and the only thing on the horizon I'm interested in is Kingdom Come 2.
BFBooger@reddit
I don't think that is the fundamental problem at all. Competition from AMD could lower the price a bit, but its not going to make NVidia's products faster gen on gen. Not when the same manufacturing node is used for the 4000 and 5000 series. Even if only competing against themselves, NVidia is incentivized to produce improvements to value gen over gen if possible, as that would lead to more sales and upgrades -- but only if they can do so without significantly increasing cost.
Apparently, they were unable to improve performance much at similar cost. The 4080S and 5080 are similar cost to manufacture. Of course they could have made a 30% larger die with more cores and a larger memory bus and maybe gotten another 15% performance (given the 5090 2x size but 50% faster scaling), but that would be perhaps 25% more cost and only 15% more performance, still a dud.
Its simply not going to be a big performance jump when you use the same manufacturing process.
The 3000 series had a big jump because they went from TSMC N16 to Samsung 8nm. The 4000 series had a huge jump because it went from Samsung 8nm to TSMC "4N" (TSMC N5, slightly improved).
The last time this happened was the 1000 to 2000 series, when there was some similar low performance improvement. The 2060 was similar to a 1070, but with less RAM. The 2000 series used the same TSMC N16 that the 1000 series did, but increased the die space significantly, mostly for the Tensor and RT cores, but a bit for raw core count. So there was a bit more improvement there. The flagship 2080Ti looked good but was a huge die size increase over the 1080Ti. This was do-able without a huge price increase because TSMC 16N had dropped in price dramatically compared to the 1000 series launch, so it was a 'cheap' node.
The 3000 series was also on a 'cheap' node, so its die sizes could be somewhat large at a given price point, compared to what would have been if NVidia used TSMC N7 instead. (at lower clocks and higher power, but probably similar performance / $ for TSMC N7, which was in super high demand at the time).
rebelSun25@reddit
Going by the charts on hardware unboxed review, 7900xtx is plenty good if found for msrp and as long as one doesn't want ray tracing
RedTuesdayMusic@reddit
Shut the hell up Jensen
nanonan@reddit
Nothing about the cards, but this has gone too far. The 5080 and 4080 Super are on the same node. "5N" does not exist. This whole "4N is really 5nm" is all bullshit. It's a 4nm node, a refinement of a 5nm process sure but that is irrelevant.
Sardaukar_Hades@reddit
Worse generation since Fermi....
Darkomax@reddit
Fermi, for all its faults, was vastly faster than its predecessor...
Sardaukar_Hades@reddit
What are comparing against the 480 vs. the 580?
Darkomax@reddit
Both are Fermi. The 500 series was a refresh/optimization of the 400 series. The 480 was 40-50% faster than the 285.
Sardaukar_Hades@reddit
Apologies, it has been a while. What I meant to say was the 500 series. I had such disdain in my mind for that generation that it is imprinted in my brain.
Klutzy-Residen@reddit
That makes me realise how little this can impact first time buyers.
My first GPU was a GTX 570 and I was happy, just bought whatever was the best deal in the price class.
Didnt care about any drama regarding improvements vs last gen as I do now.
PJ796@reddit
285 vs 480
MajorTankz@reddit
Yes the GTX 480 was beast despite how hot it was.
aLazyUsrname@reddit
And put out enough heat to cook your breakfast on
Darkomax@reddit
The power consumption was unheard of back then. But the cooler of the times had trouble to tame that kind of power which gaves us the Fermi jokes. And now, even mid tier GPUs consume more than a 480.
James_Jack_Hoffmann@reddit
Did Jensen really reveal a fake card that time? after looking it up again just now, it seemed that it was just a wind-up by none other than Charlie from SemiAccurate.
SMURGwastaken@reddit
You take that back, I loved my 480.
Raiden_Of_The_Sky@reddit
You're judging for no uplifts forgetting how good RTX 4000 was and still is in terms of architecture. There's always a limit for uplifts.
Sardaukar_Hades@reddit
Look at Paul's hardware graph of previous generations. To be honest, I didn't expect much from this gen as they were still on the same node. Regardless, in more cases than not, the new 80 series normally overtook the 80ti / 90 series from the previous gen.
dopadelic@reddit
50 series is on the same process node as the 40 series so it's not surprising that there isn't much improvement in raw power. The benefits are in the 4x frame gen.
Exist50@reddit
We need another Maxwell.
chaosthebomb@reddit
Why are you being downvoted? Maxwell was a huge change in architecture that brought massive performance increase to smaller core count gpu's on the same damn node. This is exactly what I was hoping for this gen and was unfortunately let the F down.
Archimedley@reddit
I'm not sure if maxwell was really that special so much as kepler was just that meh
like, kepler took a lot of work to get good performance out of because it was made for datacenters first and not gaming workloads
which, funnily enough, blackwell seems to be kepler like in that regard
it might just be on devs to get more out of blackwell than what we're getting right now, which isn't great, but maybe we'll see more of a difference in performance in 2026 or something
hopefully the next gen will be a maxwell-like + die shrink increase in performance
but yeah, node is like the bottleneck of an architecture, it's very rare that there's an architecture bad enough to leave room on the table the way kepler did
Nerina23@reddit
Fake Frames are not really a feature anyone should be proud of.
Capable-Silver-7436@reddit
i wonder if they'll put out a 5000 super series too
jonydevidson@reddit
Gaming benchmarks are still mainly testing for native rendering, when gamers, especially with NVIDIA cards, are almost always using DLSS. Those who can push beyond 60fps and have high refresh rate monitors are also using framegen on games that offer it.
The whole point of this generation is better cooling and 4x framegen.
ab3e@reddit
We finally reached peak mental gymnastics... We are paying for features! Not performance! How dare you people demand performance ?!?!? Give it a few months and Nvidia might lock these shinny new features behind a monthly subscription, seeing how their stock is doing.
jonydevidson@reddit
But it is performance, it's still running on dedicated hardware.
ZarephHD@reddit
"Those who can push beyond 60fps and have high refresh rate monitors are also using framegen on games that offer it."
No.
IronLordSamus@reddit
Sorry but we should be judging the card on its native rendering and not fake frames. Frame gen is a scam.
ArtisticGoose197@reddit
No fake frame gen for me
BFBooger@reddit
The irony is that the faster the base framerate, the more I'm ok with the fake frames.
If a game was 120FPS native, and I had a 240Hz monitor, then getting 220fps out of frame gen would be fine by me, the latency hit would be tiny and it would just make everything smoother.
But if the base FPS is 50 -- I'd rather use upscaling to reach \~75fps and not accept a latency penalty.
corok12@reddit
DLSS is pretty good these days, especially with the new model.
Maybe I'm just overly sensitive to it, but I tried dlss 3 framegen and found it borderline unusable in most cases. It absolutely shouldn't be used for benchmarking. From what's shown in reviews the new multi frame gen doesn't look that much better. I'm not sure making objectively worse image quality the "whole point" of a generation is a good thing.
Reflex 2 will probably be very cool though, I'm looking forward to that.
redimkira@reddit
If NVIDIA called it simply 4080 Super GTI or something that would make more sense, instead of making it look like next gen, but I guess at the end of the day you gotta-keep-pushing-those-stonks-up to keep shareholders happy.
Storm_treize@reddit
Even better 4070
ErektalTrauma@reddit
TPU got +16% so the only joke here is Optimum's testing methodology.
kikimaru024@reddit
The joke is you not understanding that you should never trust 1 single source.
ErektalTrauma@reddit
Yeah? Name another source that got 0%.
conquer69@reddit
Optimum didn't get 0%. You would know that if you at least watched the video before complaining about something that doesn't exist.
qywuwuquq@reddit
Maybe optimum should not made clickbait then?
iucatcher@reddit
u should watch the video instead of just looking at a thumbnail
Framed-Photo@reddit
Oh you just...looked at the thumbnail. You looked at the thumbnail and assumed that was the data. Got it.
Zednot123@reddit
TPU litterally had 0,5% in DOOM vs the Super. Which was what Optimus was comparing to when he said there was no gain in some games.
https://tpucdn.com/review/nvidia-geforce-rtx-5080-founders-edition/images/performance-matchup-rtx-4080-super.png
knighofire@reddit
Based on their fps numbers it was more like 14%. Still disappointing.
It is true that people need to update their testing methodologies to the latest games to properly bench these GPUs. TPU recently reevaluated their game selection extensively, and it shows. It doesn't make a huge difference, but is the difference between the 5090 being 27% and 35% faster than the 5090 (8 and 15% for the 5080).
imaginary_num6er@reddit
Which test? Cyberpunk 2077?
ErektalTrauma@reddit
Average across 25 games.
RedTuesdayMusic@reddit
Without any sort of upscaling or framegen?
GaussToPractice@reddit
old cards have that too. so yes
gr8dizaster@reddit
did you watch video? or you judge it by the thumbnail because 6min vid is overwhelming for you?
kpofasho1987@reddit
This generation of gpus so far shaping to be an expensive disappointment.
Glad I am too broke to even entertain buying anything remotely close to these cards right now haha.
I'm hoping by the time I can afford to build a gaming computer by then something like a 4090 will be somewhat affordable as ill wait another 2-3 years and then be set for like a decade with that card.
vidati@reddit
Same here, 2080ti since launch still going strong!
mawhii@reddit
Nvidia is the size of Apple nowadays. Sorry, are you expecting prices to go DOWN?
Their gross profit margins are only going to get larger as their SG&A costs increase. We’re lucky they even still compete in the consumer market when b2b AI is so much more profitable for them.
Not glazing I’m just being real from a business and tech perspective.
therewillbelateness@reddit
Why is that? I thought their margins were increasing because AMD stopped being competitive.
3l_n00b@reddit
Why does the person in the thumbnail look like Jeff Bezos with hair?
therewillbelateness@reddit
Is this supposed to be a shorter gen? When is a die shrink coming?
Fallen_0n3@reddit
Nvidia could have done a reverse amd and not released any card other than the 5090
darklooshkin@reddit
Look, I might only have a 6600 xt and a 7600 xt, but I think I'm set for the next 10 years. Between most games having to hit 30-60fps on the AMD Z series if they're going to target steam deck-alike sales and AAA dropping into the trash, it's fair to say that there won't be a good reason to upgrade beyond that unless there's a big push on powering game NPC's with onboard AI á la Deepseek. And even then it's likely to be on the CPU side.
And with Nvidia's AI gambit deflating like a soufflé thanks to Deepseek, we'll probably see a return to their roots in the near future.
So there, just buy a solid current gen card and wait for prices to drop. If Intel's next lineup of GPUs stacks up nicely in price-to-performance, then that's what's going to have to happen eventually.
AlphaPulsarRed@reddit
Obviously a joke for 4080 customers. Those people need a good knock in the head IMO
laselma@reddit
What a stupid way of losing series 60 samples for testing.
Sh1rvallah@reddit
What a stupid stance to be in favor of access journalism
airfryerfuntime@reddit
This guy probably didn't get one straight from Nvidia in the first place, he's probably sharing it with someone else.
Yodas_Ear@reddit
Is that Jeff Bezos’s son?
fibercrime@reddit
Yes