NVIDIA GeForce RTX 5090 appears in first Geekbench OpenCL & Vulkan leaks
Posted by M337ING@reddit | hardware | View on Reddit | 88 comments
Posted by M337ING@reddit | hardware | View on Reddit | 88 comments
CANT_BEAT_PINWHEEL@reddit
Woof. $1600 increasing to $2000 with a 30% increase in performance means performance per dollar basically didn’t increase this generation
Beawrtt@reddit
Performance per dollar, for the $2000 card.... Sorry to break the news, people buy the best GPU for the performance not the value
no6969el@reddit
150% truth. The only reason why I know how much Watts the 5090 uses is because I needed to make sure my power supply can handle it. Not that I was worried about electricity.
Acrobatic_Age6937@reddit
then it's clearly not a good buy for your use-case.
no6969el@reddit
You have no clue of my use case. Nor do you know what I do with the card after I move it out my gaming/hobby PC. Some of you need to take a step back and think about why you are so concerned about what other people are able to buy and why.
How about this, the 5090 is the only card that MAY be able to run what I'm asking out of it. If it can then I'll be able to do that task way into the 6x and 7x series.
Not sure why you are thinking I'm playing Mario on this or something....
Acrobatic_Age6937@reddit
That 'may' has to imply you would want to upgrade next gen as well because the 5090 would still bottleneck you. But that's apparently not the case.
What magical usecase works perfectly on a 5090 that didnt work on a 4090?
besides, i wasnt insulting you. Most people me included don't overly profit off spending for the highest end hardware. It's just a waste of money vs upgrading midrange gear slightly more frequently.
Decent-Reach-9831@reddit
7680x2160 240hz monitors
no6969el@reddit
Yeah I didn't think you were but I was just responding to what I saw as being snarky.
The use case for me is both high resolution, high frame rate Sim desk using 4k 120 panels which is hard to do with three of them.
Also most importantly it's for using a VR headset when I'm not using the 4k triple panels.
I currently have a Quest 3 and I don't think the resolution Is good enough for Sim racing so I also have to add in the additional headroom that I'm going to need to run the additional resolution.
If I can successfully run three 4K panels at 120 HZ then I am confident that I'd be able to run two streams of it, one for each eye in VR.
We have to keep in mind that when we are rendering in VR, we're also super scaling past the resolution so that it adds better sharpness and quality.
shmed@reddit
4099 may have been 1600 at launch but it's been almost impossible to buy a new one for less than $1900 for the last year. Most of them retail over 2k already. In any cases, like most high end product, there's diminishing return once you get at the top of the line.
clingbat@reddit
Interesting point on the pricing. I got my 4090 FE from Best Buy for $1599, but that was in September 2023.
DYMAXIONman@reddit
I think it's fine to offer poor value with the top tier card anyway. I just think the 70 series card should always be at least 30% better than the prior gen (which this gen will not have).
vhailorx@reddit
That was fine when the flagship was a titan card just a bit faster than the 80 class. But now the flagship is almost exactly 2x the 80 class product. The gap is way too big.
Hendeith@reddit
Because 5090 is not supposed to offer better perf/$. It's pretty clear that 5090 was mainly supposed to:
offer more VRAM for AI
just be faster
Most people who will buy this card for $2000, just like ones that bought 4090 for $1600, won't look at perf/$ ratio.
Zaptruder@reddit
Price per frame will be decent. The only question is, what the hell do you need all those frames for.
(I need them to saturate my 5120 x 1440 240Hz monitor).
SagittaryX@reddit
(Same, but for the upcoming 5120x2160 monitors)
Zaptruder@reddit
Yeah, eyeing that bendy LG monitor... looks very good. Only question is HDR?
Vb_33@reddit
8k60 monitor jammed into my face.
Zednot123@reddit
Moving from 16:9 4K to 21:9 5K just so happens to increase demand with similar amounts as a 4090 to 5090 upgrade gives.
Oh LG, why do you do this to me!
YNWA_1213@reddit
An increase to DLSS resolution during heavy RT/PT workloads, dabbling with 5K/8K displays, etc. It’s all way outside my budget, but I’ll be curious to see another revisit to > 4K gaming by creators as the 3090/4090 were well above 20GB VRAM allocations last time it was tested. Does the higher bandwidth, higher capacity, and larger bus width help keep the cards fed on those displays?
Impeesa_@reddit
I thought the 4090 did offer fairly competitive perf/$, far more so than most top-end halo products would. It was just far above the rest of the stack in both.
TheRealSeeThruHead@reddit
4090 was a value card becuase it’s perf per $ was beat better than the 4080
StonedProgrammuh@reddit
For AI workloads the 4090 offered absolutely insane perf/$ for inference
PeakBrave8235@reddit
They wouldn’t have reduced their margin lmao
Hendeith@reddit
Do you not understand how margins work? If they are making bigger chip, adding more VRAM and using more expensive GDDR7, then how would they keep same margin without increasing price?
PeakBrave8235@reddit
I perfectly understand how it works. Just curious why people here can clearly think this for Nvidia but not Apple lol, who, by the way, hasn’t increased their Mac prices for Mac mini, MacBook Air, MacBook Pro, iMac, etc.
Hendeith@reddit
Sorry, I still don't understand your point and don't understand what Apple has to do with this. If you understand how margins work, then why do you think making more expensive product while keeping price same doesn't reduce margins?
PeakBrave8235@reddit
I don’t. Many people here have the wrong idea about how products’ profit margins work
Hendeith@reddit
Now I'm lost, because previously you said the opposite thing.
Peach-555@reddit
5090 is also likely to give substantially better performance per dollar in 3D and video.
4090 was \~104% faster than 3090 in blender as an example.
5090 supports 4:2:2 10bit.
Zednot123@reddit
It's another "2080 Ti", even the die size is similar.
MrMPFR@reddit
Unchanged frontend or backend vs 5090 despite massive boost to cores is all the evidence we need. 5090 is a compute and AI card not a gaming card.
Hendeith@reddit
Pretty much. Fact that US "had to" force Nvidia to limit 4090 perf because it was bought in massive numbers for AI purposes in China should resolve anyone's doubt about which route 5090 would go.
Technician47@reddit
Id argue the price is more about the fact a 5090, while a gaming product, has a huge demand for general AI purposes and thus drives the price sharply up.
Sopel97@reddit
1600*1.3==2080
Decent-Reach-9831@reddit
Inflation adds $200 as well. $1,600 then is $1,800 today.
MrByteMe@reddit
And this is the 5090. I expect reduced margins with the lower series cards.
conquer69@reddit
Performance per dollar might be higher in the lower brackets. It was for the 4000 cards.
MrByteMe@reddit
Well, it might be more than the 5090, but I suspect not as good as it was last generation.
Ain't no way the average gamer is going to be able to buy a 5070 for $549.
no6969el@reddit
Maybe in like 2 years.
FuzzyApe@reddit
Wasn't 4080 much worse performance per dollar than 4090?
conquer69@reddit
I was thinking about the 4070, 4070 super and 4070 super ti. No idea why people rushed to buy the 4080 lol.
Asleeper135@reddit
I think it was comparable, but that's actually terrible. Halo cards have always been terrible values, so for the 4080 to even be comparable in terms of performance per dollar is bad.
Massive_Parsley_5000@reddit
Which is why no one bought it lol
There's a reason why out of all the cards the super got a price cut.
MrMPFR@reddit
NVIDIA massively overdesigned x80 and x70 TI coolers last time which should somewhat offset the additional cost of slightly higher TDPs and GDDR7 (20-30% more expensive according to Trendforce).
Only card getting hit is 5070 which despite smaller die is 25% higher TDP + GDDR7.
i_max2k2@reddit
Did you take inflation into account?
/s
nailgardener@reddit
The more you spend, the more you save
imaginary_num6er@reddit
Why should it? People here were saying 50 series will be like Ampere without anything to suggest it besides Turing coming before Ampere
MrMPFR@reddit
Are the 5090 Geekbench scores held back by 12900K + DDR4 3600?
insanemal@reddit
Nah it's just not as good as it should be because they have bet the house on AI bullshit
DuranteA@reddit
How good "should" it be? The Vulkan numbers match up well with the increase in hardware capabilities.
It's not that much faster than a 4090, outside of memory bandwidth.
I wouldn't read too much into the OpenCL results, it's not exactly the highest-priority API for Nvidia (or anyone).
insanemal@reddit
I was kinda hoping for something befitting the current world leader in GPUs.
Not just another Intel-esk, make it 30% faster by using 40% more power, kind of affair.
trololololo2137@reddit
What did you expect without a node shrink? It's more or less a scaled up ada with better tensor cores
insanemal@reddit
I think you're missing the fucking point here.
I don't think this generation is worth buying. It's a shit offering from a company that is stagnant
trololololo2137@reddit
it's the best GPU on the market but you can always buy AMD that is on the same node but 2x slower and even more stagnant
insanemal@reddit
Or not. There's literally no point to getting either.
The performance uplift is awful.
None of them can actually run any of the current games without some kind of AI bullshit.
I'll do what I did last time and wait like 10 years.
I only just upgraded from my 1080 last year and it was still doing pretty well. Except for the lack of VRAM really Starting to be an issue, especially when streaming to remote devices.
I've got a 7900XTX because VRAM and decent enough performance.
It's going to be fine for quite some time.
trololololo2137@reddit
so you accuse nvidia of being stagnant and you bought RDNA 3? what a joke
insanemal@reddit
Not really. I wanted around a 4080 performance, but needed more than 8GB of VRAM.
What else were my choices?
It's got better raster than a 4080 super @1440p and IDGAF about RT as nothing can actually do it well enough without upscaling and fake frames.
Plus I got the 7900XTX for A LOT less than a 4080 super.
Seems like a good deal to me.
But ditching the 7900XTX for less VRAM and perhaps 10-15% more raster at almost 3X the price. Are you fucking kidding me?
I literally couldn't use an 8GB card as I do a lot of gaming via streaming and games are eating pretty much all the VRAM these days. So I had to at least get something with 12GB.. And with NVIDIA that meant paying 1.5x as much for what half the VRAM?
And sure I would have got better RT performance, but most of the games I play don't use it. And honestly, there will be heaps of games not making it mandatory for quite some time. The hardware just isn't up to it. Unless you're happy for smear-O-vision.
Nah bro, NVIDIA have been on the Intel path for a while now. They can't actually get a real generational leap in RT performance with their current design and they can't afford to spend the time required to develop one. All their eggs are in the AI basket. And AI scales linearly with tensor cores. So they are trying to make up for the lack of actual horsepower with imagined horse power.
UDNA will be interesting, I use a lot of CDNA cards as part of my work. (I build supercomputers that end up on the top500). Their latest cards are faster and cheaper than NVIDIA at AI workloads. But that's just what the benchmarks say. So it will be interesting to have AMD not splitting their focus.
But hey, go off I guess.
NeroClaudius199907@reddit
They couldve went with 3nm but it will be expensive & little supply
insanemal@reddit
There is already little supply.
And that wouldn't have really increased performance enough
NeroClaudius199907@reddit
Stop being delusional & ret arded. Of course they wouldve improved perf/power. Look at their previous gens.
Beautiful_Ninja@reddit
You're complaints are with TSMC, not Nvidia. Nvidia's pushing the node as hard as it'll go to eek out whatever performance gains they can get.
insanemal@reddit
No they aren't. They are with NVIDIA.
NVIDIA shouldn't have offered this shit at all. They should have spent some fucking time and actually made something new. This is garbage
DILF_FEET_PICS@reddit
Your eke
PostExtreme7699@reddit
Yes sure, any pathetic excuse goes in order to defend this bullshit.
Definitely bottlenecking the incredible 5090. So incredible the 4090 is still gonna be still the better gpu performance/power draw wise.
Go on bots, downvote and say you're gonna buy it day one.
Judge_Bredd_UK@reddit
The 4090 is already a beefcake though, how much of an improvement did we really expect?
Strazdas1@reddit
If its not double performance for half the price like the (falsely construed) 90s they arent happy.
PainterRude1394@reddit
Why are people so upset about new gpus every launch?
MyDudeX@reddit
Because they can’t afford to upgrade so they try to delude themselves into thinking they’re not missing out on anything
fumar@reddit
It's fine not to have the shiniest newest thing. People need to not attach their happiness to material goods.
dztruthseek@reddit
Yeah, no.....this is all I have in life. These material things are the only (fading) happiness that I experience.
unknownohyeah@reddit
I have a 4090 and can easily purchase a 5090 but honestly it's not worth the hassle. If you could do a phone style trade-in and pay $400 to get a 5090 I would but having to sell my old card and then even worse trying to buy a perpetually out of stock 5090 FE takes way too much effort for a mere 30% increase in raster fps.
RiptideTV@reddit
If you're anywhere near a microcenter they actually do have a program just like that
PainterRude1394@reddit
I agree that's a big part of it.
As we all know dlss, frame gen, reflex, and ray tracing are all gimmicks. Or .. at least that was the narrative until competitors released similar but worse features and functionality.
markianw999@reddit
Whats the practical use for any of it.... except to slow raster down.
BinaryJay@reddit
Holding my breath for a MFG version of FSR to come out of the woodworks at some point and be generally hailed as a super awesome way to increase motion clarity.
PainterRude1394@reddit
Despite it being far worse than dlss frame gen they were squeeling about being useless for the last 2 years.
NeroClaudius199907@reddit
Its getting more expensive
PainterRude1394@reddit
Yeah, the cost of transistors is increasing. I don't see how screeching nonstop solves that.
A GPU existing at a price you don't like is why you get upset every launch? Does it existing suddenly make your GPU useless?
markianw999@reddit
Lol they dont last longer performance gains are just more incremental. Rember its about fuk8ng u out of your money not giving you performance
no6969el@reddit
They absolutely last longer. We're at such high resolutions now that not only can you lower graphics as time goes on you can lower resolutions. And then to make it even better, you can start slamming dlss to performance.
NeroClaudius199907@reddit
Its been happening since kepler. There will always be people getting priced out and way to deal with it is to screech.
I am so fortunate I am a normie. I just buy an alternative if its expensive for me
PainterRude1394@reddit
I mean yeah, competition exists lol. People are screeching because they want the best hardware but don't want to pay the price for it, not because they are priced out of gaming like they'll try to trick people they are.
But people always want more for less. It's a bit bratty to whine nonstop all over reddit just because you want some gaming hardware for less l.
4514919@reddit
Except for the 5090 no other 5000 card got more expensive .
NeroClaudius199907@reddit
You're only looking at gen over gen.
no6969el@reddit
It's what a if you spoiled generations look like.
Zaptruder@reddit
Because they click on stupid ass clips on youtube and get led down a well of ignorance until they're frothing at the mouth over inconsequential things using terminology they barely understand, without appreciation of the factors that drove the decisions made.
So that they can be distracted from the actual oligarchs ruining their lives.
Fossil fuel tyrants ruining the world with climate change? Pharma fucks creating an epidemic of overdose killing millions? Healthcare insurance using people's desperation to fleece them and then fuck them over?
I slep.
Fake frames on a new video card? FUCKING OUTRAGEOUS.
PeakBrave8235@reddit
Good news for Apple. The M4U is probably going to be over 300K, probably over 325K.
Good news for customers: you’ll actually be able to buy a 5090 level GPU with more than 32 GB of memory lol
AutoModerator@reddit
Hello M337ING! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.