Screw your RTX 5090 – This $10,000 Card Is the New Gaming King (RTX 6000 Pro Blackwell review)
Posted by fotcorn@reddit | hardware | View on Reddit | 213 comments
GhostsinGlass@reddit
It's so beautiful, the clean lines, the black material, the sheer performance.
But enough about Roman.
Jokes aside yeah that's one pricey meatball of a GPU but I don't think there's one among us that would kick it out of bed for eating crackers if we had the kinda budget to afford that unit. I would really want to see how this thing fairs in some GPU driven simulations for creative workflows in Nvidia Omniverse and others.
That glossy/piano black finish on that aluminum is absolutely gorgeous and as much as I'm all about watercooling everything no matter how needlessly that design would give me second thoughts for sure. That coil whine is a bit on the brutal side but that's what headphones are for.
On the creative front aside from ML I couldn't see myself buying this GPU though. At around $18,000 Canadian fun bucks I could get three or four 5090 FE, a Threadripper CPU and workstation motherboard and easily fly right past it a few times over for rendering and such and still be fine within the 32GB limitation for scenes.
Still want one though, bragging rights and all.
Melodic_Slip_3307@reddit
If the card was ever 3k max, i'd consider it but otherwise nah.
Zenith251@reddit
Thats, or course, you want to assume the risk of a 12v-2x6 connector burning at all. I wouldn't. Not for a $100 GPU, not for $10,000.
(Yes I know you need something near 600w to reliably burn them.)
evangelism2@reddit
I literally could buy this today. I've eyed it on a few sites for 8k. I sell my 5090, its a 5k upgrade. But the main things that stop me are
1) 5-6k more for like 10% more performance, nah. At least with the 5090 it was crazy, but I get 40-50% more performance over the 80 for about 900-1000 more dollars.
2) Drivers. I aint about to deal with the headache that is what drivers might end up being for this thing.
NKG_and_Sons@reddit
Or, in short, your common sense.
evangelism2@reddit
I wouldn't classify anything I said under common sense. More like research and learned experience from decades in PC gaming.
Orolol@reddit
Not wanting a 10% performance uplift for 5k is even below common sense, it's just basic survival instinct.
evangelism2@reddit
Know that the two cards only have 10% in difference between them is not common sense, also there is more to these cards than just gaming, so maybe you don't know as much as you think you do.
Orolol@reddit
I use 5090 and 6000 pro everyday for AI/ML and gaming, so maybe you don't know what you're talking about.
evangelism2@reddit
Irrelevant and besides the point.
But even if true, how would I surmise that from your narrow take before.
Orolol@reddit
You can't, that's why you don't make baseless assumptions on what other people know thinking you're more knowledgeable than them. That's also common sense.
Weddedtoreddit2@reddit
The man clearly has more money than brains.
But if I had fuck you money to the extent where spending 10k would be the same as me spending a dollar or 10 cents, I'd buy this.
I love PC tech and would be thrilled to have the absolute best of the best..
kael13@reddit
Ah, so learned a man as thee. Can scry through research and experience that 10% extra frames isn’t worth $6000. Teach me your ways.
terraphantm@reddit
The drivers aren’t an issue. But yeah, not worth the $5k extra even with budget not being an issue. At some point enough is enough.
Now if they release a Super / Ti with this core, perhaps I can convince myself that building a second PC wouldn’t be a terrible idea.
evangelism2@reddit
Drivers arent an issue right now. No guarantee it will stay that way.
terraphantm@reddit
Based on what? It’s still ultimately a Blackwell gpu and as long as nvidia continues to support Blackwell this card will receive driver support.
evangelism2@reddit
Yes but no guarantee it will receive game ready driver support at the same pace with any potential bug fixes that may need patching for the 6000 series card.
Even if you go to their site right now for the card and look up drivers for it, there are no game ready drivers listed.
shugthedug3@reddit
I don't think drivers will be a headache, Nvidia make sure their studio driver is solid... or they used to.
You won't get game optimizations of course though.
Mulkanir1975@reddit
I just got my Alienware Area51 with an RTX 5090. I CAN TELL YOU ITS WORTH IT.I play as an exemple Cyberpunk modded with Ultraviolence @ l raytraci g and psycho stuff. Jaw dropping and above 105 fps up to 120. With gsync on my oled cx48 it’s just jaw dropping. All games I am trying are butter smooth. Coming from an Alienware Aurora 11 with an RTX3080. OH also Ark Ascended runs at 120 fps most of the time. Trust me I am so glad I did the jump.
VRGIMP27@reddit
Would be sweet if you could flash a 5090 with this thing's bios and maybe unlock some more Power
Granny4TheWin7@reddit
You can’t magically unlock cuda cores with a bios flash
Gippy_@reddit
Oil barons are already paying $8K for gold 5090s, what's another $2K to actually get a +10% performance uplift?
alelo@reddit
i think whats the craziest is that you can actually manipulate the cards power target, stuff you cant do with the 5090 (like capped at 70% while not on the 6000)
Gippy_@reddit
4090 performance for 300W is incredible.
Imagine if the 5080 was released like that along with 24GB VRAM. It would've been the most praised card of the generation. Instead it's still slower than a 4090 while only having 16GB VRAM and a 360W TDP.
KARMAAACS@reddit
Not really that incredible because most 4090s could be power limited to 320W and have basically only 10-15% performance loss.
unknownohyeah@reddit
My 4090 is limited to 350W with less than 5% performance loss. Actually the memory is at +1000mhz which is arguably more important for performance than clock speed.
terraphantm@reddit
If that was the case then I'd expect the 5090 and 6000 to absolutely destroy the 4090 even when power limited since they have nearly double the memory bandwidth.
unknownohyeah@reddit
You can't compare bandwidth across different chips or generations. Ada happens to be bandwidth starved for some games because it uses GDDR6X but the 5090 uses GDDR7. And there's other things that contribute including bus width (384-bit vs 512-bit).
But yes, that's why the 5090 is ~35% faster in 4k than the 4090 despite being the same node process. It got a huge memory bandwidth and 4k needs all of it.
terraphantm@reddit
Yes you can compare bandwidth across chip generations. It’s just a number that represents how much data can be transferred per unit time.
The 5090 is faster because it has more cores and clocks higher. Same reason the 6000 Pro is faster than the 5090 despite the two cards having identical memory bandwidth. If the bandwidth made as much of a difference as you think, the 5090 should be 80-100% faster, not just 35%.
unknownohyeah@reddit
You can, but you would be stupid to. Different architectures will respond to different components uniquely. When you're making a GPU you will always have a bottleneck. Sometimes it's core count. Sometimes it's clock speed. Bus width, memory bandwidth, even PCI. Various workloads can stress different components. Maybe you're running heavy RT raycounts and need more tensor cores. Maybe you're doing memory intensive compute like raytracing at 4k and need bandwidth.
The 4090 happens to be memory bandwidth starved sometimes because of the 384-bit bus. So overclocking memory helps alleviate that bottleneck. The 5090 has more than enough bandwidth and is unconstrained by that. The core count and clock speed matter most for that. You assuming that because bandwidth doesn't matter for the 5090 does not mean it doesn't matter for the 4090.
Karyo_Ten@reddit
I do agree with you but:
Rasterization and textures need memory bandwidth. Raytracing needs equation and that's all no pixels, no textures. It's up there with the least memory intensive parallel workload with particles.
See roofline model and arithmetic intensity
terraphantm@reddit
The 4090's performance relative to everything else scales pretty much perfectly with it's core count and clock speed. Sure one can probably design a benchmark to stress the memory bandwidth, but in real world scenarios it's almost never the limiting factor when you're already at 1TB/s.
Even the Titan Ada prototype despite having the same bandwidth as the 4090 outperforms the 4090 at the same power limit since it has more cores
unknownohyeah@reddit
If you're just gonna make stuff up and not listen to what I have to say there's no reason to have a conversation.
Different workloads respond to different bottleneck alleviation. It's just that simple.
terraphantm@reddit
You’re the one making things up here. There’s basically no scenario, particularly with gaming, where the 4090 is bandwidth constrained. Multiple benchmarks prove this time and time again.
ResponsibleJudge3172@reddit
It has more cores but clocks lower. And every other 50 series GPU do more with their compute performance than 40 series. Eg 5070 vs 4070S. 4070S cores and clocks should have it 10% faster rather than tied.
This is mostly because of the bandwidth gains
cowoftheuniverse@reddit
Only on paper. In actual gaming they clock almost identical.
bctoy@reddit
4090's problem with bandwidth is that its L2 cache is heavily cut down, it only has 72MB vs. 96MB on the whole AD102 chip. For comparison, 4080 has 64MB of L2 cache.
Ada was the first gen to incorporate such huge amount of cache since the memory bandwidth was not able to keep up.
https://open.substack.com/pub/chipsandcheese/p/microbenchmarking-nvidias-rtx-4090?selection=5690b1ac-c36c-48b1-b801-039ea821a7a3&utm_campaign=post-share-selection&utm_medium=web
Gippy_@reddit
10-15% performance is the difference between a 5080 and a 5070 Ti, so that's significant. Also the MSRP.
Muppet1616@reddit
Of course a 5080 with the same compute as a 4090 with a 300W TDP would be amazing...
In the same vein, the 5080 would be an amazing card if it was priced at 600 dollars and a porsche panamera would be a great deal at 70k instead of the 100k it starts at.
jungle_terrorist@reddit
In all honesty imo the 5080 is really the successful release of the 4080 12gb.
So the Panamera is still that at 60 or 100k. But 5080 is a in disguise 🥸 5070/ti for the 80 class price
Muppet1616@reddit
What do you mean?
The 5080 literally is the same size as a 4080 16gb and 30% bigger with 30% more transistors than a 4070ti.
I don't think it's particularly well positioned in the market, it needs 18gb at minimum to be a 4k card in 2025 imo and it's to expensive to be a baller 1440p card.
Still though all the claims of the 4080 should be a 4070 or a 5080 is really a 4080 12gb kinda need to go a bit deeper into what you're actually getting for the price in terms of compute and what we've seen over the past ~8 years is that we've stopped seeing exponential price drops across the board.
It's true in consoles, it's true in mobile phones, it's true with CPU's and it's true with GPU's.
tukatu0@reddit
It's not true in phones at all. From 2020 to 2023 there was near a double in performance through top end snapdragon. Welp. Sh"". That was 5 years ago. The rise of the cheap $200 phone market in asia market something something Mali gpus.
We may get a rtx 7080 or so that's 60-100% better than a 4080. 4ghz clocks and all that. But that might take to 2029 or worst case 2032. So fine. The growth is down but nvidia still has 100% pure profits. They have leeway in pricing.
As for a summary https://old.reddit.com/r/pcmasterrace/comments/12n0j87/the_4070_is_a_smaller_cut_of_the_full_ad102_than/
And later on an analysis https://old.reddit.com/r/hardware/comments/1b0mtef/historical_analysis_of_nvidia_gpus_relative/
Muppet1616@reddit
The best GPU you could buy early 2020 was a 2080TI for around 1000 dollars.
A 5080 for the same price destroys that card even harder as your mobile phone example, so I really don't understand what the point is you're trying to make.
And even if you include the 3080 that released late 2020, it's still a 70% performance uplift (and no, you couldn't buy a 3080 below 1000 dollars in 2020).
So what happened during the mining boom and subsequent AI boom is that nvidia decided to make bigger, more powerhungry and expensive chips. Likewise the people who game on PC's nowadays include a lot of people working full time instead of being a kid and they can afford to chuck large sums of money at their systems. And they do.
Any simple comparison of the 10 or even 20-series cards with the latest generations and view the outcome as nvidia screwing you over is kinda short sighted. It's a corporation you don't even need to go that far, it will screw you over the most it can because that's kinda what businesses do, charging the most they can for their products.
tukatu0@reddit
Ah. The point was ops claim of 4080 12gb turned out to be successfull. (That was launched as a 4070ti for $100 less. And yes he is correct for the whole stack. The 5080 is a 5060ti being sold for $1400 in america. Take that for whatever you will
tukatu0@reddit
I dont know why you skipped from 2023 to 2025.
The point was nvidia has massive headroom in pricing. Even if yes they don't have any reason to. I didn't compare a 10 series anyways.
You are wrong about both things. The reason they went power hungry is simple. They know tech is slowing down. 2. The 6900xt and 6800xt existed. $650 and $1000 respectively. Plus bla bla data centers.
The second thing. The 3080 pricing was the opposite. It was actually $700 at launch and mostly $800 in 2020 during the 4 months available. By december. People started realizing the pricing could be hand in hand with this https://coinmarketcap.com/currencies/ethereum/ look at the all time chart. 2020 to 2022 lines up exactly with gpu pricing. Second hand even https://stockx.com/nvidia-geforce-rtx-3080-graphics-card-founders-edition scroll to history.
The 4080 was a fluke. It was always going to be $700 maybe 800 if ethereum didnt exist. Even if covid and the rest was the same. Infact you would have actually gotten the 4090d as a base $800 4080 since the 7900xtx was a thing.
I dont remember the point of this post anymore. But the 5080 could be $650 if they really needed it to. That reality does not exist so whatever.
EmilMR@reddit
4090 can have close to 4090 performance at 300 watt too.
EmilMR@reddit
5090s are probably garbage bins and they might become unstable compared with the god tier silicon that is on this card.
Ilovekittens345@reddit
At this rate I am convinced that in the end we won't be using Google their AI or Meta their AI or OpenAI (lol) but that one day Nvidia will start coming out with their own models that will be so much better because they had billions of dollars to trow at it's development.
No AI company is making money right now, they are ALL subsidizing their compute because they want as many users as possible because they are all training on the interaction the users are having with these models! That's what they are all after.
But the rate of development is so high right now that the way the giant tech companies are training on their users might become obsolete tomorrow and then they'd have to start over from scratch as to not fall behind in the race.
So they are burning through billions dollars in them buying hardware, or renting it, and getting the power needed. The bigger ones are all in the process of building their own power plants. These companies have money, but they are burning a lot of it!
Meanwhile it's Nvidia where most of that money goes to. They are the shovel sellers in the gold rush that's currently happening.
So what happens when the gold dries up and the run ends and now all the creditors show up wanting their money back? Well everything goes bankrupt, and now the shovel company buys up the companies of every remotely successful gold digger to turn all of it into one giant gold digging company that has the monopoly on it.
And I think that will happen with AI as well. I think in the end it's going to all be Nvidia. They will have the world's first AGI and everybody their dumber AI is connecting to it.
jerryfrz@reddit
They should full send it, buy this card, take out the PCB then slap gold heatsink + backplate + bracket on it
n1nj4p0w3r@reddit
Those card have 7 grams of gold via plating, you can just goldpate this 6000 pro.
mm0nst3rr@reddit
Can someone explain Cyberpunk 4k max RT+PT on 2:10? How can it possibly do +612%?
MumrikDK@reddit
Incredible that people miss both the joke and the asterisk note.
_reverse_noraa_@reddit
not everyone is chronically online.
fotcorn@reddit (OP)
It's a joke, he is using 4x frame generation on the RTX 6000 to
ResponsibleJudge3172@reddit
Makes even less sense than Nvidia since he is comparing within the same Gen but not using the same settings (in this case it's not a question of capability or support)
evangelism2@reddit
It was a joke. Not meant to be taken too seriously. Just poking fun at the misleading charts Nvidias been releasing.
Acceptable_Bus_9649@reddit
How is this a "joke"`? The 5090 supports MFG.
evangelism2@reddit
Its that deep bud. Its just a misleading chart parodying another misleading chart. Either understand that or move on
Acceptable_Bus_9649@reddit
It is. The 5090 SUPPORTS MFG. So this youtuber is faking the chart.
JayBigGuy10@reddit
Yes to make fun of nvidia doing the same thing during 50 series launch
Acceptable_Bus_9649@reddit
So nVidia compared a 5090 with MFG to a 5080 with just FG? No? Gotcha.
So he is faking his charts. Great to know.
Mrseedr@reddit
/r/woosh
reddit_equals_censor@reddit
he is using 4x fake interpolation frame gen there on the pro 6000, but NOTHING on the 5090,
but that would only get you to a fake 3.9x graph on a good day,
so he is probably using upscaling at at least performance to massively lower the resolution and again NOTHING on the 5090. at least performance upscaling i would guess.
and those 2 combined can get us to the magical fake nvidia nonsense graph included for funsies of getting 7.12x performance (7.12x, because it includes the original 1x)
that is the nvidia fake graph spirit there.
iDontSeedMyTorrents@reddit
That's exactly what Roman is doing. Here's Nvidia's 5090 marketing for comparison.
Gippy_@reddit
Same reason why the 5060 becomes a potato at certain settings. The 5090 has 32GB VRAM, but the RTX Pro 6000 Blackwell has 96GB VRAM. der8auer selected extreme settings that required more than 32GB VRAM.
jerryfrz@reddit
Classic Reddit moment, skip watching and go straight to commenting
Gippy_@reddit
No, it did run out of VRAM because MFG 4X is has 7X the FPS.
iDontSeedMyTorrents@reddit
No, it's called DLSS + MFG, just like how Nvidia gets ~8.5x improvement in their 5090 marketing.
mm0nst3rr@reddit
Makes sense. Didn't think you might need more then 32GB in a game though.
sh1boleth@reddit
There is no game right now that needs more than 32gb vram at 4k without mods.
4514919@reddit
He just used MFG to throw a jab at Nvidia's marketing...
KekeBl@reddit
Nope, it's because he used MFG4x for the RTX Pro 6000. It's a gag referring to how Nvidia wants reviewers to compare GPU performance in favour of MFG.
Cheerful_Champion@reddit
4K RT + PT without DLSS = needs lots of vram. RTX Pro 6000 has 96GB of it. Apparently 32GBs of 5090 are not enough
Virtual-Cobbler-9930@reddit
Bullshit. Cyberpunk does not utilize beyond 16gb vram on 4k native with max settings + PT.
600% shown in the video is a joke, literally. 10 seconds later he shows slide with text about using "mfg 4x"
Cheerful_Champion@reddit
Well that makes more sense
Aggravating-Dot132@reddit
It does, but only after a couple of hours.
That said, the card that can push playable fps there is going to have more anyway
spacerays86@reddit
Watch a few more seconds.
Virtual-Cobbler-9930@reddit
That a joke. Literally 10 seconds later there a slide showing "*rtx pro 6000 with mfg x4"
ColeXemi@reddit
Mfg 4x
MrMoussab@reddit
NVIDIA are out of their minds. 3x VRAM for 5x the price? What the heck?!
AveryLazyCovfefe@reddit
Man has never heard of enterprise-focused products or services. Suppliers always price gouge the hell out of them.
MrMoussab@reddit
Sure thing bro, it's not like NVIDIA is taking advantage of the AI hype to further increase their prices to unreasonable new highs. Keep defending the multi trillion dollars monopolistic company, it's the good way to go.
skizatch@reddit
This product is not for you
MrMoussab@reddit
How would you know? Do you even know me or what I do to decide if it's for me or not?
Granny4TheWin7@reddit
Its the kind of product where id you are complaining about its price then its not for you , so its not for you.
It’s like complaining that flying private is expensive,like duh it’s not for you.
MrMoussab@reddit
Wrong analogy and you're basically defending a multi trillion company just for putting more VRAM on a 5090 and put the tag pro on it.
Granny4TheWin7@reddit
Bro that’s literally every single company in existence, businesses are not like you and me they are willing to pay extra for more vram and pro drivers
skizatch@reddit
Wow you need to relax
MrMoussab@reddit
Never been more relaxed in my life
DynamicStatic@reddit
Sure seems like it
iDontSeedMyTorrents@reddit
Bro never heard of professional cards.
Michal_F@reddit
I like Roman videos, but not this kind. Buying expensive workstation GPU a make video about using this for gaming, how expensive this is. This is for very specific workloads and there is no alternative ...
MrMoussab@reddit
Didn't watch the video yet but I'm pretty sure he doesn't make a buying recommendation.
AveryLazyCovfefe@reddit
Yeah he concludes the video with something on the lines of: "idk why the hell I bought this, and damn 5090 owners are getting the leftover scraps"
rebelSun25@reddit
Anyone who believed Nvidia that it JUST wasn't possible to improve without dlss4 MFG magic has not been made a fool...
Jensen has become the biggest snake oil pitch salesman since the Springfield monorail guy
KARMAAACS@reddit
I think anyone with half a brain knows NVIDIA's been holding back the good silicon and moved everything a tier down, as in a 5080 die is more like a 5070. And a 5070 die would've been a 5060 in any other generation other than Ada Lovelace or Blackwell and so on. The reason the value looks so poor is because NVIDIA's jacked up prices and moved stuff a tier down so everything looks flat in terms of price to performance.
If the 5080 was $599 and was called an 'RTX 5070' it would be a great card. If the 5090 was more accurately called a 5080 Ti and priced at $1599 people would've liked it a lot more. And if NVIDIA made a real 5080 with a cut down GB202 die, with 80% of the 5090's cores and 24GB of VRAM for $899 or something it would be actually amazing.
At the end of the day, NVIDIA just knows they can milk the AI market by cutting off VRAM from GeForce cards like the 5080, thereby pushing AI buyers towards the professional cards where the margins are much higher they know that it will make them far more profit. This is what gamers don't understand, NVIDIA doesn't really care about gaming anymore, it's their fallback for when the AI bubble pops. NVIDIA knows the AI bubble will pop eventually and they're cashing in while they can. If and when AI bubble pops watch GeForce prices drop 20%. A 5080 SUPER would become $799. The only real AI product NVIDIA has is the Hopper GH200 and Blackwell GB200 stuff, the other stuff is the gaming dies repurposed.
soggybiscuit93@reddit
I think it's more nuanced than that. It's just simply not realistic given the current fab / wafer market for Nvidia to match the same die sizes at the same price points as in the past. N4 used in Blackwell is nearly 4x the cost per die as Samsung 8nm used in Ampere was. Even if Nvidia was altruistic and had tight competition, they couldn't match old die size price points.
The main issue really is VRAM. 5060 should've been 12GB, 5070 should've been 16GB, 5080 should've been 24GB, etc.
KARMAAACS@reddit
Sure but the node jump should still bring an advantage, as in either you should be getting better performance at the same power for a smaller die size due to clock speed and density improvements from the node, meaning a smaller die should be relatively better for the same price. Or if you make it small enough it should match the previous generation larger chips' performance at a more affordable price.
I think the problem is as you pointed out the price increase of TSMC silicon, it's probably outpacing that scaling rate of the silicon making it less affordable or flat. But let's also be honest here, NVIDIA is also being greedy too to an extent and milking gamers and especially the AI market too.
Well they could if they switched their gaming silicon to Samsung again or maybe even looked at Intel Foundry as is the rumors. They will probably be able to get a cheaper wafer price versus TSMC despite the silicon being less performant in either clock speed, density or power efficiency. But if Ampere is anything to show, NVIDIA can make a stellar product and a good value product on that type of node and they can mix and match their products with different nodes. I don't know if you remember but Ampere also had TSMC 7nm products for AI and datacenter and I think that should be NVIDIA's approach in future. Gaming on a cheap node and push power a bit more than usual to get the performance. Then for TSMC you use it for the AI stuff because of the advanced packaging tech, power efficiency etc.
Really TSMC silicon is wasted by NVIDIA in gaming, you usually can undervolt or power limit your 40 series card from say 450W to 320W or your 50 series card from 600W to 450W and lose like next to no performance. Really the power is being pushed for a pointless reason.
100% but they want to push the AI guys to buy the 5090 or RTX Pro 6000 Blackwell card to get high VRAM. Otherwise, the AI guys would be going out and buying 5080 24GB cards and not bother with the 5090 because it's far more efficient and you can get more perf per watt and scale higher workloads across more GPUs because you can buy two 5080s for every one 5090. NVIDIA limited the 5080 to 16GB on purpose to not intrude on their AI business, which I guess is smart. The 5080 SUPER might be a mistake for NVIDIA if they bring it out.
zacker150@reddit
I think it's also important to recognize how the demographics of gamers have changed.
Many of the high schoolers and college students who got into gaming 10 years ago (including myself) are all now highly paid professionals making six figures, and the upper-middle class spends a ton of money on their hobbies. Gaming with a 5090 is cheap by their standards.
rebelSun25@reddit
Your chatgpt length essay is years too late. Too many trolls here in denial proving my point
BarKnight@reddit
Watch it show up on the steam survey above the 9700XT
JesusTalksToMuch@reddit
Whats sad is ppl get the name of AMDs gpu wrong. Really shows that ppl dont care about it.
Candle_Honest@reddit
How is it sad? Do people need to know the engineering code names as well for the GPUs?
Alarchy@reddit
Definitely not AMDs fault for changing their naming scheme to copy Nvidia's with the intent to cause that confusion.
Tanzious02@reddit
AMD intentionally likes to follow nvidia's lead no matter what. Which is why they dont innovate. The gaming industry is screwed as we have to follow nvidia's vision of AI upscale and frame gen rather than properly optimizing our games. AMD purposefully doesn't deviate from the norm and reign with their own vision on gpus, as its too "risky".
red286@reddit
I work in PC sales and, no joke, I've already sold more RTX 6000 Pro Blackwells (4) than RX 9700XTs (0).
Mind you, we primarily focus on workstations.
Arbiter02@reddit
That doesn't shock me. Radeon's support might as well be non-existent in the pro space, I've run into scenarios where the R9 M370X in my old ass mac has better support than my 6900XT, simply because the Mac can leverage MPS while the 6900XT is stuck with ROCm.
Rentta@reddit
I would hope you don't try to sell 23 year old cards
Rentta@reddit
I mean 9700 was launched in 2002 so it would make sense
Jeep-Eep@reddit
I mean, folks work on them? So if folks get wind of the fact it's a quite competent dual purpose card...
9897969594938281@reddit
Now, this gave me a sensible chuckle
mapletune@reddit
9700XT? xt version of 9700X cpu? back to normal naming version of the 9070XT?
SPAREHOBO@reddit
I’m not saying that I want this to happen, but it would be funny if it did.
hocheung20@reddit
This seems like Titan reborn, but wow the GPU price inflation.
Listed below are the last 3 Nvidia Titan cards: Model | Release Date | MSRP (USD) ---------|----------|---------- Titan Xp (Pascal) | 2017-04-06 | $1199 Titan V (Volta) | 2017-12-07 | $2999 Titan RTX (Turing) | 2018-12-18 | $2499
ResponsibleJudge3172@reddit
It's not a Titan. It's a Quadro. Quadro has always cost similarly
Strazdas1@reddit
yeah, titans are called Ti now.
hocheung20@reddit
xx70Ti and xx80Ti would have never qualified as Titans in any scenarios.
Titans used to be gaming flagships and this $10k card is a flagship and uses Game Ready Drivers.
soggybiscuit93@reddit
That was a choice by the reviewer. Any PC you buy that comes with one of these isn't shipping from the factory with Game Ready Drivers.
Strazdas1@reddit
technically all Nvidia GPUs support "game ready" drivers. Whats new in this card is that it allows power limit settings to be changed.
timorous1234567890@reddit
Nah. Titan just does not exist. It had a separate driver to the gaming cards that was somewhere between the gaming parts and the pro parts.
Seems like NV wanted to push the people who used more than the basic gaming features towards Quadro instead of offering a cheaper pro-sumer product.
buildzoid@reddit
Titans were always just overpriced gaming GPUs meant to normalize higher prices. You wouldn't make a star wars edition professional card.
hocheung20@reddit
It uses Game Ready Drivers, not Quadro.
noerc@reddit
Apparently you can even use Game Ready drivers with this card. As der8auer said, this is the product people expected the 5090 to be, but instead they packed 96GB VRAM on it and sell it for a massive premium.
Moscato359@reddit
The 4090 was an 8/9ths cutdown of the rtx ada 6000
So expectations should be tempered
viperabyss@reddit
Same with RTX A6000 / 3090, or RTX Titan / 2080Ti, etc...
Not sure why people expected differently this time.
Own-Lemon8708@reddit
RTX Quadro 8000 48gb was just an unlocked 2080ti too.
terraphantm@reddit
A6000 and 3090 had a tiny difference in comparison to most of the others. And even then generally the workstation cards performed worse than the geforce cards in gaming since they had slower ram and lower power limits. This new one is an exception. It's closer to how the old Titans were conceptually
viperabyss@reddit
Sure, but OP's point is that somehow people expect the perfect GB202 die for 5090, when that expectation is wholly unrealistic, and have never been true at least going back a few generations.
And yes, workstation cards typically have less performance than consumer cards, despite having more SMs, are all down to power consumption. RTX Pro 6000 Blackwell is the first workstation card that not only has the perfect die, but also the same TDP as the consumer card.
By the way, the Titans are now xx90 series, where it is catered to the pro-sumer segment of the market.
hocheung20@reddit
RTX Pro 6000 Blackwell is not perfect die. Only 188/192 SMs enabled.
xx90 are not Titans. All Titans had full fat dies.
viperabyss@reddit
xx90s are all Titan class GPU. The Titan became xx90s when AIB OEMs got upset at Nvidia for hogging that market, since Titan GPUs were manufactured by Nvidia / PNY.
noerc@reddit
I rather meant the performance uplift. There is a ~45% performance difference between the 4090 and the RTX 6000 Pro. This is what many wanted to see on the 5090, because that's what we got with the 4090 when comparing it to a 3090.
gAt0@reddit
Maybe because the super-premium cost of a toy to play with games or chatbots.
Safety concerns aside, this generation is being meh at best.
viperabyss@reddit
...you mean similar to cards like RTX Titan, which was $2,499, and 2.5x more than the 2080Ti flagship?
People hating on Nvidia so much that they've forgotten this has been the norm for years.
Moscato359@reddit
Idk, people just make blind assumptions
Madeiran@reddit
Game Ready drivers have worked for every single Nvidia workstation GPU since they first started labeling them "Game Ready" drivers in 2014.
p4block@reddit
They even swapped the name of your quadro for its depressingly low tier gforce equivalent back then.
DepthHour1669@reddit
I mean, technically true lol but 0 people are gonna get a RTX6000 Blackwell to game lol
alelo@reddit
wasnt that said about titan cards too?
DepthHour1669@reddit
You can buy a Titan card in a store. You can’t even buy a RTX6000 unless you have a corporate non-personal email address to register with an enterprise supplier, to purchase the enterprise card.
old_c5-6_quad@reddit
You could go to any computer retailer and get one. Just plop down the cash and they can order it. They're not going to keep anything like that in stock because the likelihood of someone walking in and buying it is pretty much 0.
terraphantm@reddit
You can buy them pretty easily without being a corporate / enterprise customer.
KARMAAACS@reddit
I can walk right into a retail store here in Australia and buy an Ada or soon to be Blackwell professional card. I dunno whats going on in the USA, but it seems in other regions they are over the counter purchases.
alelo@reddit
you can buy RTX6000 over the counter https://www.proshop.at/?s=RTX+6000
DepthHour1669@reddit
Technically that’s against the Nvidia USA distribution contraxt
ragnanorok@reddit
With Proshop being european I don't think they're under US contracts, nor would it be that easy to include such clauses in EU ones.
alelo@reddit
maybe nvidia europe has different contracts, doubt proshop would risk supply if they cant sell those
Subway@reddit
Hold my credit card!
JackSpyder@reddit
Hold my dads credit card!
zacker150@reddit
Fixed that for you. Now it's a tax write off.
JackSpyder@reddit
Nice.
youreblockingmyshot@reddit
The number will be low but I’d bet my entire remaining life it won’t be 0.
jerryfrz@reddit
Roman literally said in the video that he's gonna game with it.
skizatch@reddit
Sure they will. Don’t underestimate the filthy rich. Might even be easier to get than an actual 5090 FE 😂
shugthedug3@reddit
Oh a few will for sure. Best of the best, unlimited budget types do exist.
beigemore@reddit
You are wrong.
Qweasdy@reddit
I'd be shocked if it was less than a thousand people who did exactly this.
There are a lot of people with a lot of money. There's a pretty big market for private yachts, private jets, hypercars etc. And those markets are all growing these days. You think those people give a shit how much it costs?
rbmorse@reddit
Some people will buy it simply because it cost $10K, without a real clue as to what it's really all about just so they can say they have the fastest PC there is.
fixminer@reddit
It will certainly be more than 0, though not much more. For some people $10,000 is pocket change.
skizatch@reddit
Buy NVDA shares when they’re low, sell when higher, repeat until you have enough to buy one
Strazdas1@reddit
Error. Unable to buy any since its never low.
skizatch@reddit
It dipped to $95 very recently due to all the tariff chaos
Strazdas1@reddit
whole market dipped. But only for a week and it was back up.
makistsa@reddit
They were not pushing it as much as i thought. It loses efficiency below 400w.
ResponsibleJudge3172@reddit
Ever Blackwell GPU can overclocked well. This means that power limiting will affect performance
RZ_1911@reddit
That depends on GPU quality . One chip can work 2850 easy inside its limit . Other one will struggle 2500 with the same limits
ResponsibleJudge3172@reddit
By overclocking we'll, I mean that OC actually yields tangible almost linear results in performance gain.
In such a scenario, then YES underclocking will be expected to similarly drop performance quickly
nero10578@reddit
These things will easily pull 1kw given the power limit allowance.
Self_Pure@reddit
I really feel like this should have been the 5090 and the 5090 was the 5080
SirMaster@reddit
The 5090 should not have 96GB vram…
Self_Pure@reddit
Not referring to the Vram, but the performance obviously..
GenZia@reddit
Really like the way you can just drop the power target all the way down to 25%.
No idea why power limiters only go down to just \~70-75% on current gen. flagships (9070XT/5090).
A bit unrelated but it would be nice if GPU manufacturers allow us to change their cards' TDP\TDC\EDC and voltage to anything we want.
At least AMD has given us full control of their CPUs via Ryzen Master. It wouldn't hurt to make a similar utility for their GPUs ("Radeon Master").
There used to be More Power Tool for their GPUs (and Red BIOS Editor for Polaris even before that), but it doesn't support RDNA 3 and 4 GPUs.
AMD deliberately locked down the BIOSes, for some reason, but I digress.
nero10578@reddit
Yea Radeon 6000 was awesome for overclocking. Got a 6900XT to 3GHz back when 3090s were still in 2GHz territory.
vegetable__lasagne@reddit
Couldn't you just set a manual clock speed if you wanted <70%?
Homerlncognito@reddit
I don't have experience with AMD, but you can undervolt Nvidias to pretty much whatever. The only issue is that you'll start losing quite a lot of performance.
HisDivineOrder@reddit
Not true with the 5090.
Jeep-Eep@reddit
Yeah, someone in nVidia definitely have consumer versions of boards for the full fat version for when the AI bubble goes...
AHrubik@reddit
Whelp we have a new level of flex on the market. All those people claiming everyone was just jealous of their 5090s now with egg on their faces.
honkimon@reddit
I think it's laughable the gap between each tier of nvidia card considering yield rate, and really, any other manufacturing factor. They are putting apple to shame with this.
Vushivushi@reddit
Crazy thing is that the RTX 8000 Turing was $10k in 2018.
youreblockingmyshot@reddit
I would love for Roman to compare his 10k gpu against 2 FE 5090s doing lossless scaling (ridiculously overkill) but if anyone seems to have the hardware for a 10k vs 4k gpu setup it’s him now lol.
MumrikDK@reddit
That is some truly magnificent coil whine.
youreblockingmyshot@reddit
Yea I’ve never really had coil whine on a gpu (maybe I’ve been lucky) until the 50 series. Even my friend’s builds I’ve put together didn’t really have coil whine that didn’t go away permanently with a few minutes.
VEGA3519@reddit
It's a choice ig, but if you want to pay $10k GPU for Workstations to play games then fine. Tho you still can just get a 4090 and have essentialy a similar or the same card, but without melting cables, 575 W power draw and with smaller VRAM (might sound stupid, but imo you don't need 32 GB VRAM rn that much. It's not like your going to launch multiple games, are you?).
fiah84@reddit
4090 melts cables just fine, you have to go back to the 3090 for non melty cables
hackenclaw@reddit
I'd imaging IF AMD is able to pull a modern day HD 5870/HD6970, Nvidia would have to sell this at consumer price.
Competition drives the price down, but AMD choose to ignore the entire Radeon line up, they choose to exist only. They are not even trying lol.
iDontSeedMyTorrents@reddit
Did you not know that AMD also has very expensive workstation cards?
i7-4790Que@reddit
They stopped trying because that whole series lost them money.
Loss leader type strategies don't work when you still can't get enough volume to make up for your aggressive price points.
Nvidia made record profits at the same time and has ever since. At some point you have to blame the consumer base for a lot of the predicament we are in now
Intelligent_Top_328@reddit
My 670 is better. Sentimental value.
jj4379@reddit
Wow, I'd feel bad if I bought a 5090 when this cards out there, I'd feel so silly.
Brisslayer333@reddit
It's five times the price for an 11% performance uplift, did you even watch the video?
MumrikDK@reddit
I think that comment fits you better. They're playing into Roman's mocking comments in the end about the 5090 being this card's trash.
jj4379@reddit
of course, I love watching his videos, its just sarcasm at how everything is incredibly expensive these days
Warm_Iron_273@reddit
More garbage. These guys are paid to promote these cards.
MumrikDK@reddit
There no chance you actually watched this video where he repeatedly mocks Nvidia.
Raikaru@reddit
Paid to promote cards that your average person couldn't figure out how to buy even if they had the money to?
Warm_Iron_273@reddit
Irrelevant. Marketing is marketing. Everybody knows they get paid for these posts, they even admit it themselves openly all the time.
Raikaru@reddit
If they were getting paid for this youtube video, they would need to disclose it or they could be reported to the FTC
Warm_Iron_273@reddit
Lol. The naivety is cute.
Raikaru@reddit
Lol. Then report them to the FTC. You’ll get the satisfaction of them never doing it again.
Warm_Iron_273@reddit
Like they’ll give a fuck. If they did, all of these YouTubers wouldn’t be doing it. They do it because they know there are no repercussions.
Raikaru@reddit
The FTC has literally sued youtube before and they regularly sue companies for violations of their rules. But keep making excuses for why you won’t follow up on something you strongly feel.
ghostdeath22@reddit
Sad they price these so high, hope some chinese companies create gpus to dump prices
cheese_dreams89@reddit
I have not watched the video, but from the title alone, isn't that a workstation card? Why would you use it for gaming? I'm sure you could but it doesn't seem very economical. Nvidia has always had workstation cards that are more powerful than the typical gaming cards. I don't see why this one is special.
ASuarezMascareno@reddit
Gaming on a 5090 is also not very economical. Isn't the usual argument that those that want the best don't care for value?
Leo1_ac@reddit
Ppl who buy 5090's for $3K each don't mind about money. They would certainly buy a GPU for $10K b/c money isn't a concern for them.
ResponsibleJudge3172@reddit
Being rich and money being meaningless doesn't begin at $3K
shadowtheimpure@reddit
There's a very big difference between $3k and $10k. Where someone might splurge on a $3000 card...they have no way to pay for $10,000 card. I've got a 3090 that I bought in hope to not have to upgrade for a while. So far, the card still rocks out on new titles at max settings.
fotcorn@reddit (OP)
I don't even know if that much VRAM is useful for any classic workstation tasks like CAD, video editing or 3D modelling.
It's mostly an AI card, being able to run a 70B model with a moderate quantization (Q8, maybe Q6 for reasonable context length) on a single card is amazing.
bphase@reddit
What are such models generally used for? Is it industry-specific stuff?
From my understanding, that's still much too little to run state-of-the-art LLMs which require hundreds of gigabytes, so many of these in parallel. But these would probably be useful for running e.g., image recognition or object recognition models in various industries?
Hamza9575@reddit
This card is useful for decentralized ai or rather home server ai. Sure your local ai isnt as good as cloud ai, but with 96gb vram you can get much better ai than 32gb on a 5090. Cost of this card is very reasonable for what you get, people are forgetting this card is actually 13% faster than what it shows due to onboard ecc. As onboard ecc causes around 13% performance loss from the ecc calcularions. So in reality this card is around 30% faster than the 5090 if not vram bottlenecked. 5090 does not has ecc calculations.
shadowtheimpure@reddit
Not all models are THAT big, those are the behemoth 100B+ models that the only way to run is to rent GPU time on a datacenter.
Gippy_@reddit
This is the first one in a very long name (maybe ever) that a flagship workstation card is at least 10% faster than the flagship gaming card. The RTX 6000 Ada wasn't always faster the 4090 because of slower clocks and VRAM.
trouthat@reddit
They can’t keep doing this