Upgrading from a 1070 (I know....), is it worth getting a 5070 or should I just get the 4070? I live in Canada so the 4070 is 800$ or so, don't want to spend too much more than that if i can help it. Do you think its worth waiting to see how the 5060 will be (I originally was going to grab the 4060 it with it only having 8g vram I decided to spend some more for longevity)? or what about the Radeon RX 9070?
Don't feel bad; I'm also rocking a 1070 GTX (it was a hand-me-down after my sis and bro-in-law built new rigs), so really ANYTHING is an upgrade for me; but I'm basically in the same boat: 40 series is over $1K here, I want a ROG 5070, but they haven't dropped yet, so am thinking about the 5060. I almost got a Prime 5070, but they sold out right before I got paid, which I took as a sign to wait. š
Have you seen monster Hunter wilds? New games are getting worse and worse in optimization. Somehow we've gone backwards in performance in recent years without even improving graphics.
Man, the prices for bikes in general where I live has gone up at an absolutely ridiculous rate.
Ten years ago you could get gold second hand for next to nothing. I bought one that had been a spare in TDF (like spare of a spare, mind you) for $600. The rims alone, new, were easily $1k.
My aging electric was $400 when it last went out of stock, in 2014. Getting a similar bike today is close to $2k.
Right, what I meant is that after a certain price point, spending money on more expensive bikes doesnāt do much besides give you bragging rights. For casual use, weāre way past the point of diminishing returns and we hit that point many years ago, if not over a decade ago.
HUB said on their podcast episode about the pricing that when they talked with retailers they said AMD cards didn't start to sell until they were 20% below the equivalent NVidia card, and didn't really start to move until it was 30% below. They're not at 30% right out the gate so I guess they're hoping that projecting more confidence in their features and drivers, (and reviewers agreeing with them on those points) will make up some of the difference, but they still need a decent discount relative to NVidia to get the ball rolling.
"AMD cards didn't start to sell until they were 20% below the equivalent NVidia card, and didn'tĀ reallyĀ start to move until it was 30% below" - yeaj but hub only focuses on rt and vram. This narrative does not work if you factor in RT and DLSS
I can get pretty nice road bikes for 900ā¬ second hand and spring is coming.
The irony here is that bikes are in the exact same situation. They are way more expensive than they ought to be, but because it's an enthusiast market customers will pay premium prices.
Because historically at least, AMD hasn't upsold much because nobody except hardcore gamers consider AMD. Nvidia can do this because everybody and their mothers buy their cards even with all the bad reviews.
We said the same thing about the 4060. People bought that shit in droves. Even with the 7700xt RIGHT there.
PC price/performance literacy went out the window years ago. Itās just console war, brand loyalty bullshit now. Nvidia knows this and is cashing in, while calling us pay piggies, and telling us to eat our slop.
Blame amd for naming it that way then. They decided that it was the 6800xt successor in name not 6800. Even if they dropped the price a bit the consumer just now thinks that's where the 800xt line is now
Canāt speak for the EU, but in NA, I remember there being a $50-150 price gap depending on what model you were able to snag. More if you found a 6700xt. It was such a trap card over here.
well the 7700xt is 37% faster according to tpu, so 1.37*300=410 so 450 was far too high and that is just in raster and does not consider that dlss is useable unlike fsr.
Amd fans really love misrepresent stuff so they can satify their superiority complex
Amd to compete needed to catch up to ML upscaling and ray tracing (since it's becoming more baseline). If they deliver then I think the market will considerate them more if the price is competitive this time.
I would love to jump ship and buy a 7900XTX because its the only "affordable" card offering more than 16gb of ram, but then I will shoot myself in the foot by not having CUDA / proper machine learning support.
The problem is that Nvidia cards have more features. It's a lot easier to justify buying something that has more features than something that has less.
I don't do a ton of AI stuff, but when I feel like it I want to be able to do that. With AMD it seems like a coin flip whether it will work or not.
Yep. 9070(XT) will have no DLSS4, worse RT, no Reflex, no RTX Video, no RTX HDR, no Nvidia Broadcast, no ComfyUI Windows support (and iffy support for most AI stuff in general)
Whether that's worth it given ~Ā£200 discount (let's hope for good street prices on Thursday...), vs the similar perf 5070Ti, that depends on your use case
FSR4 we can't say how good it is yet compared to DLSS4. Unlikely but we shall see, personally I know I won't be able to tell the difference. DLAA they mentioned in the video that they are supporting. True about RT its most likely still not quite there yet see especially in Path tracing, but we shall see.
ComfyUI is just an AI digital art generator isn't it, having a small look up on it, it appears that it does work with AMD, not sure how good it is though. Running LLM is easy on AMD and works just as well, I'm currently running deepseek 14b on my rx6900.
4060 is significantly cheaper than 7600xt 16gb here, 7600 8gb is cheapest.
7700xt is there to make you buy 4060ti 16gb, AMD basically works like a decoy product with a few exceptions.
4060 was in all the prebuilt pcs. The low power req also meant it was easy to pair with a low end psu in prebuilt and older pcs. 4070 was the easy choice over it instead of any amd equivalent.
Also 4060s were actually in stock
Then amd had no dlss alternative, no framegen, zero rt performance.
Even with worse value for raster it was hard to pick amd.
With blackwell things are very different. The power efficiency gap is smaller, prices are way higher, availability is zero, 4000 series cards have become hard to find and are being scalped too, amd seems to actually have some features now (hardware accelerated upscaling) and usable rt performance.
The reason amd didnt sell for the past years is because they were terrible, and a bad value nvidia card like the 4060 doesnt change that
This was during the launch year. A lot of it had to do with how jacked gpu prices were. There have since been official price cuts and things started selling at MSRP when the super series launched.
For the record, just doing a cursory look, itās next to impossible to find a decently priced 4060 ti in stock (at least in Canada)
I don't think scalpers are that interested in AMD, because they aren't as easy to sell as Nvidia. some are already regretting buying an RTX 5000 series to resell.
Well if people are not complete dummies they will just go for 9070 / XT
If the 90 series is good, it won't be available for anything like MSRP. If it's available at MSRP, it's not going to be good. This isn't a reason to buy the 5070, but it's worth noting that everyone is saying "Just go to the 9070" but that's not suddenly going to offer any better value prop.
If people are not complete dumbasses they'd buy 30 or 40 series used. Unless you are in the US and have normal pricing then you're an idiot for buying any card from the new generation. 9070xt is listed for 1k euro in EU. How is that ever the right purchase when you can get 4070 ti for 600 and you'd even have some warranty left.
Still hoping to see good stuff from the reviewers but unless prices drop to what they are supposed to be then neither nvidia or AMD deserves to be bought.
Well if people are not complete dummies they will just go for 9070 / XT
its like you learned nothing from this video, wait for fucking benchmarks. you make this sound like it is the better card at a better price, dude, its not being sold yet and nobody released reliable third party benchmarks...let's preorder, right?
The wheels of communism must turn. Don't want to get consumers comfy with their perfectly functional cards, they must always want the one with the bigger model number.
The entirety of this launch could have been a driver and software update for FG -> MFG but the chose to lock it under hardware and hopefully it backfires.
Next gen I'm going full AMD. Fck this anymore. DLSS and FG isn't THAT better compared to AMD at the moment to warrant the situation where we are all getting fcked by Nvidia.
Yeah because the informed people, if we just go off number of YouTube watches, is a pretty small % compared to overall sales so clearly most people are just ābigger number betterā buyers.
In a way, this whole Blackwell shitshow isnt surprising at all and kinda makes sense when you realize that
1 - It's on the same node (though prior gens that stayed on the same nodes had better gains)
2 - this is the first generation of NVIDIA GPUs whose development cycle mostly took place following the AI Boom, which is why most of the significant performance gains are in AI applications and not much else
Yup. I replaced the GPU in my livingroom tv/gaming pc a little while after black Friday. Was looking for Black Friday sales, but there just weren't any, so I picked up a used 4080 super for $850 from r/hardware swap and immediately regretted it. That's just like 10% off MSRP, not really a deal, and it's "used".... I just didn't want to wait for the 5xxx series and everyone was saying "just wait" and we were all laughing at the msrp prices people were posting up their used 4080s and 4090s for.
turns out that was one of the best decisions I've ever made.
I had a 3070 that I was pretty happy with but stumbled upon an open box 4070 for around $400 about a year ago, snagged that and sold my 3070 for $300. I wasnāt really looking to upgrade, it just made sense, and now Iām in a great spot because of it.
Yeah I got one for $600 a month ago and had a couple people telling me it was stupid. Each week that has passed Iām more and more happy with my decision.
I wanted to wait for the 5xxx and 9xxx reviews then i found an open box 4070 ti super for just 611ā¬, so glad i don't have to fight bot and scalpers to buy a new gpu
Yep - I am astounded at the stock issues. We don't seem to have anything like the cryptocurrency boom and the 4nm is mature. I wonder what is causing the issue this time.
Then again, the GPU is not really "worth it". It's a more costly 4070 Super at this point.
They want to maintain their mindshare just in case AI ends up crashing.
If they completely exit the market, they essentially cede it to AMD and Intel, and if they want to ever re-enter the market, it will be pretty hard to do so.
Because R&D still costs the same even if you don't sell a single new card and you're looking short term for stock when in 6 months it won't matter. Also the segmentation is purely artificial. They can release supers of any card this generation at any time to boost sales when capacity frees up.Ā Ā
A scam would be Nvidia selling you a box with 5070 written on it, but inside there is a 1050ti GPU.
If you have all the information about the product and do not like the performance gains of it after buying, you DID NOT get scammed, you we're just a stupid consumer.
If a car manufacturer announced their new GTR model with the expectation that it would have the same performance as a supercar from a few years back and it turns out it didn't actually have that performance, and they even put in a worse engine than they normally do for their GTR models, wouldn't people call that a scam?
Bad analogy with cars, because car manufacturers literally do this all the time and nobody cares LOL..
And no I wouldn't call it a scam because I wouldn't buy a car based off marketing material, neither do 99% of the population, and neither would a salesman at the dealership sell it to you if it was an actual scam..
They can say it's 50x faster than the predecessor flagship in X scenario, it doesn't matter, the specs clearly say what to expect. If you buy in to marketing, you're a schmuck. You get the same specs as on the box, which in the most literal sense - NOT a scam..
The product is for somebody, obviously not for you.
What a disappointing generation. I'm still using my 3080 FE from 2020, and I was really hoping to upgrade this year. I'm not spending $2k on a 5090, and sounds like the 5080 is similarly limited on VRAM and will also likely suffer from issues in the future.
I'm also on the 3080. Would be happy to sit on it longer if it wasn't for the 10 GB of VRAM.
Hell, not sure when I'll ever upgrade at this rate.
I just want something OK for an OK price, you know?
The worst part of this generation is that it's so shit that I won't even consider getting it used when the next generation arrives. At least with the 4000-series, you could get lucky and get something used for a price that was less ridiculous.
Honestly not really sure what to do. Monster Hunter Wilds honestly runs super inconsistently on my PC which is disappointing, but I haven't really had any issues running most games on higher settings.
Part of me hopes that they announce a 5080ti at some point this fall and that it's reasonably priced.
Part of me is considering giving AMD a shot. The 9070XT is getting pretty great reviews. It might be worth the upgrade if I can get it for MSRP ($599). There have been a lot of comparisons to the 5070 ti. It also has 16gb of VRAM, which isn't great, but better than my 3080 at 10GB.
Similar situation. I thought about grabbing a 7900 XTX for the raw performance and tons of VRAM but the increased usage of ray tracing in games had me pause on that one, and it's still hovering around $1000 USD if you find any in stock for a two year old card which feels wrong.
Depending on the actual performance, I am thinking of grabbing a 9070 XT to use for a couple years at the least in hopes of GPUs getting better next generation. Bumping from 10GB to 16GB doesn't feel significant but it's at least a short term solution for (hopefully) good price/performance rather than dropping a grand on Nvidia's current disaster.
Which does make it more depressing because all NVIDIA had to was just give the 5070 the same amount of cores as the 4070 super. That would have given us a semi decent +20% gen on gen uplift.
people here really expeting an 80% gen over gen uplift in value is quiet insane. But no surprise considering the delusional brainwash they get from chnnels like HUB
We had a less than 40% perf increase between the 4060 and 2060, which came out 6 years ago. At the same time the 70 series got an 87% increase at the same time, while the 80 series got a 143% increase.
Nvidia has been sandbagging all the affordable cards with garbage upgrades each generation. It's about time we finally got back to actual upgrades. If the 60 series were upgraded at the same pace the 80 series got updated then the 5060 would be 4% faster than the RTX 5070. Funny how that works out huhh?
Used to be that the xx60 class card matched the performance of the previous gen xx80 class card, at minimum. Now the xx70 class card doesn't even beat the xx70-Ti card of previous gen. That's zero uplift.
Even looking back historically, that's only seldom been true.
The GTX 460 was a bit ahead of the GTX 280/285 (source: TechPowerUp)
The GTX 560 usually very soundly lost to the GTX 480 (source: TechPowerUp)
The GTX 660 usually matched or beat the GTX 580 (source: TechPowerUp)
The GTX 760 slightly lost to the GTX 670 (source: TechPowerUp)
The GTX 960 is weird because Kepler has aged so bizarrely, at launch it slightly lost to the GTX 770, in newer titles (~2016 or so onwards, I want to say?) it's about the same or faster. (Source for launch: TechPowerUp)
The GTX 1060 matched the GTX 980 (source: TechPowerUp)
The RTX 2060's roughly a GTX 1080 (source: TechPowerUp)
The RTX 3060 is roughly an RTX 2070 (source: TechPowerUp)
The RTX 4060 is roughly an RTX 2080 (source: TechPowerUp)
All reviews were chosen at the time of release.
Even if we arbitrarily do a cutoff at Turing, 2/7 of them lose to the previous -70 and 1/7 of them lose to the -80. Only 4/7 manage to match the -80, that's not really a common enough trend to say "used to."
That doesn't mean these cards aren't disappointing, but it was functionally a coin-flip if the -60 matched the previous -80 or previous -70 (and that's with a generous cutoff period)
in addition the price for the 80 class cards went from 550 for the 980 to over 1000 usd. So nearly doubled while the 60 class card went from the 960 at 200 to 300, so only 50% increase.
Yeah, it's not always 100% true, but it's generally the case that there's a substantial gen on gen improvement, this gen there is effectively no improvement.
Yeah, for sure. Nvidia really dropped the ball here and it's especially disappointing when Lovelace was so mid to start. There's not really a GPU here other than the 5090 that you couldn't have bought 2-3 years ago.
When the 5070 was announced I was honestly kind of eyeing it (I knew it wasn't going to be a 4090, obviously, but I figured it'd sit closer to a 4070 Ti Super), now I'm not giving really any of Blackwell a second thought lol. The 9070's kind of interesting but I think I'm gonna sit this gen out entirely.
Side note, apologies if I sounded hostile in any way, wasn't my intent at all; I re-read my post and realized I sounded really blunt, I don't know how to correct it to sound less blunt
+33% bandwidth helps in some games, less in others, which is why you see some titles the 4070S wins by a small margin, these GPUs trade blows due to different engines scaling on bandwidth.
It has a much higher memory bandwidth and a decently-sized TDP increase.
Nvidia knew exactly what they were doing with this launch. They basically gave the Nvidia pay pigs the same card for a $50 lower MSRP while probably keeping or even increasing their margins due to the smaller die.
Because this is rightfully compared against the 4070 Super, not the original 4070.
When you think about it, the Super refreshes shouldn't even happen in the first place. The only reason they do is because the originals have a lukewarm reputation. And that's why RTX 20 and RTX 40 had Super cards, but not RTX 30 or even GTX 16.
The 4070 itself was poorly received with it only matching the RTX 3080 10GB and 70-series cards went from having a 256-bit bus on the 970, 1070, 2070 and 3070 down to 192-bit. All of those cards also offered much more substantial performance upgrades over their predecessors. The 4070 had 50% more VRAM, and it was only barely adequate considering the 3070 should have had 12GB itself. It was also only 25% faster than the 3070. Now we get a card with the same (and now very inadequate) amount of VRAM that's also only 25% faster than its predecessor, only this time also using 20% more power.
It's only 25% faster 2 years later, it has an inadequate amount of VRAM and is barely any more efficient than its predecessor. How can people not be mad?
Yeah because it was cut down substantially. The 3090 and 3090ti were also 628mm2. The 5090 is also cut down but its only 11% cut down and the 3090ti was the full die. This means the 5090 has more active die area than the 3090ti and 3090 and alot more than the 3080.
Of course it was an outlier that was basically my point. The whole comment chain started with calling a 5090 a 5080 which is absurd. Then another guy brought up the 3080 die size to act like it was a reasonable thing to call a 5090 a 5080
I was simply pointing out that the 5090 deserves to be called a 90 series card more than any other 90 series card if anything. The die size is huge. It has more area than the 4090 and 3090 by a large margin. And comparing it to a 3080 that was cut down more than 20% still makes the 5090 look like a way higher tier card than the 3080 even though the 3080 is a significant outlier as the biggest 80 class gpu nvidia has released.
yep, no reviewers recommended the 4060 really that I remember last gen either, but it's one of the most popular cards across steam hardware and the like
What were people supposed to do? Buy the slower 7600 that had the same VRAM as 4060 and often cost similar in many parts of the world? Or trust people not to scam them as they sell second hand parts at the same price as new?
Actually no I don't, I want there to be three competitors out for blood across all market price points rather than the one-and-a-half-and-what-the-fuck-Intel that we've had for two decades now.
Actually no I don't, I want there to be three competitors out for blood across all market price points rather than the one-and-a-half-and-what-the-fuck-Intel that we've had for two decades now.
The problem is not that an expensive halo card exists. The problem is that there's no runner up top of the line card anymore. Pretty much every historical generation had a high value high end GPU that was 10-20% off the top.
yep, agreed. I just checked and the 6800 Ultra was $500 and was 2004. So that really appears to be the breaking point when things started to jump above that marker.
Actually no I don't, I want there to be three competitors out for blood across all market price points rather than the one-and-a-half-and-what-the-fuck-Intel that we've had for two decades now.
As someone finally looking to upgrade from their 1660, I really wish Intel got their game up. The Battlemage stuff looks like a fantastic upgrade for the price, but there's no inventory and I'm frankly still skeptical about long term performance
I always find it hilarious that the 3070 constantly gets excluded from benchmark comparisons. Itās like everyone forgot it existed and only compare against the the 3080 or 3060 and I have no clue why
Yes, Nvidia has lost their mind and want everyone to buy the highly successful (1.1% market share across all 90models, 3090/3090ti/3090d/4090) product.
I usually look at the 4060 and see that even if it launched 4 years after my 5700XT and it was costing more at launch than the 5700XT, the raster performance are barely 10% more!
the 4060ti 16gb doesn't have more shaders than the 8gb 4060ti, but it has a msrp of a whopping $500!!! You could get og 4070s when it launched for roughly the same price point and those were demolishing the 4060ti
Maybe it was once, but the spec of that card doesn't make any sense for gaming. It does make sense for for AI where VRAM is more important than speed for many workloads, and I remember it was advertised at least by MSI specifically for running Stable Diffusion.
Can't find a copy of the ad but here a page about it that's still on their site:
Rtx 3070 beat rtx 2080 Ti aswell. NVIDIA keeps their gaming division at life support since all the $$$ can be made on AI, incase it fails they can go back to making gaming gpus somewhat affordable.
90W in idle with multi monitor and 340W while gaming
It did not draw 340W in gaming as it has 275W tdp. Idle multi monitor draw, yea that was apparently was hot trash, even worse than a 290 somehow from the one review i quickly came up with.
I remember it being more expensive, but regardless the power draw difference was so huge that I remember going for the nvidia 970 because I didn't want such hot and loud card for a 10% performance difference
I believe the R9 390 was less money than a 970 in North America. It also came with 8gb of VRAM vs the 3.5 + .5 that the 970 did. Electricity cost was also negligible and worked out to a few bucks a year (again, using NA electricity pricing).
Both cards were great, the 390 stayed ārelevantā longer due to its vram and GCN architecture (amd fine wine and all that jazz). For their generation, neither was the wrong answer.
A 12GB card simply won't be able to play those games, even at lower settings
While I agree that people really shouldn't be buying a 12GB card for $550 in 2025, this is just fear mongering. There is no way that games in 2027, assuming that's the PS6 launch date, would be unable to run on 12GB of VRAM.
Alan Wake 2 had mesh shaders that ran awful on like 7/8 year old cards and everyone kicked up a huge fuss. If a game is unable to launch without 16GB of VRAM, the devs would be criticised left right and centre.
People massively exaggerate how terrible it must be to turn textures to medium. The games which have extreme texture streaming issues or stuttering are in the minority. Cards like the 5700XT with 8GB of VRAM 6 years since launch can still give an ok gaming experience at 1080p, but people here will pretend it's some horrific trash.
The big issue with the 5070 is the price point that's offering 12GB of VRAM. If it was $350 it'll be a stellar piece of tech that would be recommended left right and centre.
It's not just textures, though. RT and FG have added considerably to VRAM requirements and Nvidia has been incredibly stingy for the past decade in that regard.
The biggest factor in determining a cards longevity is it's performance tier after clearing a minimum VRAM requirement. This isn't a R9 Fury X situation where the card was already choking on it's VRAM on launch.
You're right about planned obsolescence. The question is always whether the performance of the card is going to determine whether you turn down the settings before it's VRAM issue rears it's head. The 3080 clearly ran into VRAM issues long before it should have, whereas a 3060 would have to turn off settings long before it's 12GB VRAM would ever be an issue. The 5070 isn't as egregious as the 3080, but again it's ridiculous to suggest it'll be unusable in 2/3 years.
Kingdom Come 2 defaulted @ 1080p for like 60 hours without realizing I'm not on my screens 1440p resolution.
Considering my almost 4 year old 3060ti is still running all the good releases just fine @ 1440p I think even at 550$ majority buyers would be happy with the purchase. The problem is that price and stock probably won't show up for a long while.
One can only hope that we get a Super refresh of the 5000 line up soon after AMD's offerings.
yeah, while I agree that 12gb for any card above $400 at this point is a kick in the nuts, for 1440p and lower, 12gb with some settings tweaks will definitely still be quite usable for a while yet without too much compromise.
Who are these idiots? I only visit this sub for hardware info and a few YouTube channels. Most people have asserted that the RTX 5070 could only achieve RTX 4090 performance is through "fake frames".
For me, going to this sub is highly discouraged for me. It makes me angry and sad. We are living in an age where Moore's law is dead, so one should not expect dramatic improvements in CPUs and GPUs. I expect nothing spectacular from AMD in the next few weeks. I thought there launch was today, which is why I am here right now.
There will be few self-driving cars and robot butlers because hardware improvement had stalled. All that stuff is just futuristic crap in the desolate environment of stagnant hardware.
Hardware is not holding back self-driving or robotics. we have sufficiently powerful and efficient compute solutions for those tasks already. sensors, algorithms and regulations (safety concerns and public sentiment) are the main factors that slow the development in these fields.
It seems that the cost of compute was a major issue for the now defunct autonomous driving company, Cruise.
Technology issues
But the biggest cash burn to hit GM was Cruiseās pursuit of processing power that autonomous technology requires, Murphy said.
āThe capital investment is coming not necessarily from a fleet but the system itself,ā Murphy said. āIf you look at what Tesla is doing with their 50,000 H-100s from Nvidia ... that's something that will be difficult for other companies, particularly companies like GM, to get an adequate return on investment in the short run, which is what GM investors traditionally are looking for.āĀ
The H-100 graphics processing unit is a computer chip used to train machine learning models like OpenAI. To train artificial intelligence models, a lot of processing power in a data center is critical, according to Sam Abuelsamid, vice president of market research at Telemetry Insights. Still, itās only useful if the system is truly independent of human programming intervention. The next generation produced by artificial intelligence software company Nvidia is the B-200, which is more adept than its earlier counterpart, but even more expensive.
āIf you are looking to do any sort of AI training, the H-100 is the card to have. Itās very powerful ā¦ but it also consumes a lot of power,ā Abuelsamid said.Ā Ā Technology issues But the biggest cash burn to hit GM was Cruiseās pursuit of processing power that autonomous technology requires, Murphy said.āThe capital investment is coming not necessarily from a fleet but the system itself,ā Murphy said. āIf you look at what Tesla is doing with their 50,000 H-100s from Nvidia ... that's something that will be
difficult for other companies, particularly companies like GM, to get an adequate return on investment in the short run, which is what GM investors traditionally are looking for.āĀ
The H-100 graphics processing unit is a computer chip used to train machine
learning models like OpenAI. To train artificial intelligence models, a
lot of processing power in a data center is critical, according to Sam
Abuelsamid, vice president of market research at Telemetry Insights.
Still, itās only useful if the system is truly independent of human
programming intervention. The next generation produced by artificial
intelligence software company Nvidia is the B-200, which is more adept
than its earlier counterpart, but even more expensive.āIf
you are looking to do any sort of AI training, the H-100 is the card to
have. Itās very powerful ā¦ but it also consumes a lot of power,ā
Abuelsamid said.Ā Ā
flashbacks to the few days you could find heavily discounted second hand 2080 and 2080Ti before everyone realised 3000 series would not be in stock anywhere for months.
actually no. the 1060 3gb has 32% of the cuda cores of the 1080ti. the 5070 only has 28% of the cuda cores of the 5090. So it's worse than the bottom 60 tier card from the pascal era in comparison. Oof
then they would probably match closer. the titan xp was only very slightly faster than the 1080ti. I am a bit too lazy to put the numbers in the calculator. It's only 250 cuda cores of a difference
Sure. Die size is what actually matters when it comes to cost though. When people are saying a card is a 60 class card, theyāre saying they want it sold for 60 class prices.
We're ~2 years away from VRAM demands skyrocketing due to next gen consoles, anyone with more than two brain cells should avoid 12G or less like the plague at this price
Microsoft plans have their next console listed for 2026, from leaked plans, which have been correct for the most part so far. Further leaks also have games like the 2026 Call of Duty being tested on new Xbox Hardware.
Sony would like follow suit, although a year behind, similar to the 360 and PS3. At worst, with a delay on Microsoftās part, new gen consoles will likely be here by late 2027.
But I suppose it's possible that Microsoft will be eager to hit the reset button. I had sorta expected them to wait for technology to become available to miniaturize the Series S into a handheld for an extended cross-gen with a new machine, but I wouldn't be surprised if they just moved on.
what makes you think new consoles will have more VRAM? they will be using exact same 2GB modules they used before because newer modules dont exist (3GB modules started being manufactured only this year).
Bro you're living in opposite land. Ue5 is literally the one engine that doesn't use insane amounts of vram for the visuals. Look at gameplay of any ue5 game or benchmarks and you'll see that they barely use 8gb even at 4k max settings.
Not all games are crossgen though, and even so, on PC even crossgen games are more demanding despite having to be optimised for old hardware. Look at that new god of war game
At 549$ it should have been 16GB to have some chance against 9070 series even if slower. Aside from DLSS and brand loyalty, low segment users like 2060/3060/4060 if they want to up the ante should go to 9070/9070XT.
The 5060ti seems like the sweet spot for me personally and has been for all my previous GPU purchases every 4-5 years, xx70 models always have had small gains over xx60 compared to price for it to be worth it, and the other benefit being that I've had my corsair 550w psu for almost 13 years and hopefully can run it for 2 more without issue.
I remember when the specs were announced and every comment speculating that this thing won't be faster than a 4070 super was met with "it's a different architecture" reply, as if different architecture mattered in the last 10 years of GPUs lol, more cores, more clocks, more bandwidth is just more better in GPUs, architecture does very little to alleviate deficits in these areas.
For what it's worth, different architecture seems to be making a big difference in RDNA4, but once they finish adding instructions to use on the second SIMD, it's not going to happen again because VLIW-3 almost certainly won't be used enough use to be worth having.
RDNA 3 was a special case with chiplet design, a chiplet GPU is worse than the same exact GPU but on a monolithic die but cheaper to make, RDNA 4 is going back to monolithic and 9070XT has 32% faster clocks and 12% more bandwidth than 7900GRE so with that in mind it makes sense they're achieving 42% better performance than 7900GRE with 20% fewer cores.
There's most definitely an architectural improvement, it's just tiny most of the time and quickly demolished the moment you start cutting physical specs without compensating in other physical specs.
I don't think we are talking about the same thing.
9070xt has only 22% higher base clocks with 20% fewer SPUs than 7900gre. This shows up in AMD's rating of 48.7TFLOPS f32 for 9070xt vs 46TFLOPS f32 for 7900gre.
No matter how you look at it, there has been a MASSIVE architectural improvement to get 42% better real-world performance with just 6% increase in theoretical performance.
This review doesn't even have a 6950xt, so where is that info from? TPU has a 6900xt in their chart 5070 beting it by ~17% or so and the main chart has 6950xt 8% ahead of 6900xt so still doesn't seem like it beats it.
If only TPU had a seprate RT chart in their reviews... Oh wait they do and the relative performance chart isn't RT "This page shows a combined performance summary of all the game tests on previous pages" as you can clearly see by looking at like 3080 vs 6800xt. Even if you think some RT sneaked in to that graph you can just go check every individual raster game yourself to see that rdna2 aged more like vinegar than fine wine.
Seems like it aged fine considering the gap between the 6800XT and 3080 is still like 3%. Something's weird with the 6900XT in particular, I remember on launch it was 10% faster than the 6800XT and on TPU it's only like 5%. Also on GN's raster charts the 6950XT is within 3% of the 5070
Completely wrong, TPU's relative performance includes ALL tests. If you even bothered to open the page you'd see a big disclaimer at the top "This page shows a combined performance summary of all the game tests on previous pages, broken down by resolution." That includes raytracing. The raytracing page breaks out RT specifically.
Also hilariously enough the linked GN charts in every raster benchmark the 6950XT is within 3% of the 5070 FE, so his claim that his own card beats the 5070 is easily imagined true when considering factory OC models.
There is a fair bit of variance in the reviews for sure even in same games, like tpu has the 5070 kinda high vs the 4070s compared to what HUB has, i guess the meta review post once it'll be made will avg out all the main reviews to see how it stacks up as and what the "real" avg is.
Also game by game variance of different architectures can be quite large so I'm sure there are many games where a 6950xt will beat it, like cod as it's very amd favoured in general.
I wound up selling my buddy my OC formula 6900xt ahead of this launch cycle, but honestly was pretty happy with it, I was probably planning on the 9070xt anyway which should be a decent upgrade.
But yeah, both impressive and sad that it's aged as well as it has honestly
What is even going on on those power tests? a 5080 drawing less than a 4080 super is a bit weird, but sure i can believe taht as the 5080 isn't in space marine(for some reason...), however how tf does a 3080 draw a 100W more than 5080 in starfield? it's a "only a 320W card stock...
I get that starfield is not the highest power draw, but ~220W(estimating as a 3080 riding the power limit stock makes sense on most games) for a stock 5080 sounds really low also the 5070ti is the same and somehow more in outlaws than a 5080 also 4080/s also more than a 5080, like wtf is going on there.
Most games don't use the entire power budget of the gpu to be fair. I usually only see barely 300w on my 5080 and thats overclocked. And the 5080 does have the best performance per watt of any gpu heating out the 4080s by 10%
But the graph isn't performance per watt, it's just power draw, and seemingly isn't cpu bound looking at the performance graphs. TPU gaming avg is a 325W, GN has 5080 consistently drawing more than a 4080 anda 5070ti on their games.
Yes starfield is lower power draw game so a slightly bigger delta than outlaws makes sense but 100W is lot as it includes the cpu doing 30 more fps as well, even outlaws has a 60W difference including the cpu, which again probably drawing more but we have no idea cause there is just a combined number. One thing could be that the 3080 isn't stock/fe one and is like suprim X, strix or something and has 370W default limit, but that kinda defeats the point of using "stock" card no does it and isn't communicated.
Also this isn't the 1st time hub has weird power graphs, back when they used to cover full system draw, some their cpu reviews produced very weird power results where there was clearly something going on with the gpu draw/power states rather than cpu draw difference as a 7600x doesn't just randomly icnrease 50-100W in power draw while an intel cpu system doesn't increase at all or like 20W.
Or in the other video of a 13700k drawing nearly 200Wmore in game than a 7800x3d yet somehow only 34W difference in another one, like there's no way neither of these is just the cpu power draw making that difference, even their own blender test shows max 184W total system draw difference and only 89W for the 13600k vs 7600x so full cpu render test is somehow less demanding that games? yea no.
Which while very interesting info that the amd system somehow managed to keep the gpu in lower power state when it's not fully used and the intel system can't, isn't communicated at all and again would've been solved by just having some software numbers in addition to show how much software reading of the gpu and cpu was to make it easier to understand the figures and differences.
Could be marketing. Since they claimed they weren't aiming for high-end this gen, making it a X070 instead of X080 keeps expectations lower. Also lets them go up against Nvidia's lower end with more like-for-like name schemes.
Worth remembering it's total system power in those charts. A higher performance card will also have the CPU working harder too, so all of that extra power comes form multiple sources.
True, but still i wouldn't consider a 220W TBP card using more power than a 360W TBP 5080 for probably less performance good in any stretch of the imagination. Unless starfield is some weird outlier were the 9070 non XT wil outperforman the 5080. I do think Starfield is generally nice to AMD cards though. But in outlaws the difference between the non XT 9070 and the 5080 also seems to be only 40W, and in space marines it uses more power again than the 5080. Somehwere i do hope these power numbers are just plain wrong.
I am personally very curious how many Watts does that last 2-3% perf cost. As usually you can drop the power draw by quite a lot while losing under 5% performance. I value the pc being silent and cooler more than 2-3 extra fps.
I mean yeah there's SOME cases where I can get only small performance losses like 3% but it's certainly not the average haha. I'm fairly certain that would be the case for your undervolt too.
Mind telling me why your cyberpunk benchmarks are saying "NVIDIA GeForce RTX 4090" with an unknown driver version?
And besides the inconsistencies in the pictures you've tried to show me, like I said before, there are definitely cases where you can see a small drop in performance. I myself see a small drop like that in the cyberpunk pre-made benchmark. But no, it's not consistent across the board for me, and it wouldn't be for you either.
You're dropping your clocks over 250mhz, that's not free, some games just show that harder than others will.
It needs to spoof as an nvidia card for the in game fg to work
As for your comment regarding some games taking lower clocks harder than other Cyberpunk is pretty much the hardest hit when lowering clocks out of all the games I have
I'm asking you questions about this because you're the first person I've seen claiming the things you are. You're giving yourself a huge clockspeed drop while claiming it hardly effects performance across the board, that's just not true.
You're just lying, and you don't want to admit it, that's fine.
Since repasting my card with PTM7950 I haven't had to use the insane undervolt, my Pulse model stays really cool even at 25% fan speeds with an overclock, but here's some of those settings:
1900mhz, 980mv, -26% Power Limit to top out around 130w under load and losing 10% performance at most, sometime less depending on the game.
2100mhz, 1120mv, +10% Power Limit is what I've been running lately just to eek out that extra bit of performance, but honestly I switch between the two depending on if I want the card to run super cool or not.
And of course, your mileage may vary depending on your card.
Higher than I expected, but being about equal to a 3080, it's at least not too crazy high. I wouldn't need to buy a new PSU, whereas the for 7900 XTX or any 90-class I probably would. Not ideal, but not a deal breaker.
I'll admit it was a little surprised on how high the power draw was but I don't watch enough reviews or videos to know if that included the full pc draw or not.
It includes the CPU too (the title of the slide is PCIE + EPS). PCIE is for the GPU, EPS is for the CPU. At least from the total power draw, seems like they're using a 145-ish watt CPU.
i remember when xx70 was something worth to safe up for and while not enthuisiast level, it was at least a very respectable card that would last you years...now it seems that the 5070 is the poor mans entry level card and everything below that is utter garbage.
man, i'm not sure if i survived covid and this is actually gamers hell...
This got to be the worst Nvidia generation I have seen in my history of PC Gaming. We went from 3070 = 2080 Ti to 5070 = 4070 Super.
Just Imagine if the 3070 were only equal to 2070S back on 2020?! It would have been shredded on reviews...
To me this GPU is so disappointing that If I were Steve from HUB I would have went on top of Burj Khalifa itself to express how so much disappointed I am with this GPU.
I really hope AMD RDNA 4 knocks some sense to Nvidia this generation that is if AMD were able to sell RX 9070 - 9070 XT at MSRP of course.
Turing was ass. But even Turing 2070(not super) was around 12% faster than 1080. The fact it was 100$ over 1070 and didn't touch 1080ti(super addressed this) disappointed many people.
Now compare it to Blackwell. It is so much worse. If 5070ti was named 5070 and sold for 5070 price, we still wouldn't get the price perf of infamous Turing. And at least Turing had excuse of bolting on Tensor and RT cores.
Oh for sure. Turing was the most disappointing product launch in a while from Nvidia (and Kepler), but Blackwell is leagues worse than both of them. It's Nvidia straight up sticking their thumb in the consumer's eye.
As dumb as consumers are, I don't think anyone is going up to the GPU stand looking at the price of each item side by side and thinking that the 3x lower price product is going to be on par or better.
Hot take, but the leadership behind the marketing teams don't know what they are selling. They aren't gamers so to them they don't see an issue with turning up using FG metrics against non FG.
They see big number and say "wow, everyone will love this" but they aren't gamers so while yes, FG does make the number bigger, it's not always better.
At this point, I just want all reviewers to stop calling them the advertised names, and refer to them as 1 full tier lower.
Recommendation stays the same, but now based on price increase over previous gen.
Instead of worst 70 series, Its now A great 60 series, but at the worst price ever, Geforce RTX 5060 Review.
5070ti becomes 5060ti.
5080 becomes 5070ti.
5090 becomes 5080.
Then reviewers can call this the worst price to performance generation, but still then claim truthfully Nvidia can maintain generational performance gains.
This is why I donāt upgrade unless there is a node shrink. A lot of the performance upgrade we get between gpu generations come out of an improvement of process nodes, both 4000/5000 use TSMCs N4P, Iām expecting a decent increase once they move to N3 or N2.
I felt like a sucker buying a 4070 Super ~4 weeks ago for Ā£465, as I could splash a little more for a +20% uplift in performance and better efficiency on a 5070.
They are now ~Ā£600 and rising. I'll almost certainly be buying a 9070 XT if the reviews hold up.
RX 400 and 500 refresh were also money generations for AMD. They were a a very good option for the midrange and cost amd basically nothing to make. I remember them being everywhere.
At the very least, Nvidia should've increased die sizes all across the board, yet they did the opposite. The GB205 is actually slightly smaller than AD104 (263mm^(2) vs. 294mm^(2)) with 10 less SMs (50 vs. 60) and roughly \~5Bn fewer transistors.
The sad thing is, people will be buying this card for \~$800 in the coming days.
Only thing for me is I expected it to be worse than the 4070 Super but isnt so yeah. But it does draw more power than the 4070 Super though...using TPU data.
Nvidia is now making xx70 series very cut down just like how bad xx60 series in terms of, no performance increases, just match previous generation xx70/xx60.
Ishamaelr@reddit
Upgrading from a 1070 (I know....), is it worth getting a 5070 or should I just get the 4070? I live in Canada so the 4070 is 800$ or so, don't want to spend too much more than that if i can help it. Do you think its worth waiting to see how the 5060 will be (I originally was going to grab the 4060 it with it only having 8g vram I decided to spend some more for longevity)? or what about the Radeon RX 9070?
Riconas@reddit
Don't feel bad; I'm also rocking a 1070 GTX (it was a hand-me-down after my sis and bro-in-law built new rigs), so really ANYTHING is an upgrade for me; but I'm basically in the same boat: 40 series is over $1K here, I want a ROG 5070, but they haven't dropped yet, so am thinking about the 5060. I almost got a Prime 5070, but they sold out right before I got paid, which I took as a sign to wait. š
Swimming_Ad_3664@reddit
I got an msi rtx 5070 from best buy for msrp, its an awesome upgrade from an rtx 3050 and im really enjoying it.
Amd would have had me if i had been able to get a 9070 xt card for the original msrp
tmchn@reddit
A 4070 super that costs more than a 4070 super and isn't available. Nice
NilRecurring@reddit
Well, I just checked and in Germany the 4070 super's price went up from ~630ā¬ to 830ā¬ today, so it has that going for it.
Daki399@reddit
Well if people are at not complete dummies they will just go for 9070 / XT instead keep buying NVIDIA in this budget lol
NilRecurring@reddit
How are ya'll so sure vendors won't upsell the 9070 (xt) to nvidia - 50 or maybe 75, if they want to throw us a bone?
I think I'll just look for a new hobby. I can get pretty nice road bikes for 900ā¬ second hand and spring is coming.
IguassuIronman@reddit
It's not like a brand new late model GPU is a necessity to play games
Sad_Animal_134@reddit
Have you seen monster Hunter wilds? New games are getting worse and worse in optimization. Somehow we've gone backwards in performance in recent years without even improving graphics.
SenorPoptarts@reddit
LOL. As a cyclist, I can promise you that it will not be a cheaper hobby when all's said and done.
boringestnickname@reddit
Man, the prices for bikes in general where I live has gone up at an absolutely ridiculous rate.
Ten years ago you could get gold second hand for next to nothing. I bought one that had been a spare in TDF (like spare of a spare, mind you) for $600. The rims alone, new, were easily $1k.
My aging electric was $400 when it last went out of stock, in 2014. Getting a similar bike today is close to $2k.
Doing much of anything in 2025 is expensive.
jhoosi@reddit
Yeah, and arguably cycling has an even worse perf-to-dollar diminishing returns curve than PC gaming lol
Strazdas1@reddit
The majority of hobbies have a worse perf-to-dollar returns compared to PC gaming. even at those prices PC gaming is cheap in comparison.
jhoosi@reddit
Yeah, golf is another good example. Itās not the clubs, itās the golfer lol
Strazdas1@reddit
Yeah. And even then, you wont be playing golf 4 hours per day every day.
JelloNo4699@reddit
I bought my mountain bike in 2009 for $600 and it still works great. My $600 laptop from 2009 is not doing so great performance-wise.
Strazdas1@reddit
Unless you missed a zero there, no you didnt.
jhoosi@reddit
Right, what I meant is that after a certain price point, spending money on more expensive bikes doesnāt do much besides give you bragging rights. For casual use, weāre way past the point of diminishing returns and we hit that point many years ago, if not over a decade ago.
PaulTheMerc@reddit
Great framerates though.
Jeep-Eep@reddit
Yeah, and a well cached GPU will probably last longer before needing significant money spent on keeping it working.
Less dangerous too, if you avoid Big Blackwell.
Berengal@reddit
HUB said on their podcast episode about the pricing that when they talked with retailers they said AMD cards didn't start to sell until they were 20% below the equivalent NVidia card, and didn't really start to move until it was 30% below. They're not at 30% right out the gate so I guess they're hoping that projecting more confidence in their features and drivers, (and reviewers agreeing with them on those points) will make up some of the difference, but they still need a decent discount relative to NVidia to get the ball rolling.
only_r3ad_the_titl3@reddit
"AMD cards didn't start to sell until they were 20% below the equivalent NVidia card, and didn'tĀ reallyĀ start to move until it was 30% below" - yeaj but hub only focuses on rt and vram. This narrative does not work if you factor in RT and DLSS
Berengal@reddit
Is HUB a trigger word for you or something? Because that objection doesn't make any sense.
bubblesort33@reddit
Do that... but let's be honest. You won't.
Aerroon@reddit
The irony here is that bikes are in the exact same situation. They are way more expensive than they ought to be, but because it's an enthusiast market customers will pay premium prices.
Ilktye@reddit
Uh huh.
And then you can add shoes, clothes, helmet and all other stuff on it
popop143@reddit
Because historically at least, AMD hasn't upsold much because nobody except hardcore gamers consider AMD. Nvidia can do this because everybody and their mothers buy their cards even with all the bad reviews.
BaconatedGrapefruit@reddit
We said the same thing about the 4060. People bought that shit in droves. Even with the 7700xt RIGHT there.
PC price/performance literacy went out the window years ago. Itās just console war, brand loyalty bullshit now. Nvidia knows this and is cashing in, while calling us pay piggies, and telling us to eat our slop.
tmchn@reddit
The 7700xt (at least in europe) was much more expensive than the 4060. And the 7700xt was basically a rebranded 6700xt. AMD is no saint either
pmth@reddit
I wouldnāt call a 20-25% uplift a ārebrandā but you do you
amazingspiderlesbian@reddit
They picked the only one that wasn't a rebrand to call a rebrand. The 7700xt was about 15% faster than the 6750xt.
The 7800xt and 7600xt were the rebrands being about 2% faster and 4% on average respectively over the 6650xt and 6800xt. Like this 5070.
Maroonboy1@reddit
7800xt was the 6800 successor not the 6800xt. Many are making this mistake.
amazingspiderlesbian@reddit
Blame amd for naming it that way then. They decided that it was the 6800xt successor in name not 6800. Even if they dropped the price a bit the consumer just now thinks that's where the 800xt line is now
Maroonboy1@reddit
Yes 100%. It was confusing, but hopefully they have learnt.
only_r3ad_the_titl3@reddit
yeah no, you can believe amd marketing bs but i wont
Maroonboy1@reddit
Or you could just look at the specs...you'll see the natural progression is from the 6800 not 6800xt.
BaconatedGrapefruit@reddit
Canāt speak for the EU, but in NA, I remember there being a $50-150 price gap depending on what model you were able to snag. More if you found a 6700xt. It was such a trap card over here.
only_r3ad_the_titl3@reddit
well the 7700xt is 37% faster according to tpu, so 1.37*300=410 so 450 was far too high and that is just in raster and does not consider that dlss is useable unlike fsr.
Amd fans really love misrepresent stuff so they can satify their superiority complex
deefop@reddit
What? The 7700xt was about as fast as the 6800, with better rt. It was significantly faster than the 6700xt.
Squery7@reddit
Amd to compete needed to catch up to ML upscaling and ray tracing (since it's becoming more baseline). If they deliver then I think the market will considerate them more if the price is competitive this time.
anaemic@reddit
I would love to jump ship and buy a 7900XTX because its the only "affordable" card offering more than 16gb of ram, but then I will shoot myself in the foot by not having CUDA / proper machine learning support.
Aerroon@reddit
The problem is that Nvidia cards have more features. It's a lot easier to justify buying something that has more features than something that has less.
I don't do a ton of AI stuff, but when I feel like it I want to be able to do that. With AMD it seems like a coin flip whether it will work or not.
signed7@reddit
Yep. 9070(XT) will have no DLSS4, worse RT, no Reflex, no RTX Video, no RTX HDR, no Nvidia Broadcast, no ComfyUI Windows support (and iffy support for most AI stuff in general)
Whether that's worth it given ~Ā£200 discount (let's hope for good street prices on Thursday...), vs the similar perf 5070Ti, that depends on your use case
MadBullBen@reddit
FSR4 we can't say how good it is yet compared to DLSS4. Unlikely but we shall see, personally I know I won't be able to tell the difference. DLAA they mentioned in the video that they are supporting. True about RT its most likely still not quite there yet see especially in Path tracing, but we shall see.
ComfyUI is just an AI digital art generator isn't it, having a small look up on it, it appears that it does work with AMD, not sure how good it is though. Running LLM is easy on AMD and works just as well, I'm currently running deepseek 14b on my rx6900.
on1zukka@reddit
4060 is significantly cheaper than 7600xt 16gb here, 7600 8gb is cheapest. 7700xt is there to make you buy 4060ti 16gb, AMD basically works like a decoy product with a few exceptions.
Strazdas1@reddit
the 7700xt was an objectively worse product at larger price. why would you buy it
Cheap-Plane2796@reddit
4060 was in all the prebuilt pcs. The low power req also meant it was easy to pair with a low end psu in prebuilt and older pcs. 4070 was the easy choice over it instead of any amd equivalent.
Also 4060s were actually in stock
Then amd had no dlss alternative, no framegen, zero rt performance.
Even with worse value for raster it was hard to pick amd.
With blackwell things are very different. The power efficiency gap is smaller, prices are way higher, availability is zero, 4000 series cards have become hard to find and are being scalped too, amd seems to actually have some features now (hardware accelerated upscaling) and usable rt performance.
The reason amd didnt sell for the past years is because they were terrible, and a bad value nvidia card like the 4060 doesnt change that
puffz0r@reddit
Imagine thinking you can do good RT on a 4060 lol
only_r3ad_the_titl3@reddit
Yeah lets comapre a 300 usd product to a 450 usd product that is not misleading at all.
DigitalShrapnel@reddit
True, but AMD overpriced the 7700xt at launch (classic move). If it was 399 from the get go, who knows how well it would have sold.
LowerLavishness4674@reddit
The 7700XT is literally more expensive than the 4060Ti 8GB. That is not a valid comparison at all.
BaconatedGrapefruit@reddit
This was during the launch year. A lot of it had to do with how jacked gpu prices were. There have since been official price cuts and things started selling at MSRP when the super series launched.
For the record, just doing a cursory look, itās next to impossible to find a decently priced 4060 ti in stock (at least in Canada)
Strazdas1@reddit
There is currently no evidence that 9070 is any better.
tmchn@reddit
Problem is, i'm not sure that there will be enough RX9000 to satisfy the demand. Scalpers and bots will have a field day
Bad time for GPU buyers
bubblesort33@reddit
I don't think scalpers are that interested in AMD, because they aren't as easy to sell as Nvidia. some are already regretting buying an RTX 5000 series to resell.
Working-Confusion445@reddit
I truly hope you are very wrong
TophxSmash@reddit
they can't. 80% of the market doesnt exist. AMD cant fill that void on a whim.
Working-Confusion445@reddit
You are probably right. nGreedia effed up us all. We have to hope AMD trippled their production from what was expected of sales
Jeep-Eep@reddit
RDNA 4 is strongly optimized and makes significant technical sacrifices to make it manufacturable en mass, so there's that for it.
FrewdWoad@reddit
Not at launch.
Then could have 100 times 5000 series stock and still sell out.
The question is will they keep restockingĀ long enough and sufficient that the MSRP eventually becomes a real price people can buy it at...
Acrobatic_Age6937@reddit
hint: it won't be available at msrp.
jocnews@reddit
challenge accepted, probably
willyolio@reddit
That's tough to bet against
SituationSoap@reddit
If the 90 series is good, it won't be available for anything like MSRP. If it's available at MSRP, it's not going to be good. This isn't a reason to buy the 5070, but it's worth noting that everyone is saying "Just go to the 9070" but that's not suddenly going to offer any better value prop.
snowflakepatrol99@reddit
If people are not complete dumbasses they'd buy 30 or 40 series used. Unless you are in the US and have normal pricing then you're an idiot for buying any card from the new generation. 9070xt is listed for 1k euro in EU. How is that ever the right purchase when you can get 4070 ti for 600 and you'd even have some warranty left.
Still hoping to see good stuff from the reviewers but unless prices drop to what they are supposed to be then neither nvidia or AMD deserves to be bought.
ThatBusch@reddit
And we can stop here
shangriLaaaaaaa@reddit
9070xt also going 30/40% more than release price in india ,this is official resellers ,they are all looting customers in the name of fomo
S1egwardZwiebelbrudi@reddit
its like you learned nothing from this video, wait for fucking benchmarks. you make this sound like it is the better card at a better price, dude, its not being sold yet and nobody released reliable third party benchmarks...let's preorder, right?
Olde94@reddit
It was 630ā¬?? What? Best i could find in Denmark was 680ā¬
AlkaKr@reddit
Oh, wow. You weren't joking. I just checked mine as well here in Greece. Bought it for less than 600 in July and now they all cost upwards of ~800, lol.
FrewdWoad@reddit
All model above 4060 sold out their cheaper models when 5000 series reviews lifted and it becomes undeniable they were literally worse.
Bad_Demon@reddit
Old hardware always goes up in price, its so you buy the new stuff. I dont like it but that isnt new.
AlkaKr@reddit
The new stuff literally does not exist in stores though. Even if you wanted to, you couldn't buy it ESPECIALLY on the advertised price.
Bad_Demon@reddit
Does that somehow change what Iām saying?
AlkaKr@reddit
No, but it was irrelevant in the first place.
skinlo@reddit
That's not true at all, usually old hardware went down in price and vendors tried to clear old stock.
Bad_Demon@reddit
Except that there are threads about this exact thing going back years. And last gen is still as good as this gen so even more reason to go up.
Vb_33@reddit
That's becausetthere's no stock of non 60 40 series cards.
Regular_Tomorrow6192@reddit
NVIDIA could have just kept producing the 4000 series and it would be better than this launch.
tmchn@reddit
Yeah i don't get why launch a new series with no gains and no stock
Just keep producing the old ones, they were still selling like hot cakes
bogglingsnog@reddit
The wheels of communism must turn. Don't want to get consumers comfy with their perfectly functional cards, they must always want the one with the bigger model number.
Also, Nvidia employees need job security.
AlkaKr@reddit
The entirety of this launch could have been a driver and software update for FG -> MFG but the chose to lock it under hardware and hopefully it backfires.
Next gen I'm going full AMD. Fck this anymore. DLSS and FG isn't THAT better compared to AMD at the moment to warrant the situation where we are all getting fcked by Nvidia.
SolaceInScrutiny@reddit
Actually DLSS is so much better that I won't even consider AMD until they improve their version drastically and it gets better game support.
nanonan@reddit
Is it worth burning for?
Acrobatic_Age6937@reddit
Well it does have DP2.1, which means 4k without DSC
tuvok86@reddit
because they can make this for cheaper in the long run, and have a sale spike because of the uninformed people
SicnarfRaxifras@reddit
Yeah because the informed people, if we just go off number of YouTube watches, is a pretty small % compared to overall sales so clearly most people are just ābigger number betterā buyers.
Not_Yet_Italian_1990@reddit
Yup. Same performance + smaller die= larger margins.
It should also mean more availability... but were the 4070 Supers really that scarce when they were making them?
BenFoldsFourLoko@reddit
4070 Supers were highly available almost the whole time they existed
Sometimes 10-20 dollars above MSRP, sometimes 10-20 dollars below MSRP
Vb_33@reddit
There was an article recently saying Nvidia lost tons of wafers due to the Taiwan earthquake.Ā
Ilktye@reddit
Because its more lucrative to sell the chips for AI data centers.
TophxSmash@reddit
no point making 40 series, its on the same node.
05032-MendicantBias@reddit
All wafers were used to make enterprise cards until recently.
noiserr@reddit
This is genuinely the first I think in history of GPUs. It's basically Nvidia's Rocket Lake.
b0wz3rM41n@reddit
In a way, this whole Blackwell shitshow isnt surprising at all and kinda makes sense when you realize that
1 - It's on the same node (though prior gens that stayed on the same nodes had better gains)
2 - this is the first generation of NVIDIA GPUs whose development cycle mostly took place following the AI Boom, which is why most of the significant performance gains are in AI applications and not much else
TophxSmash@reddit
its rumored its backported too. I was suppose to be on 3nm.
YourUrNan@reddit
Glad I got a 4070 super 4 months ago and didnāt wait for this mess.
sitefall@reddit
Yup. I replaced the GPU in my livingroom tv/gaming pc a little while after black Friday. Was looking for Black Friday sales, but there just weren't any, so I picked up a used 4080 super for $850 from r/hardware swap and immediately regretted it. That's just like 10% off MSRP, not really a deal, and it's "used".... I just didn't want to wait for the 5xxx series and everyone was saying "just wait" and we were all laughing at the msrp prices people were posting up their used 4080s and 4090s for.
turns out that was one of the best decisions I've ever made.
pmth@reddit
I had a 3070 that I was pretty happy with but stumbled upon an open box 4070 for around $400 about a year ago, snagged that and sold my 3070 for $300. I wasnāt really looking to upgrade, it just made sense, and now Iām in a great spot because of it.
ITGuy420@reddit
Same. I grew tired of waiting in January and upgraded.
jangoagogo@reddit
Yeah I got one for $600 a month ago and had a couple people telling me it was stupid. Each week that has passed Iām more and more happy with my decision.
tmchn@reddit
I wanted to wait for the 5xxx and 9xxx reviews then i found an open box 4070 ti super for just 611ā¬, so glad i don't have to fight bot and scalpers to buy a new gpu
Olde94@reddit
Yeah iām glad i picked up a 4070 super after 5080 dropped
bloodem@reddit
Don't be an idiot! If you activate the multiframe feature, you'll be able to see the stock!
Working-Confusion445@reddit
When you say it like THAT... Now we know why nGreedia is so bad on the stock market. LoL
Capable-Silver-7436@reddit
bruh i didnt expect much but to basically jsut be a 4070 super, not even super duper, is just pathetic
Two_Shekels@reddit
PC community: āthis is the worst 70 Series ever!ā
Nvidia: āthe worst 70 Series so farā
PhoBoChai@reddit
How much worse could it get?!
steve09089@reddit
6070 loses in every test case against the 5070 unless you use their new DLSS Super Generation which extrapolates frames.
ledfrisby@reddit
RTX 6070 is just a rebadged 3060 with 8x MFG, and you have to win in the Thunderdome for a chance to buy one of the only 10 made.
Aggrokid@reddit
Steve also mentioned that 5070 stock is nearly non-existent. Surely AMD won't miss this uncontested layup.
RandomCollection@reddit
Yep - I am astounded at the stock issues. We don't seem to have anything like the cryptocurrency boom and the 4nm is mature. I wonder what is causing the issue this time.
Then again, the GPU is not really "worth it". It's a more costly 4070 Super at this point.
chlamydia1@reddit
All their capacity is going towards producing data center cards. That is their primary money-maker.
Turtvaiz@reddit
Still doesn't explain why they would release this garbage at all if you can't even buy it
steve09089@reddit
They want to maintain their mindshare just in case AI ends up crashing.
If they completely exit the market, they essentially cede it to AMD and Intel, and if they want to ever re-enter the market, it will be pretty hard to do so.
unknownohyeah@reddit
Because R&D still costs the same even if you don't sell a single new card and you're looking short term for stock when in 6 months it won't matter. Also the segmentation is purely artificial. They can release supers of any card this generation at any time to boost sales when capacity frees up.Ā Ā
Puzzleheaded-Stand79@reddit
AI boom, LLMs and all that
DickInZipper69@reddit
Arguably biggest pc hardware store in Sweden basically said they don't think they can sell any on release or near future due to no stock.
Strazdas1@reddit
Well we seem to have plenty here in the baltics so its time for your viking raid.
STD209E@reddit
Fret not! Proshop, which serves all of Nordic, has five Asus Prime cards ready for launch.
szczszqweqwe@reddit
Sounds like 5090 situation, maybe that's what Jensen meant? "5070 will have 90 series availability"
TheNiebuhr@reddit
That's it. It'll have 4090 performance... selling performance.
szczszqweqwe@reddit
Perfect :)
Strazdas1@reddit
is it going to be like the nonexistant 5070ti stock thats been selling at close to MSRP and not going out of stock here?
TophxSmash@reddit
they doesnt even have to try when 80% of the market doesnt exist. These things would sell out at $1000.
jspeed04@reddit
AMD.
SRVisGod24@reddit
The FE just got delayed too lmao
lokithetarnished@reddit
Whereād you see that
SRVisGod24@reddit
NVIDIA says GeForce RTX 5070 Founders Edition wonāt be available at launch
lokithetarnished@reddit
Welp at least I get to see how the AMD launch goes before purchasing that now
CrzyJek@reddit
You should have waited regardless. Wtf.
lokithetarnished@reddit
Well with how the previous cards sold out itās not like waiting is an easy option. Especially with tariffs starting soon
SRVisGod24@reddit
Let's be real, even if it wasn't delayed, stock was still gonna be horrid. So chances of getting one from BB would've been slim anyway
lokithetarnished@reddit
Doesnāt nvidia sell FE through their website as well?
SRVisGod24@reddit
If you're in the USA, no. Not yet anyway
Suspicious-Lunch-734@reddit
Are you talking about inet or proshop?
NeverForgetNGage@reddit
Narrator: AMD fell flat on its face 10 feet in front of the basket
Pugs-r-cool@reddit
Blah blah, opportunity, you know the rest
Aggravating-Dot132@reddit
You should check another Steve. He definitely didn't produce any hints.
Reggitor360@reddit
Unless it costs less than 349, DOA.
LuminanceGayming@reddit
$550 lmao
Reggitor360@reddit
Scam it is.
LuminanceGayming@reddit
please look up the definition of scam
mulletarian@reddit
A scam, or a confidence trick, is an attempt to defraud a person or group after first gaining their trust.
If the shoe fits...
leandoer2k3@reddit
A scam would be Nvidia selling you a box with 5070 written on it, but inside there is a 1050ti GPU.
If you have all the information about the product and do not like the performance gains of it after buying, you DID NOT get scammed, you we're just a stupid consumer.
mulletarian@reddit
What if there's actually a 5060 inside that box
leandoer2k3@reddit
Reminds me of children crying when the other kid received a bigger present.
A what if won't change the stack. Buy it if you need the upgrade, don't if you don't, it's as simple as that.
Paper launches happen, it's nothing new. If you buy in to it expecting magic, you're the retard, not Nvidia being scammers lol..
mulletarian@reddit
If a car manufacturer announced their new GTR model with the expectation that it would have the same performance as a supercar from a few years back and it turns out it didn't actually have that performance, and they even put in a worse engine than they normally do for their GTR models, wouldn't people call that a scam?
leandoer2k3@reddit
Bad analogy with cars, because car manufacturers literally do this all the time and nobody cares LOL..
And no I wouldn't call it a scam because I wouldn't buy a car based off marketing material, neither do 99% of the population, and neither would a salesman at the dealership sell it to you if it was an actual scam..
They can say it's 50x faster than the predecessor flagship in X scenario, it doesn't matter, the specs clearly say what to expect. If you buy in to marketing, you're a schmuck. You get the same specs as on the box, which in the most literal sense - NOT a scam..
The product is for somebody, obviously not for you.
takuriku@reddit
4090 = 5070 is undeniably a scam
LuminanceGayming@reddit
yes. the price being $550 is just a bad deal though, not a scam.
puffz0r@reddit
*650 street price
nmkd@reddit
"DOA" as in "sold out at all times"?
Popular_Research6084@reddit
What a disappointing generation. I'm still using my 3080 FE from 2020, and I was really hoping to upgrade this year. I'm not spending $2k on a 5090, and sounds like the 5080 is similarly limited on VRAM and will also likely suffer from issues in the future.
boringestnickname@reddit
I'm also on the 3080. Would be happy to sit on it longer if it wasn't for the 10 GB of VRAM.
Hell, not sure when I'll ever upgrade at this rate.
I just want something OK for an OK price, you know?
The worst part of this generation is that it's so shit that I won't even consider getting it used when the next generation arrives. At least with the 4000-series, you could get lucky and get something used for a price that was less ridiculous.
Popular_Research6084@reddit
Honestly not really sure what to do. Monster Hunter Wilds honestly runs super inconsistently on my PC which is disappointing, but I haven't really had any issues running most games on higher settings.
Part of me hopes that they announce a 5080ti at some point this fall and that it's reasonably priced.
Part of me is considering giving AMD a shot. The 9070XT is getting pretty great reviews. It might be worth the upgrade if I can get it for MSRP ($599). There have been a lot of comparisons to the 5070 ti. It also has 16gb of VRAM, which isn't great, but better than my 3080 at 10GB.
boringestnickname@reddit
Yeah, the 9070XT looks like a killer buy at MSRP.
Latter_Gold_8873@reddit
Getting the 9070 XT seems more and more viable if stock isn't ass
GeneralChaz9@reddit
Similar situation. I thought about grabbing a 7900 XTX for the raw performance and tons of VRAM but the increased usage of ray tracing in games had me pause on that one, and it's still hovering around $1000 USD if you find any in stock for a two year old card which feels wrong.
Depending on the actual performance, I am thinking of grabbing a 9070 XT to use for a couple years at the least in hopes of GPUs getting better next generation. Bumping from 10GB to 16GB doesn't feel significant but it's at least a short term solution for (hopefully) good price/performance rather than dropping a grand on Nvidia's current disaster.
Not_Yet_Italian_1990@reddit
Eh... 12GB can be a problem. It remains to be seen that 16GB is.
And I say this, by the way, as someone who is firmly in the "give us more VRAM" camp.
26295@reddit
At this point I'm just surprised that it managed to match the 4070S with considerably less cores.
DktheDarkKnight@reddit
Which does make it more depressing because all NVIDIA had to was just give the 5070 the same amount of cores as the 4070 super. That would have given us a semi decent +20% gen on gen uplift.
Zerasad@reddit
Or, more appropriately call it what it is, a 5060 and sell it for $350.
only_r3ad_the_titl3@reddit
people here really expeting an 80% gen over gen uplift in value is quiet insane. But no surprise considering the delusional brainwash they get from chnnels like HUB
Zerasad@reddit
We had a less than 40% perf increase between the 4060 and 2060, which came out 6 years ago. At the same time the 70 series got an 87% increase at the same time, while the 80 series got a 143% increase.
Nvidia has been sandbagging all the affordable cards with garbage upgrades each generation. It's about time we finally got back to actual upgrades. If the 60 series were upgraded at the same pace the 80 series got updated then the 5060 would be 4% faster than the RTX 5070. Funny how that works out huhh?
only_r3ad_the_titl3@reddit
If nvidia called the 5090 a 5060 and left the price at 2000 usd you would be happy because that would be a 315% improvement in 1 gen right?
2060 (350 usd) to 4060 (300 usd) -> 64% value improvement.
2070 (500) -> 4070 (600) -> 56% value improvement.
2080 to 4080 -> 42% value imrpovement
So the 60 series improved value more than the 70 and 80 series.
So the FPS/USD increased faster for the 60 series than the other series.
demonarc@reddit
Used to be that the xx60 class card matched the performance of the previous gen xx80 class card, at minimum. Now the xx70 class card doesn't even beat the xx70-Ti card of previous gen. That's zero uplift.
Hitokage_Tamashi@reddit
Even looking back historically, that's only seldom been true.
The GTX 460 was a bit ahead of the GTX 280/285 (source: TechPowerUp)
The GTX 560 usually very soundly lost to the GTX 480 (source: TechPowerUp)
The GTX 660 usually matched or beat the GTX 580 (source: TechPowerUp)
The GTX 760 slightly lost to the GTX 670 (source: TechPowerUp)
The GTX 960 is weird because Kepler has aged so bizarrely, at launch it slightly lost to the GTX 770, in newer titles (~2016 or so onwards, I want to say?) it's about the same or faster. (Source for launch: TechPowerUp)
The GTX 1060 matched the GTX 980 (source: TechPowerUp)
The RTX 2060's roughly a GTX 1080 (source: TechPowerUp)
The RTX 3060 is roughly an RTX 2070 (source: TechPowerUp)
The RTX 4060 is roughly an RTX 2080 (source: TechPowerUp)
All reviews were chosen at the time of release.
Even if we arbitrarily do a cutoff at Turing, 2/7 of them lose to the previous -70 and 1/7 of them lose to the -80. Only 4/7 manage to match the -80, that's not really a common enough trend to say "used to."
That doesn't mean these cards aren't disappointing, but it was functionally a coin-flip if the -60 matched the previous -80 or previous -70 (and that's with a generous cutoff period)
only_r3ad_the_titl3@reddit
in addition the price for the 80 class cards went from 550 for the 980 to over 1000 usd. So nearly doubled while the 60 class card went from the 960 at 200 to 300, so only 50% increase.
demonarc@reddit
Yeah, it's not always 100% true, but it's generally the case that there's a substantial gen on gen improvement, this gen there is effectively no improvement.
Hitokage_Tamashi@reddit
Yeah, for sure. Nvidia really dropped the ball here and it's especially disappointing when Lovelace was so mid to start. There's not really a GPU here other than the 5090 that you couldn't have bought 2-3 years ago.
When the 5070 was announced I was honestly kind of eyeing it (I knew it wasn't going to be a 4090, obviously, but I figured it'd sit closer to a 4070 Ti Super), now I'm not giving really any of Blackwell a second thought lol. The 9070's kind of interesting but I think I'm gonna sit this gen out entirely.
Side note, apologies if I sounded hostile in any way, wasn't my intent at all; I re-read my post and realized I sounded really blunt, I don't know how to correct it to sound less blunt
demonarc@reddit
Don't worry about the tone, I appreciate the fact check, never hurts to be better informed.
only_r3ad_the_titl3@reddit
yeah and the 80 class used to be like 500 usd not 1000 usd. While the 60 class stayed pretty much the same in terms of pricing.
tupseh@reddit
They only made 12 of them anyway.
Unlikely-Housing8223@reddit
Ten were sent out to reviewers.
only_r3ad_the_titl3@reddit
Do you consider the super release to be it's own generation?
Vb_33@reddit
But that would fuck over the 5070 super from bringing a noteworthy uplift. š¤
80avtechfan@reddit
Yes but even that would be a completely boring product with way too little VRAM for the pricepoint.
loppyjilopy@reddit
smaller architecture (nm) combined with higher clock frequency (i think)
BigBlackChocobo@reddit
They wouldn't have labeled it 4070 if it couldn't match the 4070S.
Given the previous core counts and bandwidth, this is where everyone should have known it to be.
king_of_the_potato_p@reddit
The 4060 100% and pretty sure the 4060ti lose in some work loads to previous gen same modrl tier.
Its not unheard of.
BigBlackChocobo@reddit
60 series has been horrific for the last few generations.
At this point it's just whatever die they have left over with no care given at all.
PhoBoChai@reddit
+33% bandwidth helps in some games, less in others, which is why you see some titles the 4070S wins by a small margin, these GPUs trade blows due to different engines scaling on bandwidth.
-Purrfection-@reddit
By far the most interesting thing about this otherwise completely uninteresting product
noiserr@reddit
If I had to guess, 4070S may have been memory starved, and the GDDR7 is helping on a GPU with only a 192-bit memory bus.
Qweasdy@reddit
If it was the 5060 (and priced accordingly) it'd be a great product...
Not_Yet_Italian_1990@reddit
It has a much higher memory bandwidth and a decently-sized TDP increase.
Nvidia knew exactly what they were doing with this launch. They basically gave the Nvidia pay pigs the same card for a $50 lower MSRP while probably keeping or even increasing their margins due to the smaller die.
Uh... a "win-win," I... guess?
Vb_33@reddit
Based that means AMD is about sweep the board with Nvidia since Nvidia isn't even trying.Ā
NoStructure5034@reddit
Something something missing opportunity...
szczszqweqwe@reddit
Yeah, me too, I thought that it might be a bit slower.
tmchn@reddit
The performance uplift is in line with the other 5000 series. At the same cuda cores, a 12% uplift is to expected.
The 5070 has 12% less cuda cores, so it matches perfectly the 4070 super
only_r3ad_the_titl3@reddit
Idk it is a 35% gen over gen uplift in value. What are people this mad?
Gippy_@reddit
Because this is rightfully compared against the 4070 Super, not the original 4070.
When you think about it, the Super refreshes shouldn't even happen in the first place. The only reason they do is because the originals have a lukewarm reputation. And that's why RTX 20 and RTX 40 had Super cards, but not RTX 30 or even GTX 16.
only_r3ad_the_titl3@reddit
"Because this is rightfully compared against the 4070 Super" - but then you dont look at the generational uplift, that is half a generation
Dey_EatDaPooPoo@reddit
The 4070 itself was poorly received with it only matching the RTX 3080 10GB and 70-series cards went from having a 256-bit bus on the 970, 1070, 2070 and 3070 down to 192-bit. All of those cards also offered much more substantial performance upgrades over their predecessors. The 4070 had 50% more VRAM, and it was only barely adequate considering the 3070 should have had 12GB itself. It was also only 25% faster than the 3070. Now we get a card with the same (and now very inadequate) amount of VRAM that's also only 25% faster than its predecessor, only this time also using 20% more power.
It's only 25% faster 2 years later, it has an inadequate amount of VRAM and is barely any more efficient than its predecessor. How can people not be mad?
Floturcocantsee@reddit
It's doesn't exist, stock is so paper-thin it's basically air.
The stuff that will sell will be way over MSRP.
It's 35% better than the 4070 which isn't what it's competing against, it's competing against the 4070 super which it's basically the same as.
EdzyFPS@reddit
5090 = 5080 5080 = 5070 ti 5070 ti = 5070 5070 = 5060
GetOffMyBackLoser@reddit
The 21760 cuda core 750 mm die 5080...
You could say that for the GB203 and below, but being this delusional gotta hurt lol..
noiserr@reddit
3080 was 628 mmĀ²
Swaggerlilyjohnson@reddit
Yeah because it was cut down substantially. The 3090 and 3090ti were also 628mm2. The 5090 is also cut down but its only 11% cut down and the 3090ti was the full die. This means the 5090 has more active die area than the 3090ti and 3090 and alot more than the 3080.
ResponsibleJudge3172@reddit
Bro, it was 15% less speed than 3090 and was an outlier in how close it was to the flagship
Swaggerlilyjohnson@reddit
Of course it was an outlier that was basically my point. The whole comment chain started with calling a 5090 a 5080 which is absurd. Then another guy brought up the 3080 die size to act like it was a reasonable thing to call a 5090 a 5080
I was simply pointing out that the 5090 deserves to be called a 90 series card more than any other 90 series card if anything. The die size is huge. It has more area than the 4090 and 3090 by a large margin. And comparing it to a 3080 that was cut down more than 20% still makes the 5090 look like a way higher tier card than the 3080 even though the 3080 is a significant outlier as the biggest 80 class gpu nvidia has released.
Darkknight1939@reddit
On a dirt cheap Samsung 8nm node...
HatchetHand@reddit
But you can't use those cuda cores for PhysX anymore.
Hifihedgehog@reddit
5xxx = NVIDIA: āWe did it, Patrick! We saved PC gaming!ā
AyeItsEazy@reddit
Even tho itās this bad dumb asses that have seen this video and read these comments will do but it. NVIDIA guys might just be dumber then grass
Pugs-r-cool@reddit
I don't think many people here will be buying one, these will get sold in huge amounts to pre built buyers though.
resetallthethings@reddit
yep, no reviewers recommended the 4060 really that I remember last gen either, but it's one of the most popular cards across steam hardware and the like
ResponsibleJudge3172@reddit
What were people supposed to do? Buy the slower 7600 that had the same VRAM as 4060 and often cost similar in many parts of the world? Or trust people not to scam them as they sell second hand parts at the same price as new?
resetallthethings@reddit
buy something cheaper or save a bit more and get something more expensive
not a difficult problem to solve
ShiiftyShift@reddit
Every new Nvidia annoucment is forcing my hand more and more into buying a 7900xt at this point
NGGKroze@reddit (OP)
Based on TPU Review (copy from r/nvidia)
All_Work_All_Play@reddit
I want this segment to still be top-of-the-line.
Actually no I don't, I want there to be three competitors out for blood across all market price points rather than the one-and-a-half-and-what-the-fuck-Intel that we've had for two decades now.
Strazdas1@reddit
you can have it as soon as you bring me my unicorn.
All_Work_All_Play@reddit
InconspicuousRadish@reddit
500 wasn't top of the line even a decade ago. Expecting this in 2025 is simply unrealistic.
SituationSoap@reddit
$500 wasn't top of the line two decades ago.
Janus67@reddit
You're not wrong, not sure why you're being down voted. The 8800GTX MSRP was 600-650 and the flagship/halo 8800 Ultra was $830.
https://www.anandtech.com/show/2222
king_of_the_potato_p@reddit
If I remember correctly you gotta go back to 2003 for gpu prices for the top tier to be $500 or less.
bryf50@reddit
The problem is not that an expensive halo card exists. The problem is that there's no runner up top of the line card anymore. Pretty much every historical generation had a high value high end GPU that was 10-20% off the top.
SituationSoap@reddit
A bunch of the people on this sub weren't conscious 20 years ago, I'm not worried about them not understanding the history.
Janus67@reddit
yep, agreed. I just checked and the 6800 Ultra was $500 and was 2004. So that really appears to be the breaking point when things started to jump above that marker.
LowerLavishness4674@reddit
But $600 was, and fits in the price bracket provided by the guy you answered. The 980Ti and 1080 both cost $599.
Obviously the Titans (and 1080Ti that came out nearly a year later) were more expensive, but the Titans were never intended to be consumer cards.
InconspicuousRadish@reddit
The 980 Ti was launched in 2015. Adjusted for inflation, that is $799 today. Just saying.
LowerLavishness4674@reddit
Totally fair. I forgot about that part.
$799 still doesn't even get you an xx80 these days, let alone an xx80Ti or xx90.
king_of_the_potato_p@reddit
To be fair, the 90 is basically the titan.
InconspicuousRadish@reddit
I know, and that's perfectly valid. I was just pointing out that $500 and top of the line were never really a thing.
So that's even less likely to be a thing in today's GPU climate.
king_of_the_potato_p@reddit
No it wasn't.....
$750-$800 back in 2014.
All_Work_All_Play@reddit
Which is why I said that's not what I want.
stav_and_nick@reddit
As someone finally looking to upgrade from their 1660, I really wish Intel got their game up. The Battlemage stuff looks like a fantastic upgrade for the price, but there's no inventory and I'm frankly still skeptical about long term performance
tmchn@reddit
The worst thing is that a 8-10 months ago in europe 4070 supers were available for 550-600ā¬. This thing will cost 7-800ā¬, if you can find one
Strazdas1@reddit
This isnt going to cost as much as 5070 ti...
Smagjus@reddit
I managed to get a 4070 ti super with cashback for 750ā¬ a few months ago and thought I might have pulled the trigger too early. Boy was I wrong.
cgaWolf@reddit
Buyers ~~remorse~~ vindication
TopdeckIsSkill@reddit
even at msrp it's still terrible imho
Chimbondaowns@reddit
It's not an opinion it's a fact.
UnexpectedFisting@reddit
I always find it hilarious that the 3070 constantly gets excluded from benchmark comparisons. Itās like everyone forgot it existed and only compare against the the 3080 or 3060 and I have no clue why
IcePopsicleDragon@reddit
They want to force people to buy the 5090, there's no other explanation for this.
GetOffMyBackLoser@reddit
Yes, Nvidia has lost their mind and want everyone to buy the highly successful (1.1% market share across all 90models, 3090/3090ti/3090d/4090) product.
nmkd@reddit
"as fast", not "faster".
2x as fast would be 300% performance.
NoStructure5034@reddit
It's the opposite, isn't it? 2X faster means 300% perf?
nmkd@reddit
yeah got a typo in the 2nd line, fixed
MeVe90@reddit
I'm keeping my 3070 and I'll play games from my endless backlog
TopdeckIsSkill@reddit
we went from 970 being the best card for price/performance rateo to 5070 to be the worst card besides the 4060
Guardian_of_theBlind@reddit
the 4060ti 16gb has a much worse price/performance ratio than the 4060. It's the worst card you can buy in that regard
TopdeckIsSkill@reddit
didn't even know it existed.
I usually look at the 4060 and see that even if it launched 4 years after my 5700XT and it was costing more at launch than the 5700XT, the raster performance are barely 10% more!
Guardian_of_theBlind@reddit
the 4060ti 16gb doesn't have more shaders than the 8gb 4060ti, but it has a msrp of a whopping $500!!! You could get og 4070s when it launched for roughly the same price point and those were demolishing the 4060ti
retro83@reddit
Isn't the 16gb ti aimed at AI though? The 4070 Super only has 12GB VRAM which is borderline not enough for many workloads.
Guardian_of_theBlind@reddit
It's a "RTX" card so the main target audience is gamers.
retro83@reddit
Maybe it was once, but the spec of that card doesn't make any sense for gaming. It does make sense for for AI where VRAM is more important than speed for many workloads, and I remember it was advertised at least by MSI specifically for running Stable Diffusion.
Can't find a copy of the ad but here a page about it that's still on their site:
https://uk.msi.com/blog/stable-diffusion-xl-best-value-rtx-graphics-card
https://i.postimg.cc/xTfRdcgx/image.png
shawarmagician@reddit
970 has the GPU chip GM204, 980 still had the GM204, 5070 has the GB205
adaminc@reddit
The 970 was a scam though, sold as a 4gb card, but it only had 3gb of usable memory.
tmchn@reddit
The 1070 beat the 980ti. Good times
Macieyerk@reddit
Rtx 3070 beat rtx 2080 Ti aswell. NVIDIA keeps their gaming division at life support since all the $$$ can be made on AI, incase it fails they can go back to making gaming gpus somewhat affordable.
tmchn@reddit
I'm sure that Nvidia is still able to develop a xx70 that beats the previous flagship
They simply have no need to do it cause they are making bank on the AI boom
PastaPandaSimon@reddit
*They are making bank on people spending $600 on small xx70 dies that don't outperform even prior gen xx80 cards.
Macieyerk@reddit
Exactly this.
095179005@reddit
980Ti and 1080Ti were the GOATs
DYMAXIONman@reddit
970 sucked ass. The R9 390 was better.
TopdeckIsSkill@reddit
It was also more expensive. GTX 970 was 350ā¬, the r9 390 was above 400ā¬.
Not to mention insane power draw: 90W in idle with multi monitor and 340W while gaming compared to 10W and 170W of the gtx970.
Keulapaska@reddit
It did not draw 340W in gaming as it has 275W tdp. Idle multi monitor draw, yea that was apparently was hot trash, even worse than a 290 somehow from the one review i quickly came up with.
DYMAXIONman@reddit
You could get both for the same price for the majority of the life of the product.
TopdeckIsSkill@reddit
I remember it being more expensive, but regardless the power draw difference was so huge that I remember going for the nvidia 970 because I didn't want such hot and loud card for a 10% performance difference
BaconatedGrapefruit@reddit
I believe the R9 390 was less money than a 970 in North America. It also came with 8gb of VRAM vs the 3.5 + .5 that the 970 did. Electricity cost was also negligible and worked out to a few bucks a year (again, using NA electricity pricing).
Both cards were great, the 390 stayed ārelevantā longer due to its vram and GCN architecture (amd fine wine and all that jazz). For their generation, neither was the wrong answer.
Lemon_1165@reddit
Nvidia is a monopoly and it's behaving like a monopoly.. So much to the innovation of capitalism..
DYMAXIONman@reddit
Are the reviews a day early?
SituationSoap@reddit
lol what
ThatOnePerson@reddit
And even when they do come out, no ones releasing games that aren't crossplatform for at least 3 years.
Strazdas1@reddit
He knocked but noone answered.
DYMAXIONman@reddit
Rumored 2026 or 2027 release
SituationSoap@reddit
So "knocking on the door" is equivalent to "literally in the entirely next GPU release cycle?"
Jellyfish_McSaveloy@reddit
While I agree that people really shouldn't be buying a 12GB card for $550 in 2025, this is just fear mongering. There is no way that games in 2027, assuming that's the PS6 launch date, would be unable to run on 12GB of VRAM.
Alan Wake 2 had mesh shaders that ran awful on like 7/8 year old cards and everyone kicked up a huge fuss. If a game is unable to launch without 16GB of VRAM, the devs would be criticised left right and centre.
DYMAXIONman@reddit
It might run but it will look like ass.
Strazdas1@reddit
it will look like it looks right now. it just wont look better.
Jellyfish_McSaveloy@reddit
People massively exaggerate how terrible it must be to turn textures to medium. The games which have extreme texture streaming issues or stuttering are in the minority. Cards like the 5700XT with 8GB of VRAM 6 years since launch can still give an ok gaming experience at 1080p, but people here will pretend it's some horrific trash.
The big issue with the 5070 is the price point that's offering 12GB of VRAM. If it was $350 it'll be a stellar piece of tech that would be recommended left right and centre.
Not_Yet_Italian_1990@reddit
It's not just textures, though. RT and FG have added considerably to VRAM requirements and Nvidia has been incredibly stingy for the past decade in that regard.
Jellyfish_McSaveloy@reddit
The biggest factor in determining a cards longevity is it's performance tier after clearing a minimum VRAM requirement. This isn't a R9 Fury X situation where the card was already choking on it's VRAM on launch.
You're right about planned obsolescence. The question is always whether the performance of the card is going to determine whether you turn down the settings before it's VRAM issue rears it's head. The 3080 clearly ran into VRAM issues long before it should have, whereas a 3060 would have to turn off settings long before it's 12GB VRAM would ever be an issue. The 5070 isn't as egregious as the 3080, but again it's ridiculous to suggest it'll be unusable in 2/3 years.
leandoer2k3@reddit
Kingdom Come 2 defaulted @ 1080p for like 60 hours without realizing I'm not on my screens 1440p resolution.
Considering my almost 4 year old 3060ti is still running all the good releases just fine @ 1440p I think even at 550$ majority buyers would be happy with the purchase. The problem is that price and stock probably won't show up for a long while.
One can only hope that we get a Super refresh of the 5000 line up soon after AMD's offerings.
lordofthedrones@reddit
My ancient RX480 8GB at work still plays most of the stuff really nicely. And it is Wattage limited because of dead heatpipes.
resetallthethings@reddit
yeah, while I agree that 12gb for any card above $400 at this point is a kick in the nuts, for 1440p and lower, 12gb with some settings tweaks will definitely still be quite usable for a while yet without too much compromise.
Strazdas1@reddit
the fuss was big enough that AW2 made a workaround to speed it up for old cards.
Aldraku@reddit
Also your argument is strengthened by the fact that 75% of gpus on the steam survey, not including integrated gpus are of 8gb or less vram.
imdrzoidberg@reddit
12gb is absolutely fine at the budget/midrange price point like the Intel B580.
At Nvidia prices it's pure madness.
Merdiso@reddit
Remember all idiots who really thought this could match a 4090 with just 6K cores and said that 9070 XT needs to cost no more than 399$ to make sense?
AdmiralKurita@reddit
Who are these idiots? I only visit this sub for hardware info and a few YouTube channels. Most people have asserted that the RTX 5070 could only achieve RTX 4090 performance is through "fake frames".
For me, going to this sub is highly discouraged for me. It makes me angry and sad. We are living in an age where Moore's law is dead, so one should not expect dramatic improvements in CPUs and GPUs. I expect nothing spectacular from AMD in the next few weeks. I thought there launch was today, which is why I am here right now.
There will be few self-driving cars and robot butlers because hardware improvement had stalled. All that stuff is just futuristic crap in the desolate environment of stagnant hardware.
CatsAndCapybaras@reddit
Hardware is not holding back self-driving or robotics. we have sufficiently powerful and efficient compute solutions for those tasks already. sensors, algorithms and regulations (safety concerns and public sentiment) are the main factors that slow the development in these fields.
Strazdas1@reddit
The main factor is energy. Batteries are bulky and heavy if you want any real movement in robotics.
AdmiralKurita@reddit
Sore wa dou kana? (Is that so?)
It seems that the cost of compute was a major issue for the now defunct autonomous driving company, Cruise.
The H-100 graphics processing unit is a computer chip used to train machine
learning models like OpenAI. To train artificial intelligence models, a
lot of processing power in a data center is critical, according to Sam
Abuelsamid, vice president of market research at Telemetry Insights.
Still, itās only useful if the system is truly independent of human
programming intervention. The next generation produced by artificial
intelligence software company Nvidia is the B-200, which is more adept
than its earlier counterpart, but even more expensive.āIf
you are looking to do any sort of AI training, the H-100 is the card to
have. Itās very powerful ā¦ but it also consumes a lot of power,ā
Abuelsamid said.Ā Ā
https://www.freep.com/story/money/cars/general-motors/2025/02/08/why-gm-let-cruise-robotaxi-business-go/78246281007/
conquer69@reddit
No. I don't think I saw a single comment claiming that. Everyone seemed to understand it was the classic nvidia bullshit marketing.
jerryfrz@reddit
My man's getting really worked up with these imaginary idiots lmao
ray_fucking_purchase@reddit
I'll def be remembering an idiot now.
SireEvalish@reddit
This dude is out here arguing with ghosts
snollygoster1@reddit
No. I saw a lot of people who knew Nvidia's claim was impossible, but no one who thought they were right.
mtbhatch@reddit
I also remember that my local fb marketplace were flooded with heavily discounted 4090s after the 5000 series announcement. š
SomniumOv@reddit
flashbacks to the few days you could find heavily discounted second hand 2080 and 2080Ti before everyone realised 3000 series would not be in stock anywhere for months.
free2game@reddit
The 9070 does need to cost $400 to make sense. AMD needs to market their cards cheaper to actually build up marketshare.
fatso486@reddit
At half the price, itād be an amazing 1080p card. At full price, itās a time bomb even at 1440P with low VRAM.
tmchn@reddit
In 2016 this would have been a 50class card
Strazdas1@reddit
in 2015, you would be paying 10 000 for such performance and loved it.
996forever@reddit
Maybe the 1060 3GB relative to the 1080Ti
Guardian_of_theBlind@reddit
actually no. the 1060 3gb has 32% of the cuda cores of the 1080ti. the 5070 only has 28% of the cuda cores of the 5090. So it's worse than the bottom 60 tier card from the pascal era in comparison. Oof
nmkd@reddit
Would Titan Xp not be a more accurate comparison since that was the "1090"?
996forever@reddit
No. The 5090 is STILL not the fully enabled die where as the Titan Xp was.
Both 1080Ti and 5090 are slightly cut down versions of the -102 die.
Guardian_of_theBlind@reddit
then they would probably match closer. the titan xp was only very slightly faster than the 1080ti. I am a bit too lazy to put the numbers in the calculator. It's only 250 cuda cores of a difference
Raikaru@reddit
Why would the amount of cuda cores matter vs the die size?
egan777@reddit
Cut down cards have the same die size.
1070, 1070ti and 1080 have the same size.
5070ti and 5080 as well.
Raikaru@reddit
Sure. Die size is what actually matters when it comes to cost though. When people are saying a card is a 60 class card, theyāre saying they want it sold for 60 class prices.
egan777@reddit
They have been cutting down the lower tier cards significantly though.
When 4090 got \~70% gains over the 3090, 4070 only got like 20-25% over 3070.
4060ti was even worse at around 10%.
leandoer2k3@reddit
Boy maths
Yearlaren@reddit
No it wouldn't. In 2016 the 50 class cards were the 1050 Ti and the 1050. The TDP of those cards was 75 W. The 5070's is 250 W.
pewpew62@reddit
We're ~2 years away from VRAM demands skyrocketing due to next gen consoles, anyone with more than two brain cells should avoid 12G or less like the plague at this price
Not_Yet_Italian_1990@reddit
The next gen consoles will be 2028, at the earliest. PS4 to PS5 was a 7 year gap, and the rate of improvements is slowing down drastically.
Strazdas1@reddit
anything would be a drastic improvement now due to how horrible ray tracing on current consoles is.
Burden_Of_Atlas@reddit
Microsoft plans have their next console listed for 2026, from leaked plans, which have been correct for the most part so far. Further leaks also have games like the 2026 Call of Duty being tested on new Xbox Hardware.
Sony would like follow suit, although a year behind, similar to the 360 and PS3. At worst, with a delay on Microsoftās part, new gen consoles will likely be here by late 2027.
Not_Yet_Italian_1990@reddit
I'll believe it when I see it.
But I suppose it's possible that Microsoft will be eager to hit the reset button. I had sorta expected them to wait for technology to become available to miniaturize the Series S into a handheld for an extended cross-gen with a new machine, but I wouldn't be surprised if they just moved on.
Strazdas1@reddit
what makes you think new consoles will have more VRAM? they will be using exact same 2GB modules they used before because newer modules dont exist (3GB modules started being manufactured only this year).
Panslave@reddit
Don't know how true that is but I agree
Vb_33@reddit
It's not true, VRAM doesn't go up day 1 of a console launch. Usually it takes a couple years.
Olangotang@reddit
It does go up quickly. Optimization is garbage. And Unreal 5 has been shit in terms of VRAM use.
amazingspiderlesbian@reddit
Bro you're living in opposite land. Ue5 is literally the one engine that doesn't use insane amounts of vram for the visuals. Look at gameplay of any ue5 game or benchmarks and you'll see that they barely use 8gb even at 4k max settings.
RobotWantsKitty@reddit
Nah, first few years will be crossgen, so VRAM requirements won't be as steep
pewpew62@reddit
Not all games are crossgen though, and even so, on PC even crossgen games are more demanding despite having to be optimised for old hardware. Look at that new god of war game
NGGKroze@reddit (OP)
At 549$ it should have been 16GB to have some chance against 9070 series even if slower. Aside from DLSS and brand loyalty, low segment users like 2060/3060/4060 if they want to up the ante should go to 9070/9070XT.
PastaPandaSimon@reddit
It hurts to see xx60 series now labeled as "low segment" š„¹
GetOffMyBackLoser@reddit
The 5060ti seems like the sweet spot for me personally and has been for all my previous GPU purchases every 4-5 years, xx70 models always have had small gains over xx60 compared to price for it to be worth it, and the other benefit being that I've had my corsair 550w psu for almost 13 years and hopefully can run it for 2 more without issue.
wankthisway@reddit
I wanted to upgrade my 2070 Super but Nvidia isn't where I'll be going this time
Guardian_of_theBlind@reddit
wait for tomorrow. the 9070xt will absolutely demolish the 5070. Probably even in RT, because of the VRAM
IcePopsicleDragon@reddit
Fake Frames, Fake Price, Fake Performance, Fake Cooling and Fake Promises
ThePreciseClimber@reddit
Plus gimped 32bit PhysX.
nmkd@reddit
"Gimped" as in "literally missing".
ThePreciseClimber@reddit
I mean, you can TECHNICALLY turn it on. But it will run like shit.
nmkd@reddit
No. It will run on your CPU.
It is literally missing from the GPU.
conquer69@reddit
DF said installing a second card will render the physx on the primary card anyway. So it's not missing, just disabled in software.
Strazdas1@reddit
Then DF was wrong about this.
PhoBoChai@reddit
Fake launch. Zero availability. I rang up some mates in retail and they said only got a handful of 5070 for the launch.
NVIDIA too busy milking the AI market to give a damn about wafers for gamers.
FinalBase7@reddit
I remember when the specs were announced and every comment speculating that this thing won't be faster than a 4070 super was met with "it's a different architecture" reply, as if different architecture mattered in the last 10 years of GPUs lol, more cores, more clocks, more bandwidth is just more better in GPUs, architecture does very little to alleviate deficits in these areas.
theQuandary@reddit
For what it's worth, different architecture seems to be making a big difference in RDNA4, but once they finish adding instructions to use on the second SIMD, it's not going to happen again because VLIW-3 almost certainly won't be used enough use to be worth having.
FinalBase7@reddit
RDNA 3 was a special case with chiplet design, a chiplet GPU is worse than the same exact GPU but on a monolithic die but cheaper to make, RDNA 4 is going back to monolithic and 9070XT has 32% faster clocks and 12% more bandwidth than 7900GRE so with that in mind it makes sense they're achieving 42% better performance than 7900GRE with 20% fewer cores.
There's most definitely an architectural improvement, it's just tiny most of the time and quickly demolished the moment you start cutting physical specs without compensating in other physical specs.
theQuandary@reddit
I don't think we are talking about the same thing.
9070xt has only 22% higher base clocks with 20% fewer SPUs than 7900gre. This shows up in AMD's rating of 48.7TFLOPS f32 for 9070xt vs 46TFLOPS f32 for 7900gre.
No matter how you look at it, there has been a MASSIVE architectural improvement to get 42% better real-world performance with just 6% increase in theoretical performance.
Chimbondaowns@reddit
Costs real money, Shockingly.
Dudi4PoLFr@reddit
Fake ROPs probably too!
Kiriima@reddit
Fake Availability also!
Hifihedgehog@reddit
Meanwhile, real leather jackets for the faker CEO.
tilthenmywindowsache@reddit
My $700 7900xt looking like a better purchase every day.
12318532110@reddit
Fake melting issue fixed
blackflagnirvana@reddit
My 6950XT beats a 5070 in straight raster and has 16 GB of VRAM, what a joke
Keulapaska@reddit
Does it?
This review doesn't even have a 6950xt, so where is that info from? TPU has a 6900xt in their chart 5070 beting it by ~17% or so and the main chart has 6950xt 8% ahead of 6900xt so still doesn't seem like it beats it.
GN also has a 6950xt on their charts, doesn't look like it beats 5070 there either
puffz0r@reddit
TPU includes RT in many of their benchmark suite games, and rdna2 is absolutely awful. The dude did specify raster only
Keulapaska@reddit
If only TPU had a seprate RT chart in their reviews... Oh wait they do and the relative performance chart isn't RT "This page shows a combined performance summary of all the game tests on previous pages" as you can clearly see by looking at like 3080 vs 6800xt. Even if you think some RT sneaked in to that graph you can just go check every individual raster game yourself to see that rdna2 aged more like vinegar than fine wine.
puffz0r@reddit
Seems like it aged fine considering the gap between the 6800XT and 3080 is still like 3%. Something's weird with the 6900XT in particular, I remember on launch it was 10% faster than the 6800XT and on TPU it's only like 5%. Also on GN's raster charts the 6950XT is within 3% of the 5070
puffz0r@reddit
Completely wrong, TPU's relative performance includes ALL tests. If you even bothered to open the page you'd see a big disclaimer at the top "This page shows a combined performance summary of all the game tests on previous pages, broken down by resolution." That includes raytracing. The raytracing page breaks out RT specifically.
Also hilariously enough the linked GN charts in every raster benchmark the 6950XT is within 3% of the 5070 FE, so his claim that his own card beats the 5070 is easily imagined true when considering factory OC models.
blackflagnirvana@reddit
I just heard the 5070 is roughly equal to a 4070S, which is roughly equivalent in FPS to 6950XT. Maybe things have changed with new driver updates
conquer69@reddit
So you didn't watch the video or other reviews and yet confidently claimed it was slower...
blackflagnirvana@reddit
Go to techpowerup GPU hierarchy, it's 3% faster than a 4070S according to their chart. So if 5070 = 4070S that would be correct
Keulapaska@reddit
There is a fair bit of variance in the reviews for sure even in same games, like tpu has the 5070 kinda high vs the 4070s compared to what HUB has, i guess the meta review post once it'll be made will avg out all the main reviews to see how it stacks up as and what the "real" avg is.
Also game by game variance of different architectures can be quite large so I'm sure there are many games where a 6950xt will beat it, like cod as it's very amd favoured in general.
resetallthethings@reddit
I wound up selling my buddy my OC formula 6900xt ahead of this launch cycle, but honestly was pretty happy with it, I was probably planning on the 9070xt anyway which should be a decent upgrade.
But yeah, both impressive and sad that it's aged as well as it has honestly
blackflagnirvana@reddit
I want to switch to Nvidia when I upgrade next, but it won't be this generation
king_of_the_potato_p@reddit
Ive had nvidia most of the last 20 years, tried a 6800xt two years ago and its been great.
I wont even consider nvidia again until they fix the VRM contacts and shunts to not be a fire hazard.
NoStructure5034@reddit
RX 6000 was amazing, great raster performance at a much better price point. The reference designs looked beautiful too.
Aldraku@reddit
No one noticed the rx 9070 in the power draw stats ? :D
Keulapaska@reddit
What is even going on on those power tests? a 5080 drawing less than a 4080 super is a bit weird, but sure i can believe taht as the 5080 isn't in space marine(for some reason...), however how tf does a 3080 draw a 100W more than 5080 in starfield? it's a "only a 320W card stock...
I get that starfield is not the highest power draw, but ~220W(estimating as a 3080 riding the power limit stock makes sense on most games) for a stock 5080 sounds really low also the 5070ti is the same and somehow more in outlaws than a 5080 also 4080/s also more than a 5080, like wtf is going on there.
amazingspiderlesbian@reddit
Most games don't use the entire power budget of the gpu to be fair. I usually only see barely 300w on my 5080 and thats overclocked. And the 5080 does have the best performance per watt of any gpu heating out the 4080s by 10%
Keulapaska@reddit
But the graph isn't performance per watt, it's just power draw, and seemingly isn't cpu bound looking at the performance graphs. TPU gaming avg is a 325W, GN has 5080 consistently drawing more than a 4080 anda 5070ti on their games.
Yes starfield is lower power draw game so a slightly bigger delta than outlaws makes sense but 100W is lot as it includes the cpu doing 30 more fps as well, even outlaws has a 60W difference including the cpu, which again probably drawing more but we have no idea cause there is just a combined number. One thing could be that the 3080 isn't stock/fe one and is like suprim X, strix or something and has 370W default limit, but that kinda defeats the point of using "stock" card no does it and isn't communicated.
Also this isn't the 1st time hub has weird power graphs, back when they used to cover full system draw, some their cpu reviews produced very weird power results where there was clearly something going on with the gpu draw/power states rather than cpu draw difference as a 7600x doesn't just randomly icnrease 50-100W in power draw while an intel cpu system doesn't increase at all or like 20W.
Or in the other video of a 13700k drawing nearly 200Wmore in game than a 7800x3d yet somehow only 34W difference in another one, like there's no way neither of these is just the cpu power draw making that difference, even their own blender test shows max 184W total system draw difference and only 89W for the 13600k vs 7600x so full cpu render test is somehow less demanding that games? yea no.
Which while very interesting info that the amd system somehow managed to keep the gpu in lower power state when it's not fully used and the intel system can't, isn't communicated at all and again would've been solved by just having some software numbers in addition to show how much software reading of the gpu and cpu was to make it easier to understand the figures and differences.
Question-master3@reddit
Pretty power hungry, hopefully that relates to performance
Swaggerlilyjohnson@reddit
At first I was like damn that's kind of high but then I realized it's equal to the 3080 in total system usage.
The 3080 is 320tdp and the 9070xt is 304 tbp.
Considering the 9070xt is much faster the CPU should be using more power as well.
So it's actually not unreasonable it probably uses slightly less than a 3080 which is exactly what you would expect.
Sarin10@reddit
I mean, 3080s were quasi-notorious for how much power they usef.
Ampere was known for being pretty relatively inefficient.
Spa_5_Fitness_Camp@reddit
Good thing power consumption is one of, if not the last important metric for consumers.
AlexisFR@reddit
Not really, the RX 9070 is 220W TBP max
996forever@reddit
According to the graphs the 9070XT seems to draw more than 4080 Super
AlexisFR@reddit
Yeah, the XT is rated for 330W TBP
It's just a "9080" basically, I don't know why they called it 9070 XT
996forever@reddit
Actually the stock clock spec should be 304w.
Because they can't match the performance of the 5080. They said it themselves the naming scheme is to match their competitor's.
JakeTappersCat@reddit
5080 uses much more power efficient memory with almost 2x the memory bandwidth
It's very likely that if the 9070XT would compete well with the 5080 if it was equipped with the same GDDR7
FuturePastNow@reddit
Yeah the name tells us what it's meant to compete with. I suspect AMD, like all of us, expected the 5080 to be... better.
MarxistMan13@reddit
Could be marketing. Since they claimed they weren't aiming for high-end this gen, making it a X070 instead of X080 keeps expectations lower. Also lets them go up against Nvidia's lower end with more like-for-like name schemes.
3G6A5W338E@reddit
Likely because it's the same chip + tradition in naming cards.
uzzi38@reddit
Worth remembering it's total system power in those charts. A higher performance card will also have the CPU working harder too, so all of that extra power comes form multiple sources.
Chronia82@reddit
True, but still i wouldn't consider a 220W TBP card using more power than a 360W TBP 5080 for probably less performance good in any stretch of the imagination. Unless starfield is some weird outlier were the 9070 non XT wil outperforman the 5080. I do think Starfield is generally nice to AMD cards though. But in outlaws the difference between the non XT 9070 and the 5080 also seems to be only 40W, and in space marines it uses more power again than the 5080. Somehwere i do hope these power numbers are just plain wrong.
Question-master3@reddit
Oh yeah, forgot about that thank you
Aldraku@reddit
I am personally very curious how many Watts does that last 2-3% perf cost. As usually you can drop the power draw by quite a lot while losing under 5% performance. I value the pc being silent and cooler more than 2-3 extra fps.
Framed-Photo@reddit
My 190w 5700XT gets 90% of it's performance at...130w.
So if AMD has kept up that pattern then this could undervolt really well.
PAcMAcDO99@reddit
my 186W (Rev. C5) 6700 XT gets 97% at 130W lol
insane undervolt
Framed-Photo@reddit
I mean yeah there's SOME cases where I can get only small performance losses like 3% but it's certainly not the average haha. I'm fairly certain that would be the case for your undervolt too.
PAcMAcDO99@reddit
the crazy part is the 97% is constant
Framed-Photo@reddit
I would love to see proof of that because it's not how gpus normally work lol
PAcMAcDO99@reddit
https://imgur.com/a/nYRsMbo
95% performance at 64.5% power consumption Welp it wasn't 97% but 95% is sure as hell close enough
Framed-Photo@reddit
Mind telling me why your cyberpunk benchmarks are saying "NVIDIA GeForce RTX 4090" with an unknown driver version?
And besides the inconsistencies in the pictures you've tried to show me, like I said before, there are definitely cases where you can see a small drop in performance. I myself see a small drop like that in the cyberpunk pre-made benchmark. But no, it's not consistent across the board for me, and it wouldn't be for you either.
You're dropping your clocks over 250mhz, that's not free, some games just show that harder than others will.
PAcMAcDO99@reddit
That's because I installed fg mod for cyberpunk
It needs to spoof as an nvidia card for the in game fg to work
As for your comment regarding some games taking lower clocks harder than other Cyberpunk is pretty much the hardest hit when lowering clocks out of all the games I have
Any other questions lmk
Framed-Photo@reddit
I'm asking you questions about this because you're the first person I've seen claiming the things you are. You're giving yourself a huge clockspeed drop while claiming it hardly effects performance across the board, that's just not true.
You're just lying, and you don't want to admit it, that's fine.
No follow up necessary, have a good one!
Excsekutioner@reddit
wow, any advice on how to tune a 5700XT in adrenaline to do exactly what yours does?. I'd love to run mine at 130w max too
Framed-Photo@reddit
Since repasting my card with PTM7950 I haven't had to use the insane undervolt, my Pulse model stays really cool even at 25% fan speeds with an overclock, but here's some of those settings:
1900mhz, 980mv, -26% Power Limit to top out around 130w under load and losing 10% performance at most, sometime less depending on the game.
2100mhz, 1120mv, +10% Power Limit is what I've been running lately just to eek out that extra bit of performance, but honestly I switch between the two depending on if I want the card to run super cool or not.
And of course, your mileage may vary depending on your card.
Aldraku@reddit
I've had a Vega 56 before my current card and it was a joy to tinker with it.
My current rtx 3060ti went from 240w to 167w while losing 5-7% perf.
Unusual_Mess_7962@reddit
Considering we know the GPU is supposed to use 300 watt tops (similar to 6800 XT/7800 XT), I imagine thats a typo.
ledfrisby@reddit
Higher than I expected, but being about equal to a 3080, it's at least not too crazy high. I wouldn't need to buy a new PSU, whereas the for 7900 XTX or any 90-class I probably would. Not ideal, but not a deal breaker.
DktheDarkKnight@reddit
Well well well. Does that constitute as breaking embargo? š¤. I doubt you can release power consumption benchmarks before embargo.
Low_Doubt_3556@reddit
I'm sure amd won't mind a little "mistake" or 7
ray_fucking_purchase@reddit
9070 @ 1440p 361 watts
9070XT @ 1440p 423 watts
oof
noiserr@reddit
It's CPU + GPU.
3G6A5W338E@reddit
Placeholders, as included by mistake. Maybe that's with power limits disabled and maximum OC achieved.
Unusual_Mess_7962@reddit
I imagine its a mistake. 7800 XT had a similar power usage listed than the 7090 XT. Probably was supposed to be a 7900 variant or so.
popop143@reddit
That's kinda confusing, since it's equal to the 4080 Super when the 4070 TI has 60W more TDP. Might be that no WHQL drivers yet affects that?
AreYouAWiiizard@reddit
There's no reference cards so he's probably testing a model that has it's TDP much higher.
Aldraku@reddit
I'll admit it was a little surprised on how high the power draw was but I don't watch enough reviews or videos to know if that included the full pc draw or not.
Pugs-r-cool@reddit
The number on the graph is power draw of the GPU alone, it doesn't include any other components.
popop143@reddit
It includes the CPU too (the title of the slide is PCIE + EPS). PCIE is for the GPU, EPS is for the CPU. At least from the total power draw, seems like they're using a 145-ish watt CPU.
pewpew62@reddit
I definitely did. Massive numbers if that isn't an editing error
S1egwardZwiebelbrudi@reddit
i remember when xx70 was something worth to safe up for and while not enthuisiast level, it was at least a very respectable card that would last you years...now it seems that the 5070 is the poor mans entry level card and everything below that is utter garbage.
man, i'm not sure if i survived covid and this is actually gamers hell...
EdzyFPS@reddit
That's because it's no longer a XX70 series card, it's a XX60 pretending to be an XX70.
cancergiver@reddit
Rather a 50ti card
Chimbondaowns@reddit
Calling it 60 is generous.
Dat_Boi_John@reddit
The 5070 has 50 tier percentage of CUDA cores compared to the 5090, that's why.
AdministrativeFun702@reddit
He well need to be in space when he will be reviewing 400usd RTX 5060TI 8GB.
puffz0r@reddit
Brb waiting 100,000 years for HUB to travel to the lesser magellanic cloud for the 5060 review
ShadowRomeo@reddit
This got to be the worst Nvidia generation I have seen in my history of PC Gaming. We went from 3070 = 2080 Ti to 5070 = 4070 Super.
Just Imagine if the 3070 were only equal to 2070S back on 2020?! It would have been shredded on reviews...
To me this GPU is so disappointing that If I were Steve from HUB I would have went on top of Burj Khalifa itself to express how so much disappointed I am with this GPU.
I really hope AMD RDNA 4 knocks some sense to Nvidia this generation that is if AMD were able to sell RX 9070 - 9070 XT at MSRP of course.
EdzyFPS@reddit
Watch the tards still buy it in droves.
-ShutterPunk-@reddit
Steam surveys have rtx 3050 outnumbering all amd cards right now.
king_of_the_potato_p@reddit
Laptops and prebuilds.
Very few people buy them by itself, the 60 and 60tis are similar.
ShadowRomeo@reddit
In this case not even them will be able to buy it because its nearly non existent.
Diplomatic-Immunity2@reddit
Causal people still saying it has the performance of a 4090 for $549.99 lol
tilthenmywindowsache@reddit
In fairness, the 2xxx series from nvidia was underwhelming and the 3xxx series was going to look much better by default because of it.
MonoShadow@reddit
Turing was ass. But even Turing 2070(not super) was around 12% faster than 1080. The fact it was 100$ over 1070 and didn't touch 1080ti(super addressed this) disappointed many people.
Now compare it to Blackwell. It is so much worse. If 5070ti was named 5070 and sold for 5070 price, we still wouldn't get the price perf of infamous Turing. And at least Turing had excuse of bolting on Tensor and RT cores.
tilthenmywindowsache@reddit
Oh for sure. Turing was the most disappointing product launch in a while from Nvidia (and Kepler), but Blackwell is leagues worse than both of them. It's Nvidia straight up sticking their thumb in the consumer's eye.
J4BR0NI@reddit
Must be a noob
tupseh@reddit
I dunno, 5000 fx series was pretty bad. If it has fx in the name, you know it's bad.
rebelSun25@reddit
They're gonna get cooked for lying about it's performance and I'm so for it. So many people will fall for that 4090 comparison
Not_Yet_Italian_1990@reddit
I don't think I've seen, heard about, or read about anyone believing that claim...
rebelSun25@reddit
We won't, but Reddit is an echo chamber. Nvidia marketing is everywhere. Don't underestimate marketing
GetOffMyBackLoser@reddit
As dumb as consumers are, I don't think anyone is going up to the GPU stand looking at the price of each item side by side and thinking that the 3x lower price product is going to be on par or better.
kaisersolo@reddit
Gready nvidia . that card should be 450-499
GetOffMyBackLoser@reddit
It probably will be after the Super refresh.
kuddlesworth9419@reddit
I wasn't expecting much but I didn't expect it to be this bad.
conquer69@reddit
Why is it slower than the 4070 super???
killer_corg@reddit
Hot take, but the leadership behind the marketing teams don't know what they are selling. They aren't gamers so to them they don't see an issue with turning up using FG metrics against non FG.
They see big number and say "wow, everyone will love this" but they aren't gamers so while yes, FG does make the number bigger, it's not always better.
thoughtcriminaaaal@reddit
they know what they're doing, lol. if they have to misrepresent to sell cards, they will do it, it's their job.
tillidelete@reddit
To put it in perspective even the mediocre 2070 was substantially faster than the 980 TI so the losses to the 3090 are honesty insane
Qsand0@reddit
At least it has 12gb vram.
I'm just thinking of the mobile version.
Damn...
Not_Yet_Italian_1990@reddit
The mobile 5070 will probably be a 5060 or a 5060 Ti with 8GB.
Last gen the mobile chips were one stack below the desktop ones.
tilthenmywindowsache@reddit
But it'll be power restricted so performance closer to a 50 series.
FrozenMongoose@reddit
Stay optimistic: This is the worst 70 series so far
Pedro80R@reddit
Jesus, got myself a 5070 super and didn't even knew about it... Way to go Nvidia :)
dustarma@reddit
Well I guess the "5" number really is cursed for Nvidia.
Deeppurp@reddit
At this point, I just want all reviewers to stop calling them the advertised names, and refer to them as 1 full tier lower.
Recommendation stays the same, but now based on price increase over previous gen. Instead of worst 70 series, Its now A great 60 series, but at the worst price ever, Geforce RTX 5060 Review.
5070ti becomes 5060ti.
5080 becomes 5070ti.
5090 becomes 5080.
Then reviewers can call this the worst price to performance generation, but still then claim truthfully Nvidia can maintain generational performance gains.
ancientemblem@reddit
This is why I donāt upgrade unless there is a node shrink. A lot of the performance upgrade we get between gpu generations come out of an improvement of process nodes, both 4000/5000 use TSMCs N4P, Iām expecting a decent increase once they move to N3 or N2.
Gippy_@reddit
Yup. Went from an i5-6600K to an i9-12900K. Intel was stuck in 14nm+++++++++++ until they finally got 10nm working.
szczszqweqwe@reddit
A roof, what a madman.
FasthandJoe@reddit
Glad I got my 4070super for 589 back in October..
Valkyranna@reddit
Got mine for Ā£499! Great card, having zero issues playing everything at 4K.
maiiiiixo@reddit
I felt like a sucker buying a 4070 Super ~4 weeks ago for Ā£465, as I could splash a little more for a +20% uplift in performance and better efficiency on a 5070.
They are now ~Ā£600 and rising. I'll almost certainly be buying a 9070 XT if the reviews hold up.
lucavigno@reddit
I've just finished watching the review of one of the big tech youtubers in Italy, and they sounded very offended by the 5070.
I'm really hoping for the 9070s to be good.
HLumin@reddit
LMAOOOO NO WAY HE WENT TO THE ROOF
Lukeforce123@reddit
He's gonna have to launch to the moon for the 5060
III-V@reddit
Wasn't the slow RAM debacle a 70 series? Is it worse than that?
i7-4790Que@reddit
that card still put up numbers against the prior flagship 80 series. And it was quite literally ~half the MSRP of the 780 when it came out.
And had more VRAM than the 780.
Jellyfish_McSaveloy@reddit
The R9 200/300 series really was a great AMD generation/refresh before being dismantled by Pascal.
BaconatedGrapefruit@reddit
RX 400 and 500 refresh were also money generations for AMD. They were a a very good option for the midrange and cost amd basically nothing to make. I remember them being everywhere.
It was the Fury series that did them in.
GenZia@reddit
Goes to show where Nvidia's priorities now lie.
Blackwell had no business being on N4.
At the very least, Nvidia should've increased die sizes all across the board, yet they did the opposite. The GB205 is actually slightly smaller than AD104 (263mm^(2) vs. 294mm^(2)) with 10 less SMs (50 vs. 60) and roughly \~5Bn fewer transistors.
The sad thing is, people will be buying this card for \~$800 in the coming days.
FS_ZENO@reddit
Only thing for me is I expected it to be worse than the 4070 Super but isnt so yeah. But it does draw more power than the 4070 Super though...using TPU data.
Nvidia is now making xx70 series very cut down just like how bad xx60 series in terms of, no performance increases, just match previous generation xx70/xx60.
Flimsy_Swordfish_415@reddit
I must say: HAHAHAHAHA
Psyclist80@reddit
Gross....Nvidia doesnt care about gamers anymore. The "Game" has changed so to speak.
IcePopsicleDragon@reddit
Worse than the 4070 Ti for more. This generation is baffling, good thign we have AMD
VanceIX@reddit
But I was promised 4090 performance?????
DeathDexoys@reddit
Lmaaoooo he really went to the roof, the 5060 series are gonna have him jump off a building
fatso486@reddit
its an excellent 1080P card if it was at half the price.
skinlo@reddit
He's literally on the roof, you know it's going to be a good one.