Lisa Su says Radeon RX 9000 series is AMD's most successful GPU launch ever
Posted by xenocea@reddit | hardware | View on Reddit | 233 comments
Posted by xenocea@reddit | hardware | View on Reddit | 233 comments
abbzug@reddit
I feel like they could clean up if they came out with a good 9060xt. Market is dire below $400.
Zoratsu@reddit
Second hand market eats alive anything under $400.
Because why I would a new $350 GPU when I can get an older gen for $350 that is better in all expects?
Maybe efficiency is better but not many people will care about it.
Strazdas1@reddit
But its not better in all aspects because its running on old tech. for AMD especially this would be true as older gens do not support AI upscaler, which is one of the biggest selling points for 9000 gen.
Zoratsu@reddit
So tell me, will this new $350 GPU be better than a second hand $150 3070S if we do $/FPS?
Maybe if we do 320p upscaled to 1080P with PT for both but none of the games I play even have RT so....
Nervous_Border_4803@reddit
3060's used aren't 150, let alone a 3070. 3070 is around 300 and you aren't factoring in that only a minority of people would even consider a used GPU.
150 right now on the used market will get you a 3050 or a 2060. Even radeon GPU's aren't going that cheap. 6700xt is 300 dollars.
Strazdas1@reddit
It depends on what you are trying to run at it. Lets take an example. You got alan Wake 2 that requires mesh shaders. If your used GPU for 150 does not support mesh shaders but your new 350 GPU does, then the performance on the old one will be so bad the new one will be doing laps around it in terms of dollars/FPS.
Zoratsu@reddit
If we are going to use specific tech to gatekeep then let's put a PhysX game in the competition too.
Strazdas1@reddit
the tech was an example to make a point that sometimes new tech does indeed matter a lot.
no_salty_no_jealousy@reddit
Intel is the only savior. People somehow seeing this BS Amd news as "positive" is just stupid, we shouldn't praise mid end GPU which is overpriced like 9070XT which solt at $700 like wtf? That's not mid end pricing.
I hope Intel would keep kick Amd ass with Battlemage and soon with Celestial, if they sell mid end GPU for around $400 they already winning !!!
Kozhany@reddit
The 9000 series launch was arguably ATI's best, too.
Farfolomew@reddit
Agreed! That Radeon 9700 Pro might have been the last time ATI/AMD was ahead of Nvidia across the board. The subsequent GeForce 6000 cards were impressive when released, and even tho the X8xx series Radeons were good, they weren’t as good as NVidia’s. Those were very last of the the AGP generation cards
_zenith@reddit
I liked the 5000 series too. I had 2 of the 5090 cards - they used an internal Crossfire bridge, as this card was basically 2 of the 5080 GPUs combined with this bridge. As such, having 2 of these cards in Crossfire, this was a quad-Crossfire configuration. It was absurd, and awesome haha, even if it often didn’t get the performance it should have. On the games that did properly utilise it, it absolutely annihilated everything else.
IIRC, this I had paired with the first of Intel’s integrated memory controller architectures, Nehalem, the Core i7 920. This clocked at 2.66GHz, but I could run mine comfortably at 4.2GHz, on air even. Ah, the glory days of overclocking… anyway, having this rather fast (for the time) CPU was very much necessary to be able to drive this quad Crossfire setup. I think Nehalem was intended as a prosumer product, being derived from their newest server architecture, but it quickly became popular among performance enthusiasts, and for very good reason… it totally blew away previous generations of CPUs with their comparatively primitive front side bus method of memory access (other than AMDs Opteron, with its HyperTransport, upon which Nehalem was very obviously inspired by. BTW, today’s Infinity Fabric is closely related to HyperTransport 🙃)
biciklanto@reddit
With how the Nvidia 5000 series launch has gone, I anticipate buying the 9070 XT from Sapphire.
I figure any decent card is going to be a huge upgrade from my GTX 1080 😂 and will pair well with my 9950x3d much better, as that + a 1080 is just a ridiculous combo at this point
akdjr@reddit
Using my 9950x3D with my old 2080ti :p. The sad reality is that I need the vram of the 5090 for work :(
biciklanto@reddit
Can you tell me about your use case? I'm curious because I partially just think, well, I've spent what I have, might as well top it out (and have 128gb of total [V]RAM in my system).
akdjr@reddit
Yep! Working on a non-gaming application of unreal engine while working with multiple displays - we end up rendering multiple worlds simultaneously, with our current version requiring a large amount of vram mainly for multiple frame buffers. We’re working to optimize, but some of our scenes that we’re using need more than 16gb
RealOxygen@reddit
9950x3d + 1080 is a diabolical combo
INITMalcanis@reddit
I mean at least they're 100% certain they're getting the absolute most out of their GPU in all circumstances...
PT10@reddit
That poor card just needs a break
djseifer@reddit
1080: I'm tired, boss.
arryporter@reddit
Gf4 ti 4200.. im dyin baws.
INITMalcanis@reddit
He's worked a long day :(
Infiniteybusboy@reddit
In most 4k games is there even a real chance of getting bottlenecked with any half modern cpu?
Strazdas1@reddit
Yes, depending on what you play. I can give you examples where youd be CPU bottlenecked in 4k even with a 3050.
Sasja_Friendly@reddit
This might answer your question: https://youtu.be/m4HbjvR8T0Q
Infiniteybusboy@reddit
While I may have missed it, he didn't really measure ray tracing in any of these titles.
grumble11@reddit
There are certain CPU heavy titles where it matters, like some sim games and so on. A powerful CPU also helps with 1% lows, which improves the smoothness of the experience by reducing that 'jitter' feeling.
Strazdas1@reddit
Not even. Plenty of games that will CPU bottleneck in this setup. I had 7800x3D and a 1070 combo bottleneck on CPU before :)
xenocea@reddit (OP)
It'll definitely be a momentum upgrade for sure going to the 9700XT. I previously went from the good old 1080 Ti to 4070 Super. My frame rates have literally doubled in raw rasterization. This is without using DLSS or frame gen.
You going from a non Ti to the 9700XT which is faster than my 4070 Super, you'll see an even bigger gain than I did.
marxr87@reddit
kinda crazy it only doubled over 10 years and 4 gens. really goes to show how slowly upgrades are coming these days, and that most people don't need to update even every other gen....maybe every 3rd gen.
Matthijsvdweerd@reddit
It was always comparing flagship to flagship, so I don't really think this is a fair comparison. Take 1080ti vs 4090 and it's a whole different story.
marxr87@reddit
i mean, that's too generous in the other direction. xx90 is much more similar to titan class, although it isn't a 1 to 1 and now there are super and ti super, etc. just do 1080 ti vs 4080 ti. and it's not much more than double
Matthijsvdweerd@reddit
Keep in mind that the 4080 should have more like been a 4070ti/4070 at most, because of the nvidia 'naming scandal'. There is no 4080 ti, only 4080 super. So i think, even though it seems unfair because it sits a tier above and is more than triple msrp vs msrp, comparing against the 4090 makes sense, to me atleast.
Drict@reddit
Depends on your use case. I noticed a decent bump and smoothing out of my experience WITH max settings from a 3080 to a 4080 Super. I also was just shy of my target with the games etc. and I needed just a touch more power. Had I been on a 3090, I wouldn't be upgrading until the 7000 series.
Infiniteybusboy@reddit
Stuff is a bit tighter with 4k on account of even top of the line cards struggling with it but short of a big breakthrough gpus have basically flatlined.
king_of_the_potato_p@reddit
I was hesitant myself a few years ago, I've had nvidia the better part of the last 20+ years. I picked an XFX 6800xt merc back in 2022 for a fairly low price and it's been great.
At the moment its beasting, undervolted to 1015mv clocked at 2400mhz on the core.
alpharowe3@reddit
My favorite thing about switching from Nvidia to AMD was the Radeon software but I like constantly tinkering with settings.
king_of_the_potato_p@reddit
Oh for sure, amd's adrenaline was way ahead of nvidia. Even with the nvidia finally moving away from the old control panel and gforce experience its still lacking.
I was able to undervolt down to 1015mv with a 2400mhz clock on core.
BioshockEnthusiast@reddit
GeForce Experience will remain the lesser of the two software suites until they stop with the account shit.
_Fibbles_@reddit
The nvidia app doesn't require a login
michiganbears@reddit
Im in the same boat, I have a 1050 right now and just got the 5070 to go along with a 9800X3D. Even with the 5070 not being a huge upgrade from the 4070, it will be a huge upgrade for me. I also went with Nvidia rather than AMD since I know it will out perform in adobe programs
biciklanto@reddit
Adobe is the single biggest point that could hold me back from the 9070 XT. There's part of me that thinks that just biting the bullet once and buying a 5090 might be the right move just to know I'm covered for a good while.
michiganbears@reddit
Hoping the next gen or two we start to see them close that gap some more
Zenith251@reddit
Fucking loving my ASRock 9070 XT. Steel Legend.
slighted@reddit
i just moved up to a (sapphire) 9070 xt + 9950x3d from a 1080 + 6700k
4k ultra on everything. my large format files are flying in photoshop. this is really stupid considering the components, but even web browsing with 100s of tabs is extremely fast and responsive now lmao.
SEI_JAKU@reddit
I still want one, my Micro Center just got some in stock, but I just don't have the cash. Gonna wait and see what the 9060 XTs look like.
TerriersAreAdorable@reddit
Months of stockpiling was great for the first week, but those cards are all sold now. The real test is the first quarter.
pewpew62@reddit
is it? The problem is that scaling up supply to meet demand takes a long time, they can't just do it in 2 weeks, it will take months and months, more like half a year probably, until we see the effects of the increased supply
Kionera@reddit
In the Chinese market there's actually no shortage of supply, you can readily buy 9070XTs online albeit at 20% inflated prices, and the 9070 at only 5% inflated prices. Prices has been steadily dropping over time too.
It seems like the Chinese market is a major focus for AMD, which explains the positive reception there.
danny12beje@reddit
Can confirm.
My country has them too. I wouldn't call them inflated prices since VAT, import tax and retailer profit are added to keep the GPUs around 800 euros (5070ti is around 1100) and a few retailers still have GPUs in stock.
sdkgierjgioperjki0@reddit
MSRP means street price as in the actual price you are supposed to pay and it includes taxes and retailer profits and everything else. They are absolutely inflated.
fatso486@reddit
hehe . WT* why are you downvoted to oblivion. What am I missing. MSRP+ %20 vat is $720. 800 euros is $862
TwilightOmen@reddit
The person is being downvoted because he or she said something that is completely wrong. Both statements in the post are false. MSRP is not street price, and MSRP does not include taxes.
Strazdas1@reddit
MSRP is not street price. MSRP DOES include taxes outside of US. for example european MSRP is with VAT included.
TwilightOmen@reddit
So, can you tell me the "european" MSRP then? And what is the %tax in "europe" ? Oh wait, IT VARIES FROM COUNTRY TO COUNTRY? Nooooo... You don't say!
Strazdas1@reddit
Yes, it varies from country to country. For example 5070ti MSRP in germany is 899 euros. Note that since MSRP is only suggested price, they dont really need to account for each minute variation in taxation. The retailers will do that on their own.
TwilightOmen@reddit
https://old.reddit.com/r/hardware/comments/1jjggqp/amd_ceo_radeon_rx_9070_xt_first_week_sales_10x/mkngd01/
TwilightOmen@reddit
MSRP does not mean street price. I mean, the acronym literally means Manufacturer's Suggested Retail Price. Aka, a price suggested by the producer. It has nothing to do with what cards sell for, and does not contain taxes.
Strazdas1@reddit
MSRP does in fact contain taxes outside of US.
TwilightOmen@reddit
No, it does not.
Strazdas1@reddit
Yes, it does.
TwilightOmen@reddit
https://old.reddit.com/r/hardware/comments/1jjggqp/amd_ceo_radeon_rx_9070_xt_first_week_sales_10x/mkngd01/
danny12beje@reddit
Not all countries have the same taxes, wtf do you mean?
Any-Ingenuity2770@reddit
MSRP is set per-country to account for that
danny12beje@reddit
Can you show me where that msrp per country list is? Officially posted by AMD.
Thanks
apmspammer@reddit
It's not necessarily public but is told to retailers. I am not defending the system just explaining it.
danny12beje@reddit
And what's your basis on this? I know people that do procurement for one of the biggest retailers in my country and guess what.
They don't even buy from AMD. Because that's how this works. AMD doesn't go to each retailer on the planet and tells them, personally, how much they should ask for a product, my guy.
Matthijsvdweerd@reddit
Each different country has different VAT/tax/etc. So naturally, it's going to change the price. It's probably along the lines of: 600usd * currency exchange rate + tax percentage of the total in local currency = local msrp
tukatu0@reddit
Vat is an import tax. Retailer tax is over inflated price. They dont pay $600 and sell them to you for $600.
Even assuming they did pay $750 for that pre launch stock. After € value thats €675. And if you assume they will be oaying $600 moving forward. Thats €550 or €650 after vat.
At €800 it is as if you are paying more than double vat. How is that not over inflated
GrumpySummoner@reddit
Same here. Decent amount of 9070XTs in stock now if you're willing to pay €800-850.
PT10@reddit
Same for Nvidia though. There's plenty of 5090s in Asia.
It seems North America and Europe are no longer the primary market. At least AMD did allocate a launch here.
Acrobatic_Age6937@reddit
You won't have an issue getting 5000 cards in the EU. You just won't like the price.
Strazdas1@reddit
even that has gone down now. not to MSRP but its not crazy scalping either. A 5070ti for 900 Euros post-tax.
gahlo@reddit
Are there plenty or are the ones that there not selling?
LavenderDay3544@reddit
Is America not a major market anymore for an American company?
Killmeplsok@reddit
True, there's no shortages in Malaysia too, but again we don't really have stock issue most of the time except for the most extreme period including NV cards so I'm not surprised.
li_shi@reddit
It's not that much once you consider China have 13% sales tax baked in the price.
VampyrByte@reddit
Doesnt seem like excellent stock here in the UK, but they are certainly available. Still some 7900XTX available in places too.
I don't like the prices but they are not massively inflated.
Strazdas1@reddit
Theres no shortage of supply. Everything is in stock now. The window where you would sell anything you release is over.
Anfros@reddit
They can't really scale up production. There is a fixed amount of fab time available, and consumer GPU has too low margins to be worth dedicating more fab capacity to.
996forever@reddit
And for that reason, the first week would be just as useless as the first quarter
SJGucky@reddit
In germany 9070XTs are readily available at 800€, MSRP is 689€ for comparision.
The 5070Ti is still barely available below 1000€, retailers are keeping prices high.
I'd say here every who wanted an AMD card above 700€ has bought theirs.
Strazdas1@reddit
5070TI MSRP in europe is 899 euros. I can find plenty in the 900-1000 range here.
SJGucky@reddit
5070Ti MSRP in germany (19% tax) is around 829€, but the cheapest card is 949€ currently.
Strazdas1@reddit
According to Nvidia MSRP for europe is 899 Euros. Where are you getting the 829 from? are you just taking american MSRP and adding tax? because thats not right.
SJGucky@reddit
Yes and no. I guess the price is more like 849€. But not 899€.
Nvidia already made an adjustment for the price, because of currency exchange. The 5070FE was lowered down to 619€.
But since there is no FE model for the 5070Ti, there is no "official" currency adjustment.
That is the problem for not having an actual MSRP.
Strazdas1@reddit
Nvidia does seem to make adjustments after the fact so the price may change, yes, but the original MSRP was 899 euros. You dont need a FE to have a MSRP.
Firefox72@reddit
Carda have been available in the EU for a while now.
IIlIIlIIlIlIIlIIlIIl@reddit
Also the initial stock was enough so that anyone that wanted to buy a card in the first couple of days at MSRP could, at least in the UK.
I've never seen NVIDIA cards at MSRP actually be available, I imagine scalpers and bots lap them up instantly, but I did curiously check AMD the day after release and there was MSRP stock.
chefchef97@reddit
Blatantly false, no cards have been sold at MSRP in the UK since the first 35 minutes of release. 2 hours if you count people trying and failing to buy from OCUK as it was broken the whole time.
In no universe were there cards for hours, let alone days.
Strazdas1@reddit
I cant speak for UK, but in my eastern european country prices werent that far from MSRP.
IIlIIlIIlIlIIlIIlIIl@reddit
Eh, I checked both at 6PM and around 2PM the day after and it was available. Even shared it with my friends and 2 of them bought one, as they didn't even bother checking cause they assumed it'd be gone within minutes like Nvidia.
incognataa@reddit
Did you try to get a card when it released? After the first few hours of trying if you were lucky to get through to checkout yes then it was MSRP. But after a few hours the price increased.
LavenderDay3544@reddit
Meanwhile Nvidia barely allocates a single wafer a quarter to GB202 dies.
Hetstaine@reddit
Yep. Still trying to get a 9070xt puke. No joy.
ReplacementLivid8738@reddit
Should look for a pure instead
Yasuchika@reddit
Cards aren't sold out in Europe at all, they're just price inflated by a massive margin and not moving because of it.
FriendlyBlanket@reddit
There was a restock at Best buy for Gigabyte and XFX. I missed ordered online, went in to the store, and they were able to order a card to my house. Expected delivery is around a week and a half.
Rentta@reddit
Week ? Couple days at max here in EU
shugthedug3@reddit
They sold out in a few minutes here at the fake price, nobody seems very interested in the real MSRP though which is understandable.
TheCatOfWar@reddit
Was it? I couldn't buy one on day 1 despite trying for hours
iBoMbY@reddit
There are many available in Germany at least, only the vendors still try to keep the prices up.
chipface@reddit
An anecdote of mine. I saw a few when I went to Canada Computers the other day. And this wasn't first thing in the morning. This was around 4:30PM.
surf_greatriver_v4@reddit
*In the USA
LettuceElectronic995@reddit
according to whom?
zimbabwatron9000@reddit
She talked specifically about the 9070xt (not "9000 series") outselling their previous cards.
It's a little misleading measuring such a short period after the card was stockpiled, let's see the next 3 months.
Nevertheless, it's good for everyone if they really do well, then nvidia will have to put the bare minimum of effort into their next cards again.
no_salty_no_jealousy@reddit
This post feels like BS to drive Amd stock market but hey, it Lisa Su and r/hardware will "forgive her" for her lies.
Strazdas1@reddit
9070xt IS the 9000 series. 9070 is just defective xt dies and 9060xt isnt released yet.
cagefgt@reddit
I mean, the bar isn't really that high.
no_salty_no_jealousy@reddit
People somehow seeing this BS Amd news as "positive" is just stupid, we shouldn't praise mid end GPU which is overpriced like 9070XT which solt at $700 like wtf? That's not mid end pricing.
I hope Intel would keep kick Amd ass with Battlemage and soon with Celestial, if they sell mid end GPU for around $400 they already winning !!!
BlueGoliath@reddit
It really wasn't.
cagefgt@reddit
100 GPUs sold in a week is, indeed, 10 times more units than their predecessors.
BlueGoliath@reddit
When AIBs started dropping out, you know things were bad.
Joezev98@reddit
Yeah, we really should have seen the awful gpu's coming when EVGA exited the gpu market.
cagefgt@reddit
[Indeed])https://www.tomshardware.com/pc-components/gpus/msi-skips-rdna-4-and-will-not-manufacture-amd-radeon-9000-series-gpus)
MumrikDK@reddit
You say that like AMD wasn't the big dog for some past generations.
The GPU market used to have proper competition between the two.
no_salty_no_jealousy@reddit
"Most successful" isn't really success when you sell overpriced garbage GPU 3x more than what it should be priced. This trash BS post is just exists to drive Amd stock market BS.
Amd can't sell mid end GPU at $400, i hope Intel would kick Amd shit ass with Battlamage and Celestial.
ResponsibleJudge3172@reddit
Blah blah blah mindless Nvidia drones and Nvidia mindshare, etc etc rubbish excuses start wavering
Flynny123@reddit
Can these really be selling better than the entire 6000 series, which are actually pretty great and went properly toe-to-toe with nVidia for the first time in years?
Roph@reddit
They were overpriced relative to the competition, the 9070/XT isn't (as much).
ykoech@reddit
NVIDIA handed them this.
Kashinoda@reddit
Intel did the same on CPU too. If you stand still or release shit, eventually competition appears in the rearview mirror. You still have to grap the opportunity which AMD have.
NuclearReactions@reddit
Yep and not just due to low stocks how many think but also because of the fire situation, low uplift and high pricing. I always get nvidia, only had one ati and one amd in my life. I could have waited for a 5080 or 5090 to show up but i prefere waiting for the next wave of 9070xt, take the compromise in performance but have a card with a somewhat decent price/performance ratio.
conquer69@reddit
Why does Nvidia have low supply? Are they using all the chips for AI and the prosumer market?
ModernRonin@reddit
NVidia isn't publishing much in the way of numbers, and TSMC isn't talking at all as far as I know. So those of us out here in the real world trying to buy GPUs can't be certain of anything.
That said, Paul's Hardware recently said, based on a number that NVidia gave at GDC in Taiwan about two weeks ago, that approximately 5-6% of NVidia's share of the chips TSMC can produce, went to consumer GPUs. The math isn't hard... 100% minus 5-6% equals 94-95% of NVidia's chips going to AI Datacenters and other corporate customers. Not to gamers. See: https://www.youtube.com/watch?v=EgZnpN-xFaY&t=107s
If you want to get some idea of how much of an insanity-level cash cow AI Datacenters are for NVidia, skip to 8m25s in this video: https://www.youtube.com/watch?v=8VGJ3UGDdhM . Basically, NVidia earns about 21 times more money per chip die from an NVL72 AI accelerator card, than from a consumer RTX 5090. So that's where your 5090 went - to some dumbshit Tech company executive, currently blowing 10 billion dollars on a data center based on the stupid AI/LLM fad.
IOW to train LLMs that, no matter what ridiculous lies Sam Altman may spew, 1) do not "think" in any meaningful sense of the word, 2) do not "understand" anything in any meaningful sense of the word.
AMD is actually jealous of how insanely NVidia is soaking these low-IQ CEOs, and they recently signed their own deal to deliver 30,000 AI accelerator cards with AMD chips, to Oracle. See: https://www.techradar.com/pro/amd-just-signed-a-huge-multi-billion-dollar-deal-with-oracle-to-build-a-cluster-of-30-000-mi355x-ai-accelerators
So if you're wondering why there don't seem to be enough 9070 (/XTs) for all the people who desperately want them, and why AMD's claims about "more supply coming in April" don't pan out... Well, now you know where all of AMD's TSMC GPU chip output went to.
Strazdas1@reddit
I dont want LLMs to think. LLMs are tools and should be used as such.
ModernRonin@reddit
You're very much smarter than most of the Tech Company Execs throwing billions at AI datacenters.
Strazdas1@reddit
Well, i do want a singularity even at some point, but LLMs aint it.
ModernRonin@reddit
Likewise. Nothing wrong with LLMs, but we aren't gonna get AGI (much less anything past that) out of them.
ModernRonin@reddit
I don't blame TSMC, BTW. They are making chips as fast as they can. And nobody else can make chips with the insanely tiny 5nm type features that TSMC can.
It's NVidia who orders the chips from TSMC, and NVidia's choice who to sell those chips to. NVidia are the ones to blame for the shortage. And NVidia are the ones who continue to lie about it - blatantly. See: https://www.youtube.com/watch?v=UlZWiLc0p80
vHAL_9000@reddit
Nvidia is a publically traded company with a fiduciary responsibility to its shareholders. What you're proposing is illegal.
Imagine investing in a company, which then promptly decides to sell its product at 5% of the market price to salty video game players for no good reason. Your investment would go up in smoke!
Strazdas1@reddit
This is not true and a gross misinterpretation of the law. The fiduciary responsibility is much broader than quick cash out schemes. Nvidia has an exellent argument of gaming products being the test bed and market creators for AI enviroment ever since CUDA launched in 2006. Long term stability and profit is much higher priority than short term games under the fiducuary responsibility.
ModernRonin@reddit
The only thing I'm proposing is that NVidia quit with the bullshit, and just straight up tell us what we already know: That ~95% of the chips they get from TSMC are being sold to AI datacenters, and that this is (obviously) starving the consumer GPU market.
How will NVidia's profits go down from just saying the plain facts that we all already know? How are they abdicating their fiscal responsibility by stating the bleedingly obvious? Are the dipship Tech Company CEOs who are dumping billions into AI datacenters, going to stop buying? Not a damn chance!
Continuing to pretend like the consumer GPUs market isn't drastically underserved just makes people hate NVidia, and drives them toward AMD and Intel GPUs. It isn't in NVidia's long-term best interest. And the most annoying thing is... acknowledging the plain reality of the situation... is free! It literally costs them zero dollars!
Continuing to lie about the current situation is more work, and all it accomplishes is to make average people hate them. Why expend extra effort, just to piss people off? It makes no sense.
vHAL_9000@reddit
What for? Everyone knows. Start paying 50k per die and they'll take you seriously gamerboy.
advester@reddit
Because gaslighting is offensive
ModernRonin@reddit
If honesty isn't something you value, then I see no point in attempting to explain to you why it's important. "Don't teach a pig to sing" and all that.
I don't want NVidia's respect any more than I want one of their insanely overpriced 5000 series GPUs.
And so I'm not gonna playing NVidia's stupid games, with NVidia's stupid rules.
"Play stupid games, win stupid prizes." I'm not stupid enough to give NVidia my money.
NVidia can suck the rotten shit from my zitty gamer asshole.
ModernRonin@reddit
Not gonna playing NVidia's stupid games, with NVidia's stupid rules.
"Play stupid games, win stupid prizes." I'm not stupid enough to give them money.
NVidia can suck the rotten shit from my zitty asshole.
ModernRonin@reddit
The only thing I'm proposing is that NVidia quit with the bullshit, and just straight up tell us what we already know: That ~95% of the chips they get from TSMC are being sold to AI datacenters, and that this is (obviously) starving the consumer GPU market.
How will NVidia's profits go down from just saying the plain facts that we all already know? How are they abdicating their fiscal responsibility by stating the bleedingly obvious? Are the dipship CEOs who are dumping billions into AI datacenters going to stop buying? Not a chance!
ModernRonin@reddit
The only thing I'm proposing is that NVidia quit with the bullshit, and just straight up tell us what we already know: That ~95% of the chips they get from TSMC are being sold to AI datacenters, and that this is (obviously) starving the consumer GPU market.
How will NVidia's profits go down from just telling saying the plain facts that we all already know? How are they abdicating their fiscal responsibility by stating the bleedingly obvious?
grumble11@reddit
It makes me wonder if INTC really would have had a win on the foundry side, since TSMC can't keep up with AI demand. It got scaled back so now who knows, but it could have been quite the thing.
_zenith@reddit
I doubt it, simply because there is a severe conflict of interest: most of what customers would wish to have fabbed there, Intel also makes (as in, the type/category, not the exact chip) themselves as products. As such, there is very understandably a fear that Intel will take their IP and repackage it. It would be very, very difficult to prove they did it.
TSMC doesn’t have this issue.
ModernRonin@reddit
I heard that the Arc GPUs were Gelsinger's idea. If that's true, I commend him for being very forward-thinking. The NVidia/AMD psuedo-dupoly isn't great, and I'm happy to see another player in the market. If AMD pisses me off the way NVidia has, I will be turning to Intel for a GPU. It may even happen later this year, depending on how many 9070 XT's actually end up for sale at MSRP in the USA.
Zarmazarma@reddit
I mean... does anyone really care about that? I want LLMs to be able to interface with computers with human language. Ask them questions in natural language and get good answers. I don't really care if they think or understanding what I'm asking, lol. That basically has nothing to do with the value proposition of LLMs.
ModernRonin@reddit
Everyone is going to have to decide that for themselves. If a "stochastic parrot" that basically spits back an encyclopedia entry when asked about a certain topic is good enough for you, then go nuts with LLMs.
I'm not actually an LLM hater. I think LLMs are neat, and I absolutely think they are a good form of AI.
I just think that some of the things that Altman and other people with billion-dollar investments in bullshit AI hype are saying, are utterly stupid and completely wrong. In other words, what I hate are stupid rich human beings... not artificial neural networks.
tukatu0@reddit
Being a encyclopedia searcher is nice and all (which has been pretty sh""" for me since they sileced 3.0. So i dont really agree.)
But have you see this? https://old.reddit.com/r/ChatGPT/comments/1jjyn5q/openais_new_4o_image_generation_is_insane/
Strazdas1@reddit
Its amazing in generating situations that are expected. Not so much in more niche situations. I use image generator for a TTRPG i run on weekends. It includes characters that arent human and boy does the generator do not like painting that. It takes a lot of tinkering to get it to do anything good. 'hybrid' is apparently the most useful keyword for this.
ModernRonin@reddit
Fun stuff! This kind of thing is a big part of the reason I don't hate generative AI.
The Van Gogh style Roll Safe, in particular, had me lol'ing. I love that meme!
ModernRonin@reddit
Fun stuff! This kind of thing is a big part of the reason I don't hate generative AI.
The Van Gogh Roll Safe, in particular, had me lol'ing.
Strazdas1@reddit
Altman was spewing bullshit even before he veered off into LLMs. Look up some of his old panel discussions, he was always full of himself and made ridiculous statements.
Baggynuts@reddit
Honestly, the lies are mostly not for us though, they're for the people with more money than brains. Altman's doing the same thing Musk did: create a hype train to relieve dipshits of their money. He's a hype-man. That is all. 🤷♂️
ModernRonin@reddit
Agreed!
INITMalcanis@reddit
"and get good answers" is kind of the issue. LLMs can get really good at tasks of the kind 'go look up this information I could get for myself but don't want to' but they're dangerously useless for tasks of the kind 'I need you to actually understand the subject matter and intuit what I'm doing with it' because they'll give you answers that seem like they do, but they really don't. A
nd the AI hypists are absolutely conflating the one with the other.
PMARC14@reddit
I am pretty sure there aren't enough 9070's because AMD made the expected demand based on previous sales, so when Nvidia emptied crumbs from their pocket for consumers, they did not prepare for the demand. The Mi355X actually uses TSMC N3 so doesn't steal demand from consumers products like Nvidia, Radeon's main production compeition has always been Ryzen CPU's, so if you are out buying AMD Laptops that is one less graphics card.
ModernRonin@reddit
I don't understand why laptops are relevant. Any Ryzen CPU die, laptop or desktop, is in competition for TSMC's manufacturing capability with RDNA4 dies. Do I understand correctly?
PMARC14@reddit
Ryzen desktop is on 5 + 6nm TSMC so doesn't compete as closely. But all of Ryzen Mobile which is a very hot commodity in comparison to Radeon desktop GPU's all use TSMC 4 nm just like the 9070 and 9070xt. And that overlap has been the case in previous gens as well so Ryzen division typically gets priority for sourcing wafers especially if the number of TSMC orders is limited by their demand, no different then Nvidia using all of their TSMC wafer allocation on enterprise.
ModernRonin@reddit
Thanks for the clarification! I understand now.
PMARC14@reddit
It is kind of funny because part of the popularity of Ryzen Mobile is their very powerful Radeon iGPU's (especially with the launch of handhelds), but they always suck all the air from the Radeon discrete products. Also I forgot to mention consoles as well.
WarOnFlesh@reddit
AI is getting real good, real fast.
n19htmare@reddit
Yah pretty much. They have finite resources and capacity at TSMC....you either use it on GPUs or something that will bring in 15x the revenue. For any corporation, the answer is pretty clear.
Acrobatic_Age6937@reddit
which begs the question, how is amd pulling it off? Their current b2b cards are pretty solid, so the demand should be there. Are they intentionally bleeding money on the b2b market and actually 'buy' gaming marketshare atm?
Strazdas1@reddit
They arent. first, they have been stockpiling GPUs for months at retailers. Secondly, they dont really have any B2B demand because their cards arent in fact solid.
Acrobatic_Age6937@reddit
look up benchmarks. The cards are benching faster than nvidias. They do lack on the software side.
Strazdas1@reddit
Benchmarks dont matter if you cant back them up with real life use. And sadly in real life they just dont stand up to what the current demands are except in the currently much lower demand cases like weather pattern prediction.
n19htmare@reddit
AMD doesn't have the same demand.
There was article the other day that Nvidia shipped 3.6Mil Blackwell GPUS to just 4 cloud service providers alone..... that kind of demand doesn't exist for AMD.
Those type of figures are also indicative of where majority of their supply is going, and it's not towards filling consumer GPU demand.
PMARC14@reddit
They don't have enterprise AI demand nearly as much as Nvidia.
acc_agg@reddit
Yeah, even the flagship consumer grade GPUs make them a fraction of the revenue that putting that silicon in AI cards does.
falcongsr@reddit
like more than 10x the revenue per chip.
acc_agg@reddit
1/10 is a fraction.
falcongsr@reddit
big if true
AbhishMuk@reddit
No no 1/10 is rather small
Ok-Board4893@reddit
I wrote this comment a while ago and got downvoted to hell because people said the ai chips are limited on something else (was it packaging idk)
f3n2x@reddit
They are, and the comment is wrong.
joe1134206@reddit
Refusing to elaborate is always a good sign.
WarOnFlesh@reddit
they are different chips, but it all comes down to how many chips they can order from TSCM. Let's just throw out a round numbers and say they can order 100 wafers per month from TSCM.
Every wafer they buy, they have to decide if that is going to be an industrial/datacenter/AI wafer, or a Gaming GPU wafer.
Every AI wafer makes them $100. Every gaming GPU makes them $10.
So, every time they choose the gaming wafer, they are leaving $90 of profit on the table.
again, these are fictional numbers, but hopefully you see the point. If they know for a fact that they are going to sell out of both types, no matter how many they make, why would they choose to make ANY gamer GPUs?
Well, the short sighted business answer is they shouldn't. They should completely pivot to 100% AI/datacenter chips.
But, the long term business answer is they don't want to put all of their eggs in this AI basket, and they have brand recognition and a reputation. And those things aren't free and they are definitely worth something. So even though they are losing potential profit, they still think it's worth it.
WarOnFlesh@reddit
yes. Every gaming GPU they sell they are losing out on profit from selling those same chips to a datacenter doing AI or some other type of computation.
The profits on business GPUs is much higher than consumer GPUs. They are only staying the Gamer business for the street cred. If they could abandon it tomorrow without the reputation hit, they would.
teh_drewski@reddit
It's not for "street cred", it's for strategic diversity. It's basically an insurance policy for it the AI bubble pops - they don't want to have to rebuild all their corporate knowledge in the market if they can't make windfall profits from LLM creators any more.
Crusty_Magic@reddit
Yes, production is being prioritized for that market segment.
Quatro_Leches@reddit
yes, last quarter less than 10% of their revenue was from gaming, rest is all datacenter cards
Jensen2075@reddit
There's a long wait list for AI chips. AI chips have bigger profit margins. Connect the dots.
CodeMonkeyX@reddit
If they want to maintain any good will, they need to get the prices down to MSRP consistently.
INITMalcanis@reddit
This implicitly means, in current conditions, AMD ramping from supplying 10-12% of the GPU market to 50 or 60% or more. A big ask considering that this plans are usually made several months in advance.
If AMD have a particle of sense they'll be siezing this once in a decade opportunity to reclaim some marketshare and mindshare but even if they go high priority on it, it'll take months to stabilise prices.
Strazdas1@reddit
The purchases has increased to about 30-40% per reports, but not to 50-60%.
INITMalcanis@reddit
Indeed, and that's "30-40%" of cards that people have been able to buy at a price that they can stomach, not 30-40% of 'true' demand (ie the demand that would apply under what is laughably called "normal conditions" - the number of GPUs that retail customers would buy at \~MSRP with widespread availability).
AMD might be supplying as much as 15 or even 20% of the 'true' or 'normal' or 'real' or whatever you want to call it market but they're nowhere close to saturating it. There are still a hell of a lot of people who would like to buy a 9070XT at £569 or $600 but can't.
jaxspider@reddit
THEN MAKE MORE OF THEM SO WE CAN BUY THEM.
surf_greatriver_v4@reddit
Yep, would love to grab one, but availability in the UK seems dire right now, only a few models available for preorder at the regular shops, the rest you can't even preorder
Strazdas1@reddit
can you import from EU? availability in EU seems fine.
Mexiplexi@reddit
now time for a 9900xtx
Ultravis66@reddit
Wont happen, amd got their eyes set on Udna. Its “supposed” to be really good. But we will see…. Im rooting for amd! Nvidia needs to be knocked down a notch.
Strazdas1@reddit
even in victory AMD never fails to "we will fix it next gen".
PhoBoChai@reddit
Imagine making a decent GPU uarch and having stock for launch at decent prices. That's all AMD had to do!
Strazdas1@reddit
imagine a company that got lucky with CPUs because competition ate glue and all they had to do is be competent get the exact same scenario with GPUs.
littleSquidwardLover@reddit
Yeah but even if they did retailers wouldn't pick it up. AMD for year after year has been the lesser of the two. Hence the joke that they always fuck up on launch, this being the first time they didn't.
Retailers have been burned countless times by buying these AMD cards promised that they would sell only to be severely outperformed and put priced by Nividia being left with countless AMD cards. So why should they have believed that the 9070 would be any different? Hopefully next time they will order more seeing that AMD finally holds a candle to Nividia.
Wrong-Quail-8303@reddit
Don't worry, they will fuck up thir good will next launch with underwhelming performance and nVidia - $50 pricing.
One would think they would learn from their mistakes. Spoiler: They won't.
puffz0r@reddit
I'm a little more optimistic since they got a new head of Radeon division and this is his first product launch, the guy in charge of all the previous launches is gone now. So they're probably learning that, just like the x3d, gamers will buy the best available products if they're priced decently.
pc0999@reddit
Yet it still cost 2-3x as my 6700.
LavenderDay3544@reddit
There goes any hope of AMD making anything to compete with an Nvidia flagship ever again.
metahipster1984@reddit
Good. Now I hope they bring the heat at the high end too!
BlueGoliath@reddit
You know things are bad when people buy AMD GPUs. Will Nvidia change anything? Probably not.
BlueGoliath@reddit
What rule did this even violate. JFC mods.
996forever@reddit
They really, really don’t wanna sell more gaming GPUs than absolutely necessary to keep their software stronghold in consumer graphics.
BlueGoliath@reddit
I think you meant in AI at the end?
"gaming" GPUs is partially why Nvidia has the marketing dominance that they do. CUDA's ecosystem would not exist without mainstream GPU support.
DeeJayDelicious@reddit
One of the few cases where gamers actually did what they said they'd do.
I.e. "give us reasonbly good GPUs at reasonably good prices and we will buy".
TheGreatGamer1389@reddit
Now just keep them in stock.
SubtleAesthetics@reddit
I hope AMD keep making progress, this 4080 might be my last Nvidia GPU. I mean, I won't be able to buy a 6090 or 6080 regardless due to no stock or scalpers. I'll gladly buy a 10070 XT or whatever if the raster performance is good. And like I said, even if 6000 series is good will you even be able to buy one? Blackwell was boring and is still impossible to find. The new cards will be a new node and performance leap! Good luck getting a 6080!
chafey@reddit
ATI is going to gain a lot of marketshare as they will be able to keep their prices lower than nVidia due to using cheaper DDR6 RAM and smaller die size. I don't think the 5000 series is salvageable for nVidia, especially with the incoming recession and trade wars.
TenshiBR@reddit
AMD did the bare minimum while its competitors made every possible mistake, allowing AMD to capitalize on the situation. Although their products don’t boast top-tier performance, cutting-edge technology, or the best features, they were priced perfectly—even if the MSRP is fake
They might have captured even more market share with a lower price, but given that they are sold out in most places, the current pricing was ideal for the moment. There’s always the possibility of lowering the price later
This situation is reminiscent of the PS4 versus Xbox One era. In early 2013, Sony unveiled the PS4, maintaining a straightforward strategy with no radical changes, while Microsoft stumbled with several missteps during its controversial Xbox One announcement. Sony’s consistency ultimately helped them dominate that generation, marking a significant misstep for Microsoft at the time. If I remember correctly, the announcement by Sony was akin to "we are not doing what they are doing!" and that was it lol
Astigi@reddit
Nvidia is gifting this generation to AMD
ProfessionalWheel2@reddit
I'm so tired of her lies. I've held AMD for three years, and I'm down almost 8% despite her lies and trying to hype the stock. I'm still holding because I know it will pop when she is fired.
XiMaoJingPing@reddit
Sucks that the launch discount is gone, these cards are going for 750+ now
Ok-Arm-3100@reddit
The RTX 5000 series is the most successful launch for AMD. 🤣
littleSquidwardLover@reddit
I'm so tired of this, I'm looking to upgrade but it's such a pain. 40 series is hard to find and expensive, 7000 series isn't quite as good as the 40 series in RT. NIVIDIA just doesn't care anymore I feel like, the past two generations they've just shit in their hands and served it up. I'm glad to see that this generation is the first time that people haven't eaten it up as much though.
Ok-Arm-3100@reddit
Same boat here tbh. If wasn't of Cuda cores, I wouldn't be buying Nvidia. I am using my 3080TI for gaming and GenAi localllm.
littleSquidwardLover@reddit
6700XT has honestly held up pretty well. The drivers have been very good to it, bringing it to about the performance of a 3070 nowadays.
Nourdon@reddit
How is this statement not just misleading like nvidia? Lisa compared $1000 last gen gpu to current $600 (msrp) gpu. Also isn't the last gen gpu sold so bad that amd lose marketshare to nvidia?
Swaggerlilyjohnson@reddit
That statement is ambigous we don't know what she means by predecessors it could be the 7800xt so it may or may not be misleading and probably it is (meaning it probably is referring to the high end rdna 3 launch)
However She made a different stronger claim in the same interview that the 9070xt is the most successful AMD GPU launch of all time.
That must include launches like the 5700xt and 4870/5870 which were midrange cards without a high end option and it should include any midrange GPU launch where the launch was at a separate time from the highend (like the 7800xt).
So basically the ten times number is almost certainly playing it up but when she says it was the biggest gpu launch ever she would essentially have to be directly lying instead of being misleading like Nvidia was.
this is an important distinction because it is illegal for a CEO to make outright false statements about how successful their products are in a publicly traded company.they can and have been often sued for that by investors.
So basically the supply and sales of the 9070 series must actually be genuinely very high by AMD historical standards. how much better than previous generations we really won't know until their next earnings or if they make more specific unambiguous statements.
Temporala@reddit
You don't need to do that. You don't need to "ask" suggestive question, nor do you need to narrate.
Tomshardware was even kind enough to make you an article about GPU market shares and how there's a fair bit of noise in the data. You have to build a trendline over multiple years to make some sense of this stuff:
https://www.tomshardware.com/tech-industry/amd-grabs-a-share-of-the-gpu-market-from-nvidia-as-gpu-shipments-rise-slightly-in-q4
I would expect next data blip to still trend upwards, given how nice 9000-series sales have been so far.
Nourdon@reddit
English isn't my first language. I'm just pointing out that people seems to be able to point out the misleading statement jensen make about rtx 5000 sale while taking it at face value while (imo) lisa do similar thing.
Sure, let's look at the market share chart from your source
For rtx 3000 vs rx 6000 (Q4 2020 - Q3 2022), amd have an average 19.5% market share
For rtx 4000 vs rx 7000 (Q4 2022 - Q4 2024), amd have an average 14.3% market share
If that isn't losing market share, i don't know what is
Photog_DK@reddit
Nvidia screwed up so badly that they made Radeon come back from near death.
Photog_DK@reddit
Nvidia is the best advertisement for Radeon.
Capable-Silver-7436@reddit
having supply, decent RT, good upscaling, decent price. crazy how it does that
Present_Bill5971@reddit
It's competitive in performance and pricing, at least at MSRP. The vast majority of us don't need a $1000 card. Most don't care about anything over $400. So for an AMD is how much production they want to put towards lower priced lower margin cards. If UDNA comes out good then no duh momentum will continue. If they ever get ROCm support day one for all their cards with years of support, momentum builds to no one's surprise
One-End1795@reddit
AMD is probably the only shot at getting more gaming GPUs out in the market, as it doesn't have nearly as much wrapped up in AI as Nvidia does. Therefore logic would dictate they could dedicate more fabrication capacity there. Yes, their data center AI accelerators are selling more than before, but it isn't even in the vicinity of Nvidia's scale.
w142236@reddit
And it’s gonna need to continue that success for the next 2 years or nvidia will catch up and nullify any gains they had on launch
BarKnight@reddit
NVIDIA has been sold out from the start. They are still very far ahead.
iwonttolerateyou2@reddit
The thing is another member here just unlocked big gains on the 50 series cards that nvidia had locked. If this can be unlocked via a patch, I don't think amd will sell well.
surf_greatriver_v4@reddit
Can you elaborate?
Kougar@reddit
Considering what a clusterfuck//unobtanium mess the 5000 launch has been, is this really saying much of anything?
__________________99@reddit
If it's so successful, AMD, then make something to compete with Nvidia's flagships. You guys had the perfect opportunity to take even more this time around. But then you went ahead and announced you would only be competing in the mid-range section of the market.
Only AMD can squander opportunities the way they do.
ActuallyTiberSeptim@reddit
The vast majority of people don't buy "flagship" GPUs. They want decent performance for a decent price.
opaali92@reddit
Why though? The market for them is extremely tiny and not worth pursuing
RealThanny@reddit
AMD doesn't do reticle-limit monolithic dies. Their original plans for high-end RDNA 4 was a MCM monster, which would compete for packaging with the MI300 chips.
Given the ML bubble, AMD chose to skip that this generation and get more profit from manufacturing more MI300 cards.
__________________99@reddit
If it ended up being just faster than a 5080, I still think AMD could've been profitable with it. AMD could've completely capitalized on Nvidia's poor availability this time around. The one time they could've stuck it to Nvidia is when AMD decided not to go big. Which was surprising to me because AMD was doing quite well in raw rasterization performance the last couple generations.
Q__________________O@reddit
Its never about anything other than:
Availability
Price
Usually they set their prices too high.
They didnt this time. And so, success!