CEO Lisa Su says AMD is a data center-first company — DC revenue topped $2.8 billion last quarter, over 4X higher than its gaming business sales
Posted by imaginary_num6er@reddit | hardware | View on Reddit | 162 comments
throwaway044512@reddit
Who would think AMD, Intel, or NVIDIA are gaming oriented? Everyone knows you milk enterprises since they’re cash cows and are happy to pay premiums at high volumes whereas consumers are way more price sensitive.
0x080@reddit
If this was 2010 then yea, nvidia would be taking orientated lol
fratopotamus1@reddit
They’ve been very clear for years that they thought the data center was the future for them and largely got laughed at or written off.
SimpleNovelty@reddit
Wasn't there a 2009 quote with something to the effect of Nvidia being a software company? They always had the long term game plan of a software stack that used hardware (and that's not something a gaming company would really focus on). It's just most people really don't understand compute needs and the eventual things it would enable.
Strazdas1@reddit
They started working on CUDA in 2006. But that quote is from 2015 or something like that if i recall correctly.
SimpleNovelty@reddit
Google shows the quote starting from 2009.
tecedu@reddit
Nvidia were the defacto AI chip company even in 2017, its just they are more popular now. AMD has close to a decade to remove Nvidia's dominance from that sector. Even right now AMD will only target LLMs whereas Nvidia is already focusing on the next challenges their GPUs can solve, one of the ones which I see will make a big change is their GPU accelated pandas and also cudf which can just straight up replace spark at point.
PointSpecialist1863@reddit
Wtf is accelerated panda???
LetsNotBuddy@reddit
A panda that can run fast
Strazdas1@reddit
Thats just fast panda. Accelerated panda would be a panda thats being pushed at increasing velocity by external forces.
Strazdas1@reddit
Even as far back as 2019 nvidia was making more money from gaming than from datacenters.
UpsetKoalaBear@reddit
This was because DanNet and AlexNet came out and people saw the potential for GPU’s on large parallel workloads but more specifically they saw the benefits of using CUDA for these workloads.
SkylessRocket@reddit
Literally go into any one of the numerous posts on this subreddit and all you’ll read is about how AMD drastically needs to reduce the prices of their gaming GPUs to compete with Nvidia or how Zen 5 is a massive flop and that they’re not going to upgrade.
What those idiots fail to understand is that they’re not the number 1 priority. Nvidia and AMD are making so much money from the data-center sector that consumers are virtually irrelevant.
PastaPandaSimon@reddit
It's none of the extremes. Of course data center is the bigger business. But they don't just want to give up gaming. It's still billions of dollars there. That's still a huuuge market. Regardless if they have got an even bigger one.
SkylessRocket@reddit
Someone posted figures for Nvidia last quarter further up. Gaming revenue was $2.9b and data center was $26.3b. And that’s with Nvidia’s AI products being extremely supply limited, if TSMC could alleviate CoWoS constraints Nvidia’s data center revenue would be substantially higher.
Gaming is not a huge market, it is for all intents and purposes irrelevant for Nvidia. Nvidia could not release another generation of gaming GPUs for the next couple of years and it wouldn’t matter to them.
viperabyss@reddit
Must be why they're still working on releasing more features for DLSS and frame generation...
ProfessionalPrincipa@reddit
Those features aren't made to improve gaming experiences. They exist to reduce the amount of silicon being wasted on gaming plebs. It's the same reason they're pushing their cloud gaming, which also has the nice side effect of getting rid of secondary markets. It's all about silicon rationing.
Strazdas1@reddit
Its mostly because that DLSS is actually used in business enviroment as well.
viperabyss@reddit
....it's not. It's because with die shrinks getting exponentially more expensive, and yielding less returns, using AI is a great way to maintaining performance improvement between successive generation without dedicating a significant amount of transistors for that specific purpose.
DLSS and frame generation have very little use outside of gaming.
PastaPandaSimon@reddit
A $3 billion market is definitely not irrelevant just because you've got a bigger one. It is absolutely a huge market. Regardless if there's a bigger one.
They can't "turn their back" on gaming GPUs because they're obliged in front of their stakeholders to maximize profits, among other reasons (like how stupid it'd be to leave $3 billion dollars on the table). Nobody is ignoring gamers. Quite the opposite, it's their mission to milk them as much as they can. They want gamers to upgrade GPUs as often as possible, and they want to continue releasing new generations as soon as they've got a reason to upgrade to sell.
surg3on@reddit
This is what you would do if not silicon constrained. They are though so expect some weirdness
Strazdas1@reddit
They are not silicon constrained. They are HBM and Packpaging constrained. And you know what does not use HBM or Packaging? Consumer cards.
Edgaras1103@reddit
Turned their back against gamers lol. It's a business like anything else and they are supporting devs, adding new features because there's revenue in that segment . Some of you are so emotional about this.
LeotardoDeCrapio@reddit
FWIW the graphics division @ NVIDIA is not just gaming. Quadros have much higher margins and are basically the same core, so vanilla geforces are not the priority for that group either.
Gaming revenue is a nice side effect.
shimszy@reddit
On the contrary, Nvidia quite likely kneecaps themselves with gaming GPUs as their price/performance is way too good for enterprise/research use when applicable. A 4090 performs similar too enterprise GPUs that cost an order of magnitude more. IMO gaming GPUs are just a way to salvage parts that fail validation and to build mindshare rather than maximize profits by making more DC cards.
PastaPandaSimon@reddit
If this wasn't a way to maximize profits, they wouldn't have done it. They must have calculated that some of the extra GPUs sold at a higher price to business is not worth losing sales of hundreds of thousands or whatever 4090s at north of a thousand dollars. These are all very well calculated decisions, and they aren't doing anyone any favors.
svenge@reddit
Not to mention that most of NVIDIA's software stack is just as usable on GeForce cards as their datacenter products, which in turn organically increases the buy-in for their ecosystem.
SeraphicalChaos@reddit
u/SkylessRocket has a very valid point when it comes to the criticism by the uninformed that they're pointing out.
Of course a business is going to prioritize the money maker when they have only so many wafers to allocate to multiple markets and one of those markets earns you orders of magnitudes more money. Now that business has a shortage of supply for the other market (that they created themselves) and keeping prices high due to the scarcity that they just introduced makes sense... as long as the products in that market still sales. This doesn't even consider the other markets they've dipped their toes in (custom chips for very large clients) that binds them to very ironclad contracts.
There's no way a company as big as AMD didn't crunch the numbers on the prices they set and forecast this out. But I guess a bunch of random gamer armchair economists know better then a very well paid team with decades of collective experience...
psydroid@reddit
This became clear to me years ago, so I haven't been an AMD customer since 2009. I did build some AMD systems for others until 2013. I am not an avid gamer, so that was never a reason for me to buy their hardware.
But now all my new low-end hardware is ARM-based. I'm not excluding AMD indefinitely, so I just haven't found a use case for their newer hardware so far. And if Nvidia and/or Mediatek join the client market, so I can have a chip with ARM CPU cores and Nvidia GPU cores, I may not even have a use case for AMD (or Intel) hardware at all.
Zerasad@reddit
So? What do I care? I don't work in data center procurement, data center parts are literally irrelevant to me. My best interests dictate that I care about Zen 5 and AMD GPUs, and that's what I'm gonna talk about. Unlike a $30,000 H100 I might actually buy a consumer end product.
ProfessionalPrincipa@reddit
You won't be able to outbid Sam Altman or the VC's with dollar signs in their eyes for silicon.
LeotardoDeCrapio@reddit
Wait wait, you mean to tell me that grown ass gamers, with little disposable outcome, are not the main focus of the entire tech industry?!?!?
Nonsense!
chefanubis@reddit
The consumer sector is just a marketing expense now. They need to keep their names on the mouths of the IT folks.
gunfell@reddit
Over the long term ignoring “the little guy” may bite them in the ass. Putting all you eggs in one basket for years can lead to undesirable outcomes
psydroid@reddit
I don't expect them to actually ignore the little guy but to give "the little guy" a product more suited to what he needs, e.g. the SoCs Mediatek is said to be developing based on ARM cores with Nvidia GPU cores, hopefully at a lower price than what Intel and AMD sell them at.
yacineKCL@reddit
enterprise isn't 'one basket' :)
ET3D@reddit
AMD certainly earned more from client/gaming than from data centre until recently. In Q4 2021 Computing and Graphics revenue was $2.6B while Enterprise, Embedded and Semi-Custom was $2.2B. In Q4 2022 AMD had shifted to different groups, with Data Centre was $1.7B, Client $903M and Gaming $1.6B.
While Gaming has been going down, $648M is certainly a new low.
In any case, AMD definitely was a gaming oriented company. It's data centre revenue has gone up considerably, so it's now more data-center-oriented, but I don't think this means that AMD plans to leave the segment. In fact AMD said that it want to increase its GPU market share considerably, and the release of the PS5 Pro might also boost this segment.
fastheadcrab@reddit
Even before the rise of "AI" the money was always in data centers for CPUs. AMD would've been foolish not to chase it. It's just that Intel had a hold over CPU installations for data centers for a long time.
And Nvidia has been chasing data centers and GPGPU since 2006 or whenever with the CUDA release. They benefitted greatly from the crypto frenzy, which probably counted as "gaming" but we all know those cards were not primarily being purchased for games.
ryzenat0r@reddit
only narrow vision gamers think that
inflamesburn@reddit
There isn't much more to orient for NVIDIA there anyway, they dominate the pc gpu market already.
siazdghw@reddit
People that push the Mindfactory sales data to spin their narrative certainly seem to think that way.
Gaming/DIY is peanuts compared to laptop, prebuilt/OEM and DC sales. Like Nvidia could say they are shutting down their gaming division to focus all efforts on DC and the stock would likely go up.
foo-bar-nlogn-100@reddit
Listen to the Acquired podcast interview with Huang.
He argues that we are in an post Von Neuman architecture where the bottle neck was transfering from registers to CPU and back.
He argues that compute now is around parrellelism that GPU cores provide. Furthermore datacenters will become hyperscaled parallel compute system.
GPU cores, VRAM, infiniband network connect.
OS is CUDA.
That's the new compute stack.
Instead of Excel presentation layer going through Windows OS , it goes through an AI datacenter for state transitions.
Radium@reddit
Gaming business sales are only what you make them AMD. Get on it!
FumblingBool@reddit
You shouldn’t put hope in corporations… AMD marketing for each of their product segments at best spins their products to appeal their markets. But it doesn’t mean that the actual engineering is working to make products specifically for those markets.
Radium@reddit
The last time I witnessed a company spun up in the GPU market was 3DFX. It'll suck for gamers if they don't see another one spin up for 30 years. Things change when the market matures like this.
I hope Intel has something up their sleeve, but if not we're going to need some small AI startup to pop some GPUs out.
Many corporations have released some amazing surprise products so there is always hope in them.
FumblingBool@reddit
I work there… No one cares about gaming here.
Radium@reddit
I know, we need a new startup 3DFX to happen again. Join the revolt
FumblingBool@reddit
You have no idea what it takes to even produce a nonviable GPU. The reason why Nvidia is so dominant is a combination of hardware and software. That’s a huge capital investment against a dominant player.
Your initial strategy will be competing on cost which already puts you at a massive disadvantage in terms of hiring talent.
You are better off starting an AI chip startup.
Strazdas1@reddit
Which is why i can only see someone like Intel or Apple having the talent and pockets deep enough to carve a marketshare in such a market. Intel is trying, Apple isnt.
Radium@reddit
I don’t produce gpu but of Jim Keller and his smaller team can produce and sell AI chips and cards the way they are then it’s definitely prime time to do a GPU business too.
iDontSeedMyTorrents@reddit
Gamers really are the most oppressed minority. ^^^/s
AMD chases the money, and the money is in datacenters. Like any massive corporation, they don't really give a shit about you, so stop acting like they're some benevolent entity.
Radium@reddit
The money is there in gaming cards. It's a highly profitable business, just not as cracked out as AI data centers at the temporary moment.
iDontSeedMyTorrents@reddit
Not for AMD. They've been struggling for years against Nvidia and making no progress.
Right, they're going where the money is. Margins in server/datacenter are way higher. Problem was, AMD got pushed out of that market for many years. Now they're able to compete. It's why Zen is designed around enterprise first. This whole AI craze now means there's a lot of demand for AMD's datacenter GPUs, too.
Strazdas1@reddit
They are undisputed leaders of gaming CPUs though.
Radium@reddit
Yes for AMD, they haven't done anything but make a profit on the cards they've been selling. Think of it as a entirely separate product/division. There are enough humans to work on CPUs, GPUs, and AI tensor chips *simultaneously* I know, it's mind blowing.
iDontSeedMyTorrents@reddit
And yet it's been years and they've never been able to do that. Mind blowing.
You can look at their financials yourself. Gaming only had an operating profit of $77 million from $648 million in revenue this quarter. Datacenter made nearly 10 times as much profit from less than 5 times the revenue in the same quarter. That's not even mentioning that gaming includes consoles, which are a large part of those numbers.
Radium@reddit
Your definition of "struggled" is weird, they literally haven't even been sweating it or trying anything. Lol
iDontSeedMyTorrents@reddit
Vega wasn't them trying? "Poor Volta" was them DGAF'ing about having the fastest GPU? RDNA 2 wasn't them trying? RDNA 3 w/MCM wasn't them trying? And their market share keeps going down. They didn't stop aiming for the fastest with RDNA 4 because they don't care. They stopped because they can't do it for the time being.
And I don't really know what your fab talk has to do with this. Yes, demand is making fab time more expensive. All the more reason for AMD to focus on high margin datacenter products. Atomic Semi isn't targeted at this market at all from what I understand. Intel has significantly delayed many of its fabs, and we have no concrete evidence that 18A or beyond is actually competitive. Samsung's GAAFET nodes are by all accounts still yielding horribly. GlobalFoundies most advanced node is 12nm. Texas Instruments doesn't come close to fabbing these sorts of chips.
Radium@reddit
The fab costs are precisely why GPUs aren't being worked on anymore.
iDontSeedMyTorrents@reddit
That will never not be the case.
Radium@reddit
Have you seen the price of an nvidia AI card and compared it to a cheap $2000 4090?
iDontSeedMyTorrents@reddit
Exactly. Which is why AMD is focusing on datacenter now. You're repeating what I've been saying this whole time.
Radium@reddit
I literally said that in my first comment ? lol
iDontSeedMyTorrents@reddit
And it shows.
Strazdas1@reddit
For hardware, no. For software, the kind of rug pulls gamers accept would often lead to very expensive lawsuits in B2B.
toasters_are_great@reddit
The thing is that AMD have already done that on a couple of occasions over the years and provided better performance at lower cost and the gaming market collectively said "meh, I'm going to buy nVidia anyway".
In order to change that they need to change the minds of millions of people with some staggeringly amazing marketing campaign as well as the silicon & software to back up marketing claims, because the latter clearly isn't enough.
For the DC sector they have to market to about ten entities who only care about the performance:TCO ratio.
The gaming market yawned at AMD, now AMD yawns back only because of the lessons it has been historically taught by that sector.
Radium@reddit
I've been buying and enjoying gaming with AMD GPUs since the RX480. Haven't gone back to Nvidia and the god awful driver control panel since except on a laptop I have. Just because they aren't the top seller doesn't mean they're dead. That's not how business works. There can be two companies simultaneously succeeding in their own customer base and those in the other customer group are oblivious to the alternative community.
ResponsibleJudge3172@reddit
Would you loo at that. Exactly what all those AMD catches Nvidia who drops gaming for data enter have aged well
Strazdas1@reddit
Turns out it was AMD who dropped gaming (gaming revenue down according to quarterly) while Nvidia vacuumed their market share.
exquisitelytorture@reddit
I was buying by AMD as part of our Sun systems in 2006-2009 as they had better floating point than Intel. We bought 10’s of thousands of them.
darthkers@reddit
They're riding the AI the wave now. Will the Data Center revenue still stay 4X when the AI bubble crashes?
siazdghw@reddit
AI bubble will crash, but it will almost certainly rebound and trend up again, AI isnt going away and will only become more and more prevalent in our lives, similar to the situation with the internet and dotcom bubble.
lusuroculadestec@reddit
Cisco was the "selling shovels to the gold miners" of it's day, people were convinced that they'd rebound and be immune to the long-term effects of the dot-com bubble bursting. Yet, it's more than 20 years since it's stock price collapsed and the stock price has never reached where it was at it's peak in 2000.
Strazdas1@reddit
No. Cisco was reselling the worst, wobbling shovels they could find.
No-Relationship8261@reddit
Fun fact, same is true for Intel.
mayorolivia@reddit
This is such a lazy brain dead cnbc analysis. Cisco didn’t have the revenues and profits of Nvidia and they didn’t have much of a competitive advantage. Nvidia doing $85b+ annualized in data centre sales with 75% gross margins. #2 AMD is gonna do $5b this year and #3 Intel is gonna do $500m. AMD could walk on water the next 5 years and they still won’t catch up. Anytime I hear the Cisco comparison on cnbc I know the analyst is just a sheep.
mayorolivia@reddit
Why the Cisco comparison is stupid https://x.com/beth_kindig/status/1834608964811502040?s=46&t=u-BFt7hD7_thQQbgD8tq0A
auradragon1@reddit
You should short Nvidia stock over the long-term then if you think you're right.
Cisco had a peak PE ratio of 200. Nvidia's PE ratio, by comparison, is a much more reasonable 55.
The fundamental difference between Nvidia and Cisco is that AI gets smarter with more compute. The smarter it gets, the more demand it will have - creating a compute demand self fulfilling cycle. Networking equipment does not get faster with more equipment since the speed of light is a hard limit. You can increase the bandwidth, but not latency. Once you saturate the bandwidth, there's not much need for more equipment.
Demand for compute will always increase.
XenonJFt@reddit
From their product stack. Only thing riding on AI is new Mi300 and new TBA mi400 series. AMD Radeon GPU avaibility isnt enough to satisfy an upcoming AI boom anyway. Rest is server CPU's and workbench lineups like threadripper. which will be a bit hurt from an AI crash but they anchored themselves to long term success either way
masterfultechgeek@reddit
They basically said this a year ago as well.
And Zen 5 appears to be a data center first design. When Zen 5 (seemingly) started getting delayed, AMD almost certainly cut things that helped consumer workloads to make sure that the design would get out more or les son time. (I'm imagining Zen 6 as being quite nice).
Geddagod@reddit
What consumer centric things do you think got cut from Zen 5, and how did that speed up the development timeline?
Strazdas1@reddit
Latency, especially the interconnect latency. But even the in-bus latency between cores is almost as bad as intels between buses. This is fine if you run a NUMA aware software on a datacenter. This is terrible if you run stuff like videogames.
PointSpecialist1863@reddit
They did not overhaul the cache hierarchy. Zen5 is a very powerful engine with a tiny intake.
Geddagod@reddit
The cache hierarchy saw pretty big changes. That alone contributed to 4% of Zen 5's IPC uplift according to AMD, which is more than Zen 4 got despite doubling the L2 cache capacity.
Zen 5 seems to be similarly back end memory bound as Zen 4 was.
PointSpecialist1863@reddit
They revamp the cache bandwidth but keep the cache sizes. If the engine is memory bound increasing the cache sizes alleviate the problem
Geddagod@reddit
The L1 cache straight up saw an increase in size, and the L2 saw an increase in associativity, improving hit rates. If you look at the Zen 4 vs Zen 5 top down analysis in the article I linked, it shows the pipeline slots % having a similar % of being backend memory bound.
masterfultechgeek@reddit
At a very broad level, optimizing various parts of the CPU for latency.
The only major use case for that is gaming. It has minimal impact on most other use cases.
Also general fine tuning.
Zen 5 is a new architecture with a lot of new stuff in it and kinks to work out.
Vushivushi@reddit
They said it 7 years ago at their 2017 Financial Analyst Day presentation, the same year they launched Zen and Vega. #1 priority for AMD was to re-establish datacenter leadership.
https://www.youtube.com/live/590h3XIUfHg?t=6058s
The company wanted to target high margin, high growth markets.
Datacenter is the highest margin, highest growth market.
Their R&D has always been weighted towards the datacenter. On the question of R&D spending at their Q2 2017 earnings call, Lisa had this to say:
masterfultechgeek@reddit
It's kind of an open secret that just about EVERY design for a long time has been targeted at high growth/margin groups.
Heck on paper bulldozer was an AMAZING datacenter product. Imagine a CPU design that's VERY efficient with transistor count/area, gets awesome MT INT performance, clocks high and can merge resources together to get high ST performance and the only sacrifice is FP performance, which on paper is going to end up on GPUs anyway... (in the real world it didn't save much die space, clocking was worse, efficiency was worse and overall latency was a mess)
piggybank21@reddit
Gamers in shambles that neither AMD or Nvidia cares about them.
Strazdas1@reddit
I dont know. I think 6-7 billion of revenue from the market shows they still care about said market.
hackenclaw@reddit
May be Gamers should move their ass to Playstation, since Playstation is Sony biggest profit department.
May be Sony would care them more than AMD/Nvidia.
Strazdas1@reddit
Its biggest department by elimination. Sony failed at all other businesses so they are left with gaming.
Legal-Insurance-8291@reddit
Intel will save us though. 🤡
No-Relationship8261@reddit
Well Intel is the underdog. So they might care enough.
yacineKCL@reddit
seems likely on Laptop gaming side of things
XenonJFt@reddit
Even if they make an efficient arc laptop gpu. Nvidia will try to crush them as much as they can. Laptop market share is nvidia's forgotten golden goose
Lalaland94292425@reddit
It's hilarious, if it weren't so sad. Gamers: no corporation gives 2 cents about ye.
bobbie434343@reddit
HUB and GN on suicide watch.
LAwLzaWU1A@reddit
I am still surprised that people view AMD as some kind of "gaming first" company, as the opposite of Nvidia which seems to be viewed as a "just chase easy money wherever it is".
During the cryptocurrency boom, AMD were marketing their CPUs as being good for mining, and said that gamers should mine on their AMD GPUs.
A few months ago AMD said they would shift their focus from hardware development towards software, API and AI experiences, with quotes like "we can't think of AI as a checkbox/gimmick feature like USB—AI could become the hero".
Hell, their new mobile chips even has "AI" in the product SKU names. Like the Ryzen AI 9 HX 170.
Gaming (and clients in general) are kind of a side-business for all the big CPU and GPU makers these days. Luckily for us, a lot of the stuff developed for data centers tends to trickle down to consumer hardware as well as time goes on. A big focus on other things does not necessarily mean the consumer products suffer either. It's not a zero-sum game.
ashyjay@reddit
consumer markets are places for AMD, Intel, and Nvidia to dump products which don't make the QC for Data centres and enterprise SKUs.
toasters_are_great@reddit
nVidia uses the AD102 die for the L40 and also the RTX 4090 (with a couple fewer CUDA cores active) so you have something there.
AMD uses the Aqua Vanjaram die for their Instinct MI300X but it serves no consumer SKUs whether cut down or not. Zen CCDs that hit good clocks (which generally won't hit as good power efficiencies) are generally used for Ryzens and those that hit good power efficiencies (which generally won't hit as good clocks) are generally used for Epycs, but that's no more dumping DC QC fail dies into consumer SKUs than it is the other way around - it's just a smart business plan to discard as few dies as possible.
Which dies do Intel use in both consumer and enterprise/DC spaces? I wouldn't count the likes of the i7-7740X as anything more than a will-the-motherboard-POST test device.
Strazdas1@reddit
AMD uses same dies for EPYC and consumer CPUs.
Kyanche@reddit
I mean they're both publicly traded american businesses. They'll happily shit all over themselves and their customers if it means continued quarterly growth lol.
BarKnight@reddit
For comparison NVIDIA's gaming revenue was $2.9B and data center was $26.3B
AuspiciousApple@reddit
That's actually less extreme than I thought with all of the AI hype. Though DC margins might be better too
Arbiter02@reddit
They're making a lot of noise now but they still NEED that long play to pay off. R&D wise they've been investing heavily in this for well over a decade now and realistically it's only just recently blown up to the point that everyone's talking about it.
Strazdas1@reddit
well, last quarter their revenue was 5x the RnD budget for entire year...
3ebfan@reddit
That’s just from one quarter and Nvidia is still raising guidance and beating expectations every earnings.
ryzenat0r@reddit
they also lost 200 billions+ valuation
Strazdas1@reddit
7% market fluctuation isnt that strange for fast growing stock.
masterfultechgeek@reddit
As per their financial statements, AMD is often losing money on consumer parts.
I'm not sure how they're amortizing R&D and manufacturing but if it's per die, I wouldn't be surprised.
The consumer stuff is almost certainly in place so that they can improve economies of scale (so offsetting R&D over a bigger sales base) moreso than make direct profit.
BatteryPoweredFriend@reddit
The Zen compute die being a muti-purpose component that's now used in 3 distinct product categories (Epyc, Ryzen & Instinct) means they have a lot of room to fudge the numbers.
Practically every single company at these sort of sizes will be playing fast and loose with their publicised financial results to a certain degree.
TwoCylToilet@reddit
TIL about MI300A.
Wyzrobe@reddit
There's also a rumored MI300C which has all Zen 4 compute dies, but it might not ever be released by AMD, given the high demand for AI right now.
tecedu@reddit
Man I really wish for that one but I think 3d vcache kinda fits that niche right now
lightmatter501@reddit
Consumer and datacenter use the same chiplets for CPU, and they are moving to that with GPU.
panthereal@reddit
Does a PS5 count as a consumer part?
Because that surely counts as gaming revenue.
masterfultechgeek@reddit
https://ir.amd.com/sec-filings/filter/annual-filings
I'm using consumer to mean "client"
CPUs are NOT part of gaming as per AMD because CPUs don't have much to do with gaming.
Elegant_Hearing3003@reddit
Consumer is in place to bash Intel until such time as AMD can make money off consumer rather than Intel doing it
Not that they're doing the best job at that obviously, but that's the plan.
AuspiciousApple@reddit
Though AMD has maybe more CPU on the consumer side so it might be different.
For Nvidia, they have board partners involved, too. I guess they mainly make money from the flagship models
lusuroculadestec@reddit
It's more extreme when you compare it to some historical data. For the FY2020 10K filing, data center revenue was $3B and gaming was $5.5B. For FY2024 10K data center was $47.5B and gaming was $10.4B.
Last quarter's 10Q was $26.2B for data center and $2.9B for gaming. They've gone from data center being less than gaming to it being almost 10x.
Looking at compute on it's own, it was $22.6B last quarter and $8.6B for the same period the year before. That's a 2.6x year-over-year increase.
jaaval@reddit
The LLM hype does that because the complexity grows exponentially if you want to improve the model. You can have a basic model with one processor but if you want a better one you might need ten times as many.
But there is a problem in that at some point in relatively near future somebody has to figure out how to make money with LLMs instead of just pushing $100B a year into them.
mach8mc@reddit
microsoft is in the best position to monetize LLMs with their m365 subscription
they can also bundle it with visual studio and github
auradragon1@reddit
It's very extreme. Nvidia's gaming revenue historically dwarfed its datacenter business up until 2023. Yes, only about 1.5 years ago did Nvidia's datacenter business surpass graphics. It took 1.5 years for datacenter to 9x graphics.
Strazdas1@reddit
Nvidia made more from gaming than AMD made from all sources. And they say gaming isnt profitable :)
omgpop@reddit
I read total revenue was $18bn last quarter, where do you get those figures?
BarKnight@reddit
$32.5B in total.
https://nvidianews.nvidia.com/news/nvidia-announces-financial-results-for-second-quarter-fiscal-2025
omgpop@reddit
Thanks
autogyrophilia@reddit
But last year was extraordinary for Nvidia and not representative of future trends. Either it's going to be much lower in the near future or much higher.
fumar@reddit
As long as AI demand stays where it is, they will continue to print money. The limiting factors affect their competitors as well: HBM availability and TSMC capacity.
Nointies@reddit
its unlikely that AI demand stays where it is, there's a lot of excess demand right now.
virtualmnemonic@reddit
I disagree. OpenAI's new model shows that there's still room for LLMs to grow in performance, but it's way more demanding. As in, can take minutes to respond.
Nointies@reddit
That doesn't mean there isn't excess demand
And it being more demanding is going to be a huge problem when the time comes and the VC starts to dry up.
BarKnight@reddit
That was just last quarter
SkylessRocket@reddit
Factor in profit margins and the difference is even more extreme.
PainterRude1394@reddit
Lol
PainterRude1394@reddit
AMD is riding the same wave, just well behind Nvidia.
Top_Independence5434@reddit
Couldn't the same be said for AMD too?
Successful_Cup_1882@reddit
Ngl gamers are cheap as shit, capricious too. Building your business around them is stupid if you have better options.
uzuziy@reddit
Well, I don't think anyone was considering Intel, AMD or Nvidia gaming focused for a long time now. They'll just give gamers a piece from what big fishes are eating.
Evabluemishima@reddit
Gamers think a 70 dollar video game is an expensive example of corporate greed and avarice. Who would build a business around them?
Entropy_Bug@reddit
In AMD's case they have enough resources and IP to give best Chips/Technologies for the video consoles but not for the laptops. :)
WarOnFlesh@reddit
but what are the margins. if you've barely making a profit on each server chip, but making a killing on each gaming chip, that matters.
mayorolivia@reddit
Data centre margins are nearly 55%. They are cooking
Fortune_Fus1on@reddit
50 percent margins on any product is mad
noiserr@reddit
Nvidia has 75% margins.
Bulky-Hearing5706@reddit
Gaming chips, at least for GPU, have abysmal margins compared to servers. And GPUs margin is laughable compared to CPUs. If you want to see how insanely marked-up server chips are, just look at a 16-core Ryzen and the similar Threadripper products, the only differences are IO dies, yet TR is a couple times more expensive. And TR is not even enterprise level yet.
jumper55@reddit
well I guess that just means in terms of graphics cards for the consumer market they will be like they were in the 2000s with their CPU's and not able to compete against Nvidia like they could not against Intel for so long.
MechaStarmer@reddit
Would someone mind ELI5 how Nvidia/AMD make money from data centres? I googled it but didn’t really understand the articles. I know a data centre is a building with a bunch of computers and servers in it, that’s about all I know.
One_Wolverine1323@reddit
Slowly everyone is letting go of the gamers. Nvidia - AI, AMD - data center, Intel - thanks Steve.
meiself@reddit
Soon they will be an AI-first company
AHrubik@reddit
CEO who stopped focusing on gaming revenue is surprised when gaming revenue no longer makes them lots of money. In other news water feels wet. News at 11.
gunfell@reddit
I really don't think amd is in a position to say this. but ok Lisa Su, we hear you loud and clear
ky56@reddit
Well this is probably the worst news for those of us looking forward to HEDT Threadripper being priced reasonably.
Lalaland94292425@reddit
It’s hardly surprising that corporations prioritize profit above all else.
They're not your friend, never were, never will be.
ryzenat0r@reddit
She has no choice this is a public traded company .
Seref15@reddit
AMD wasn't really a name in the data center space until the last 5ish years. They've made really impressive inroads in datacenters vs Intel in that time.
bubblesort33@reddit
It is now.
UDNA for desktop GPUs can't come soon enough. Hopefully gamers will benefit from them merging CDNA and RDNA back together.
svenge@reddit
I expect that it will also end up hurting existing Radeon owners due to AMD discontinuing driver support for RDNA cards prematurely a la Fury X / GCN.
Entropy_Bug@reddit
This is why I am not running for new purchase, I started with smartphones to upgrade every 4 years or so, and lately laptops I will when I will have the money, till then hopefully I would not become homeless.
seigemode1@reddit
Well yeah, Most of AMD's gaming advances are just spinoffs of data center technology. even X3D was originally designed for DC products. but just happened to be really good for gaming.