AMD now controls 38.1% of all x86 CPU market value and 46.2% of all x86 server CPU revenue share
Posted by sr_local@reddit | hardware | View on Reddit | 60 comments
GreenPlankton309@reddit
intel is hardly concerned about taking over cpu market when they can become tsmc of the west
Geddagod@reddit
Considering the CPU market is the only place they are currently making any money I think they are very concerned about the CPU market lol.
GreenPlankton309@reddit
i dont think you understand what game they are trying to play. their cpu market will get fixed on itself over the next two three gen . their goal now is to make their foundry a good enough viable option in the west . after all arm even risc v will compete for the same market while intel would be the one providing production to all of them
Geddagod@reddit
Very much TBD. Their next 2 generations, at least in the high margin server segment, looks kinda cooked. DMR, which Intel themselves say won't be competitive, and then Coral Rapids, which is likely going to be at a node disadvantage and have uncompetitive core IP.
Hopefully Unified Core fixes their problems, but that's also not a given, and is rumored not to be seen till 28'-29'.
That certainly is one of Intel's goals. In fact, I would agree with you if you said that's where most of Intel's efforts are going towards recently.
However that doesn't mean they are "hardly concerned" about the CPU market. They have to put continuous effort into the CPU market, as again, that's the place they are getting most of their funding to continue advanced node development.
Intel's roadmap changes, both in DT and DC (something which Intel execs have commented on recently) as well as their large reorganization in their cpu architecture teams (canning royal, merging the P and E-core teams), are both evidence that they very much are putting a lot of focus on the product side as well.
GreenPlankton309@reddit
that is what I was trying to myself. you just elaborated it . They dont need the market share when they are literally able to sell everything they can produce on their foundry because of the high demand from the AI hyperscalers.
MewKazami@reddit
Kind of sad that even with total dominance for almost a decade they couldn't go above 50%
They never really dethroned Intel. They just captured the top end of the gaming market with the X3D chips.
luuuuuku@reddit
It’s because they never had total dominance.
hackenclaw@reddit
basically they prefer to sell 38 chips for $100 profit each than 80 chips for $50 each despite the latter earn them more money.
They are no constraint by TSMC, TSMC also make a dozen of other chips from others chip designers, its simply AMD didnt bother to book more capacity. (there is no way they under estimate their demand after for so long)
RHINO_Mk_II@reddit
wHy dIDn'T AmD nOT bOoK mOrE cAPaCiTy?
Every advanced node is sold to capacity. AMD would be competing with Apple for 2nm wafers, and they can't afford to throw infinite money to steal Apple's slice of the pie, nor would TSMC allow them to cut Apple off entirely at any price.
Geddagod@reddit
I mean since the AI boom, this makes a lot of sense, but what excuse does AMD have for the past few generations then?
FlyOk6103@reddit
Apple has way higher margins, you cannot outbid them.
Geddagod@reddit
The AI boom is what resulted in the extreme wafer demand. There was also a lesser scale, shorter term shortage during Covid IIRC. But there was almost certainly no need to outbid Apple for the past couple of generations for wafer output.
Even better, AMD hasn't used the leading edge node as soon as it's ready like Apple does the past couple of generations, for the most part. Zen 6 is where it looks like AMD is getting aggressive, but in the past this was not the case.
NeroClaudius199907@reddit
amd went from penny stocks to $700B+ in less than a decade of mcm comeback
SwanManThe4th@reddit
Man I have made some poor life choices... I guess you could say yeeting my entire - at the time high 4 figures, life savings on AMD when it was at $72 was one... Thankfully it has worked out.
Brapplezz@reddit
Could have been buying intel the last 12 months
SwanManThe4th@reddit
Am cash poor at the moment due to the aforementioned poor life choices. I was sure the US wouldn't let their most important company die.
Brapplezz@reddit
Same tbh. I just chose to not eat to buy 5 shares at 35. Granted I bought it over a few months like $10 at a time.
If you're poor like me you gotta take the risk but take it slowly. I'm on welfare lmao but refuse to let that guide my future
nope586@reddit
It's hard to displace mind share. I work in enterprise IT and many of my (especially older) colleagues wouldn't ever consider AMD as a CPU option for anything. I'm starting to see a little bit of a shift as some of the vendors are coming around. One of my coworkers was deploying a new Azure VM last year and noticed that the CPU was listed as an AMD EPYC, he was super confused "how could Azure have AMD CPUs? This must be a mistake".
MewKazami@reddit
Yeah not surprising honestly.
ps5cfw@reddit
I Remember when they barely had any market shared back when their Server offerings were opterons!
I am glad they managed to dig out of that hole, sadly can't Say the same about their GPU Department, that one could still do with some improvements
nokeldin42@reddit
Radeon group needs its own Ryzen moment, but this time it has to be software driven. But that is very difficult given the leadership and the culture at the company. Nvidia, for a long time, has taken pride in its software work. AMD hasn't, and now it's in a position where it is very hard for AMD to attract top tier developers and software leadership. It is still very much a small company compared to nvidia and intel.
Like, just take a look at speaker panels for open software conferences - you'll see speakers from intel and nvidia alongside faang, but AMD has a much smaller presence.
I won't claim to have direct visibility into the Radeon group, but I have it on good authority that software engineering is somewhat of a second priority for AMD, they don't get to make decisions on product direction. Software engineering at AMD exists first and foremost as a support group for the hardware. That philosophical change is always going to be hard to bring about and Radeon is getting the worst of it since nvidia has sort of called them out on it. That said, things do seem to be steadily progressing but some sort of a jump is needed I think.
keremdev@reddit
I know it may not look like it but I think 9000 series is the Ryzen moment for Radeon. The Ryzen 1000 moment that is. They managed to make a cheaper product with similar performance to the competition and in a way that has clearly disrupted the market.
I think we will soon see the Zen 2 moment, at least I'm hopeful that it will be the case.
Noreng@reddit
An RX 9070 is significantly more expensive to produce than Nvidia's RTX 5070. Even the 9070 XT is likely more expensive than Nvidia's RTX 5070 Ti, thanks to the increased PCB and VRM costs.
kuddlesworth9419@reddit
You say that but AMD's Adrenaline software has been better than the Nvidia control panel for a long time. Nvidia just neglected it for years until fairly recently.
Gatortribe@reddit
Adrenaline/control panel (and whatever the new Nvidia one is) aren't what people mean when they say software.
That said, I love how simple and lightweight NV control panel is. I still use it over their mobile app answer to AMD's mobile app control software.
Z3r0sama2017@reddit
Yep. Control Panel is still my GOAT for basic tinkering, with Profile Inspector existing for those that want to have the power to tinker with everything.
TwoCylToilet@reddit
Windows update just broke my Radeon drivers this afternoon. Arguably it's as much Microsoft as it is AMD's fault, but I shouldn't have to use registry or group policy to prevent Windows Update from updating and breaking my GPU drivers. This has never happened on my machines with Nvidia GPUs.
soggybiscuit93@reddit
.
I can't remember the last time I had Windows Update break any of my drivers or software - it's been that many years. Seems to me that if this is a uniquely Radeon problem, then it's probably their fault
TwoCylToilet@reddit
Well I've had it happen to Intel iGPUs as well. Windows update installed a more recently signed but lower version driver. It also did the same thing with some of my NIC drivers from a bunch of different vendors. From my POV it's a systemic Windows Update problem that Nvidia has made a significant effort to avoid.
soggybiscuit93@reddit
Windows update is going to push out the latest WHQL certified driver. If the vendor posts a driver on their website (or through their app) before that driver has been WHQL certified, it's possible that Windows Update is going to override it. I'd still blame the hardware vendor in this case.
Nvidia likely avoids this by making sure their WHQL certification aligns with their public release of their drivers through their app and website.
Laggiter97@reddit
Please don't abbreviate "control panel"
996forever@reddit
You know you’re not in the conversation when you come up with…the UI of a piece of software hardly anybody outside of forum enthusiasts will see on consumer hardware.
kuddlesworth9419@reddit
I would imagine most people have used the Nvidia control panel or AMD Adrenaline a few times at least even the general consumer if they have either an Nvidia or AMD GPU. It's very easy to access just right clicking on the desktop so it's not hidden away.
HuntKey2603@reddit
No, what people care about is older non mainstream games crashing due to driver issues. Stability and compability what people mean by "software", no one cares about the UI if the GPU can't GPU.
kuddlesworth9419@reddit
Well if we are talking drivers, Nvidia has been plagued with driver problems since 5000 series that is well published. Linux drivers are also an issue and are behind AMD drivers on that OS.
996forever@reddit
Amd has been plagued with driver problems since ATI era.
BoringElection5652@reddit
Control Panels are irrelevant. It's the whole developer software stack that matters, and AMD's is abysmal. Nvidia always made GPU development easier, while developing for AMD is a nightmare.
CataclysmZA@reddit
The one area where Raja Koduri was effective was in leading software teams inside Radeon towards a better base to build from, and it's enabled them to keep up with advancements like upscaling, frame generation, and per-game enhancements and profiles. Adrenaline is generally a good user experience.
waitmarks@reddit
I don't necessarily agree that it's a leadership and cultural problem. Their Linux drivers are really quite good now, but only because they were essentially forced to start from mostly scratch to get them included in the linux kernel. The reason behind that re-write was they tried to merge it in pretty much as it was and the kernel devs essentially told them to go back to the drawing board and use standard linux functions instead of their custom ones. That forced them to totally re-write almost all of it.
I think they just need a reason to do the same for windows (and also ROCM, but that's a whole other story). I doubt that they will be able to justify getting the budget to do that though.
BatteryPoweredFriend@reddit
That was only about 10-12 yrs ago.
RealThanny@reddit
The first Opteron was released 23 years ago.
AMD's server market share was very slow to grow due to Intel bribing big server vendors to not buy AMD. They paid Dell about a billion dollars a year in rebates to use only Intel processors in their servers. In the end, Intel had to pay AMD over $1 billion in a legal settlement on the issue in 2009.
With EPYC, I doubt anything quite as blatant was going on at Intel, but they were almost certainly still doing shady things to keep people on Intel only. But EPYC kept improving so drastically over time that there was no stopping the momentum.
BatteryPoweredFriend@reddit
What I was saying was that it was still only 2014-2015 when AMD's marketshare in the server space was often reported as low as <1%, or even just rounded down to "0%" in reports, because it really was that insignificant.
siazdghw@reddit
Those numbers will recede in the next few years as Intel claws it back.
AMDs success was based on:
TSMC being far ahead of Intel fabs. This is now pretty much gone.
AMD moving to chiplets while Intel was monolithic. This is gone and Intel's tiles are superior.
AMD using TSMC to stack cache. Intel will have large cache chips in customers hands this year.
AMD having mobile chips with good iGPUs for gaming. Panther Lake already surpassed most of AMDs lineup.
What advantage does AMD actually have anymore? Nothing really. Meanwhile Intel has its own fabs trading blows with TSMC, but making it significantly cheaper for Intel to produce their chips than AMD to be in a bidding war for TSMC wafers.
Geddagod@reddit
Intel is still a node behind. This isn't any different than Intel being on Intel 7 when AMD was on N5, or Intel being on Intel 3 when AMD are on TSMC N3 (turin dense at least).
How?
Not rumored to be 3D stacked in client though. Just monolithic. We will see how they fare, but on paper at least one would assume they are worse, since they don't have the advantage of better latency via 3D stacking rather than just widening the compute tile and adding more cache there.
As for server, AMD didn't even bother releasing Zen 5 X3D Turin skus. So their advantage in server isn't due to V-cache.
Given AMD's market share in mobile, they never really had that much success there.
Better core IP. Idk much about graphics, but I'm willing to wager they still have better graphics IP too in perf/mm2, even if they haven't bothered to use their latest graphics IP in their latest generation stuff.
Depends on how wafer cost, and also yields, end up looking.
For example, Intel 7 is very mature and high volume atp, but the node itself is extremely expensive relative to TSMC N7/N6 stuff.
Meanwhile Intel 18A is wafer cost competitive with TSMC according to Intel, however poor yields and volume are dragging down PTL costs to below corporate average.
Intel, due to being unable to compete with TSMC at the leading edge, is having to end up going externally too.
SmashStrider@reddit
It's actually insane how high margins the EPYC server CPUs have. Zen really does continue to be a benchmark for scalability.
Nicholas-Steel@reddit
The power of separating CPU Core Dies from everything else! One or more cores don't meet spec? Move that die to a lower end CPU model while replacing it with a die that has all cores meeting spec.
Very little waste compared to Intel's approach of monolithic design.
Geddagod@reddit
Intel has chiplets now, and had them since Sapphire Rapids in DC CPUs, which allows them to scale up core counts to compete with AMD (iso node at least).
However their chiplets are still massive, like >400mm2.
Interestingly enough though, Clearwater Forest has compute tiles that are even smaller than current Zen CCDs, while AMD is supposedly increasing their compute chiplet size in Venice Dense. And while Diamond Rapids is also rumored to have large 18A compute tiles, Intel also claims that the largest compute tiles they can have using hybrid bonding (which DMR is also rumored to use) is \~90mm2 in 2026 and 2027.
jenny_905@reddit
That's impressive given AMD's tiny mobile presence.
DNosnibor@reddit
I have one with an HX 370. Had it for almost 2 years now. It's been nice.
ComprehensiveLuck125@reddit
I have thrown one to the trash bin ;)
Within last 20 years none of Intel notebooks I owned died. In the last 5 years 1 out of 3 AMD notebooks died. Stats are very bad for AMD ;)
bitNine@reddit
Awesome! After more than 27 years of PC building, I bought my first AMD processor last year (9950X), and I love the shit out of it. I love it so much that when I went to upgrade my graphics card, I bought a 9070XT instead of a GeForce. It's an excellent combo, and the 9950X blows away every Intel offering that was out at the time.
Johnny_Oro@reddit
Outside Cities Skylines 2 and a few productivity software, Raptor Lake was more powerful. Even 14600K was more powerful in gaming. But it degrades, yeah.
Main-Carry-3607@reddit
EPYC has been a monster for AMD. The margins on those server chips are wild. But yeah, mind share in enterprise IT is still stuck on Intel for a lot of older decision makers. GPUs are a whole different battle though. They need a software moment, not just hardware. Radeon has the specs but the experience still lags.
webjunk1e@reddit
Wording is everything. Neither category includes Nvidia.
996forever@reddit
What would Nvidia have to do in a chart about x86 CPUs…?
webjunk1e@reddit
woosh
bhop_monsterjam@reddit
is that the sound of the empty space in your skull?
railagent69@reddit
you'd be surprised when you see the number functional illiterates.
AsheAsheBaby@reddit
“Wording is everything”
What are you on about lol. NVidia have nothing to with x86, why would they be mentioned?
AutoModerator@reddit
Hello sr_local! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.