Intel spends more on R&D than Nvidia and AMD combined, yet continues to lag in market cap — Nvidia spends almost 2X more than AMD
Posted by TwelveSilverSwords@reddit | hardware | View on Reddit | 208 comments
octagonaldrop6@reddit
This is a bit misleading because Intel’s R&D needs to cover both design and manufacturing
only_r3ad_the_titl3@reddit
what is interesting is AMD spending half of Nvidia despite also designing CPUs
TwelveSilverSwords@reddit (OP)
It's also interesting to look at employee counts;
Intel = 110,000
Nvidia = 30,000
AMD = 26,000
TSMC = 77,000
Qualcomm = 50,000
ARM = 7000
EnoughDatabase5382@reddit
Intel has systemic inefficiencies.
https://www.reuters.com/technology/intel-board-member-quit-after-differences-over-chipmakers-revival-plan-2024-08-27/
Capable-Comment-6446@reddit
Thanks for this link!
Forsaken_Arm5698@reddit
If Intel having 110k employees is considered "bloated", then so is Qualcomm having 50k employees. That's almost as much as Nvidia and AMD put together. Why does Qualcomm have so many employees? Or perhaps we should be asking the inverse question- how do AMD/Nvidia have fewer employees?
Also ARM having only 7k employees is funny.
Balance-@reddit
Do not underestimate how much products Qualcomm makes. Especially in the networking category, so many stuff runs on Qualcomm.
They could still be a bit bloated though, I have no idea.
PMARC14@reddit
Arm is putting out bangers for their employee count, but Qualcomm does a lot more than chips GPU and CPU, don't forget modems, ISP, DSP, NPU, wifi & Bluetooth, cellular networks, and all associated parents they publish. Not to say Nvidia and AMD don't have a lot of tech for data centers under their belt, but I don't think Qualcomm is particularly bloated, it may have some extra. I think it is pretty demonstrable that they get a lot of value from it as Apple has a ton of revenue for developing their own modem, yet still struggles to put something that compete acceptably against even old 5g modems from them.
KyuubiWindscar@reddit
Bangers was not invented to be used this way 🤣🤣🤣
so_fucking_jaded@reddit
I'll allow it
ComeGateMeBro@reddit
ARM only having 7000 people is amazingly efficient
jaaval@reddit
Arm only makes their core architecture designs. Somebody else makes the actual products.
edo-26@reddit
They don't manufacture anything though do they?
ComeGateMeBro@reddit
Neither does Qualcomm, would guess qualcomm has like 10k managers and 5k lawyers
Exist50@reddit
Qualcomm at least needs much bigger SoC teams. ARM needs very minimal physical design, for example.
Ok-Acanthisitta3572@reddit
The comparison to TSMC is the most damming considering the volume of chips each can produce. Intel having more people than TSMC and Nvidia combined is insane.
TwelveSilverSwords@reddit (OP)
Before layoffs Intel had 125k.
That was almost as many as AMD + Nvidia + TSMC combined.
only_r3ad_the_titl3@reddit
pretty sure they didnt lay off 15k by now
Dealric@reddit
https://apnews.com/article/intel-chip-ai-job-cuts-layoffs-loss-e61781e9364b69af63481c34ca5dcd67
They might have
ExeusV@reddit
by 3%? how is this insane?
anival024@reddit
Intel is doing a tiny fraction of what Nvidia and TSMC do, with way more expenditure. It's why they're culling staff and will continue to do so. They likely need to get down to around 80,000 employees very quickly.
based_and_upvoted@reddit
Based on what assumptions did you come to the 80 thousand number?
Why does intel do a tiny fraction of what Nvidia and TSMC do? Intel designs and builds their own CPU and GPU chips even if they are also using TSMC now.
God I swear redditors can be so hilarious sometimes
Professional_Gate677@reddit
He divided the diameter of his hand by the diameter of his anus. Being that it equaled 1 his stuck it up there and pulled the number out .
ExeusV@reddit
While they definitely do less, then by what logic you think it is "tiny fraction"?
Doesn't seem like is it true, since they aren't showing desire to do so
rhayex@reddit
People just say wild things lmao.
I'm curious what research the guy you're replying to has done to be sure that they need to cut 30k+ jobs. Which sectors? What positions? What are their responsibilities, product lines, etc? What are you losing and what is the knowledge gap that will be created?
Its irritating seeing people talk about cutting jobs as the only possible solution to bad fiscal years (or even just bad news regarding a particular product stack). Short-term massive slashes to workforce can only "save" so much, and long-term you lose all the knowledge and training those individuals had.
If intel winds up laying off employees at that massive of a scale, I'll be significantly more worried in their long-term future than I currently am.
TwelveSilverSwords@reddit (OP)
TSMC + Nvidia make more revenue/profit than Intel.
ReplacementLivid8738@reddit
The funny thing is that part of TSMC's revenue comes from Nvidia buying and then selling, adding to their own revenue, so part of the same money counted twice.
ExeusV@reddit
Nvidia does 4-5x revenue of and 50x profit of AMD with the same employee count
DR_van_N0strand@reddit
Yea but that is probably mostly workers in manufacturing on the factory floor making the chips for everyone else.
Ok-Acanthisitta3572@reddit
Intel doesn't make chips for anyone else..
DR_van_N0strand@reddit
When did I say they did? I was speaking of TSMC.
Intel has a bloated workforce because they have in house manufacturing and they have a ton of sales people and management and R&D and just a massive workforce that has never been streamlined.
Nvidia doesn’t really need armies of people handling accounts for a bajillion different clients.
Nvidia has a much smaller pool of corporate clients each spending a ton of money with them whereas intel has a much larger pool of clients each spending less.
Intel also sells their stuff first party so they need the distribution and all that whereas Nvidia and AMD have partners who make their graphics boards that are sold to consumers and their partners take on the employees who handle the distribution and sales to end users. Nvidia and AMD sell way less of their boards made by them like the Founder’s Editions than they do boards made by third parties.
Intel owns a 65/35 split of the CPU market and dominates in premade PC’s whereas AMD has a healthier share among people building their own systems.
AMD has to devote way less staff to handling sales than Intel. And Intel is old school and just has a bloated workforce in the first place.
octagonaldrop6@reddit
I agree, it’s very interesting. In terms of employee numbers, Nvidia + TSMC = Intel. Which would make sense. But Intel is FAR behind them in both design and manufacturing. It’s pretty good evidence in favor of the fabless model.
Though Taiwan government support plays a much bigger role for TSMC than the peanuts Intel got from the CHIPS Act.
CarbonTail@reddit
Taiwanese economy is essentially making semiconductors at this point. Every other industry exists to support semiconductor fabrication for the most part.
powerbronx@reddit
This^^^ a rough comparison per population is the size of all U.S. Armed forces
powerbronx@reddit
This^^^ a rough comparison per population is the size of all U.S. Armed forces
UnicornJoe42@reddit
Intel has not only consumer and commercial orders, but also orders from the military. Besides, Intel has its own factories.
RonLazer@reddit
Anyone who has worked in tech can tell you that if left unchecked a company will add 1000s of new employees every year and maybe 10-100 of them will be useful. If you're lucky the other 900 are unimpactful, but often they're an active hindrance.
Intel probably needs to make deeper and heavier layoffs to return to market dominance.
WeWantRain@reddit
Chip-making factories employ more people. Thus TMC and Intel's size.
Past-Inside4775@reddit
Intel is closer to 85k today.
reps_up@reddit
Don't Google IBM employee count
hamatehllama@reddit
Nvidia does a lot of computer science research in graphics, simulation, AI etc. They are one of the most published institutions on par with Stanford and Google.
octagonaldrop6@reddit
Nvidia is also designing CPUs nowadays (though not for consumers). They have a lot of catching up to do in that space so could account for some extra spending.
They can also generally just afford to spend more based on revenue.
TwelveSilverSwords@reddit (OP)
They design CPUs yes, but not the CPU cores themselves (which is much harder to do). The Grace CPU uses Neoverse cores licensed from ARM.
ResponsibleJudge3172@reddit
Who said they are not spending on CPU cores? RnD is naturally focused on the future and the architectures of future CPU products are not yet revealed
Exist50@reddit
They're working on that too.
TwelveSilverSwords@reddit (OP)
Nvidia working on custom ARM CPU cores?
Exist50@reddit
Yes. They've been hiring in that area recently.
TwelveSilverSwords@reddit (OP)
There's indications that Google is also designing custom CPU cores.
There were many Custom ARM core projects in the last decade; Qualcomm: Krait/Kryo/Falkor, Nvidia: Denver/Carmel, Samsung: Mongoose, etc... But they all died out.
Now Qualcomm is making custom ARM cores again. There are indications that Google and Nvidia are doing so too, and there's even hazy rumours that Samsung is resurrecting their custom ARM core project.
quildtide@reddit
All the old custom cores were competing with the stock cores in the same niche: mobile.
The escape of ARM outside that niche is creating opportunities for significant levels of diversification, I think.
monocasa@reddit
Also, ARM dragging it's ass on creating Apple style CPU cores that can compete on the high end when they're actually released to endusers.
VastTension6022@reddit
Arm's design team has been great, what do you mean? They may be second to apple, but amd/intel are such distant 3rd/4ths they're not in the conversation. It looks like even qualcoms (ex apple)nuvia acquisition is probably a bust with cores that may have trouble competing with arm's x9s
monocasa@reddit
Perf per watt, AMD still beats even Apple. Single thread perf, Intel beats Apple.
And AFAICT, there are no X925s you can buy. By the time you can (two years out probably), other cores from all of the other manufactures will already be out. There'll probably be an Apple M6 on N2 about that time frame.
And that's the core of what I mean "when they're actually released to endusers". Cores designed by ARM have so much additional SoC integration work that needs to be done that they end up perpetually falling behind cores that are codesigned with the the SoCs they're going to end up in. That's sort of the state of the mobile ecosystem with the exception of Apple, but for servers and laptops, that's a recipe for always being a couple cycles behind.
TwelveSilverSwords@reddit (OP)
The Dimensity 9400 has been announced with X925 cores, and soon you will be able to buy phones with it.
Geddagod@reddit
Looking at core power, In SpecINT, Zen 4 on DT is slightly higher than the M1 Max. Zen 5 has pretty much the same perf/watt at Zen 4 at those power ranges.
Apple is now on the M4. I don't think AMD is even coming close to beating out Apple in perf/watt.
Looks like it will be just barely, while consuming a shit ton more power.
Isn't that rumored mediatek and Nvidia PC chip rumored to use the X925? I think that's rumored to come out in late 2025. Don't follow ARM cores that much though, so could be completely off base there, I'll admit.
monocasa@reddit
AFAIK, their custom core division that created the Denver cores has all been laid off or redirected to other projects.
Exist50@reddit
Maybe, but they seem to be hiring, perhaps for something new.
Forsaken_Arm5698@reddit
For the Nvidia-Mediatek Windows-on-ARM SoC project perhaps?
Exist50@reddit
Probably longer term than that.
octagonaldrop6@reddit
That’s true, but they are also putting a lot of work into CPU/GPU interconnect both inside the server and between servers. I’d say their spending is justified, especially considering scale.
Jonny_H@reddit
AMD also have direct equivalent interconnects, with their infinity fabric and similar.
They may not be as well refined, but they're "solving the same problem"
loozerr@reddit
Isn't AMD IF an interconnect for their multi-die CPUs? Not between very different processors.
Jonny_H@reddit
Also between their GPUs, and GPU<->CPUs.
I think there was also talk about future Xilinx and Pensando devices using IF (FPGAs and networking correspondingly), though not sure if anything has actually been released there yet.
And there was also talk on them moving away from IF to an Ethernet based solution in co-operation with some "Rivals" (I think Intel was one?), but again not sure if that has actually made it to product yet.
siraolo@reddit
I hear they pay their employees pretty generously.
R3xz@reddit
I read that they were having trouble figuring out where to spend their suddenly ultra inflated treasury.
only_r3ad_the_titl3@reddit
okay didnt know that thanks
jmlinden7@reddit
The Nintendo Switch uses an Nvidia CPU
monocasa@reddit
It uses an Nvidia SoC. The CPU is a Cortex A57 designed by ARM.
jmlinden7@reddit
The A57 is just a core, a CPU requires more than just a core.
monocasa@reddit
What do you think a CPU requires that isn't in a Cortex A57 hard macro, but is generally present in other CPUs?
Exist50@reddit
They're using "CPU" to refer to the SoC as a whole.
monocasa@reddit
I know they are, I'm trying to teach the difference.
Adromedae@reddit
NVIDIA has designed CPUs cores many times before BTW.
Elegant_Hearing3003@reddit
I've heard their recent success has cost Nvidia a lot in terms of employees, a lot of experienced people there today with seven figure salaries and no particular urgency to do their jobs (we're doing great, why should I hurry?).
The price of success, one might call it.
TwelveSilverSwords@reddit (OP)
Suffering from success
theQuandary@reddit
The only significant CPUs that Nvidia designed were the Transmeta-based ones that crashed and burned. For everything else, they are using bog-standard ARM stuff.
joltdig@reddit
Beat me to the punch. The Grace GB200 is not a GPU.
Physical-King-5432@reddit
Probably because TSMC does the leg work for both of them
ecktt@reddit
It shows. Not throwing AMD under the bus but they are not innovating while everyone else is taking risks on new tech. Cannot blame them either. They are maximizing their ROI.
sunjay140@reddit
Nvidia designs CPUs
Jack071@reddit
Because Nvidia is great at developing and selling the current market fad. They made bank with mining and now are dominating at ai datacenters
Amd kinda lags behind, they got late into the mining market and ended having a surplus of gpus they had to sell discounted as gaming gpus, and now with ai chips they are behind Nvidia by any metric.
clampzyness@reddit
it just means that Intel is going the wrong route by doing this. The title is not really misleading imho.
octagonaldrop6@reddit
A lot of people concerned with geopolitics would say it’s the necessary route.
Ok-Acanthisitta3572@reddit
Intel is a private company, not a part of the US government. It isn't their job to worry about politics. If the US government wants Intel to make bad business moves for political reasons then they need to pay for that.
octagonaldrop6@reddit
The US government HAS been paying for it with billions of dollars in funding. It’s also naive to say that investors don’t care about politics/geopolitics. Domestic manufacturing is what differentiates Intel from their competitors.
It may be their downfall but if anything bad happens to Taiwan they’ll be the only game in town.
Exist50@reddit
Drop in the bucket compared to the money required.
Empirically, they don't. The stock surges any time someone hints about Intel cutting the fabs loose.
That "if" is precisely the problem. If they spent all their earnings on lottery tickets, no one would call that a sound bet, right?
octagonaldrop6@reddit
It’s not a lottery ticket it’s a hedge.
Exist50@reddit
If the only scenario that bet pays off is for a remote possibility, then yes, it effectively is a lottery ticket. And just like the lottery, if you pour all your money into it, you're probably just going to end up bankrupt.
octagonaldrop6@reddit
A hedge is the opposite of a lottery ticket, it protects your investment. Let’s say an investor believes in the exponential growth of the AI sector. They want to invest in a portfolio of semiconductor stocks, but they realize that they are now making a huge bet against China fucking with Taiwan. Thus companies like ASML and Intel become an effective hedge against a worst case scenario.
Exist50@reddit
By that definition, Intel's bet isn't a hedge at all. They're sacrificing pretty much the entire rest of the business to do foundry. If it fails, they have no alternative or backup plan. A 3rd party using Intel Foundry may count as a hedge for them... but as we can see, pretty much no one is.
octagonaldrop6@reddit
Intel’s bet isn’t a hedge, the investors’ bet on Intel is the hedge. This is why things like the CHIPS Act exist and why the Intel stock is slowly recovering despite such a bleak outlook.
There is value in what Intel is doing, even if it’s not the best way to maximize profits in the short term. Competition is good for everyone. If AI advancement becomes akin to an “Industrial Revolution” as some people are suggesting, then TSMC would have the entire world’s balls in a vice grip without foundries like Intel and Samsung.
Exist50@reddit
And investors are not betting on Intel. Literally the opposite. The stock jumped 10% on the mere rumor they would ditch the fabs. Investors hate the waste of money.
That gives money to TSMC and Samsung as well. Plus, doesn't remotely cover the cost of Intel Foundry.
People were saying that at $30, lol. It's literally down >50% YTD.
If there's profit to be had, then when is it supposed to materialize? Because the milestones keep getting pushed further and further away.
And why is manufacturing more of a long term bet than design? Designs are what people ultimately buy. Nvidia is worth several times TSMC.
What about Nvidia? Intel's de facto sacrificing their chance to compete in the GPU space.
octagonaldrop6@reddit
Haha that’s a lot of claims I disagree with. We’ll have to see how it plays out, I would be absolutely shocked if the Intel fab business was allowed to fail. I think Intel’s value will become very apparent and everyone that bought at $20 is going to be quite happy.
Nvidia can’t survive without TSMC, but TSMC can survive without Nvidia. Nobody else has the capacity nor the advanced processes. I’m of the opinion that once the chips arms race really gets going, things like this are going to matter.
RemindMe! 3 years
Exist50@reddit
Nvidia can and has used Samsung before. Their value isn't in the incremental benefit from TSMC. Hell, they're not even using N3.
I do not believe the political will exists to continue spending so much money on a failing private company, regardless of long term interests.
octagonaldrop6@reddit
The newest nodes will always go to smartphones or similar because smaller dies are better suited for a lower yield process. Nvidia will use N3 when it’s more mature.
At this point there’s just no way Samsung could meet Nvidia’s needs. GPU demand already far exceeds supply, even with TSMC’s capacity. Hell, the Saudis just floated the idea of a trillion dollar compute investment to Jensen on stage. There would also likely be no generational performance improvement (or even a regression) due to Samsung’s inferior nodes.
If TSMC all of a sudden only produced chips for AMD there would be a great shift in market share. At some point availability would become more important than CUDA.
As for whether the political will to save Intel exists, that will have to be seen. Politicians can’t be counted on to understand the gravity of the situation, but there could definitely be advancements in computing that could cause a rude awakening (AGI, military applications, encryption breaks, cyber attacks, etc.). There is certainly significant investment in TSMC from Taiwan.
Also to some degree I’m just arguing for the sake of arguing at this point, neither one of us can see the future. Maybe we’ll both be wrong and ASICs will be the path forward.
Exist50@reddit
Apple has been making huge N3 dies since last year. Clearly the node is very much capable of it.
Currently more of a packaging bottleneck than logic dies. Also, that assumes the current levels of demand will hold indefinitely.
You talking about Altman's nonsense? Think you're reading too much into it. $1T is a ludicrous number for headlines and hype. There's no basis in reality.
Nvidia's design gap is currently worth more than a process node. Which is why they can get away without using the bleeding edge. And Samsung might very well remain more competitive than Intel is.
But that's not going to happen, so why theorize?
Same, but it makes for fun discussion.
octagonaldrop6@reddit
I could be wrong about this because I don’t know how big Apple’s Ultra/Max/whatever dies are nowadays, but I was under the impression that they are still nowhere near the size of a GPU like Blackwell. Even the Grace CPUs seem pretty massive. Even if they are, it’s a low volume product compared to the iPhone or base model MacBook chips.
You say it “assumes this level of demand will hold indefinitely”. But you are discounting the fact that the demand could just as easily increase as it could settle down.
I wasn’t talking about Altman’s nonsense, it was a talk between a Saudi prince and Jensen on stage at some recent conference. He was half joking and it could have been in reference to Altman’s comments, but it was clear that the Saudis were prepared to make massive investments in AI (their own models, not OpenAI). Things are going to get crazy when players like that get involved, along with governments. Am I bought into the hype? Maybe but I also work in the industry so my livelihood kind of depends on it.
I’m not sure exactly what to make of Nvidia’s design gap, but I definitely agree that it’s not just the process node. Probably more to do with CUDA and software support than anything though. Nvidia played MUCH nicer with PyTorch and TensorFlow in the early days and now they are just the standard for AI researchers. Would that still hold if there was a competitor on a more bleeding edge node? Who knows. Depends on the performance gap.
“so why theorize?” All we’ve been doing is theorizing! The chances of Nvidia going back to Samsung is about as likely as China invading Taiwan. Because that’s the only reason they would do it lol.
I’m standing by the opinion that Intel foundry will stay afloat, and possibly even thrive. I would put money on it (in fact I have). All it takes is for the US government to get serious about the value of compute.
Sorry I haven’t been doing inline quotes, I’m on mobile.
Exist50@reddit
I can't find exact numbers, but it seems like the M3 Max is likely 600+mm2. Might not be quite as big as Blackwell, but close enough.
I don't think the money exists for that level of spending. The market is white hot right now, but corporate budgets aren't infinite. Going to need to start seeing ROI for it to keep growing.
Frankly, doesn't matter who or what context. Anyone throwing out a $1T investment in AI should not be taken seriously. The entire Saudi sovereign wealth fund is ~$1T, for reference.
Quite frankly, I'm discounting the government entirely. The CHIPS Act was a one-off. Intel can't rely on that every couple of years. Even the committed spending is falling under scrutiny.
So the rest of Intel has to bail out the fabs long enough to be profitable, but the nodes aren't even good enough for Intel's own 2026+ AI chips. Don't see why any 3rd party would be interested.
octagonaldrop6@reddit
The money doesn’t exist yet, but I can see a world where it is more valuable to have more compute than the other guy instead of 10x more F-35s than the other guy. You are discounting the government but I think that’s where the money will come from.
Not much is going to change in the next 2 years, but in 10-15 things could be very different. If we end up in that world then the money will be there. And so will the need for domestic manufacturing (which will be just as important as ROI when it comes to gov spending).
Who knows if it will be a steady build to that point or if the bubble will burst first, but I don’t think it’s an unrealistic future.
The $1T investment was definitely a joke, I misremembered. I rewatched it and the Saudi Minister of AI brought up $7T (and Altman) jokingly but then went into a serious discussion about building up compute, mentioning “Saudi ambition”. Jensen even talked him down a bit, saying you don’t have to invest so much now because compute is increasing at such an alarming rate that any investment will be soon obsoleted. It was a surprising thing to hear from a GPU salesman but he must have more demand than he knows what to do with. Not sure if it proves any of my points but the talk is worth a watch. Jensen did say that he projects that there will be a $1T investment into datacenters (as a whole) over the next 4-5 years which is still staggering.
It’s definitely possible that you’ll be right and Intel will fail before governments start spending big money. There won’t be any more CHIPS money in the short term. But I could see the US upping their spending in the medium-long term and I hope they don’t end up kicking themselves for letting Intel fail.
Splitting off the foundry could be a good move as long as it continues to be a foundry and isn’t just scrapped for parts.
I thought AMD was finished in the Bulldozer days but look at them now. I still think Intel will recover, more likely than not. You may lean the other way and we may have to agree to disagree on that.
Ok-Acanthisitta3572@reddit
The US government hasn't provided Intel with any special funding. They created a blanket subsidy that applies to foreign companies too. Indeed TSMC is the only company to have actually successfully built a fab with CHIPS Act subsidies.
Exist50@reddit
And if none of them are willing to put money behind it, how much do you think they believe it?
HylianSavior@reddit
A lot of things get rolled up into R&D spend, including software development. Being the GPU market leader for so long, I imagine Nvidia has their fingers in a lot of pies. They developed and pushed CUDA, raytracing, and nowadays they're training their own AI models.
Not to say that AMD hasn't been killing it with a scrappier team; pushing for future innovations as the market leader just requires more spend in general.
acc_agg@reddit
Given the quality of their GPUs it's not at all surprising.
cloudone@reddit
Nvidia does a lot more than just GPUs
Just go watch Jensen’s GTC keynotes
only_r3ad_the_titl3@reddit
i just assumes that everything they do is basically the same type of tech just in different use cases.
TheAgentOfTheNine@reddit
Nvidia also spends on software, unlike AMD as you can see in their drivers, ROCm, etc*
*Is joke
quildtide@reddit
The only joke I see here is ROCm support.
aphosphor@reddit
Intel is spending a lot more researching many more technologies.
akluin@reddit
And intel cover a wide area of technology not only cpu and gpu
Ok-Acanthisitta3572@reddit
That's not really misleading because ultimately these are companies trying to earn a profit. The fact Intel invests so much into an unprofitable foundry business is the whole problem. They could leverage TSMCs innovation too (and indeed are to some extent).
Professional_Gate677@reddit
The foundry business hasn’t started running customer wafers yet so how would you expect them to be profitable?
Ok-Acanthisitta3572@reddit
Intel has lots of fabs losing money currently.
Professional_Gate677@reddit
What does that have to do with your statement being completely false?
NeroClaudius199907@reddit
Title is making sound R&D expenses are apples to apples.
Ok-Acanthisitta3572@reddit
They ARE apples to apples, youre just missing the point. Ultimately, these companies don't exist to make CPUs or GPUs; they exist to make money. Nvidia isn't investing R&D in Fabs or x86 because it's much better spent on AI.
NeroClaudius199907@reddit
Nvidia and amd wouldnt be competitive if they had fabs. They do exists to make money, but theres nuance to expenses. Alphabet spends more on R&D than nvidia but their marketcap is lower
Ok-Acanthisitta3572@reddit
That's literally the whole point. Intel isn't competitive because they're making horrible investment decisions. Cutting staff developing CPUs and GPUs while wasting tens of billions of money losing fabs.
NeroClaudius199907@reddit
Dont look at short-term... fabs are longterm play. Plus they're getting subsidies and the west (US) needs those fabs for insurance
Exist50@reddit
This "long term" seems to be pushed out every year. Meanwhile, are CPUs and GPUs not also long-term bets? They're not going away.
The subsidies don't cover the cost to keep the business going, and Intel is a company, not a government. If the government "needs" it, they should be paying for it.
soggybiscuit93@reddit
x86 CPUs are a shrinking market relative to the overall compute TAM. Maintaining competitiveness there while focusing on GPUs or other competitive advantages long term isnt a terrible strategy.
Exist50@reddit
Well they recently did some more layoffs on GPUs as well, so it's not clear what their long term strategy even is. Seems to be a tug of war with management, and each time the rope goes one way, another team gets laid off.
NeroClaudius199907@reddit
When you're opponent has 70% of the industry and will always have customers even at not great yields, your longterm is not immediate.
People dont remember in the 80s the government bailed out intel by crippling the Japanese. Plus Taiwan just brings insecurity by itself, they're going to keep pumping money.
Exist50@reddit
No, TSMC has customers because they offer a reliable schedule, including yields and performance, as well as absolute node leadership. Intel has none of that, so what's the business supposed to be?
They're also crippling their standing in AI, a market where Nvidia probably has >>70%, and is bigger than manufacturing.
Who's "they"? Because the government hasn't been.
NeroClaudius199907@reddit
To them their fabs are their most important assets. Intel is an inferior position, design & hardware vs amd/apple/nvidia. They sacrifice and go completely fabless they'll get beaten.
They're willing to cripple ai, because they largely benefit from the potential benefits of their fabs.
They'll get subsidies & tax breaks.
Exist50@reddit
Their fabs are in a way worse shape than their design business. The design business could survive completely at TSMC, as you see with LNL/ARL. The manufacturing business could not survive standalone.
And clearly no one but Gelsinger believes their fabs are an asset, much less their most important one.
They haven't been getting anywhere close to the amount needed, and there's no indication that more is forthcoming. Even the government expects results.
NeroClaudius199907@reddit
LNL & ARL are not impressive, ARL is losing to Intel's 7nm & amd 5nm products and LNL although doesn't soundly beat amd's cheaper 4nm product & still worse than qualcomm and apple.
Good that the man in charge and making the decisions believes in your most important assets. They're still employing external fabs for designs.
If the government expects the result they wouldnt have given them the subsidy & tax break or apollo signing up.
Exist50@reddit
No, but the situation is still far worse on the manufacturing side. They can sell LNL/ARL for a profit. Meanwhile, the fabs lose $7B/yr. They can't even clearly beat TSMC's '23 node in '26.
And LNL is actually pretty decent, while ARL was crippled in part because MTL was forced to accommodate the failures of Intel Foundry.
Or he just continues to double down on a failed bet. The board will fire him if this keeps up.
The government subsidies cover about 1.5yr of losses, nevermind capex.
NeroClaudius199907@reddit
Intel already saw that if they rely on external fabs for their designs they'll still lose and barely make profits.
They'll beat tsmc 23 node in 26
ARL was crippled because the design is terrible
Yes they should continue, if amd gave up at $2 they wouldnt be where they are now.
It will get extended, theres nothing poltiking cant fix
Exist50@reddit
Barely profitable is still miles better than the fabs.
Yet they're using N3 for products then, not just 18A.
And the design was terrible in large part because of foundry demands.
AMD literally did the exact opposite. They cut manufacturing and doubled down on design. Intel cancelled their Zen equivalent.
Ok-Acanthisitta3572@reddit
The problem here is that you need both a fab and a process. Intel doesn't have a good process and no government money to develop one.
ghenriks@reddit
Nvidia doesn’t have/couldn’t get a x86 license hence why they have gone ARM
And their AI spending includes not just the GPU but also CPU and networking and a lot of software
clampzyness@reddit
actually the opposite, if the title is trying to make it apples to apples it should have said "intel spends more R&D on the CPU dept vs AMDs R&D CPU dept*, the title is more generalize not apples to apples.
Ok-Acanthisitta3572@reddit
This forum is so nuts. People wanna spin everything to be pro-Intel even when the facts are plain as day thst Intel is doing poorly. People here literally praising Intel for pissing money away. 🤣
clampzyness@reddit
thats the internet for ya, i too sometimes have reading comprehension problems but its all good.
nukem996@reddit
Intel also does more than CPUs. They make network cards, IO controllers, create main board reference designs and more.
half-baked_axx@reddit
Yep. The reality is that once China invades Taiwan, we're all switching to Team Blue without a choice.
Over dependence on TSMC is worrysome.
h1zchan@reddit
That explains why both Nvidia and AMD have Taiwanese CEOs
tissboom@reddit
And they will continue to live behind until they put out a GPU that is on par with what Nvidia is putting out. But they have to start somewhere and we’ll see where it goes.
k2ui@reddit
Relating R&D to market cap is ridiculous. What happens when tech bloggers write about finance
anival024@reddit
Why? It's perfectly valid to look at those metrics to judge whether or not a company's expenditures are proving fruitful.
auradragon1@reddit
I think the title of the article implied that a large R&D should lead to a large market cap, which is ridiculous.
soggybiscuit93@reddit
R&D to Revenue or profit would be a much more useful metric to determine current success objectively.
R&D to Market Cap ratio is a measure of the market's confidence in whether or not that R&D will pay off.
The market is voting value trap. That's the statistically most likely outcome. But INTC's pricing reflects its risk in a potential upside if R&D efforts pay off by the end of the decade.
phire@reddit
Market cap doesn't measure fruitfulness.
It only measures the "finance experts" opinions of fruitfulness. Their opinions are often distorted by external factors and buzzwords like "AI"
k2ui@reddit
I mean, feel free to compare them, but you won’t get any helpful or actionable information from it.
Intel has a much broader product portfolio and competes in many different markets than nvidia, which impacts not only intels research priorities, but also the market’s view of its valuation. One simple example: intel manufactures chips, nvidia doesn’t. Intel is spending on manufacturing technologies.
Exist50@reddit
And if all those areas aren't making much money? Sounds like this metric makes sense to highlight inefficient investment.
k2ui@reddit
You realize that profit from R&D takes years, right?
“Inefficient investment” today is what turns into actual breakthroughs.
Exist50@reddit
Intel's not some startup. It's been many years for plenty of investments that just continue to drain money. Their foundry, for example, has been a loss for a decade or so by current accounting.
Or it's just money down the drain. How many AI companies has Intel acquired and discarded? Think we're up to 3 or 4 now. Or look at them spending years and hundreds of engineers on a new CPU core to throw it all out because management started chasing a new squirrel.
masterfultechgeek@reddit
Tech bloggers might not be the best financiers but looking at financial ratios IS valid.
https://www.investopedia.com/terms/p/pricetoresearchratio.asp#:\~:text=The%20price%2Dto%2Dresearch%20ratio%20(PRR)%20measures%20the,expenditures%20on%20research%20and%20development.
It's one aspect of looking at how much potential the company has for future revenue streams.
It's just if there's been a history of high R&D and low market cap growth at a company it hints that there's some major inefficiencies.
soggybiscuit93@reddit
High R&D and low market cap can also just as much hint at product lines that R&D is being spent on not yet hitting the market.
The subjective interpretation of that is whether you believe INTC to be a value trap or an undervalued opportunity.
On the whole, the market believes INTC to be a value trap, hence its market cap. But it's a matter of guess work for both sides, and that risk is present in the potential upside (or slow burn). The future is uncertain.
masterfultechgeek@reddit
Intel has had a big R&D cost structure for years. A huge chunk of things they developed didn't really succeed.
ARC GPUs
Optane
10nm process
A half dozen CPUs designed for the 10nm process
A bunch of NAND SSD stuff
If you compare AMD's R&D to Intel's for the past decade... AMD spent A LOT LESS on R&D in say 2016 than Intel did. AMD came out with an awesome product in 2017 though... and Intel not so much... and you can say the same thing for the next few years.
auradragon1@reddit
Not sure why you're being downvoted.
Spending a lot on R&D doesn't mean they have a lot of great competitive products coming up. Intel has failed a ton in basically all markets.
To me, it's more inefficient R&D right now than some game changing leadership product coming in the pipeline.
soggybiscuit93@reddit
The failure of 10nm is well known at this point, and is two of your 5 points. And is an outside portion of that.
ARC GPUs were never going to be profitable in a 1st gen. A single gen just isn't enough to recoup the NRE, not to mention the need for market penetration pricing. ARC also synergizes with other product lines. It was never about just desktop dGPUs.
A lot of Intel's R&D goes towards their manufacturing. Intel products alone no longer provides the volume to amortize that NRE. Hence, the key metric to determine Intel's future over the next 3 years is how many external fab clients they can secure between 2025 - 2027.
That uncertainly drives their Share price, and the risk is baked in.
FascinatingGarden@reddit
I don't know. You tell me.
riklaunim@reddit
R&D is future potential not current profits. And as mentioned Intel is a very wide company from fabs and their nodes to final products.
Capt_Picard1@reddit
When has spending necessarily equaled innovation ?
Kresche@reddit
Oh my god!? That's like... 3 AMDs!
shimszy@reddit
Intel seems to spend a lot of money on things with dubious returns like tech standards. Thunderbolt, the Ultrabook standard, optane. They need to focus on core lines of business; this sprawl does no good.
nekogami87@reddit
I highly doubt thunderbolt is dubious. especially faced against the MANY MANY various USB3.x USB4.x and whatever they call their variant now.
shimszy@reddit
Nothing wrong with the tech.. Thunderbolt, Optane, Ultrabook form factor are all great. Its just that surely they haven't made their money back from developing these technologies.. and hell, I bet most people have no idea that Intel even developed Thunderbolt and Ultrabooks
nekogami87@reddit
That's where I'd disagree with thunderbolt, I'm pretty sure that helped sell a shit ton of laptop imo, especially after apple showed what could be done (daisy chaining, etc ..) I really think that it was worth it. Now, optane, maybe not indeed.
nekogami87@reddit
I am more surprised by the efficiency of what AMD is able to do when competing in GPU/CPU/DC at the same time, with a much tighter budget, larger product base and less engineers.
Yes Nvidia is still dominating high perf on GPU side, but again, with how AMD is placed. it's still a miracle they can do so much (Ok, the miracle might be named "Intel doing jack shit with their advantage for the past 8 years, not counting lunar lake")
puffz0r@reddit
I wonder how much of that R&D budget is actually marketing funds, intel is infamous for paying vendors off to preferentially use their products.
cjj19970505@reddit
Sigh... Can't believe ppl actually believe this shit.
Intel devotes more resource to collab with vendors while AMD doesn't. Thank god thier is a Linux example that you can see what is going on since it's opensource instead of just going conspiracy theory because you prefer AMD and believe that AMD's suboptimal software ecosystem is due to the sabtage of it's rival.
https://www.reddit.com/r/hardware/comments/1g2kp51/analyzing_issues_regarding_preferred_core/
Ryan-Jackman-Reynold@reddit
I just came here to say Arrow Lake better be good esp since I’m upgrading from 5800X3D
puffz0r@reddit
For gaming? No, it's worse than 14th gen according to intel.
theQuandary@reddit
This is really interesting when you realize that AMD spent nearly $6B in R&D last year, but ARM spent just $1.1B.
ARM makes interconnects, memory controllers, all kinds of IO, chipsets, etc. They make NPU designs. They make GPU designs. Instead of one CPU design every other year, ARM makes multiple CPU designs every single year (MCUs, DSPs, 5xx, 7xx, 9xx, server cores, etc). ARM's top-end designs have beaten AMD/Intel in IPC for a while now as well. This also excludes all the software they develop and maintain for all this stuff.
Even if you are convinced that x86 can be just as fast as ARM, it should seem obvious that it costs WAY more money to get x86 anywhere near competitive.
TwelveSilverSwords@reddit (OP)
It's truly incredible what ARM is accomplishing with the small amount of money they have.
masterfultechgeek@reddit
ARM's done some awesome stuff...
At the same time let's look at AMD
Zen 2 -> Zen 3(1.19x) -> Zen 4(1.13x) -> Zen 5(1.16x)
Going by AMD's numbers they're up around 56% on IPC. If you want to jump on the hate bandwagon and knock Zen 5 down a bit they're still roughly at parity.
They got clock speed increases too...
AMD had similar uplift.
ARM parts are still winning at ultra low power draw (but AMD is catching up with Strix Point) and AMD seems to be winning on perf/core in server set ups (128 core AMD server wins on performance AND perf/watt)
TwelveSilverSwords@reddit (OP)
The timeline is important. Since we are talking about Cortex X1 -> Cortex X925, which is a 4 year timespan. The appropriate comparison would be Zen3 -> Zen5.
masterfultechgeek@reddit
Zen 2 was the CPU at the start of 2020.
Zen 5 is the current CPU in mid 2024.
The 5 years before were PHENOMENAL. 50% or so since 2020 is solid from ARM but it was something like 300% in the years leading up to that. It's such a HUGE JUMP that I now think of the BEST arm CPUs (M4) as having higher IPC than the best x86 CPUs... which is insane given that 10 years ago I thought of ARM as low power junk that... well it ran. ARM got WAY WAY better.
VastTension6022@reddit
funny you say that considering even back in 2017, Anandtech's Andrei Frumusanu said
x86 lost the IPC lead a long time ago
masterfultechgeek@reddit
x86 is still winning on:
clock speed
manufacturing costs
peak performance in a 256+ core configuration
Apple has basically 0 products on the market that compete in the data center, which is where x86 designs seem targeted.
So yeah... Apple has a good CPU for relatively high margin mobile devices.
Find me something you can plug 20 accelerator cards and/or 50 SSDs into....
The usual argument is "yeah but idle power draw is 2W lower" which is true... but you'd need twice as many servers and connecting them would require a NIC that consumes MUCH more than 2W.
Geddagod@reddit
Doesn't matter much when you also aren't winning by meaningful margins in 1T perf, if anything it would be worse considering how much extra power you require to hit higher clock speeds (though ig everything is relative, hitting 6Ghz on a super wide design will prob require even more power than on a narrower core).
Qualcomm's and Apple's cores are quite competitive in area vs AMD's and Intel's designs. If you look at it from a core complex perspective, as in core+L2+L3+SLC, I'm pretty sure the recent ARM designs are even better there in comparison to AMD and especially Intel.
This doesn't seem to be anything inherent to the core design itself though.
Apple doesn't seem like the company to ever make server products, except maybe for internal use? IIRC there were rumors they were planning to do so a couple years ago, though I don't know how credible they are.
I'm pretty sure Qualcomm claimed they will be pushing Oryon cores into server products though.
The problem here is that these ARM cores are prob better suited for server designs even better than AMD's and Intel's cores considering how much better they are at ultra low power. The cores in Intel's and AMD's server skus are actually only being fed a couple watts each due to how power hungry the chip level interconnect is, and just from the sheer amount of cores there are with a relatively small TDP budget.
monocasa@reddit
You can't compare IPC apples to apples between ARM and x86. x86's complex instructions means that it executes less instructions to perform the same task.
mb194dc@reddit
Yet no real innovation since core in 2006 and fallen miles behind in manufacturing...
Intellectually bankrupt?
rsta223@reddit
Lol, this shows you definitely weren't paying attention to what Intel did in that period.
Process wise, they did the first high-K metal gate and the first finfet, both of which were huge innovations, and architecture wise, Nehalem/Westmere was a substantial step over original Core 2, and Sandy Bridge was another big jump over that. Intel's core design has advanced substantially, to the point that at iso frequency, a single modern Intel core is around twice as fast as a single Conroe core.
(And that's a single core vs a single core at the same frequency, so it ignores that the modem cores can run faster and you can fit far more on a single chip now)
mb194dc@reddit
It's all just incremental on top of core.
What Intel needs, is another breakthrough like that.
At that time, AMD had also overtaken them. They got really complacent.
Exist50@reddit
They had that. Then they decided CPUs don't matter, so they killed it.
rsta223@reddit
Huh? What are you talking about?
Exist50@reddit
Royal.
rsta223@reddit
The process advancements I mentioned were absolutely not incremental, and Sandy Bridge was as big a jump as Core was from PIII.
You can complain about "increments" all you want, but the fact is that increments stack. As a result, we now have cores running twice as high a clock, while doing twice the work per clock cycle, all while having 4-8x the cores per chip compared to the first Conroe era stuff. That's a massive uplift in performance, and can't be dismissed as "no real innovation since 2006".
3Dchaos777@reddit
I’m sure you could do better
Exist50@reddit
Anyone who could at Intel gets laid off.
3Dchaos777@reddit
DEI hires who worked 2 hours a day weren’t going to make a difference
Exist50@reddit
Lmao, you think that's who Intel's been firing?
TwelveSilverSwords@reddit (OP)
Intel's management is a dragon that's eating the company from the inside.
3Dchaos777@reddit
Who else?
Exist50@reddit
Their core teams. Anyone not on the right side of management/corporate politics that particular day.
3Dchaos777@reddit
Seems like it’s the people who are an unnecessary cost burden to the company. As in, if their work doesn’t create a direct positive ROI more than their salary, then they are at risk. Which is how it should be under hard financial times.
Exist50@reddit
Lmao, they literally offered voluntary separation. If you think they're doing layoffs based on merit, you don't know Intel history.
ExeusV@reddit
Not to everyone
masterfultechgeek@reddit
Jim Keller said in an interview that Intel was dysfunctional.
How much someone spends on something isn't necessarily how much value they get out of it.
Also what someone THINKS how good something is isn't necessarily how good it is.
Intel has been dysfunctional for years. Very inefficient.
Blueberryburntpie@reddit
Didn’t Jim Keller quit working with Intel because he felt he was constantly being stonewalled trying to push through reforms?
PotentialAstronaut39@reddit
I think the story was along the lines of internal conflicts / corruption.
Basically, instead of cooperating, people/departments would sabotage each other for personal gain within the company with a lot of internal conflict bullshit happening.
Berengal@reddit
That's what happens when you don't have competition for a long while. It doesn't make companies lazy, companies aren't people, but without competition it can't measure the competitiveness of its output, meaning the people working there are rewarded for their ability to play office politics rather than their actual results.
III-V@reddit
I don't think this is true, or else we wouldn't see its absence in organizations that don't have a profit motive.
It depends on other factors, mostly leadership, but also culture (like, culture on a societal level, outside the organization). And frankly, a lot of organizations still run into this where there is plenty of competition.
Berengal@reddit
Of course there are other factors too and several ways this can play out, but the key point is that without competition the company doesn't get good feedback on its output. It removes a powerful factor keeping the incentives of the decision makers aligned with the purpose of the company, which in a typical company leads to the typical internal office power struggles taking over, but in any given organization there could be other factors playing a larger role and it could play out very differently. There could even be other factors keeping the organization on task even in a monopoly, e.g. public oversight, like what government organizations have.
SpaceBoJangles@reddit
Never clicked until you put it together like this. It’s so obvious now.
ComeGateMeBro@reddit
With 110k people, having gone through many really arbitrary crappy layoffs... is it any surprise backstabbing and sabotaging is the norm?
Affectionate-Memory4@reddit
That's been my experience after 10 years here as well. Great engineers and well managed small teams, but there's clear bloat and red tape where there doesn't need to be.
masterfultechgeek@reddit
Great managers can allow their teams to make great impact.
Too many great managers cancel each other out and slow things down.
Exist50@reddit
Poor management can waste any amount of money or talent.
ThatGamerMoshpit@reddit
Well they did make a brand new product line…
gunfell@reddit
The reason is that they have too many employees. This might be temporary, and once they release 18a they might go back to normal
Exist50@reddit
Yes, Intel consistently makes the wrong investment decisions. The big one now being doubling down on failed fabs instead of focusing on the much more lucrative design business. Which is ironically required to keep those fabs running as well.
Beautiful-Active2727@reddit
Isn't intel the one that pays so amd can't have a motherboard with the same color?
Quintus_Cicero@reddit
That’s perfectly normal for an outsider trying to catch up to the market leaders. There literally isn’t a story there. If you enter a new market with limited experience in it, you’ll have to spend easily twice as much as the others to catch up.
octagonaldrop6@reddit
Intel trying to catch up isn’t relevant because the article shows similar numbers over the past 10 years when Intel was very much in the game.
It’s more to do with the foundry vs fabless business model.
So it’s actually even less of a story.
Quintus_Cicero@reddit
my fault about not reading the article for once.
I thought it was referencing GPU R&D but the article is just taking all R&D despite acknowledging that Intel has a lot more products to do R&D on. And Nvidia’s market cap has more to do with the AI bubble than the actual valuation of the company at this point.