Intel’s potential exit from advanced manufacturing puts its Oregon future in doubt
Posted by self-fix@reddit | hardware | View on Reddit | 111 comments
FieldOfFox@reddit
Anyone else see a serious problem with… essentially 50-100 people in the world only being the ones with the knowledge to actually make this shit work at scale?
MR_-_501@reddit
The supply chain goes wayyyy deeper than that, you have layers and layers of companies outsourcing parts of the production/r&d for semicon equipment/r&d
zerinho6@reddit
For a long time I was pondering about such issue, we already have the idea of patents but for some reason knowledge about tech and other stuff that could impact humanity progression and evolution is kept to a few people or companies, never left and potentially lost.
Case in point we have the recent security drama with asus and gigabyte, those companies are supposedly "absurdly smart enough to know and be able to work with nvidia, graphics and bios drivers" but it looks like the game dev situation where they know how to make a game but their program skills is worse than some teenager at school who has actually studied the a language for 2 years or so, how much advancements, competitions and creations could we have if such process were actually documented and had a known path for your to learn and be needed to know in order to be a expert.
Imagine if learning to code was such "secret/hard niche" and you couldn't learn in a few youtube/codecamp/personal projects moments.
conquer4@reddit
"Just use AI bro" <- I hate it
aurantiafeles@reddit
It’s worse than aerospace and building commercial planes because even with adequate resources if all those people decide to cash out and retire we’d be manufacturing stuff from 10 years ago.
conquer4@reddit
I'd like to go back 10 years ago, seems like a better time.
ElementII5@reddit
You mean the people at imec in Belgium?
https://www.youtube.com/watch?v=hDo5P578wJI
Moist-Ideal1263@reddit
I think there is a big difference between research and manufacturing.
expertonliner@reddit
don't think this is the case for advanced nodes. it's not a contest of who has the best theory, which can be done by small teams. it's rather iterative learning and 'empirical' research, collaborative problem solving with time pressure. intel in particular seems to be fucking up majorly in DTCO and EDA etc issues despite having enough 'theory' to convince pat that the project and schedule is feasible.
surf_greatriver_v4@reddit
moreso we're almost entirely reliant on a single country (two if you count the machine manufacturer) for cutting edge fabs
these new processes just demand so much, and an extremely high startup price is just the nature of the game at this point
No_Sheepherder_1855@reddit
Many parts of the supply chain have single suppliers that no one else can reproduce too.
Gwennifer@reddit
Like how there's only what, 2 suppliers of Ajinomoto build-up film substrates/at that performance level?
vexargames@reddit
Maybe google and facebook can move to Oregon so the prices of homes in the bay area go down, and new companies can move in.
mustafar0111@reddit
Here let me solve Intel's lack of customers for 14A.
Take one of your GPU's currently in the pipe and make 24/32/48GB VRAM versions of it using 14A and provide software support and price below the other players. Make sure its inference speed is at least equivalent to a RTX 3090 or better.
They'll sell out so fast you won't be able to keep them in stock.
OutrageousAccess7@reddit
it would take at least three years. while tsmc proceeds toward 1-nanometer process.
mustafar0111@reddit
I don't doubt TSMC and its customers are going to kick the ass of anything coming off 14A in terms of performance and power efficiency.
But the GPU market has absolutely absurd markup's going on right now and there is definitely a gap in the market in the lower end where there is just nothing to even buy. Especially for cheaper inference cards with a decent amount of VRAM packed on.
Nvidia has all the higher VRAM cards locked up behind a massive paywall right now. AMD seems content to follow along.
nanonan@reddit
Strix Halo.
xternocleidomastoide@reddit
Gamers really think they are the center of the tech universe, don't they?
ResponsibleJudge3172@reddit
To be fair, would we see so much doom and gloom were it not for gaming? People are convinced Intel is shit simply because they don't beat the current X3D in gaming
dabocx@reddit
This sub has become more and more gaming focused over the past few years.
imaginary_num6er@reddit
You meant AI focused
mustafar0111@reddit
To be fair that is what the majority of people on this subreddit probably use GPU's for so its the first place their minds go when you say GPU. The conversation would obviously be different on localllama or something.
SERIVUBSEV@reddit
Gamers are the worst consumer base of any industry.
Most are kids and man children, many in their 30s spend hours everyday on twitter fighting about how PS5 is better or defending Xbox's cloud and multi platform strategy.
If people had more than 2 brain cells they would support competition for the sake of it, and we could keep getting massive performance jumps year on year that could lead to native 4k120fps on mid ranged cards within 2 generations.
Instead we have brand warriors that buy the most expensive consumer electronic device and don't even question why its still on 5 year old node, because they can't stop frothing at DLSS and "neural rendering", which we wouldn't need if we had competition that got us the 4k120fps performance at mid range anyway.
Strazdas1@reddit
I must have less than 2 braincells then because i think buying a shit product for more just because its competition is bad.
conquer69@reddit
But DLSS looks better than TAA. If your concern is image quality (assuming that's why you want 4K), then you would want DLSS because it is objectively superior than the TAA we had before.
The obsession with resolution comes from the pre-PBR era where resolutions and SSAA were the only way to deal with specular shimmering. That was over a decade ago.
LingonberryGreen8881@reddit
Plus a good chunk of the die area is used for ray tracing which most gamers don't even use and which has no other use case.
ResponsibleJudge3172@reddit
Man children froth at the mouth of better scaling and more efficient rendering because it has AI in the name
kuddlesworth9419@reddit
I just play indie games where I can run them at native 4k 60+ fps on a 1070. Got back into photography as well.
mustafar0111@reddit
That is why I focused on inference speed and VRAM.
Professional-Tear996@reddit
48 GB GDDR6 and a GPU of let's say 300 mm² die size, fabbed on 14A, sold for $500.
Meanwhile Nvidia's ASP for gaming GPUs is $400. And they don't use anything more advanced than 5nm class nodes.
Yeah - you don't know what you are talking about.
Strazdas1@reddit
thats some weak GPU you got there once you loose all that die size to memory controllers.
mustafar0111@reddit
Whoosh.
I wasn't talking about gaming. The big hint was the word inference speed.
Professional-Tear996@reddit
Learn reading comprehension before giving these hot takes.
mustafar0111@reddit
Here is a idea don't reply to comments if you don't understand what the other person is talking about.
This was never about gaming GPU's.
Professional-Tear996@reddit
Read the whole comment slowly. Especially what the comparison to Nvidia's gaming GPUs are meant to convey.
You understand diddly squat, that much is clear.
mustafar0111@reddit
I did read it. It had nothing to do with my comment.
You are talking about gaming. I am not.
I don't think you even know what my comment was about. Tell me, what do you think I'm referring to? Because I guarantee most other people here know.
Professional-Tear996@reddit
Why would Intel sell your hypothetical AI inference device for $500 with 48 GB memory and fabbed on 14A when Nvidia's ASP is just $100 less selling gaming GPUs fabbed on 5nm?
mustafar0111@reddit
Intel is already trying to do it with B60.
There is a real need and demand for local inference. As evidenced by the used market right now. The cards can be produced because they are being produced.
Nvidia is not going to produce any AI accelerators at $500 or below, ever. They have each tier of VRAM locked behind a particular price point. AMD is a bit cheaper at every tier but doing exactly the same thing.
Professional-Tear996@reddit
B60 is also on 5nm. Just like Nvidia.
mustafar0111@reddit
And by the time Intel actually taped out a new die for 14A Nvidia and AMD will be on to other TSMC nodes.
Professional-Tear996@reddit
And none of them would be selling what you described for $500.
But Intel should, according to you, based on what exactly?
crshbndct@reddit
Intel should, for the same reason that AMD gave us Mainstream 8/16 Processors for the same price as Intels 4/8.
There are no bad products, only bad prices.
mustafar0111@reddit
Correct. AMD and Nvidia will not sell high VRAM GPU's at a sub $500 price even if it is profitable because it would eat into their even larger profits in the higher end GPU and data center accelerator markets.
Intel doesn't get a choice. Its not competitive in the GPU market or the AI accelerator market. That is the reason they gave up on model training completely and instead focused on inference cards. They know they're already fucked and missed the boat.
So rather then just sitting in the corner and suiciding themselves off in the market they could sell into the one part of the market left open to them that AMD and Nvidia won't touch. The budget market.
Strazdas1@reddit
well you just redesigned entire chip architecture to work with a new, large bus that takes up so much space that your compute chip is half the size now.
theholylancer@reddit
The problem with 14A is that it needs a profitable external customer to spread risk out.
And you are suggest them to sell things cut to the bone trying to find competition in a market that is heavily cornered by cuda, hoping to find uses by people who will likely custom code for it because how cheap it is.
This as you said WILL fill the production capacity, but the problem is intel is seeking external investment to make it happen rather than just filling capacity.
This is no longer a marketshare play, where they are willing to eat margin to get marketshare like b580 and a770 were willing to do because their core design sucked and the perf / mm2 is shit. And what you are suggesting is more or less a marketshare play rather than a profitability / funding play.
Henrarzz@reddit
That’s a good way to make Intel dead in record time lol
ElementII5@reddit
A modern leading edge node supported by a single product that is "priced well below other players"?
You clearly have no idea what it costs to develop a leading edge node... Even if Intel could manufacture everything else they sell on 14A they still would need external customers to recuperate the cost.
mustafar0111@reddit
I didn't say manufacture everything else they sell. I said an affordable GPU with good inference speed and a decent VRAM loadout.
I think a lot of people replying do not even understand what I'm talking about and seem to think my comment is about gaming.
Professional-Tear996@reddit
What you call affordable would be termed loss-making if we are talking about leading edge nodes in 2028.
You don't understand fab costs, pricing and margins.
mustafar0111@reddit
If they can't produce something on 14A that is cheaper then TSMC in 2028 Intel is going to be bankrupt.
Professional-Tear996@reddit
If TSMC produced what you described for Nvidia in 2028 at the projected wafer costs, it would bankrupt Nvidia as well.
mustafar0111@reddit
Only on a leading edge node. That is not required for what I am talking about.
Professional-Tear996@reddit
You are literally talking about a loss-making item fabbed on 14A sold at the price of gaming consoles, with Intel hoping that people buy it over the alternatives.
dabocx@reddit
The margins for that would be non existent or negative if anything. It’s a good long term investment but I don’t know if the company can do something like that while also slowly losing market share in everything else including DC cpu
flat6croc@reddit
If it was that easy...
imaginary_num6er@reddit
So 14A+++ till 2030
Helpdesk_Guy@reddit
Imagine holding onto a process (as performant as it is), for as long as possible, yet meanwhile refusing for more than a decade straight, to develop a PDK for external customers (for them to capitulate on it, and you make bank with it), to actually make a living of such a Forever-Node™ like their 14nm± for once, or their golden 22nm.
… then complain about vacant fabs on said nodes, while being short on money! Peak comedy.
It's truly incredible how Intel constantly ignores reality.
imaginary_num6er@reddit
The irony is that Intel is actually maxing out capacity of their 7nm node while other nodes add sitting idle
Helpdesk_Guy@reddit
… while Intel does basically nothing about all of it, with no actual PDK at hand for given processes.
Only to lament over heavy foundry-related losses every other quarter at their earning calls!
Wasn't it them trying to milk their 10nm/Intel 7 quite a while longer? Seems the market asks for newer stuff.
Intel should've NEVER been granted even a single cent of subsidies, WITHOUT a subsidy-package being necessarily tied to the mandatory requirement, of developing PDKs for at least their older 14nm/22nm processes to begin with and open those up afterwards for industrial foundry-customers! Then 20A/18A later.
Who cares about anything Leading Edge, when Intel can't even get a PDK in place for Trailing Edge or even Lagging Edge and at least their older age-old processes up and running from a decade ago since?!
RazingsIsNotHomeNow@reddit
I don't think they have actually booked any of the subsidies yet. They could still end up not receiving a cent.
Helpdesk_Guy@reddit
Yes, they have. Intel received already $2.2Bn in last December and January.
ResponsibleJudge3172@reddit
Absolute peanuts
nanonan@reddit
Matching the effort they've made towards the required milestones.
Helpdesk_Guy@reddit
Pft, peanuts! Do YOU would like to own such sums?! I think many would love to have these 'peanuts'!
The point still stands, Intel already received BILLIONS in funds from the CHIPS and Science Act. Period.
To your defense here though, Intel deliberately refused to disclose having even received such for literal months, and also withheld of having already received +$500m USD from the EU in last year's October.
So all of it was only disclosed afterwards in January/February, likely to uphold and support the very Intel-narrative they constantly push, of "Mean government trying to starve poor Intel to death intentionally!!".
ResponsibleJudge3172@reddit
How many billions and how does it compare to the tens of billions setting fabs has cost TSMC and Intel?
jmlinden7@reddit
They had a PDK for external customers, but it was not very straightforward to use and Intel didn't provide a lot of customer support to help customers use it. They signed on Altera as a major customer.
Helpdesk_Guy@reddit
I don't know … Did they actually ever had one?
The public opinion is (which Intel basically confirmed), that Intel haven't had a actual PDK for any of their processes for external clients ever since, and wanted to address that for 20A (which was conveniently knifed before it was ready?).
Only to repeat that for 18A since with their PDK v0.8 or so being eventually ready. Then v0.9 and v1.0 came.
'Sign on' is a bit of a stretch here I think. Intel needed to massively dash Altera with cash for coming over.
Not to use the term 'bribe' here – Only for them to immediately pay the total price-tag of their own independence for it afterwards, and Altera has been let to rot at the wayside by Intel since.
Helpdesk_Guy@reddit
I recently stumbled across an article I *have* to share here! Mind the similarities!
Now for the fun part: When do you think this article was written?! xD
The article could be readily from last week, right? >
!<
Geddagod@reddit
18A and iterations till 2030.
14A is coming out near the end of 2030. LBT claims 2028-2029.
iwannasilencedpistol@reddit
Why are comments on Intel threads so unbelievably insufferable?
rustyhalo93@reddit
2-3 people are starting the most of the fights, maybe they are rage bait AI
iwannasilencedpistol@reddit
They don't like it being pointed out but they're all active on the Intel_stock sub too
ResponsibleJudge3172@reddit
You mean the guys doing victory laps at the thought of only AMD doing CPUs?
SkillYourself@reddit
I'm noticing how when one of them copped a ban, the other started turbo posting...
logosuwu@reddit
Half the people here have a hate boner for Intel lol
iwannasilencedpistol@reddit
A reddit hallmark
Numerlor@reddit
eh there are people in common that bring down intel on less popular posts but higher traffic posts have plenty of misinformed people shitting on intel for things they didn't do.
This is the same across many tech communities that just have a rage boner against intel probably from yt creators baiting drama
RazingsIsNotHomeNow@reddit
Nah, that's just this sub in general. It attracts very argumentative people. Like a Linux forum lol
Strazdas1@reddit
Because there are some people in this sub who are hellbent on intel failing.
No-Relationship8261@reddit
What do you mean? The fact that Intel is dead and these people in Oregon will be jobless soon should not be a surprise to anyone including people that will get laid off.
Even as a bystander immense amount of rot in Intel is clear, as an insider it should be even more obvious.
Arrow Lake was the first time in my whole life that I saw performance regression in a next generation chip.
First time in my whole f***ing life.
fritosdoritos@reddit
AMD's Bulldozer CPUs had more cores, higher clock speed, and uses more power than the preceding Phenoms but often performed worse.
CapsicumIsWoeful@reddit
Regression only in gaming benchmarks. Productivity was a net gain compared to previous generation.
This subreddit seems to forget the people use CPUs for more than just gaming
Intel isn’t going anywhere. They’re too important from a national security perspective, and they still dominate the OEM space for corporate customers (Lenovo, Dell, HP etc).
Their new laptop CPUs are genuinely good from a performance vs power consumption perspective.
I don’t even own anything Intel (I use a 9800X3D at home), but I can at least see that Intel isn’t going the way of Blockbuster or Kodak just yet.
wintrmt3@reddit
DEC surely won't die.
steve09089@reddit
Must’ve been a short life then if Rocket Lake isn’t on the radar.
And regression is an exaggeration, the only regression in it is its atrocious memory latency that affects gaming.
Its multi core and single core performance don’t suffer from that flaw in other workloads, and are still uplifts from the previous generation.
The main problem with Arrow Lake is that it’s made with a very expensive node with not much to show for it in terms of efficiency or performance uplift vs its predecessor save for maybe laptop users.
NewMachineMan@reddit
Probably one of many reasons why Intel and their leadership got lazy
Reminds me of Xbox and their fans, and look what happen lol.
meshreplacer@reddit
Did intel get all that CHIPs money? Curious since it was for building factories and creating jobs or is it “all gone” sorry.
Executor_115@reddit
They've received $2.2 billion out of the $7.9 billion award.
Vushivushi@reddit
Intel receives the money as they meet milestones, so it's more like matching their investments rather than a blank check.
Strazdas1@reddit
No. They recieved a small part of it so far.
costafilh0@reddit
A bunch of BS!
Every chip designer and manufacturer has their ups and downs.
Just because Intel isn't capitalizing on the AI boom and its stock price reflects it, we get all this BS media coverage about it, like it's the end of the world.
This is likely a signal to buy the stock in the next market correction and hold it for the next decade.
Helpdesk_Guy@reddit
All this factory-stuff with Intel laying people off, really got me thinking …
Intel really needs customer-contracts for their foundry, to get things going, I suppose?
It's as if it would *tremendously* help Intel, when they'd get, I don't know …
Like contracts for a shipload of tiny stuff, to quickly get up the yields and make their processes actually viable!
Imagine someone came over to Santa Clara, just to offer them such a contract, like for millions of tiny little chips!!!
iguessthiswasunique@reddit
Honestly Switch 2 would have been a great opportunity for Intel.
Samsung 8N in 2025 is incredibly outdated, especially for a mobile device where efficiency goes a long way.
I can’t imagine Intel couldn’t have offered to produce T239 on something like Intel 3 and made it worth their while. Not to mention, if it went well enough Nvidia would be more likely to use their foundry for other products as well.
NoRecommendation2761@reddit
>Honestly Switch 2 would have been a great opportunity for Intel.
Intel never had a chance. Tegra is specifically designed on Samsung node with Samsung IP. Had Nvidia thought an investment for design change was warranted, then they would have gone to TSMC, not Intel.
Intel's PDK is confusing at best and outright hostile at worst. It is not even a finished product that doesn't guarantee that you will get chips based on specs.
At least you have something with 18A. You don't even have a half-baked PDK with 14A. How would anyone expect to design a chip based on IFS node? They aren't even the cheapest in the town.
There is a reason why there is a negligible number of external customers for IFS. Too expensive, too unfriendly and to complicated to design.
It was always either TSMC which offers the best node and will babysit customers from a start to a finish or Samsung which will offer you the 2nd rated node, but also the cheapest prices.
steve09089@reddit
8N was an existing node NVIDIA already designed their chips for, and Nintendo was looking to save money and go safe, not go cutting edge.
RazingsIsNotHomeNow@reddit
Yeah, a switch 2 contract doesn't make sense unless Intel heavily subsidized it. At which point that's a desperation play to get revenue for helping to scale customer relations.
Helpdesk_Guy@reddit
So? Who cares?! Are we going to pretend now, that Intel never subsidized things?
Intel has ALWAYS subsidized the living pencil out of lousy dead-end products, to combat superior offerings from others, only to maintain their uncompetitive sh!t into life with billions of dollars …
Yet now, when it's basically do or die now and when their very survival as a company is on the line, NOW there are concerns over subsidizing things!? Are you kicking?
So what?! Intel has always done such desperation plays, nothing new. Only this time it would be official.
If Intel could blew through $5.7–$7.5Bn USD for subsidizing the sh!t out of Optane, or $12–$15Bn for trying to overthrow the mobile market using their inferior Atom, or spent $4.5–$5.3Bn to pressure utterly outclassed 1st Gen ARC Graphics into the market at OEMs and whatnot other blunders …
Then Intel *ought* to have a few billions laying around to jump-start the foundry, no?!
scytheavatar@reddit
Customers don't want to pick Intel because they have a track record of overpromising and under delivering. Even Intel themselves don't trust their own foundries. Until the fundamental issues of Intel foundries and solved there's no point in Intel "jump-starting" their foundries.
I keep comparing Intel foundries to the situation AMD is in with their gaming GPUs, people keep wanting AMD to drop their prices and undercut Nvidia. But does that actually help AMD and get people to buy AMD GPUs? All it does is to make Nvidia drop their prices too. What AMD has to do is to close the gap between them and Nvidia when it comes to software and make people feel AMD cards are not worth less than Nvidia cards. That's the same attitude Intel needs to do with their foundries, make people feel they are not inferior to TSMC.
Helpdesk_Guy@reddit
That's a very good comparison actually, just revealing that people are insincere about the whole stuff and only want AMD to basically act as a price-kicker before Intel and Nvidia, only to get their beloved brand of life cheaper …
No, not even that helps. The worst part is, that AMD had times, at which their cards not only were less expensive, featured more VRAM, but were more powerful than anythign nVidia – They still were left at the shelve, since mind-share.
That is a train that left the wreck of a self-sabotaging station already ages ago. It's not ever coming back to Santa Clara.
ResponsibleJudge3172@reddit
Rich from same people who in every thread espouse dreams of AMD dominating both CPUs and GPUs
Helpdesk_Guy@reddit
Yes, of course! That's also a another issue at hand – Constantly promising sunshine, lollipops and rainbows, yet not even have the basics liek a Process-development Kit (PDK) at hand to offer any foundry-customers …
That was always the single-biggest red flag in all of that: Intel itself takes onto TSMC, yet ask for foundry-customers.
100%. … and the very first steps in becoming a foundry, is to offer PDKs for your processes to be designed for!
wonder_bro@reddit
Issue is not with throwing around money but rather a lack of PDKs on anything not called 18A or more specifically 14A. Even if customers want to go with Intel legacy node there is simply no way to design them.
Helpdesk_Guy@reddit
Exactly. It's mind-blowing that Intel to this very day STILL hasn't a PDK for any of their older processes for external foundry-customers, yet does nothing about all of it, with NONE PDK at hand for given process-nodes.
Only to lament over heavy foundry-related losses every single quarter at their earning calls – It's truly peak comedy.
Imagine holding onto a process for as long as possible (as performant as it is), yet meanwhile REFUSING for more than a decade straight, to develop a PDK for external customers (for them to capitalize on it, and for you making bank with it), to actually make a living of such a Forever-Node™ like their 14nm± for once, or their golden 22nm.
… then complain about vacant fabs on said nodes, while being short on money! Santa Clara is nuts.
bestanonever@reddit
It's funny but AMD has a lot more experience working with other companies for custom chips and they don't even own the fabs anymore!
But there's a reason they won the mayority of the console deals, they were willing to work on custom designs for cheap and survived long enough to actually offer the best CPUs right now.
Intel, way back then, wasn't probably looking for such meagre earnings. Joke's on them, now.
Helpdesk_Guy@reddit
It's even more ironic, that it seems that AMD's in-house expertise on custom-chips and precious building-blocks entered the house, around the very time-span when they had to axe the fabs due to financial constrains.
Since they signed the big console-deals shortly after and had been in Wii, Wii U and and others before.
Specialist-Hat167@reddit
Dying company. Sad to see
No_Sheepherder_1855@reddit
What $150 billion in stock buybacks does to a $80 billion company.
Helpdesk_Guy@reddit
The bad thing isn't even, these $150Bn in buybacks, (well, in a way it is, but you get the idea).
The actually worst part is, that of the $152.05 Billion Intel has spent on stock-buyback programs since 1990 (On a tanking stock, which basically has mere side-graded since the Dotcom-bust in the 2000s!) …
… virtually A THIRD of that very sum they actually wasted on buybacks just since AMD's launch of their Ryzen, Threadripper and Epyc since 2017 alone – No less than $44.6 billion USD!
Quatro_Leches@reddit
Intel is cooked as the kids would say
Helpdesk_Guy@reddit
Intel is not just cooked for good … It's 100% toast, double-grilled and salt-spanked, hanging in the smoker since.
It's truly remarkable how Intel has seemingly perfected their way, to constantly sleepwalk themselves into disasters.
3Dchaos777@reddit
Found the AMD senior janitor
NewMachineMan@reddit
This is what happens when a group/company focuses too much on politics or themselves instead of being progressive or innovative lol
Almost foreshadowing given the state of the country lol