Intel draws a line in the sand to boost gross margins — new products must deliver 50% gross profit to get the green light
Posted by Geddagod@reddit | hardware | View on Reddit | 270 comments
the_dude_that_faps@reddit
If Intel is going to stop bleeding market share to Apple and, AMD and ARM CPUs, they're going to need GPUs.
Maybe they temporarily abandon discrete GPUs, but it makes zero sense for them to abandon them altogether. APUs still need a good architecture and they still need to catch up in software. And since devs won't pay attention to them of the market doesn't, they still need to convince us they have a good story to tell.
If they do all the legwork for good APUs and built a scalable architecture, it makes no sense to not go for discrete.
Their bigger issue with this new mandate is to be able to produce something competitively priced. I don't see them being able to compete against AMD, let alone Nvidia given that they all share the same process node and not even AMD has 50% gross margins on GPUs while also building more cost-effective parts than Intel.
Maybe if they can actually use Intel foundry?
OutrageousAccess7@reddit
well, can celestial and druid survive?
Exist50@reddit
They already canned Celestial. Unless they can find a way to do Druid very cheaply and/or a way to sell it for higher margins (AI? iGPU reuse?), sounds like it's not getting funded.
mockingbird-@reddit
The only way it can be done is if Intel can get silicon for cheap from its foundry for Arc.
No way can Arc compete if it's made at TSMC.
Exist50@reddit
Well Intel now has to pay market rate for even "internal" nodes. They might get some early adopter discount, but sounds like it will be difficult to get the kind of historic pricing they've had.
Why not? Nvidia and AMD do. Intel "just" needs to have competitive IP, SoC, and drivers.
Strazdas1@reddit
Market rate at IFS may be less than market rate at TSMC though. Im sure IFS is offering lower rates to attract external customers.
Exist50@reddit
They surely are, but if it's actually good and cheap enough to be compelling, Nvidia/AMD could use it themselves and then that advantage disappears. Besides, the fact that Intel themselves continue to prefer TSMC (even over supposedly equivalent/better internal nodes) for graphics/AI indicates that that whatever the pricing is, it needs to be better.
Not_Yet_Italian_1990@reddit
On the GPU side? AMD barely stays relevant. Single-digit market share.
Even if AMD were in a position to make a move in the GPU space, it would be impossible for them to gain substantial market share due to TSMC allocations not allowing for them to do so. They'd need to have some sort of breakthrough Nvidia-killing product and know far enough in advance that they could bet the whole company on buying up an enormous amount of TSMC allocation.
As long as TSMC is the only provider of cutting-edge nodes, we're going to see sky-high pricing and extreme scarcity. And we're also going to see zero movement within the GPU market.
Exist50@reddit
The problem isn't TSMC. You can find plenty of both Nvidia and AMD GPUs on shelves right now.
Not_Yet_Italian_1990@reddit
They'll basically all eventually be sold, though, even if it requires discounts.
If you go to PCPartPicker you can't even get a 7600 XT at MSRP. There are a couple 7700 XTs at MSRP. Basically all of the Ampere/RDNA 2 supply has dried up or is insanely priced. The cheapest 7800 XT is $100 over MSRP...
If both companies sell virtually all of their stock, and they probably will, the market share isn't going to change.
In order for AMD to start reclaiming market share, they're going to need to produce a lot more cards. And it's simply impossible to do that because they're feeding at the same trough as Nvidia.
Strazdas1@reddit
Why would you want a new 7600xt now? Its feature-obsolete.
Not_Yet_Italian_1990@reddit
That was my point.
Exist50@reddit
You realize they're constantly producing more, right? At least of the newer generations. Supply for the older will dwindle as they're phased out, yes. There's no indication that it's lack of supply that limits AMD's marketshare.
Not_Yet_Italian_1990@reddit
If both companies are selling virtually all of their stock, it absolutely does mean that.
You just said supplies of the older cards will dwindle. Meaning that virtually all of them will eventually be sold, at some price or another. But AMD's market share will never substantially increase, because it can't substantially increase.
If Nvidia has 10x the consumer GPU allocation through TSMC every generation that AMD does, how can AMD ever catch up to them? It's a mathematical impossibility.
Exist50@reddit
Only because eventually they discount them, use them for returns, etc. The old gen must compete with the new.
Not_Yet_Italian_1990@reddit
Sure... but... that's my point.
If basically 100% of AMD cards and basically 100% of Nvidia cards are going to find their way into machines at some point, then market share just becomes a matter of how many cards are produced. Blackwell and RDNA 4, combined, aren't even 10% of the total market yet, I don't think.
Basically, if Nvidia has 90% GPU market share, it's because they make somewhere in the ballpark of 90% of GPUs.
If AMD gets into a position where their cards are much more desirable than Nvidia GPUs, then they'll just sell out/prices will increase. But their market share won't increase because they're supply constrained. There's some wiggle room to get extra allocation from TSMC, but not a lot, and it's more expensive (they'd probably need to buy it from someone else), and it's not just some switch that they can flip and, like... double their production over night. They need to allocate wafers several months or even years in advance.
Does that make sense?
Exist50@reddit
They scale production with demand. It's not just a fixed number set in stone with TSMC. Granted, they absolutely can miscalculate, but I'd hardly suggest AMD's market share problem can be resolved by just ordering more.
Not_Yet_Italian_1990@reddit
To an extent they can scale production. But it's something that takes a lot of time. If they have a product that's super successful and they run into supply chain issues that takes several months to a year to iron out and try and capitalize, and by that point it could be too late and interest may not be as high.
If they book a certain number with TSMC and they want more, they usually need to get that allocation from someone else if TSMC is all booked, which they basically always are these days. TSMC books and allocates more than a year in advance for their cutting-edge stuff. Apple always goes first, then everyone else follows.
I agree, though, that it's not just an issue of ordering more. The problem, though, is that they're always booking based upon the sales of the last generation which are always just a small fraction of what Nvidia books. They also have smaller margins than Nvidia, so overbooking is much more of a risk for them. Even if they expect UDNA to be really good, they're not going to be booking 2x or 3x what they did for RDNA4... it's just too much of a risk. They can try and scramble and find components and wafers later on, but it's not going to fundamentally change the market.
If AMD cards instantly sell out throughout an entire product run, they're still selling a lot fewer GPUs than Nvidia even if Nvidia cards are widely available throughout the entire run. For market share, it doesn't really matter that much if a card sits on the shelf for 3 years or is sells out in 3 days, as long as basically all of those cards eventually end up in machines.
Case in point: the 9070 XT is a very in-demand card. But it sells at way above MSRP right now because there just aren't enough to go around. (And, yeah, board partner and retailers play some part in this as well)
mockingbird-@reddit
Look at the specs. The Arc B580 should be right up there with the GeForce RTX 4070 in performance, but it's not even close.
Strazdas1@reddit
B580 suffers from things other than just pure architecture. Like that CPU overhead that simply hardcaps the card on anything than the best CPUs out there. Its a big leap since Arc, but still a long way to go to catch up with the knowhow of other companies accumulated over 30 years.
Exist50@reddit
Yes. They either need to radically close that gap, or they will be forced to leave the market. The fabs can't save them if that kind of PPA gap persists.
theevilsharpie@reddit
Source?
Exist50@reddit
The closest you'll get to a public acknowledgement is Gelsinger's remarks about refocusing on iGPUs. Beyond that, well they refuse to talk about Celestial for a reason.
theevilsharpie@reddit
That's a lot of words to say, "I made it up."
Exist50@reddit
No, just that I don't have something irrefutable I can link you. It's no more fake than N3 ARL is.
But you're perfectly free to wait around for it. Will just be in vane. You can see how quick the "anti-Intel FUD" cycle becomes reality.
RandomFatAmerican420@reddit
Ya but one could argue not having a whole dGPU lineup, and not putting them in laptops at all “focusing on igpu”.
Exist50@reddit
I'm not basing that claim on Gelsinger's comment, just using it as an example of external evidence.
Strazdas1@reddit
Celestial is too far into the pipeline to cancel now. Druid who knows.
RobsterCrawSoup@reddit
I suppose that may depend on how they are going to amortize R&D costs. If they recognize the development of celestial and druid as also contributing to gaining long term GPU market share beyond the immediate product generations, then maybe it still looks good. The GPU market isn't cooling down at all and if Intel can break into the data center GPU market in the next few generations, there is a lot of money to be made there. Also Intel isn't going to be getting away from integrated GPUs, so developing discrete GPUs for the consumer market can rightly be seen as spending a bit extra to be able to capture some extra revenue in a different segment.
Of course, part of this depends on whether Intel is going to keep catching up to Nvidia or not. B580 has been encouraging, but the lack of a B770 was not, and both of these things are surely old news internally at Intel as they probably are already bullish or concerned about Celestial, while we haven't a clue yet.
If there's anything that is starting to feel like a threat to continued development of discrete consumer-focused GPUs, it's the recent emergence of unified memory APUs with GPUs powerful enough to trade blows with lower spec gaming GPUs. Apple led the way here but now AMD has followed suit with the Halo Strix APUs and while none of them can touch a RTX 5070 yet, it may very well be the new trend. That's only maybe, but I wouldn't be surprised if Intel follows Apple and AMD's lead and introduces their own M-series/Strix Halo competitor.
upvotesthenrages@reddit
I've seen a few tests of local language models where the 5070 gets demolished by Apple.
I'm not super knowledgeable about this sector, but the specifics tests they did were all pretty compelling, though it could have been cherry picked (didn't seem like it).
They also tested a 5090 laptop against a Macbook pro, and the 5090 got completely destroyed.
Quite interesting to see how they work for AI tasks.
m0rogfar@reddit
It’s mainly the VRAM that causes the results to be so lopsided for AI. While Apple’s GPU in the M4 Max is certainly competitive against high-end discrete mobile GPUs, it’s not “completely destroys the 5090 Mobile” levels of impressive in most other tasks.
LLMs can just use an obscene amount of VRAM, to the point where even the Mac Studio with its 512GB of memory that can be used as VRAM can struggle with having enough on the latest highest-end models.
In that kind of scenario where memory capacity is king, the MacBook Pro with its up to 128GB VRAM and much faster swap to the internal disk since the SSD controller is on the same die was always going to win. The 5090 Mobile with only 24GB VRAM and a much slower swap flow, where the GPU first has to communicate with the CPU over a bus, and then the CPU has to communicate with the SSD controller over a bus simply never had a chance.
upvotesthenrages@reddit
Yeah, I figured it might be something like that. In this case the Macbook Pro had 32GB memory, vs a 24GB 5090.
Doesn't Windows 11 allow the GPU to read/write straight to the SSD and surpass the CPU? I could have sworn that was something I read a while ago.
Thanks for the awesome response btw!
Kryohi@reddit
It's still going to be super slow. Even 256bit ddr5 is slow for big models, you need an epyc with 12 channels to get good toks/s without a GPU.
upvotesthenrages@reddit
Sure, for large scale stuff.
For a local model it seems pretty damn great. A few $1000 and you can operate your own model with privacy intact and great performance for a few users.
I'm sure this'll all drastically change when we start doing local video stuff though.
dahauns@reddit
Technically it doesn't have to - with DirectStorage/"RTX IO" (the latter also being available in Linux AFAIK) the GPU would be able to use the fast flow as well, but my gut feeling says no one has probably implemented that yet.
GoblinEngineer@reddit
Nvidia also has it's tegra lineup. Although their primary use case is for Jetsons (used in robotics/IoT/SDCs), they may see the benefit of offering an Arm APU for the PC market soon. They already modified a Tegra APU for the switch 2, why not open it up to other steam deck like handhelds or what ATI is doing with their new APUs?
m0rogfar@reddit
Jensen has all but confirmed in interviews and investor calls that they’re gonna be doing ARM SoCs sooner rather than later, with leaks suggesting a launch of >100W ARM+Blackwell SoCs for laptops this fall.
phire@reddit
Good news: Gross margin doesn't include R&D at all, it's simply the cost of manufacturing/shipping the final product. (source)
Problem is, it's such a narrow metric that I suspect Arc will still run afoul of it. It doesn't leave room for any "we need to make this design so we can iterate on it later", unless that intermediate design can be sold with 50% gross margins.
RobsterCrawSoup@reddit
Oh yeah, I don't know why I was thinking about net when it says gross. I guess that means it will come down to how much performance they can get per mm2 of wafer (including yields) and how that compares to the competition. Of course, we may see some products not come to market, but that doesn't mean they aren't doing the R&D to try to have a more profitable offering in the future. Intel is stuck in the GPU business as long as they are in the APU business.
phire@reddit
Though, at point are they making this "green light" decision?
Are they doing the bulk of the design, estimating the gross margin, and deciding to go ahead with taping it out? That might result in some generations being cancelled, but the actual design should still iterate forwards (though, really sucks for the driver people).
Or are the business people wanting this gross margin estimate before the R&D even starts? That could be pretty damaging.
I notice the article says "projects" not "products" being green-lit, which sounds more like the R&D, not the project.
Exist50@reddit
Before it starts in earnest, at least. They've been slashing RnD spending left and right. Last thing you'd want is to waste it on a project that never sees the light of day.
phire@reddit
You can kind of split R&D into two categories. Some R&D is specific to an exact product. If that product is cancelled, then that work is basically useless for anything else, except for engineering experience. The biggest example is the actual layout of transistors on the chip.
Other R&D is less specialised, it will carry over to future products in the same product line, or can be adapted to a related product line. For example, an engineering team improves the architecture of the shader cores (as HDL code), then that same code can be reused for the next generation, or improved even further.
What I hope, is that Intel have the types sets of R&D somewhat separated at a organisational level. So that one team (or set of teams) can work on the generic R&D for their whole GPU technology set continually, and then they only spin off a team to work on the specialised R&D when they have something worth making.
My understanding is that AMD have this kind of split for their GPUs, the architecture evolves continually, and you can sometimes end up with chips like the PS5, which is mostly RDNA2, with a few missing features (and a few extra features too)
RobsterCrawSoup@reddit
Maybe project in this context just means a product line? I don't have a clue how much of the total costs of bringing these products to market are R&D and how much is the cost of the fab capacity. There's an opportunity cost to committing fab capacity to one product or another, so it makes sense that you would not commit as much of it to making a slim margin over one that makes a big margin, so maybe you trim your SKUs down and produce less of the ones you do make. Maybe that is why we have a B580 but no B770. It still doesn't make sense to me to have a "rule" like this, though. Obviously, they want to maximize profit, but its not about absolute margins, its about relative margins. If Intel has no products that will sell for a 50% gross margin, they still have to sell something.
Exist50@reddit
Intel has additionally promised multiple rounds of multi-billion dollar spending cuts. So there's not much money for RnD either.
Proglamer@reddit
If they thought of even more childish 2000s names, then maybe /s
SherbertExisting3509@reddit
Intel Arc Pro and Battlematrix should allow for Arc Celestial and Druid to survive
Exist50@reddit
That sells too few units and at too low a price to get the margins where they supposedly need to be. To have any chance, future dGPUs would need to be both much cheaper to develop and much more economical to sell.
SherbertExisting3509@reddit
Battlemage gets a pass because it's Intel's foot in the door with the local LLM market
There have been stories of people buying multiple expensive mac studios and mounting them to their walls to avoid paying for server time.
Stories of people scavenging 3090s for VRAM and creating Frankenstein 48gb 4090s. Local LLM market is desperate for a 48gb GPU under $1000.
Just look at the amount of hype for the Arc Pro B60 dual on the Local LLM subreddit.
I agree that Intel needs to quickly solve their PPA issues with Celestial. They can't run a business where they use 1.5x to 2x the silicon for the same performance as their competitors
Exist50@reddit
It's just not that big a market. And in fact, the interest in Battlemage is exclusively because people don't want to pad Nvidia's margins. They just want as much hardware as cheaply as possible. If Intel holds to this 50% cutoff, then Battlemage wouldn't exist, and certainly not at these prices.
Raikaru@reddit
Nvidia has way more than 50% gross margins though?
Exist50@reddit
The exact number isn't particularly relevant to my point. The interest in BMG is intrinsically tied to its low price, pricing that is impossible at the margins Intel desires.
Raikaru@reddit
For battlemage it’s impossible yeah but if they improved ppa they could have margins and lower prices
aminorityofone@reddit
AI is to important of a profit to give up. A gpu is just a means to an end for AI cards.
Exist50@reddit
Their client dGPU lines are wholly separate. They don't sell those for AI, at least not to a meaningful extent.
notam00se@reddit
In theory that changed with their Pro announcement, but we'll see if their ecosystem keeps up.
AI playground is a great first step for consumers, and their announcement to distribute docker/container files for linux support might work out.
They had a roadmap to combine Gaudi with Arc/XE/Max under oneAPI, but who knows if that is still in play.
Exist50@reddit
This tier of "Pro" cards is more about CAD and such than anything else. I'm sure someone will buy it for AI, but it's not enough to make a difference.
Gaudi is dead. Their datacenter AI solution is purely GPU-based going forward. What the software stack looks like still seems to be an open question, but that's more oneAPI vs openVINO etc.
dahauns@reddit
Dunno...at least from a marketing/product positioning angle they seem to lean quite heavily into AI for these cards, especially with things like Project Battlematrix.
Exist50@reddit
I think that says more about the state of 2025 marketing than the product itself.
6950@reddit
Intel's SW stack for AI is better than AMD tbf both are behind Nvidia but the TOPS/Bandwidth ratio for A60 is nice it's a compelling card and at $500 it's not a loss maker I can bet they will redirect B580 supply to A60.
Far_Piano4176@reddit
the only people likely to be interested in their Arc Pro cards for AI are LocalLLM hobbyists on a budget.
SherbertExisting3509@reddit
Local llm subreddit went nuts over 48gb of VRAM under $1000
mockingbird-@reddit
Intel can keep making GPUs for AI.
DerpSenpai@reddit
nope, Intel density design is very very low, it's impossible
lusuroculadestec@reddit
A better question would be if the dGPU variations can survive. The architecture use in the iGPU has never in danger of being cancelled. Intel will never kill off the iGPU and the chances of them doing something other than Xe for the iGPU is even lower than them killing the iGPU.
Exist50@reddit
Celestial and Druid are specifically dGPU names.
mockingbird-@reddit
A person with stage 4 cancer has a better chance of surviving.
aLazyUsrname@reddit
If there’s one thing that always improves a product, it’s an extreme and myopic focus on immediate profits. I’m sure this won’t be toxic and anti-consumer in any way.
Exist50@reddit
Not even profit. This is margin. They're saying that even a product estimated to bring $1B/yr of profit at 30% margin would be canned.
zacker150@reddit
Keep in mind, this is gross margin, which excludes R&D and other overhead.
50% is basically the minimum to be net profitable.
Exist50@reddit
AMD makes a healthy profit at almost exactly that margin.
Strazdas1@reddit
I wouldnt call their profit healthy. If your profit is bellow a market index fund return, you are bad investment.
Exist50@reddit
It's not.
zacker150@reddit
5-8%.
Exist50@reddit
Also, the more products you cut (because the incremental margin isn't >50%), the less you have to spread RnD over.
haloimplant@reddit
this is the thing people are missing, the point is to direct your R&D spend at things that make higher profits aka make the consumers want the products badly enough to pay a higher price for it
Exist50@reddit
This isn't Intel making necessary priority calls. They're just cutting things, full stop.
haloimplant@reddit
Cutting things is a valid call. If you don't have the talent or innovation to succeed in a space or multiple spaces you're in, somehting should be cut. Again it's the foundry and it's failure to find external customers that puts Intel in particular in a bind, a fabless company can scale down or ahift focus without a huge cost physical cost overhead that's being wasted.
Exist50@reddit
Thing is, they're cutting businesses that are successful. Just not successful to an arbitrary metric.
Ironically, the one thing that's failed the most, the fabs, is also the one thing they're keeping.
shakhaki@reddit
This is literally Microsoft right now. All products have to meet a 20% or higher gross margin. Who cares if you can make $2B in gross margin dollars at 10% or $1B of margin dollars by selling at 20% gross margin targets.
Strazdas1@reddit
at 20% gross margin i think the profit rate is negative for hardware products. The nondirect costs are never this low.
Hifihedgehog@reddit
Exactly. Books and books of organizational leadership platitudes and fancy business theories and they are missing the fundamentals that most any lay person can readily point out. Such is the hubris of corporate rot.
DifferentiationBy@reddit
The bean counter coming and telling the engineers to ..get this....make money...is a sign of broken bean counters. tbf it could just be public info to calm down investors since it's absurdly stupid way of business to do thing that make money. They should try intel drop shipping, I hear those margins are also high. Or maybe start fab terrorism, where they extract their protection money like the mafia.
Exist50@reddit
Generally, the history of Intel telling sweet lies to investors isn't a story that ends well.
shakhaki@reddit
I’m open to further viewpoints, it just seems to be lazy financial management to say everything has to make a single target instead of understanding the market that the solution participates in. The only reason I could justify the viewpoint is efficiency of capital allocation but if your firms core competencies complement spaces where you can win and win well, 50% margin is astonishing in durable goods.
gumol@reddit
gross margin though. You can have a 30% margin and lose money on the product.
Strazdas1@reddit
No. Gross margin is margin. What people often incorrectly call profit margin isnt actually margin, just profit rate.
Exist50@reddit
Yes, but 50% is a pretty ridiculous target. AMD as a business is right around 50% gross margin, and they're in a much, much better state than Intel right now from both a product and expense standpoint. They could be trying to optimize for profitability, not arbitrary margin targets.
Alphasite@reddit
It’s the semi business. Broadcom has a gross margin of 68% (buoyed by software sure, but semi is 60% of the business) so 50% is doable.
Intels biggest problem has been an inability to focus and actually see things to completion.
Plank_With_A_Nail_In@reddit
Their two problems are that their foundry business shit the bed and they kept gimping products that looked to be competing with their golden goose x86 CPU's. Atom's should have been way way better but nope. The fact that their CPU's aren't that good anymore might help with internal decision making.
Alphasite@reddit
It’s not even competing with x86. They can’t even do something other than CPUs.
Exist50@reddit
Broadcom is riding the AI wave hard right now. What does Intel have that can justify those kind of margins?
Then it's just going to get worse now. You think their incubation projects have 50% margin? If anything, they seem to be retreating to their fundamentals at the same time as they kneecap them.
Alphasite@reddit
For sure but historically Broadcom has had a very large and profitable hardware business.
Tbh I suspect this is mostly about bringing up average profit margins and trying to get more software into the mix since the margins there can hit 90%.
Exist50@reddit
They seem to have abandoned a lot of their SaS efforts, but I guess they didn't really get far to begin with.
Alphasite@reddit
Intel? I didnt know they did any saas tbh. I thought in general theyd made excellent software but struggled to directly monitise it.
Exist50@reddit
They were making a big push in NEX for a while, and there were some side projects like Unison, but all of that got completely gutted by the layoffs and budget cuts. Hence their main software exec (Greg Lavender) leaving.
haloimplant@reddit
right and to be around 50% gross margin, they would targeting to do that or better and things don't always work out. if the plan is 40% gross margin from the start it's not a good one
gumol@reddit
which is probably why they're aiming for 50%.
Exist50@reddit
Maybe, but if Intel were actually competing well with them, that number would be lower for AMD. If 10 years ago AMD decided they'd only make products with 10-years-prior Intel margins, they wouldn't have made Zen or anything else.
It's like some cargo cult behavior. If we target margins like an industry leader would have, maybe leadership will magically come back? How on earth does that make any sense?
Hifihedgehog@reddit
The infamous bean counters and corporate red tape bureaucrats who brought about these awkward arbitrary lines in the sand are the same myopic midwits who got them into this fix to begin with. Instead of focusing on making a quality product first and foremost, they are focusing on profit targets above all else even at the expense of market leadership. As a result, this philosophy will only bled them dry until all that is left are the easiest and cheapest things to produce and ironically still offer less bang for your buck. If they focused on both high profit margins secondarily and then performance-per-dollar primarily, with those priorities, then they might have a chance. However, flipping flop it, it is stripping away all the value of the company to fill the pockets of investors who really are out to gut the company before hitting the bankruptcy eject button.
Konini@reddit
Perf-per-$ is not always the best target.
In the cpu market the x3D stuff is mostly not best in that category but they still sell like hot cakes strictly because x3D chips give specific advantages.
On the flip side in the gpu market the perf-per-$ crown usually goes to an AMD mid range card but nobody cares all that much. NVidia features win the day (with a big asterisk related to price fluctuations, but all things considered it’s what it usually boils down to). AMD finally started gaining some ground on that front.
Also remaining king of the hill with super expensive halo product works well for NVidia to drive sales lower down the stack too. Intel seemed to expect the same outcome for them but it appears that if that halo product does not have a significant lead over competition, it just doesn’t have as strong an effect.
I think the bottom line is quality product - doesn’t have to be the most efficient, economical or most powerful necessarily, but it has to be high up in every category. At least for gaming that is.
scytheavatar@reddit
It's a vicious cycle, lower margins means less money available for R&D which means your ability to be competitive with future products decreases. Intel right now is struck in a cycle of Pyrrhic victories when even if they win with Alder Lake/Raptor Lake/Lunar Lake they still continue their march towards irrelevance. Something needs to be done to break that cycle.
Exist50@reddit
Lower profit does, not gross margins. If anything, the pursuit of margins leads into to concentrate their RnD expenses into fewer and fewer products, instead of amortizing the cost over cheaper stuff.
MJ has been with Intel since 1996 and led the client group since 2022. She seems to be more part of the problem than the solution.
Her personal history aside, I fail to see how chasing after margin fantasies does anything to improve Intel's situation. They need to be realists now more than ever.
scytheavatar@reddit
You seem to think it is easy to make profits with cheap products, when the reality is that low margins means any sales drop rapidly turns a profit into a loss.
Exist50@reddit
It can be plenty reliable. See AMD's console revenue. And just as importantly, it denies competition an avenue to encroach on markets you can more about. Nvidia could just sell x70 and higher, but they make sure to offer just enough that AMD doesn't have a captive audience.
scytheavatar@reddit
AMD themselves have been happy to abandon the entry laptop market, cause it's a battle not worth fighting for. While Qualcomm is not in a good position with their ARM laptops precisely because they don't want to make cheap ARM laptops. They all don't want to offer entry products cause it's not like it is easy to make money for that segment.
Schemen123@reddit
Depends on your other cost.. 50 percent gross can easily mean.. you are loosing money
Earthborn92@reddit
AMD gradually inched towards the 50% margins. Intel can't expect it to happen in a few quarters.
Exist50@reddit
Exactly. So either Intel's going to fail to reach this target, or they're going to cut most of their business because it doesn't make the cut. Or both.
Vb_33@reddit
I don't understand how this will affect their data center CPU business? Isn't that currently in bleeding profit in order to undercut and retain market share mode? They point out that there are exceptions so I guess the devil lies in the details.
Exist50@reddit
The way these things work in practice is that people will lie. Someone will massage the numbers to show 50% margin, even if it's based on complete nonsense, and use that to justify getting their project funded. Intel management stupid enough to set such unreasonable targets are also stupid enough to take those claims at face value, and they get paid many millions for doing so.
Remember the $500M of Gaudi accelerators Intel told investors it expected to sell? When in reality they sold a fraction of that? Same thing.
aLazyUsrname@reddit
This is why everything is terrible. It’s why companies do layoffs instead of not giving out bonuses to the c suite. Greedy fucking apes.
haloimplant@reddit
if you're setting prices correctly it actually does improve the products because they have to be more valuable for consumers to pay it
Exist50@reddit
That's reversing cause and effect.
haloimplant@reddit
nope if a product has low margin it sucks plain and simple
Exist50@reddit
Consoles and cell phones have extremely low margins. They all suck?
non_kosher_schmeckle@reddit
What? Markups on smartphones are like 50%, at least.
haloimplant@reddit
the chips inside probably have decent margins
at the consumer level things get weirder because once software gets involved the revenue starts getting shifted around. consoles being sold at break even or loss to keep consumers on a platform for example
Exist50@reddit
AMD absolutely does not make good margin on console chips. All that software revenue goes to MS/Sony.
haloimplant@reddit
true AMD had to go dumpster diving to get some revenue when they were desperate. now their GM is back up to 50% total which means if the gaming is running 35% the rest of everything is doing better than 50%
should intel go dumpster diving? maybe but for a company with a huge foundry that's supposed to be an advantage that's pretty bad and they might as well call it (break up)
Exist50@reddit
Low margin business is better than no business at all.
haloimplant@reddit
not with intel's foundry expenses it isn't, when their gross margin dropped below 40% they were no longer profitable at all, to have profit margin over 20% they had gross margin over 50%
Exist50@reddit
The worst possible scenario for a fab is to sit empty. Intel should absolutely take any filler volume they can get. Also need to prove to customers they can actually operate as a foundry.
haloimplant@reddit
i have heard nothing good about working in intel process so I wish them luck with that
it's entirely possible this thing is cooked, that between the chip design stagnation and foundry failures they are in too deep of a hole. but planning to lose more money isn't a plan
billythygoat@reddit
That’s such a huge gross margin too
ExeusV@reddit
It's about growth
Asgard033@reddit
RIP the current consumer GPU strategy then. I highly doubt Arc cards are making that kind of margin.
CassadagaValley@reddit
Celestial is pretty much set and work on Druid started last year. Intel was pretty happy with Battlemage so if Celestial can compete with a XX70 or XX80 at a lower price they have a pretty good shot at sneaking into the #2 spot
Exist50@reddit
It was killed months ago, and that was far from finished. Why do you think it was done?
Strazdas1@reddit
It was not killed.
Exist50@reddit
It was. The same denial comes out every time Intel does something like this. I remember all the same insistence that 20A/18A was perfectly healthy and on track.
CassadagaValley@reddit
Celestial wasn't killed? Are you thinking of the B770 or something?
Celestial entered pre-validation last month.
Exist50@reddit
It was. Suppose it was technically one of the last major roadmap decisions Gelsinger made.
Not your fault, but the article you're thinking of is one of the most ill-informed things I've ever read. The "source" was someone's LinkedIn bio where they talk about pre-silicon validation work for Celestial, at some unknown point in time. Pre-silicon validation isn't a milestone or even really a phase of development. All it means is people used to be working on Celestial.
The irony is that the most likely reason this was found to begin with is because that engineer was laid off as part of Celestial's cancelation and updated their provide.
CassadagaValley@reddit
So I'm trying to find an article about Celestial being killed off but I'm not seeing anything, got anything you can send about it? Google is just giving me stuff about it being on track or it's integrated version being added onto Nova Lake
Exist50@reddit
Unfortunately, the media tends to lag hopelessly on this kind of stuff. See the history of 20A/18A for proof of that. So all I can tell you is it will become obvious in due time.
By definition, Celestial is a dGPU. Xe3/3p IP is another matter entirely.
Asgard033@reddit
But them margins, bro...
DoubleExposure@reddit
If they can figure out consumer GPUs they will make buckets of money.
Exist50@reddit
Even AMD doesn't make buckets of money in consumer graphics.
Strazdas1@reddit
Last time AMD has figured out consumer GPUs was what, a decade ago?
Exist50@reddit
Lol, AMD's GPUs are in a much better position that Battlemage. This isn't really up for debate.
DoubleExposure@reddit
Because no one can afford them, or there are none to buy, or (in the past) bad drivers, or their pricing model, or they lack in features for the price point, AMD has not figured out consumer GPUs yet. If they cranked the volume of their cards and made them available at MSRP they would make mint.
Strazdas1@reddit
which is why sales are records high?
scytheavatar@reddit
If AMD drops the prices of their cards, Nvidia will just the prices of their cards too. AMD cannot beat Nvidia just by getting into a price war with them, in the end both sides will just lose. This is also why if Intel wants to be serious with their GPUs they need to start being like AMD and actually selling their cards with proper margins.
ElementII5@reddit
This is what happens if you price your products with 50% margin...
imaginary_num6er@reddit
Yeah but here is 1 thing AMD doesn’t need to worry about. Overhead from their fab business and not being able to order in volume from TSMC. AMD and Nvidia can lump the rest of their volume together with their TSMC orders, but Intel cannot do that.
Exist50@reddit
And pretty much all of that and more can be said for Intel.
haloimplant@reddit
as much as we wish this were the case, one way to lose a shit ton of money is dump it into an area that you're not competitive at and fail. part of these targets is avoiding that
AFlawedFraud@reddit
consumer GPUs are famously low margin products
Nuck_Chorris_Stache@reddit
RIP Intel's graphics card division
Kurgoh@reddit
New products must deliver 50% gross profit? Not just gpus, ANY product? I guess intel won't be producing anything anymore then lol.
Strazdas1@reddit
No. Gross margin. Margin and profit are different things.
cest_va_bien@reddit
Why not try to actually innovate and prove your reason for existing? If this isn’t a red flag for engineers to abandon ship I don’t know what is.
Exist50@reddit
What do you think the last year has been? Intel has a skeleton crew left at this point.
non_kosher_schmeckle@reddit
What happened to all the merger/joint venture rumors?
Everyone decided to just do nothing?
Strazdas1@reddit
They were just stock manipulation via media. None actually planned to do a merger/aquisition.
non_kosher_schmeckle@reddit
I guess so, since you moved on to cooking comments lol
Are you part of that skeleton crew? Guess so haha
logosuwu@reddit
Given that his insider knowledge of Intel was superficial at best I would say he was never part of Intel but probably worked for an Intel partner.
haloimplant@reddit
in a roundabout way that's what gross margin targets incentivize
a company i worked at targets gross margins of 60-80%, the products that aren't innovative enough at delivering value to customers have to be sold cheap, fail to meet the target and are cancelled in favour of better ones
Exist50@reddit
That assumes there's always something better to invest in.
haloimplant@reddit
if there isn't then you are cooked as a high tech company
Techhead7890@reddit
Right? When a metric becomes the target, it ceases to be a good metric.
get-innocuous@reddit
Intel has been looking enviously over at nvidia’s annual reports. Unfortunately guy, nvidia is much better at this than you.
JustHereForCatss@reddit
This means they 100% never recover imo. Intel is dead. NVIDIA and AMD can do this kind of thing, however Intel needs to innovate and R&D is expensive
Bavario1337@reddit
they were printing money for 20 years with their CPU monopoly. where did all their money go? Apparently not in R&D
Strazdas1@reddit
Stock buybacks and bad aquisitions is where the money went. Also RnD too, but one that failed. Like trying to shrink the physical size of transistor gates that they spent billions on and have nothing to show for it because the tech just didnt work.
Exist50@reddit
Gelsinger burned a lot of money with his failed manufacturing bet. They'd be in much better shape if that one decision was different.
6950@reddit
He should have spent less on things that were not needed they clearly said somewhere they will build based on external customer not in hopes of getting external customer which should have been the goal from the start.
jeffscience@reddit
Pointless acquisitions like McAfee and Altera that led nowhere. BK was pumping the stock price while the fabs were in crisis.
Exist50@reddit
It's really funny to tally up all the AI startups the bought just to completely squander.
Vb_33@reddit
Investor dividends, poor acquisitions, golden parachutes and investments into fab tech that didn't work out.
chefchef97@reddit
A tale as old as time
Far_Piano4176@reddit
fab R&D, dividends, stock buybacks, Fab CapEx that they couldn't amortize across a functional foundry business, unsuccessful acquisitions, and failed bets. Intel spent a lot on R&D, just clearly not in the right areas
haloimplant@reddit
gross margin doesn't include R&D so these targets don't stop companies from investing in R&D, done correctly it directs that money to where it can deliver the best products
Exist50@reddit
No, but their billions of dollars in promised spending cuts do kill RnD.
FlyingBishop@reddit
Nvidia doesn't chase its tail trying to kill things with low margins, they invest and make the best products so nobody wants to pay Intel prices.
SightUnseen1337@reddit
Intel will get bailed out because the military-industrial complex needs US-based fabs for classified designs and dual sourcing for commodity parts in consumer PCs
genericusername248@reddit
And apparently ignoring the part where Nvidia spent several decades developing the market they're currently dominating.
Hifihedgehog@reddit
Business plan for failure, step 1: Fall for get-rich-quick scheme while explaining it away because you are too smart to make it fail like everyone who has tried it has.
Rome wasn’t built in a day, folks!
Strazdas1@reddit
It burned down in 3 days though.
DifferentiationBy@reddit
Yes but intel will be destroyed in 1
Hifihedgehog@reddit
That’s the typical outcome of get-rich-quick schemes, yes.
Impressive-Swan-5570@reddit
Who cares just give us good product.
megongaga2025@reddit
Intel hasn't been willing to take risks for decades now. If there were no AMD would mid-range Intel CPUs still be on quad-cores?
ResponsibleJudge3172@reddit
Alder lake and Lunarlake
megongaga2025@reddit
The i7-965 to i7-7700 (2017) were all quadcores. AMD started adding more cores to compete and forced Intel to add them later. AMD's Phenom II had six cores in 2008. AMD added more cores before Intel. AMD's six core 1600X in 2017 competed with Intel's i7-7700.
iDontSeedMyTorrents@reddit
Phenom II X6 was released in 2010.
Intel actually released a hexa-core CPU a little over a month before AMD. So technically Intel was first.
megongaga2025@reddit
Oh you’re right, Intel did narrowly beat them. I thought the Phenom II was hexacore, but it was the Phenom II X6.
Numerlor@reddit
Intel were trying 6 cores before Zen was a thing, they just kept failing, repeatedly, with 10nm
randomkidlol@reddit
no they had 6 cores for a very long time. ie the 3930k, 5930k, 7800x, etc. they just wanted you to pay exorbitant prices for the privilege of getting one. AMD made 6-16 core chips cheap and affordable.
Numerlor@reddit
Intel certainly weren't in a big rush, but AMD forcing their hand is just untrue. First things first Zen 1 really was not good enough yet to be a big issue for Intel, it was a signal that AMD made big changes and could've affected pricing to a degree but in consumer space was just bad outside of MT.
Then intel released a mid-rangey 6 core within the year, there's absolutely no way they would've had the time to redesign the architecture for it and produce a release amount of CPUs if it wasn't already the plan.
Cannon lake on 10nm was supposed to be 8 cores after skylake, but 10nm was such a huge mess that all it got was a paper launched 2 core cpu in 2018 because they had to release it to not outright lie to investors
Exist50@reddit
They took a risk on foundry. In the same sense that jumping off a building holding an umbrella is "taking a risk".
Kougar@reddit
Welcome to the new Intel. Same as the old Intel.
vandreulv@reddit
Wouldn't be an Intel thread without /u/Exist50 threadcrapping all over it.
SmashStrider@reddit
Surprised u/Helpdesk_Guy hasn't infiltrated this post yet.
Helpdesk_Guy@reddit
I usually don't blow punches, when someone got already beaten and is evidently going down.
hardware2win@reddit
Then he will delete all his posts cuz he leaks some shit xd
Burnsidhe@reddit
This is going to kill their market.
travelin_man_yeah@reddit
Too little, too late. They should have adopted that about 15 years ago before spending $38 billion on Altera, Mobileye and Macafee on top of other silly ventures like drones, wearable and VR/volumetric.
The only things making decent margins are Xeon and client processors but TSMC makes some of the client chiplets so that's cutting into those margins. Client GFX are all made by TSMC so pretty tight margins there. Unfortunately, data center AI/GFX/Gaudi is a complete train wreck and isn't the cash cow it should be.
And MJ, she's likely on her way out anyway. Just about all the other ELT under Gelsinger have departed and even though she's "product CEO", all the product division VPs now report up to Lip Bu Tan instead of MJ.
Exist50@reddit
We can only hope, for Intel's sake.
travelin_man_yeah@reddit
DC is actually making some money, think it was $3-4 billion last quarter. Pretty much all from Xeon since no one is buying Gaudi.
Jensen2075@reddit
They're deep discounting Xeon chips to compete with AMD. Not making much profit.
Exist50@reddit
You're thinking of revenue, not profit.
https://morethanmoore.substack.com/p/intel-2025-q1-financials
6950@reddit
More likely the industry sake nothing has driven the industry like they have except the last 8-10 Years
Bavario1337@reddit
looking at the margin history of intel overall, they really need geniuses if they want to get back to pre 2022 numbers quickly lol. sitting at 30% gross margin currently
Exist50@reddit
They need competitive products. Instead their plan seems to be to cut costs until they hit the desired margins. And the execs making that call will surely get out before it collapses.
ResponsibleJudge3172@reddit
Xeon is headed for competitive status. They shrunk a multi Gen gap to 1 Gen.
Exist50@reddit
They still have the problem that Venice and DMR will launch around the same time, and the former will have a node advantage. Also, with the Forest line and SMT killed, Intel won't have much much of a response in high thread count workloads.
beeff@reddit
Source? AFAIK clearwater forest on 18A is simply delayed to '26 and SMT being removed is only for client P-cores.
Exist50@reddit
The CWF successor was killed alongside CWF-SP and SRF-AP.
Why do you think it's only for client?
6950@reddit
While the killing of CWF-SP is true SRF-AP is turned into a custom xeon part for hyperscalers
Kit guru had a Interview regarding this that For servers HT makes sense it will be optional part of core design
Exist50@reddit
Then where is it? And keep in mind that's a product that was essentially finished. The CWF successor never got serious investment to begin with. It's not happening.
Quite frankly, that interview was BS from an Intel rep that either didn't know what they were talking about, or didn't want to fully acknowledge the new "strategy".
6950@reddit
AWS/Google were the ones who are supposed to get them
I mean the person was Sr. principal engineer for P cores lol so the fact that he doesn't know what's he is talking about is nuts hiding makes sense
Exist50@reddit
Understand that Hotard distained CPUs, and especially E-core ones. His vision for Xeon was basically a legacy enterprise business. AI accelerators were the only thing he wanted to spend money on.
Gelsinger had some really bad hires.
6950@reddit
Yeah he was an idiot getting rid of him was good
He made some good re-hires as well but the bad ones did so much damage.
QuestionableYield@reddit
I thought Hotard got rid of Intel to be CEO of Nokia.
I haven't heard that story before. Care to enlighten?
6950@reddit
QuestionableYield@reddit
You mean Christoph Schell?
6950@reddit
Yeah him lol
QuestionableYield@reddit
That's interesting. I thought the handling of it would go under Michelle Johnston Holthaus as the CCG owner. Her days are probably numbered too.
cyperalien@reddit
DMR will have 256 cores and i think the core will have an IPC advantage over zen 6 so it should be fine. Perf/w is the remaining question.
Exist50@reddit
So will Venice. Yes, they're the "dense" cores, but that's hardly a detriment at the power levels you'd run a 256c part at. Which happens to be the same environment the node disadvantage hurts the most.
Why do you think it'll have an IPC advantage? Each is getting a one-gen boost over Zen 5 vs LNC, and that doesn't exactly put Intel in the lead. Sounds like roughly parity.
Like, don't get me wrong, if it weren't for the node problem and SMT (for applicable markets), DMR would probably be the most competitive Intel part since pre-Rome. But those are problems they will have to deal with in practice.
Crisender111@reddit
I think this CEO is a trojan horse from AMD. LMAO.
Astigi@reddit
There.goes their Xeons
6950@reddit
Xeons are not getting canned lol
Exist50@reddit
I'm sure someone can make a pretty slide showing Xeon at 50% margin, even if the requirements are impossible.
Plank_With_A_Nail_In@reddit
Condensing complex problems into simple rules always works. Lol its RCA all over again Intel is dead man walking.
humanmanhumanguyman@reddit
Well, rip ARC. The hope was nice while it lasted.
bluefalcontrainer@reddit
it was nice knowing ya intel
HorrorCranberry1165@reddit
I believe tjis is only financial speak for conference, where they participate.
Single metric 50% is very abstract, you can fit any product to it, depending on conditions.
In short they say 'we are taking care for our profits', and that's all.
I think they will never reach again high profits from the past, like 60%, everything is different now and much harder. Golden age is behind them.
Geddagod@reddit (OP)
Hopefully
ibeerianhamhock@reddit
Doesn't seem that crazy. I mean ngreedia had an operating cost of 5b with over 40b in revenue last quarter. Not saying that's necessarily their profit margin bc I don't understand the specifics enough without a business background, but it definitely points to a very high profit margin.
Exist50@reddit
Intel does not have Nvidia-like products. That's the problem.
Homerlncognito@reddit
Corporate Ghoul 100% Speedrun.
lord_lableigh@reddit
What a bunch of BS. No "engineering-focused company" deals out something like this to their engineers.
I'm guessing this includes celestial and druid as well and they expect NVL to pass this criteria. I'm not well versed in finance enough to know the difference bw gross profit and gross margin. Are they the same?
For all the praise lip-bu tan got, I think this will be unanimously weighed against him by the people in tech circle.
Geddagod@reddit (OP)
I think they use this to justify dropping celestial and druid
SmashStrider@reddit
Gross Profit is just the actual raw amount of money that the company earns (Revenue - COGS), while Gross Margin is the percentage of profit with respect to the total expenses ((Gross Profit/Revenue) x 100%).
Smooth_Value@reddit
Cool, by that logic, I will never work again. Fucking corporations.
SmashStrider@reddit
Here's what's gonna happen -
1) Intel implements this rule
2) Actually promising products get a red light
3) Their core products are hiked up in price to meet margin requirements
4) Shareholders are happy at first, but then no one buys their products, so their revenues plummet
5) The rest is history
Honestly, quite a boneheaded move coming a from a company, who under the new CEO is supposed to be more 'Engineering focused', although this move really just seems to be shareholder appeasement. It's especially baffling, considering that with Intel moving a lot of their chip manufacturing back to their American Fabs, they should be able to produce their chips at better margins, giving them a possibly big price advantage over the competition who is using TSMC (assuming 18A actually turns out well). But this just seems to take advantage of that, and instead go the opposite direction, prioritizing margins over value for consumers.
One silver lining if true -
Although again, this could just mean that PTL and NVL are to be quite expensive.
neutralityparty@reddit
They should invest in graphic cards. They got a huge option on that front but alas Intel
ConsistencyWelder@reddit
The weird thing most people don't realize is, Intel has been making video cards longer than AMD, they've just always half assed them.
The first one was so bad, they tried to force motherboard makers to bundle them with their motherboards, but no one wanted to do it.
HisDivineOrder@reddit
There go their discrete cards.
DifferentiationBy@reddit
Only thing that somewhat has a chance to be high margin this coming decade
Exist50@reddit
Not client dGPUs.
constantlymat@reddit
Yeah, even AMD has had to hide the numbers of its dGPUs behind the RDNA2.5 console chips and Intel sells even worse than they do.
ExeusV@reddit
It's about growth
TritiumNZlol@reddit
Grandma won't like this one.
Rye42@reddit
Yeah, expect price increase on celestial then if it comes out as a Consumer GPU. It will be price comparative with NVidia and AMD.
ConsistencyWelder@reddit
Could just be me, but I've always felt that most of their attempts at marketing new products have been half-hearted. Like they tried, but they didn't REALLY try, so they ended up with a bunch of pretty good products that just weren't taken over the final hurdle to become a market success.
I always figured there must be people working at Intel who are a bit defiant. Insisting that Intel is a CPU company, and "we don't need to do anything else, we should do that one thing we do well, extremely well".
I could be totally wrong though.
AstroNaut765@reddit
Oh no. This super bad.
Intel is doing a lot of stuff that is not making money in short term, but is creating market for other products.
obthaway@reddit
wake up grand-intel, 4790k is 11 years old now
gburdell@reddit
I never got the emphasis on margin. Like isn’t more profit always better? It’s not like highly technical employees are fungible and can just be “moved over” to support higher margin work
512bitinstruction@reddit
Intel just sinks deeper and deeper. This will just make intel more conservative and less innovative.
hansrotec@reddit
Intel confirms plans to drive customer to other vendors leading to bankruptcy
69_CumSplatter_69@reddit
Leading to taking that money anyways from taxpayers.
venfare64@reddit
ftfy
imaginary_num6er@reddit
They just have to launch at high margins and cut the margins the next quarter
Hairy-Dare6686@reddit
That's the strategy with which AMD continuously lost all of their market share in the GPU space to Nvidia over the past decade.
wh33t@reddit
RIP dGPU and compute cards for the home user :(
hardrock527@reddit
Maybe this means they have to put a real msrp on celestial and not the fake ones that are in the market today
Exist50@reddit
If you take this as a strict requirement, Celestial won't happen at all. No one will buy it at >AMD margins.
Professional-Tear996@reddit
Intel's gross profit margin has been low because of their high COGS and lost dominance in traditionally high margin business segments like data center.
The actual transcript talks about them looking to decrease COGS through making the validation phase post tape-in more efficient, among other things.
And also the more important thing is that we got a hint of a semi-confirmation that Nova Lake using TSMC is not going to use the latest node that TSMC is going to offer. Why? The loss of market share in desktop was mentioned. Being able to move a lot of volume in a short time using a node that has ramped, and has good yields, as well as the desktop market being elastic was mentioned. And finally, Nova Lake for both laptop and desktop was mentioned.
Taken those together, I interpret it to mean that Nova Lake will likely not be using N2.
Exist50@reddit
I'm not sure how you've reached that conclusion. They're not using TSMC for volume. Intel Foundry is sufficient for that. They're using TSMC because N2 provides a generational PnP advantage that they need to sell to the high end market. A
Rocketman7@reddit
The new CEO is a clown
steve09089@reddit
Well, that’s just asking for disaster.
DehydratedButTired@reddit
And there goes their GPUs.
Bavario1337@reddit
Also yearly CPU releases should be dead by this requirement, since any cpu released that is not dumpstering AMD cpus will not be hitting a 50% margin
Exist50@reddit
They've been working towards this for a while, and that included killing off the -N line.
zuperdo@reddit
If this is true, this is absolutely insane and will be the end of Intel within a decade.
Intel needs to face reality and start investing big into research and development, regardless of what shareholders want, so they can set themselves up for long-term success and strategic market victories. Doing things like cutting off all products that don't deliver a 50% gross profit will only ensure that Intel ends up in an unstoppable death spiral.
This is like Compaq all over again.
EdzyFPS@reddit
This is going to age like a finely aged pint of raw milk.
mockingbird-@reddit
I guess that Arc is toast.
AutoModerator@reddit
Hello Geddagod! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.