Exclusive: How Intel lost the Sony PlayStation business
Posted by Helpdesk_Guy@reddit | hardware | View on Reddit | 239 comments
Posted by Helpdesk_Guy@reddit | hardware | View on Reddit | 239 comments
Traditional_Yak7654@reddit
How is this not a rumor? It’s all unnamed sources.
UrbanFight001@reddit
Because it is coming from Reuters…
Strazdas1@reddit
So?
Worldly_Apple1920@reddit
well, we will see very soon if PS6 comes out on AMD chips and not Intel.
wizfactor@reddit
If the dispute was over margin of all things, then this is a massive L for Intel. Intel needs every contract it can get to justify the existence of IFS. It’s such an existential issue that bickering over margin (in a market that is famous for loss leading) is a fatal error.
The console market was never going to print money for Intel. But it could have opened a new market for Intel and saved IFS to some degree. Those two upsides are worth more than the drop in gross margins that shareholders will look at.
Legal-Insurance-8291@reddit
Losing money on every chip you make is hardly a winning proposition. Maybe it would be better than the current situation, but definitely not good.
Quentin-Code@reddit
The margin is what you have remaining after expenses. It is not a loss. Reducing margin does not mean losing money, it means winning less money.
Intel is famous to try to keep high margin, which is generally not a good practice in a competitive market, unless you think the market is not competitive, which indicates that Intel still does not seem concerned.
Legal-Insurance-8291@reddit
Margin can be negative too and almost certainly would be in this instance given how high Intel's manufacturing costs are. Obviously every company wants to make as much money as possible.. that's literally the whole point of having a business. But you can't just compare Intel of the past when they had a dominant position to modern Intel which is struggling to avoid bankruptcy.
Quentin-Code@reddit
A negative margin is called a loss in this context. It would be like speaking of negative salary or a positive deficit, it « exists » but commonly speaking a margin is positive, as much as a deficit is negative.
Strazdas1@reddit
Negative margin is a far more common term than the other examples you used. Yes, it is a synonym for loss.
Legal-Insurance-8291@reddit
I feel like you're just trying to push a narrative here. We know that these console parts are usually sold with minimal profit and we know Intel's fab are FAR more costly than TSMC. It makes perfect sense that AMD/TSMC would be able to offer a part Intel simply can't compete with on cost.
Boreras@reddit
it would've been reported as a loss if it was a loss. That is an enormous detail in the story, especially given the audience. It's possible the sources are wrong, but based on the reporting itself the claim is that they were talking about profit margin, not loss margin.
Legal-Insurance-8291@reddit
It's a freaking rumor dude. I wouldn't try and read a lot into minor wording choices like this.
spazturtle@reddit
Sony pay per working die. If the wafer has a high defect rate then even if each chip has a small margin Intel could still make a lose per wafer.
Quatro_Leches@reddit
not really, thats not how it works, Sony does not care about defect rates, Intel will sell the chip at a cost, and they will make a profit, mind you probably a lot smaller than if they sell it to a data center (although is that 100% of your capacity, if so, say no, though it doesnt seem like it right now). Sony is the one that will eat the loss if there is any
Real-Human-1985@reddit
Intel is paying TSMC for every worthwhile chip they make from now on and can't afford to raise prices either as AMD is kicikng their nuts across any X86 space.
Legal-Insurance-8291@reddit
Someone needs to tell Pat that because he's still trying to make IFS a thing even though we all know it's DOA.
DaBIGmeow888@reddit
It is so dead on arrival. Intel has to formally separate IFS, GloFlo style.
Legal-Insurance-8291@reddit
It's only a matter of time.
DaBIGmeow888@reddit
Low margin is low profit. Negative margin is negative profit. What was debate was whether it was low or high margin. Beggars can't be choosers if you are financially unprofitable on IFS.
Helpdesk_Guy@reddit (OP)
I think on these very long-term console-deals, I see even titans like TSMC becoming a lot more pressure to right their yields ASAP, when customers like Apple can begin to pressure TSMC to only pay them per working die and expressively not per wafer.
I think there will be a major shift in power at fair-play foundries, in that major customers will have the big semis over their barrel in fierce talks of cut-throat price-negotiations, when the customer has the ability to guarantee big volume – Guaranteed loading-schedule and load-factor on the foundry's fabs, which the foundry needs to make profit.
Meanwhile the foundry will be forced to swallow penny-pinching break-even deals by major customers, as they themselves as foundries are effectively forced to accept those, since they're with their back against the wall and are otherwise eaten up alive by their own maintenance-costs of the own fabs …
Makes one think who's the actual winner here.
lupin-san@reddit
Chip contracts where the customer pays per working die only really applies to bleeding edge nodes because yields aren't mature enough. The customer is essentially a guinea pig for the foundry.
Consoles do not use the latest bleeding edge nodes. Chips are using mature nodes where yield is a known quantity and isn't a big concern. Wafer prices isn't as expensive as bleeding edge either. So the goal is to make the chips as small as possible to get the most out of a wafer.
The winner here is whoever wins the chip design contract. They get a continuous revenue stream for the lifetime of the console and new tech they can use on their own products.
shakhaki@reddit
I don't know why you say in the same breath they should take any contract they can get because they're bleeding money, but then say it's a loss leader market. Margin is the most important thing to the longevity of a business. It's better to make no money than it is to lose money.
wizfactor@reddit
I never said Intel had to sell chips at a loss.
Loss leading was in reference to Sony, not Intel. Consoles are sold at a loss, so if Intel was expecting laptop-like margins when selling their chips to Sony, it’s a deal-breaker.
shakhaki@reddit
There's components in your response I agree with but consoles are not loss leaders like they have been in the 2000's. Microsoft has a strict 20% GM threshold if any business line is to survive cuts from the CFO. If you're above that line you get to continue operating. Sony can't afford to lose money on PlayStation either as they're not really a software company in the way Microsoft is.
tset_oitar@reddit
Or maybe they just didn't want to lose money on consoles? Sure this gives them some fab utilization, but thats it? The Arc graphics card series was delayed and massively scaled back for the same reason. Their gfx, cpu IP PPA is inferior to AMD. Sony stood to lose backwards compatibility, power, design implications and the overall risk. They probably wanted these chips free of charge. This was also happening alongside the whole Alchemist fiasco, Intel would likely have to pay Sony to use their chips lol
AutonomousOrganism@reddit
Nonsense. Console chips were always high volume low margin. I guess Intel think they'll get a higher margin producing something else. AMD on the other side seem to be quite happy with a low margin but stable income source, it kept them afloat during the Bulldozer era after all.
randomkidlol@reddit
considering the amount of repeat business AMD semicustom/ATi has been consistently getting over the last 25ish years, id say theres more to it than just giving customers a good deal. theres probably a lot of experienced staff on that team that have built up long working relationships with staff on customer teams, so going back to AMD is kind of a no brainer.
Evening_Feedback_472@reddit
Because amd has no risk they design and tsmc shoulders the risk of making it. In this case Intel wasn't confident yet it's 2022 they barely started their ifs pivot
BatteryPoweredFriend@reddit
Wafer contracts are signed and payment/initial deposits are given years in advance of production starting.
In the case of leading nodes, the first round of contracts are often formalised & paid for when they're still operating in low-volume or even risk production mode.
Zenith251@reddit
Lol wut? Who pays TSMC? Fab time is paid for up front and/or in advance. TSMC makes the chips they were paid to make, it's up to AMD to recoup their investment by selling the chips.
TSMC takes on near-ZERO risk as production is already paid for. AMD ends up with products they have to sell or go bust.
BatteryPoweredFriend@reddit
It's still quite ridiculous if that was their decision, since even Nvidia just took the L with the Tegra and gave it to Nintendo for pennies. And the X1 is now among one of the best selling products Nvidia have made in their entire history.
Real-Human-1985@reddit
Intel wanting margin on Arc is wild, lol.
COMPUTER1313@reddit
Imagine if Intel had priced Arc "accordingly" for desktop GPUs when their drivers were still raw. Holy hell it would have been a bloodbath and made AMD's GPU software features look like a Michelin plate.
brand_momentum@reddit
Arc GPUs were priced "accordingly" at launch
KingStannis2020@reddit
To be fair to Intel, their GPU driver team was based in Russia, and the launch happened a few months after the invasion of Ukraine. Intel pulled out due to sanctions, so basically the entire team had to relocate to a different country or be let go.
Exist50@reddit
I think that was more of their MKL team. Their driver team seems to largely be China, iirc.
ABotelho23@reddit
Such a basic marketing thing too. Especially as a leader in other segments. Loss leader is just basic shit.
Dangerman1337@reddit
The thing is they could've had good marings if Alchemist worked and released way earlier. Imagine 3070 or 3070 Ti performance w/ 16GB of VRAM and sold at $500 where it was enough to make a decent profit per card sold?
cheapseats91@reddit
Honestly if they had released like 6 months earlier they would have been swept up by crypto miners (or gamers who had to deal with crypto miners and had no other options) even without working. Alchemist cards came out like a month after the bubble popped and noone had a need for a gpu with a half baked software pack.
Dangerman1337@reddit
Early 2022 the Crypto Mining Boom was cooling down a lot. Emphasis if Alchemist worked.
soggybiscuit93@reddit
Alchemist was never going to be profitable. Selling at a higher cost would've helped it lose less money, though.
Alchemist had to shoulder all of the amortized NRE costs of spinning up a new division. One generation of product isn't enough to pay off that investment.
COMPUTER1313@reddit
Especially in the midst of the pandemic supply shock and crypto boom. Intel could have charged almost whatever prices they wanted as long as it was cheaper than AMD.
Legal-Insurance-8291@reddit
It says a lot about this sub that so many more people are willing to believe Intel executives are just complete idiots than are willing to believe Intel simply didn't have a competitive product at this price point.
nanonan@reddit
Well we also have half a dozen other examples of Intel excecutives being complete idiots.
Legal-Insurance-8291@reddit
Fair.
gnocchicotti@reddit
We know that things are looking less rosy for Intel now than they did in 2022. That's the benefit of hindsight. Maybe Intel leadership wasn't stupid, but it appears they turned out to be wrong.
Legal-Insurance-8291@reddit
Maybe, but even if they had won the contract there's a decent chance Sony would have pulled out by now given all of Intel's struggles to actually deliver a working node and that would have only looked worse.
itsjust_khris@reddit
Lol right, why is the default assumption that we know better than whoever actually made the decisions with the information they have that we don’t.
nanonan@reddit
Chip makers aren't the ones losing money on console sales.
Quatro_Leches@reddit
Part suppliers never lose money on consoles but the console maker does at the end of the day intel will get profit from selling the part however it’s a low margin business
Helpdesk_Guy@reddit (OP)
You mean like how Nvidia claimed back then, they let AMD win the PS4-deal, since Nvidia didn't wanted the lower margins?
jaaval@reddit
I don’t think this has anything to do with IFS. It was a chip deal not chip manufacturing deal.
-protonsandneutrons-@reddit
The first line of this article.
This subreddit, especially, has a problem with commenters not even opening the article.
ResponsibleJudge3172@reddit
AMD doesn't fabricate the chips themselves. They handle the process of fabricating though.
Same thing here. The Intel chip would be designed and a fab chosen and ported to said fab by Intel. Whether it be TSMC, IFS or Samsung
-protonsandneutrons-@reddit
jaaval@reddit
You have a problem with understanding the article.
Helpdesk_Guy@reddit (OP)
What?! Who do you think was supposed to manufacture these dies at Intel? The design-branch?!
Of course it's a manufacturing-deal about a chip. Being then hopefully designed by Intel and fabbed at their IFS.
That has everything to do with their manufacturing side of things, called Intel Foundry Service! SMH
jaaval@reddit
intel is going to manufacture chips regardless. IFS doesn’t need extra intel chips, they need external orders.
Exist50@reddit
They need that too. From a foundry perspective, the two hold similar value.
jaaval@reddit
Not really. Intel can fill the foundry production lines (at least for higher end stuff) if they so choose. But that’s not what they need at the moment. Nobody is interested to see how many chips they manufacture for themselves. The foundry business is dependent on getting external designs.
Though considering the console is mostly gpu I would guess it would have been tsmc anyways.
Exist50@reddit
No, they can't. That's literally part of the problem. Their datacenter chips are basically selling at cost now, and they still have underutilized fabs.
It would surely have been for an APU fabbed at Intel. Or maybe chiplets, but no reason to split suppliers like that.
jaaval@reddit
Because they have chosen to use TSMC for client. That’s a choice. Im not sure why you think this would have been made at intel. Did they even have designs for GPUs that are portable enough to be fabbed internally. Something like meteor lake configuration is possible of course.
Though it’s a bit hard to believe intel3 is underutilized. Despite having to compete on price intel still sells a lot of server CPUs.
Exist50@reddit
RPL remains the majority of their client volume.
jaaval@reddit
Yes. There is also a lot more capacity for intel7 than there is for newer nodes.
I guess my point is this: If someone made a big order of intel Xeon would you say that is a victory for intel foundry services? I wouldn’t. Sure the foundry would make some money but it wasn’t the foundry services that sold a product there.
Exist50@reddit
There's different perspectives here. On one hand, the only reason Intel's selling to 3rd parties in the first place is to drive more absolute volume and better utilization/amortization of legacy nodes. Greater Intel internal demand works towards that same as external. On the other, however, Intel choosing them own fab is a much weaker signal of the objective health/strength of the node than a 3rd party doing so, and thus has less influence on future adoption by 3rd parties.
Helpdesk_Guy@reddit (OP)
Ah, okay. Now I get it. My bad! So you pre-emptively made the pretty fair assumption, that a given Intel ARC-GPU was about to be manufactured by TSMC anyway, if Intel made that console- deal happen? I wasn't really getting, what you were talking about.
You think?! You think that Intel would've gone with TSMC? I guess, Intel could've manufactured a given graphics on their 10nm as a stopgap and with that, reduce their fabs vacancy by quite a bit … Or do you think that would've been a too uncompetitive design?
jaaval@reddit
Iirc intel7 is not really portable with respect to chip designs. Moving designs between TSMC and intel3 is more believable.
Helpdesk_Guy@reddit (OP)
I honestly don't even get what he's talking about …
I mean, is he under the impression, that Intel was just within the last two contestants to solely *design* the SoC, or what?
nanonan@reddit
Right, like from Sony.
jaaval@reddit
This would be intel ordering intel designed chips and then selling them to Sony. You know, the same way they now sell CPUs. Me buying a cpu is not an external order for intel foundry.
Helpdesk_Guy@reddit (OP)
Huh?! What else is Sony or Microsoft with their console-deals but a external contractor?
Is Tokyo and Redmond all of a sudden now considered some Intel-subsidiary and NOT a external foundry-customer?
What are you even talking about? Do you even read the news here?!
Of course Intel was supposed to design the console's SoC and manufacture the said chips afterwards on their IFS site of things.
Dude … Whatever you take, take less of it. Or pass something up to the thirsty ones!
jaaval@reddit
Intel designed chip is still intel designed chip. It’s not an external order any more than someone ordering custom Xeon is an external order.
Helpdesk_Guy@reddit (OP)
My bad, you went with the assumption hat a ARC-graphics would've gone to TSMC regardless …
I don't know. A Intel-SoC with a powerful iGPU like a AMD-APU on their 10nm?
NeonBellyGlowngVomit@reddit
Read.
The.
Fucking.
Article.
jaaval@reddit
I don't think you understand the content of the article.
Wyzrobe@reddit
Spooky23 on June 4, 2015 | parent | context | favorite | on: Autodesk’s John Walker Explained HP and IBM in 199...
Morris Chang, founder and ex-CEO of TSMC:
“You Americans measure profitability by a ratio. There’s a problem with that. No banks accept deposits denominated in ratios. The way we measure profitability is in ‘tons of money’. You use the return on assets ratio if cash is scarce. But if there is actually a lot of cash, then that is causing you to economize on something that is abundant.”
gburdell@reddit
The real issue is buried in the article. Intel also would have needed to have backwards compatibility with PS5. That’s a lot of extra overhead AMD didn’t have.
Helpdesk_Guy@reddit (OP)
They're both x86 anyway, no? Or what were you aiming at?
Strazdas1@reddit
Intels drivers arent great with older games.
xCAI501@reddit
GPU
Berengal@reddit
Weren't there rumors last year about xbox being in long talks with Intel before settling on AMD too?
imaginary_num6er@reddit
Yeah but that was before Xbox decided to exit the console business
chx_@reddit
Did they? I heard something in 2023 they might and this year that they won't. https://www.gamespot.com/articles/microsoft-isnt-exiting-the-console-business-xbox-boss-phil-spencer-reportedly-told-staff/1100-6520982/ but this is old, did I miss something?
Strazdas1@reddit
There is nothing beyond rumours.
Real-Human-1985@reddit
Imagine losing out on a Strix Halo type SoC to use an intel solution with a GPU that performs two tiers lower than its transistor and power budget.
brand_momentum@reddit
Nintendo is proof that aiming for top performance doesn't matter.
You see the same thing with the recent rise in PC gaming handhelds, Steam Deck outsells all of its competitors, even though performance wise many of them are more powerful.
Strazdas1@reddit
Nintendo is proof that nostalgia is one hell of a drug and that commuting perjury on the regular is unpunishable.
Real-Human-1985@reddit
Intel’s lack of performance doesn’t come by design nor with any efficiency bonus, lmao. Dude why don’t you got check how much power each Arc GPU uses?
brand_momentum@reddit
Nvidia proves nobody cares about power draw
conquer69@reddit
The 4090 is very power efficient.
brand_momentum@reddit
AMD has been beating the efficiency drum for the past few generations and Nvidia outsells them at every GPU tier
ToeSad6862@reddit
Intel has a history of bribing oems to where sales don't really matter anymore.
Baalii@reddit
Guess AMD had the better bribes this time around, or why else would Sony and Microsoft people side with them?
bob-@reddit
Because they are correct and righteous, AMD is special you see, they are the chosen ones fighting for the people!!
Different_Return_543@reddit
It's known that AMD led a rebelion https://www.youtube.com/watch?v=9R8F-aN6W4g https://www.youtube.com/watch?v=h8NUtM4sexc
Jeep-Eep@reddit
MS's most consistent millstone is really boneheaded hardware choices in console...
TheHumanConscience@reddit
For sure Intel would bring one big advantage. BROD instead of RROD or YROD.
Helpdesk_Guy@reddit (OP)
I mean, as others already said – If the price is right. It's not that Tokyo and Redmond didn't got themselves a bloody nose with a couple of gens of consoles and had to subsidize their consoles with a +$100 USD for each sale. Sony was on a brink of collapse back then due to their lossy sales, Microsoft already lost huge sums on the Xbox 360's and its Ring of Death-deable.
Also, it's known that Microsoft still subsidizes their current Xbox, as they lose around $100-200 USD with each sale of a $500 Xbox Series X, which Phil Spencer pointed out himself. Sony also likely loses money and already lost on every PS3/PS4 sold.
So I guess, a console-design based on a ARC-graphics and with a core-cpu would've been quite possible, if they price was right.
Though on these very long-term console-deals, I see even titans like TSMC becoming a lot more pressure to right their yields ASAP, when customers like Apple can begin to pressure TSMC to only pay them per working die and expressively not per wafer.
I think there will be a major shift in power at fair-play foundries, in that major customers will have the big semis over their barrel in fierce talks of cut-throat price-negotiations, when the customer has the ability to guarantee big volume – Guaranteed loading-schedule and load-factor on the foundry's fabs, the foundry needs to make profit.
Meanwhile the foundry will be forced to swallow penny-pinching break-even deals by major customers, as they themselves as foundries are effectively forced to accept those, since they're with their back against the wall and are otherwise eaten up alive by their own maintenance-costs of the own fabs … Makes one think who's the actual winner here.
Berengal@reddit
If the price is right... and according to those rumors Intel seemed very keen on getting that partnership... But then again people can make up rumors for whatever reason.
ThankGodImBipolar@reddit
I heard these rumors as well, but I believe it was always suspected that Microsoft was giving Intel the runaround. You may recall that during the original Xbox launch, there were a bunch of AMD staff in the audience who were very surprised when it was unveiled to be powered by Intel + Nvidia.
sharpshooter42@reddit
OG Xbox IIRC had Intel based Devkits too.
Helpdesk_Guy@reddit (OP)
Yes, dev-kits for software-developers. He was talking about prototypes from the original hardware-developers though.
sharpshooter42@reddit
Facepalm, I meant to write AMD devkits
Ok-Wasabi2873@reddit
Before they switched to Apple Silicon. I heard rumors that Apple would always schedule the Intel people for meetings after the AMD people. And they would run into each other.
tupseh@reddit
Was this before or after they booted nvidia?
unityofsaints@reddit
Yes
Ok-Wasabi2873@reddit
After, nvidia and the whole BumpGate was around 2009(?). I heard this around Broadwell generation (2015).
brand_momentum@reddit
Another negative Intel article from Reuters... they have an agenda at this point.
i7-4790Que@reddit
No. You're just way too emotionally invested into Intel.
jayjr1105@reddit
Maybe if Intel wasn't a negative news producing machine.
_Mavericks@reddit
If I remember correctly, AMD once said that the partner designing the custom chip earns part of the intellectual property in that custom design and they can iterate on their own with the design. And also, they can build the chip wherever they want.
That's how Microsoft integrated on the same die a PowerPC and a Radeon GPU with the Xbox 360 slim.
I can't see a scenario where Intel (a control freak supplier) does the same.
randomkidlol@reddit
yep AMD semicustom is willing to share, license, and allow customers to add their own IP to existing AMD's design. obviously this necessitates sharing HDL code of AMD's GPUs, CPUs, and IO controllers as well as driver source code.
Nvidia and Intel being massive control freaks would not even consider the idea letting HDL or driver code leave their labs, let alone allowing someone outside the company make modifications to it.
Present_Bill5971@reddit
Hard to imagine any vendor winning besides AMD for Playstation and Nvidia for Nintendo. Backwards compatibility has become incredibly important with forever games and game catalog subscriptions. Sure they're the vendors and can better handle compiled shader handling than open source emulator devs can, but it's still all likely major development expense
mutantmagnet@reddit
I am extremely doubtful nvidia will still be a partner after the switch 2.
randomkidlol@reddit
yeah theres a very common pattern with nvidia burning bridges with every business partner they ever work with. its not a matter of if but when theyll burn the nintendo partnership.
onlyslightlybiased@reddit
I mean, I'm sure they would have gone around to get quotes but realistically, why would Sony partner with Intel when they've been so successful with amd?
Traditional_Yak7654@reddit
It would have to be an incredible deal being offered for them to even consider it. Which makes the margins being too low sound about right. Even if intel pitched a chip built at TSMC it wouldn’t magically make them have a competitive gpu architecture. I just don’t see how, with a worse manufacturing process and gpu architecture, they could have put together anything that makes sense for either company.
reddit_equals_censor@reddit
we wouldn't know what process node ps6 would use tsmc vs intel and how they turn out.
crucially the one reason, that makes intel suck as a foundry business rightnow to think about is not necessarily the performance of the node, but everything around it.
until now intel build process nodes FOR INTEL! and that's it. tsmc build process nodes for their partners in business.
point being, that intel creating an intel apu with an intel process node would be vastly less bad or maybe equal at least compared to how bad it would be, if let's say amd were to build an apu for sony using an intel process node.
of course still doesn't make any sense to go with intel, unless intel gave them the chips for free either way.
Exist50@reddit
Their GPU should be in much better state by Xe3/4. Trouble is getting someone to buy into such an unproven roadmap.
Exist50@reddit
Cost.
DaBIGmeow888@reddit
Intel costs would have been much higher, thus lower margins.
Exist50@reddit
Presumably they're comparing N3 or N2 to 18A. TSMC's margins are large enough that Intel should, in theory, have room to undercut even with the cost deficit.
fatso486@reddit
I wonder if any company besides AMD has a real chance of securing future console contracts, especially given the low-margin APUs. The RX 6650 XT has 11 billion transistors, while the NVIDIA 4060 has 19 billion, yet they deliver similar performance. This highlights AMD's significant cost advantage, allowing them to lower prices and making it nearly impossible for competitors to compete. Intel, in comparison, is even further behind; their '3070 silicon' barely outperformed the 6600 XT the last time I checked, making AMD the clear choice
Quatro_Leches@reddit
well, intel has APUs now. lunar lake actually beats AMDs newest chips in efficiency.
Xillendo@reddit
We don't know that until we have independent reviews. Also, AMD doesn't have any chip in the Lunar Lake range at the moment. The HX 370 is much bigger and very likely a lot faster in MT workloads. It's probably more power efficient as well in MT.
Very likely, Lunar Lake sole win will be power efficiency at low TDP and on a single thread. It's probably going to lose in every other metric.
Lysanderoth42@reddit
I doubt nvidia cares about AMD having most of the shitty low margin console market now that it’s a $3 trillion market cap titan
AMD’s cost advantage matters little when nvidia is so dominant in upscaling, RT and overall featureset for the high end PC market
From-UoM@reddit
The 6600xt compromised in areas in RT and AI on die.
Now that they have been added on the ps5 pro you can see the price ballon.
Civil_Medium_3032@reddit
Intel never had business with playstation to begin with.
Helpdesk_Guy@reddit (OP)
The article also states the very reason …
INITMalcanis@reddit
Sounds like basically AMD wanted to lock in the volume to keep their overall COGS down, while Intel chased margin.
Azzcrakbandit@reddit
"while Intel chased margin"
Sounds similar to why nvidia doesn't make many console chips.
Quatro_Leches@reddit
well, at least nvidia is operating at capacity, in that case, it makes sense not to
Intel is not. not a whole lot is going on at IFS besides burned baked goods
Helpdesk_Guy@reddit (OP)
Didn't Nvidia back then just pulled the plug on Microsoft's original XBox overnight and with that effectively killed it?
HonestPaper9640@reddit
IIRC the truth is a little more complicated. Microsoft negotiated a pretty decent deal on the original run of Xbox GPUs from nvidia but when they went to get a second run nvidia figured they had them over a barrel and wouldn't really budge on price.
sharpshooter42@reddit
There was also the Xbox motherboard trashing over security issues Nvidia had to eat a loss on when it was discovered the next runs had More vulnerabilities.
emrexis@reddit
Funny things about nvidia and their contract..
Microsoft started with nvidia (OG Xbox) they later went with amd.. Sony then use nvidia for PS3, they later went with amd. Apple starting to use nvidia for their high end macbook gpu, they later went with amd (then to arm/apple silicon of course).
Only nintendo still staying loyal with nvidia.
Azzcrakbandit@reddit
I wouldn't really call it loyal since nintendo doesn't consistently use nvidia. They used it for the switch because it was cheap and extremely efficient for what it was. If they use nvidia again, I figure it's likely due to backwards compatibility.
Helpdesk_Guy@reddit (OP)
For the time being, yes. Nintendo made that mistake to go with Nvidia, and likely immediately regretted it.
Since by going with Nvidia with the Switch, but they got granted a broken, overheating mess which was flawed from start to finish and granted Nintendo a nice bill afterwards for compensating their customer's broken/dying consoles – Nintendo not only had to initiate a large-scale recall-program over busted batteries, image errors and freezing hardware (all due to the overheating Tegra), but also due to a fundamental security-flaw of the Tegra itself, which enabled a data-leak, by which millions of Nintendo-accounts were compromised due to stolen hardware DRM-keys. The Switch sold a lot though.
That was at a time, when manufactures didn't even dared to poke that hot mess with a ten feet stick for a reason for years.
The funny thing is, that many predicted that (troubles) being exactly the case with Nvidia well beforehand, as many felt actually sorry for Nintendo having fallen for Nvidia's sweet honey-talks – Nvidia dumped them their trashy Tegra for a fortune of Nvidia itself (when no-one wanted having anything Tegra inside their products for half a decade).
Azzcrakbandit@reddit
The fuck are you talking about. Switch hardware failures were not that bad. I don't know why you have such a hate boner for that specifically.
wizfactor@reddit
There were always pragmatic reasons for Nintendo to stick with Nvidia.
The question is how nice was Nvidia when it negotiated with Nintendo over T239.
Helpdesk_Guy@reddit (OP)
Ironically, Nintendo previously used mostly ATi/AMD before, like on their Wii U (AMD Radeon Latte GPU) and their +100M units selling mega-seller Wii (ATi Hollywood GPU) or even its predecessor GameCube (ATi Flipper GPU).
cp5184@reddit
And the OG xbox originally had an AMD cpu and even the demo xbox had an AMD cpu, but intel undercut AMD on price. Some AMD VP or something just let the pitch go by.
The AMD CEO made it clear that would never happen again.
hhkk47@reddit
Went the other way around. They used AMD/ATI chips from the GameCube up to the Wii U. At the time that they were designing the Switch, AMD didn't really have a competitive SoC, and Nvidia's Tegra SoC from the Shield TV was pretty much the best choice for their use case.
Ghostsonplanets@reddit
Tegra SoC was chosen before being used on Shield TV.
Azzcrakbandit@reddit
The shield release date was 2015.
Ghostsonplanets@reddit
Tegra X1 was chosen as early 2014.
Azzcrakbandit@reddit
Source?
Ghostsonplanets@reddit
Gigaleak. Nvidia demoed TX1 to Nintendo in late 2013, and the contract was signed in early 2014. Nvidia even did some revisions on TX1 security for Nintendo.
Tegra X1 Mariko (16nm revision) started to be planned around 2016 but couldn't meet the launch.
Azzcrakbandit@reddit
That's cool but not a source
Ghostsonplanets@reddit
Gigaleak was a ransomware suffered by Nintendo, iQue, and Broadcom and resulted in tons of confidential files being released. It's literally from the source.
More than that just by asking God himself.
Azzcrakbandit@reddit
A source = a link
HandheldAddict@reddit
For now.........
Wouldn't be surprised if Nintendo eventually went with Qualcomm or AMD.
soggybiscuit93@reddit
I know there's rumors that AMD is working on an ARM based mobile chip, and AMD had that short lived partnership with Samsung, but I'm willing to bet money that Nintendo has no plans to switch from ARM for their handhelds.
rocketchatb@reddit
Nvidia violated numerous DirectX Api specs on PC. Radeon didn't at the time. Microsoft wants to go for accurate HD graphics not driver level hacks so ATi was the answer.
broknbottle@reddit
IIRC sales of OG Xbox hit a threshold and usually at this point console manufacturers will offer at a new price point like 199.99 or 149.99. The deal they had with Nvidia made that challenging as Nvidia was not willing to take work with MS on the cost per GPU.
It’s always seemed like Nvidia acts like strictly a supplier of a part or component and less of a “partner” with vested interest in seeing the product or service be successful. In hindsight this may be on of the keys to their success i.e. focusing on Nvidia problems and not becoming distracted by everybody else’s problems.
imaginary_num6er@reddit
I guess Nvidia doesn't make Tegra chips for Nintendo
Azzcrakbandit@reddit
Those have extremely small profit margins. I'm very surprised nintedo went with them for a $300 handheld Console in 2017 while the ps4 was $400 in 2013.
soggybiscuit93@reddit
The Tegra X1 was a gen old and didn't have much success when Nintendo launched the Switch. They probably got a really good deal on a product that was otherwise selling poorly for Nvidia.
Azzcrakbandit@reddit
I'm really excited to see under the hood of the switch 2 chip.
Real-Human-1985@reddit
Tegra T239.
Azzcrakbandit@reddit
I'll wait until it's officially confirmed
Real-Human-1985@reddit
the entirety of the Geforece now leak has come to pass.
GrandDemand@reddit
Fair enough. I will say that details about T239 (at least, all of the credible ones) come from the 2022 Nvidia hack OR Nvidia's Linux4Tegra repo. It's not some Twitter leaker with a spotty-at-best track record, it's directly from Nvidia's files. But I totally understand the skepticism
imaginary_num6er@reddit
A $799 Switch 2, that’s what /s
Helpdesk_Guy@reddit (OP)
Except that Nvidia ramped up a new inhouse-division for semi-customs, aiming at a +$30Bn-market – Turns out …
MSN.com - NVIDIA Has Been "Calling on Microsoft and Sony Every Week" about Returning to PlayStation and Xbox Consoles
Massive_Parsley_5000@reddit
I think it's more likely that they rather publicly burnt bridges with Microsoft, and soured Sony by woefully under delivering on the RSX.
PainterRude1394@reddit
Probably has nothing to do with that. There are probably stronger actual business reasons like profit, margins, product risk, etc.
Azzcrakbandit@reddit
The RSX was more of a result of it being closure to last-minute integration. I'd say sony's poor planning has more blame on that one.
Jeep-Eep@reddit
That and being obnoxious about customization?
Sounds right, though I'd still take Intel over team green in that case between x86 and in house fabbing, if I was going for console silicon and AMD was off the table.
Helpdesk_Guy@reddit (OP)
I mean, could've really helped Intel's foundry-ambitions, I guess.
Such a contract would've been a low-effort constant maintenaince-justification and a nice ramp-up 'commodity' to show other potential foundry-customers, that they're able to sport some long-term contracts.
INITMalcanis@reddit
I guess Intel hadn't quite finished swallowing their pride at being demoted to "value/volume" supplier status back then
wizfactor@reddit
This was back in 2022, probably when Pat Gelsinger still felt invincible. The following 2 years have been brutal on both Intel and Pat.
vegetable__lasagne@reddit
Broadcom? Would they have made the GPU too?
SPECTOR99@reddit
Imagination is still in business, I guess they could make one.
Helpdesk_Guy@reddit (OP)
It's a refreshing thought, that a PowerVR-driven from Imagination Technologies would've come close to the current-level console-GPUs, but I guess Broadcom got bold and aimed at: “You miss 100% of the shots you don't take.”
That's some Brownie points for trying in my books!
Helpdesk_Guy@reddit (OP)
It's a refreshing thought, that a PowerVR-driven from Imagination Technologies would've come close to the current-level console-GPUs, but I guess Broadcom got bold and aimed at: “You miss 100% of the shots you don't take.”
That's some Brownie points for trying in my books!
College_Prestige@reddit
What is Broadcom doing in the bidding process? Do they even have something performant in the market or was their entire bid based on trust me bro I can make something good?
FumblingBool@reddit
They are a major player in designing AI processors (like google’s TPU).
Vushivushi@reddit
Probably the top custom designer. They probably asked for quite the premium.
COMPUTER1313@reddit
Probably trying to get ARM architecture into consoles and thus ensuring that future games can run on ARM natively.
Quatro_Leches@reddit
Ps vita was arm
Helpdesk_Guy@reddit (OP)
I find it a bold move, which maybe could've possibly played out with a good bunch of good old engineering, I guess?
I really like, that Broadcom comes up bold and wants to really bring ARM-designs!
Majutsv@reddit
i think small profit for huge exposure wouldn't hurt, but yeah it's intel...
Substantial-Soft-515@reddit
Another Reuters and Max Cherney hit piece once the stock rises above $20...Reuters is losing a lot of credibility here with these hit pieces...An article with almost zero named sources...that also from 2022...I hope Intel sues this Max Cherney person for all these hit pieces...He is definitely working with a competitor and trying to lower the stock price...
Helpdesk_Guy@reddit (OP)
You really need to calm down and stop putting Intel in the victim-corner though … If you've already zero'd in onto some particular news-editor, I really thing your obsession got the best of you, thinking already too much about some nebulous enemy image.
It was neither my intention to bring a 'hit-piece' (whatever this is supposed to mean anyway…), but just tried to inform. I thought Intel being even in the run-ups and among the last two possible suppliers to bring a likely ARC- and Core-enabled console, would've been great news!
Though, as it seems, Intel shot themselves in the foot (again) and wanted to up the price-tags – Them greedy backfired hard.
The last time that happened, was when they told Apple to go kick some rocks over the margins already, and refused to deliver Apple their iPhone-SoC back in 2007 – With that stupid move, Intel single-handedly spawned the ARM-universe as we know it today and gave live to the plethora of ARM-powerhouses of the multi-billion market-heavyweights like Qualcomm, MediaTek, Broadcom, Samsung and others.
Of course, all of them went on, the become Intel's biggest competitors and which bring fierce competition to Intel. So in retrospect, it was likely one of Intel's biggest blunders – As turns out now, Intel just did the same stupid move again over margins …
Substantial-Soft-515@reddit
HelpDesk guy I hope you are able to get out of the Intel obsession sooner than later... Wish you the best :)
Helpdesk_Guy@reddit (OP)
Didn't knew I had a obsession, but …
Worldly_Apple1920@reddit
Why is it a 'hit piece' if it reports news that you dislike? Why is it never Intel's own incompetence driving it's debacles, it's always some personal angle.
Substantial-Soft-515@reddit
Look at the 4-5 articles Reuters has published about Intel with the same reporter...It is pretty obvious ...I can see why you wouldn't think so seeing your comment history which is entirely on articles about Intel...😂
Helpdesk_Guy@reddit (OP)
You know that reporters and/or news-editors usually have a desk and a given scope of news, right?
It's called having a ›ressort‹ and it's a area of responsibility Xy the editor/report usually brings news in.
John has bone-dry politics, Becky has the joyful gossip and tittle-tattle, Allison reports on that ever endangered environment, Frank has the technology locked in to report on and so on …
It's really not that hard to understand, and a given reporter/editor always bringing news in a given topic, is called ›just doing his job‹!
So there's really no need to embark in some conspiracy-theories over someone shorting some stock and bringing 'hit-pieces' or whatever.
DaBIGmeow888@reddit
Okay, we have Sony, Broadcom, SoftBank, Qualcomm, the list goes on and on.
If it smells like a duck, quacks like a duck, it must be a mismanaged Intel leadership.
Helpdesk_Guy@reddit (OP)
You know, victim-mentality is strong in some people these days …
Exist50@reddit
First time reading the news?
Maybe they just now got a good source, which would also explain the series of articles as a whole. And if it was so long ago as to be irrelevant, that should surely look better for the stock, no?
Lmao, they wouldn't dare risk discovery. Also, truth is an ironclad defense.
Son_of_Macha@reddit
Wouldn't the better way to explain be that Intel failed to get Sony to move to their chips? AMD have supplied the last two consoles
Real-Human-1985@reddit
Intel would love to have a real client for their foundry right now.
-protonsandneutrons-@reddit
Per Infel, we should've seen at least Qualcomm on 20A:
https://www.anandtech.com/show/16846/intels-first-highprofile-ifs-fab-customer-qualcomm-jumps-on-board-for-20a-process
20A was also meant to prove "perf/W parity" jn 2024:
https://videocardz.com/press-release/intel-announces-3b-dix-mod3-oregon-factory-expansion-and-new-manufacturing-roadmap
Sigh.
Real-Human-1985@reddit
Yup. 20A was supposed to have qualcomm chips AND both Lunar Lake and Arrow Lake. Now 18A is the promised land.
Exist50@reddit
LNL, at least, was always N3, and ARL had at least one N3 die planned for a long time. The cancelation of 20A is still not a good omen, however.
Real-Human-1985@reddit
I got links of Intel saying LNL was coming on 20A.
Exist50@reddit
Where?
Qsand0@reddit
In his pants 😂😂😂
DaBIGmeow888@reddit
Ever shifting goal posts
Helpdesk_Guy@reddit (OP)
At which process was the PS5's design being made? 7nm?
Real-Human-1985@reddit
Yes Zen 2 on 7nm.
Helpdesk_Guy@reddit (OP)
Thanks! So should've been Intel 4 then, I guess?
Real-Human-1985@reddit
Intel’s 7nm is Intel 4, not sure what node they wanted to use. PS6 isn’t out yet and 18A will be hopefully in full swing before it comes out.
Ghostsonplanets@reddit
TSMC N7P for Oberon
TSMC N6 for Oberon Plus
Helpdesk_Guy@reddit (OP)
Ah, okay. So refresh got a N6 then?
Ghostsonplanets@reddit
Correct. Viola is up to anyone guess (N6 x N4P)
Next-Last-Next@reddit
Intel lost a similar deal with Apple for the initial iPhone chips. They learned from it and repeated what they did then, and lost what could’ve been a steady revenue stream.
Brilliant executives, hopefully they got a good bonus for this decision. /s
COMPUTER1313@reddit
You joke, but back in the early to mid 2000's, IBM was chasing margins as part of their financial engineering and was willing to bleed customers by charging extremely high costs for the contracts.
And more recently the entire cable/streaming business (e.g. Disney and Netflix) is continuously hiking prices and taking away conveniences. A few of my friends have discussed going back to pirating content if the prices keep going up.
Next-Last-Next@reddit
This a different dynamic companies to charging end users directly I feel. You would think they would learn from the biggest miss of letting go of iPhone (not guaranteeing that they could’ve had a good phone processor)
Helpdesk_Guy@reddit (OP)
You bet – They had one of the most potent ARM-suppliers at hand and at their disposal back then (XScale), to sport a given ARM-design. They turned that down likely on purpose, when losing out on the deal over Apple's iPhone-SoC over insisting on higher margins.
Though I think Intel knifed their whole ARM-based department and any whatsoever ARM-based inhouse-competency when selling the XScale-branch to Marvell in June 2006 fully on purpose, only month before Apple introduced their iPhone in January 2007.
Since without doubt, any discussions and plans with Apple to manufacture the iPhone-SoC by Intel must have been ended by then and the actual manufacturing must have been already started at Samsung by then for the product-launch six months later.
Thus, that move of Intel was likely a direct and imminent response from Intel and immediate act of defiance upon Apple's own refusal of their higher-priced CPU-challenge, which Intel turned down. It was most definitely a demonstrative and deliberate move (and statement) from Intel, to expressively not wanting to support anything ARM in any future as a consequence for them.
Thus, Intel likely expressively said with that move, that they won't support anything ARM in the mobile-space and consequently considered anything another mere x86-realm to be conquered – Intel subtly but distinctly said, that their and only their x86 is the way forward and only viable solution to every problem arising in any foreseeable future, since Intel declared as such …
Likely the same, as Intel did a couple of months ago, when selling all of their shares on ARM Ltd., which was (in my understanding) also a demonstrative and distinct move to also not wanting to support anything ARM – This time in the shape of Windows on ARM.
The mere suggestion of many (especially the media-outlets), that Intel's move is to be considered purely financially motivated (by a stake worth only $147m!) was laughable to begin with – I think, it was again a strategic move and a distinctly telling one.
I think that Intel maybe feels kind of 'abandoned' by Microsoft – Especially the fact that of all things their direct ARM-competitor Qualcomm got certified as the only viable and legit supplier of Microsoft's Co-Pilot+ really stung a few at Intel, I guess?
Helpdesk_Guy@reddit (OP)
Schocking! Who would've thought, that jacking up price-tags in a free market as a non-monopolist, may end up the customers going some places else?! Does the Harvard Business School know this new market-mecanics?! xD
DaBIGmeow888@reddit
Intel bargaining like it's 1990s. Just where their stock price is at.
Helpdesk_Guy@reddit (OP)
Thx, spilled my coffee … Really gave me a chuckle!
I don't now if out of pity or glee though, but you likely nailed it. Seems they can't help themselves, I guess?
COMPUTER1313@reddit
"We made the next quarterly financial reports look great. Bonuses for us!"
Pulls cord on golden parachute before the wheels fall off the neglected vehicle
ProfessionalPrincipa@reddit
That's more an indicator of the land-grabbing era of streaming coming to an end.
Within the realm of established businesses, think Broadcom after their acquisition of Pat's old stomping grounds at VMware.
LeotardoDeCrapio@reddit
Most executives, with a few rare exceptions, at the top level operate under a principal guideline of maximizing profits. And there tends to be a correlation between maximization of margins and profits, so that is the bet most of them are going to follow.
It's part of the whole "Fiduciary duty of a corporation's board members to act in the best interests of the shareholders investment returns"
It can lead to catastrophic loss of opportunities and leaving a lot of potential profit being lost in an unironic manner. All because one influential moron named Milton Friedman, who had never created a business in his life, didn't know that new markets and business models can not only be tapped but created as well.
A lot of Intel execs in the 00s were classically trained, so they couldn't understand the mobile market and the business model of being a for pay foundry for 3rd parties. Apple, on the other hand was led by someone without that handicap. And the rest is history.
Next-Last-Next@reddit
Whether or not classically trained, they should be able to see the opportunity, sense the trends and give the company. What’s the point of being a highly paid executive otherwise?
LeotardoDeCrapio@reddit
The expectation is that they are either fired or replaced by the board when they fail to do so.
However, when companies reach certain size or age, the people leading them tend to have a much more conservative decision making.
It's easy to see the great business opportunities with 20/20 vision of the past. But when it is happening in real time, it is far harder because of the tunnel vision of the present.
E.g. who is going to risk their career on a business model/opportunity that it is not clear because nobody else has taken off there. How are they going to gather support within the organization? Etc, etc.
Reddit tends to have little connection with how things happen at those levels. It is not a matter of an executive just getting the millions of dollars needed to implement the iPhone SoC win out of thin air.
Next-Last-Next@reddit
They are the ones who have the decision making capability. They decide on the big ticket items. I get that you can’t do everything but how can you miss out on so many things? Mobile/Console/Foundry/AI/Discrete graphics. Anything I’ve missed?
Of course they’re fired with a golden parachute too.
LeotardoDeCrapio@reddit
In this case, Console is still not a very attractive business.
Real-Human-1985@reddit
nonsense. both intel and nvidia would LOVE console wins. That's why Nvidia is sticking it out with Nintendo despite not gettign their usual price gouging. Them tegra soc's would be landfill without nintendo and they're not getting "margins" on a $199-$249 product, lol.
LeotardoDeCrapio@reddit
No company loves low margin design wins that come with lots of pain.
Every contact I have had on AMD's semicustom couldn't wait to get out that group.
Next-Last-Next@reddit
You did notice the report mentions that this was for PS6, if AMD wanted to get out of semi custom it could’ve done so. Zen was already successful, acquired Xilinx too so why did they choose to go ahead then?
LeotardoDeCrapio@reddit
I didn't mean that AMD wants to get out of the semicustom business entirely. But rather that people working for that group want to get out of it, because of the pressure and execution schedules.
Xilinx was bought with stock not cash.
AMD was burning cash for well over a decade, they started to become truly profitable a few years after Zen. Furthermore, their GPU division is on life support, and without the console Apus, the RADEON group would be SOL since there is not enough revenue from the consumer dGPUs to fund development and their data center GPU revenue is still not stable.
The semicustom group provides a consisten revenue, albeit with low margins, for AMD. Which they need still.
Next-Last-Next@reddit
I listed more than just Console, seems like inherent issues with the decisions made at the top.
LeotardoDeCrapio@reddit
I focused on console because it was the topic of the article.
Obviously the responsibility is firmly on the leadership. I'm surprised their CEO and a bunch of key executives still have jobs honestly.
Next-Last-Next@reddit
Got it, just that it’s in line with everything else in a series of bad decisions. Agree with you on this.
imaginary_num6er@reddit
Just like how Pat lied of Intel having a "healthy dividend" and slashing it by 1/3rd a month later in Q1 2023.
DaBIGmeow888@reddit
Intel and gaslighting goes together like PB&J at this point.
Real-Human-1985@reddit
Lmao.
haloimplant@reddit
always hard to say how serious companies are about alternate suppliers, vs just presenting them as a possibility to get a better deal from their current one. it's still hard to see anyone competing with AMD on an integrated CPU+GPU solution. nvidia doesn't have the CPU and intel doesn't have the GPU
jenesuispasbavard@reddit
For a second I thought it was written by Mark Cerny lol.
kingwhocares@reddit
They can always go for Xbox.
From-UoM@reddit
Amd would have completely utterly screwed in the gaming sector if they lost the PS6.
We have seen their gaming tank with the consoles sales slowing down this fiscal year. A huge portion of their gaming revenue and RnD is coming from Sony.
If that went, oh boy.
HandheldAddict@reddit
Sony actually impacts AMD's graphics roadmap.
Maybe that's the reason Sony isn't budging on PS5 pricing.