Apple discontinues the Mac Pro with no plans for future hardware
Posted by iMacmatician@reddit | hardware | View on Reddit | 266 comments
Apple has also confirmed to 9to5Mac that it has no plans to offer future Mac Pro hardware.
Uptons_BJs@reddit
Makes sense, the Mac Pro was still only available with M2 variants.
End of the day, if the only graphics cards Mac OS will be supporting going forward are Apple integrated graphics, what advantage does a big tower have over a Mac Studio anyways?
jchill2@reddit
This is just a signal to peripheral providers that they can't expect any sort of support
algaefied_creek@reddit
Yeah honestly there were some (admittedly niche) yet industry insider sort of folks excited about Intel graphics on Apple Silicon Macs.
But I think this year’s B70 graphics card was supposed to be out in like 2022….
So obviously Intel wasn’t able to deliver that product either.
——
Still no excuse for not having AMD support.
Not to mention nVidia has wonderful UNIX drivers and had wonderful Apple drivers in the past.
—-
TL;DR: Apple does not want a device that could be considered anti-competitive if it doesn’t allow third-party add-in graphics/compute cards
VodkaHaze@reddit
You don't sound like someone who has mained nvidia graphics on linux. Nvidia drivers on linux have famously hellish compatibility
Plank_With_A_Nail_In@reddit
Nvidia Linux drivers are famous for being closed source that's all, they are actually no better or worse than anyone else's for compatibility.
Half the problem with Nvidia's drivers are caused by the distro maintainers themselves.
Strazdas1@reddit
They are no more closed source than AMD. Both are partially open source partially binary blobs.
VodkaHaze@reddit
I have a workstation and home server running nvidia on various linux distros over many years.
I'm speaking first hand experience as someone who has wasted several days debugging broken nvidia driver updates, the latest one being a 580.x which didn't work on my linux kernel version.
Strazdas1@reddit
Nvidia drivers on linux are actually the best there is if you use it for things that arent gaming.
turtleship_2006@reddit
They used to be, they are much better in recent years now, especially since they're (partly) open source now
talkingwires@reddit
I can’t allow my Fedora 43 install with a 2070 to ever suspend or power off the monitors because as far as Nvidia’s drivers are concerned the one connected via DisplayPort will cease to exist. 🫠
Caffdy@reddit
I have three monitors connected to my nvidia card on Fedora 41 and I can sleep, hibernate, connect, disconnect, exchange cables on my monitors with no issue. Probably you have a problem of configuration
talkingwires@reddit
But are you using DisplayPort or HDMI? The monitor on the HDMI port works as expected, the one on DisplayPort does not.
Caffdy@reddit
I'm using both
talkingwires@reddit
Well, I am happy it works for you out of the box. I have no idea how one would diagnose the problem and the forum threads I found on the issue have posts from people killing the Gnome Display Manager process, cycling their monitor’s power, and other such remedies that don’t actually fix it.
Young_Maker@reddit
At work on Ubuntu I can't sleep it either because all my containers loose CUDA support
Dark_Souls_VII@reddit
I‘m writing this from my workstation (Xeon W-2295, 256GB, Quadro RTX 4000) with Debian 13. All I had to do was enable the non-free repository and do 'apt install linux-headers-amd64 nvidia-driver' and I'm happily using driver 550 without even thinking about it.
VodkaHaze@reddit
I'm well aware, you're using ancient drivers however.
Even last year I ran into broken 580 driver version updates on various distros (ubuntu 24 and 26, pop os 22 and 24) and GPUs (2060 and 5090).
If you pin your nvidia driver version to some ancient one that is known to work, good for you until you have to update your kernel version and it breaks or something like that.
Caffdy@reddit
talk about hyperbolizing. The v550 drivers are only 1 year old, and you should already know that anyone using Debian have reasons to use stable releases of each software piece for stability and reliability. You yourself told us about your "broken 580.xx dirvers" journey across several distros. Not everyone wants OR need to keep Nvidia drivers updated to the latest
Dark_Souls_VII@reddit
Yeah my system is not exactly a gaming PC indeed. The neat part about Debian (Stable) is that the package versions are not changing. Hence the name stable. I just wanted to share my experience here since it was about Workstations and I happen to use one that is at least somewhat comparable to a Mac Pro I guess.
algaefied_creek@reddit
I have CachyOS and nVidia drivers are autodetected, installed and configured during install
algaefied_creek@reddit
Nah just FreeBSD, Illumos, Arch. Everything is already nice and ready to go I guess for me, so they seem lovely.
luciddrummer@reddit
My 3080 has had zero issues since my swap to Linux. I’m on mint and it’s an older card so it’s well supported. 50xx series I might need to be in Arch, idk.
KodaKR-64@reddit
It wouldn't work well or make sense with their architecture at all.
The entire purpose of their integrated GPUs on package is that it can efficiently share memory with the CPU.
So instead of having a limited amount of dedicated GDDR on the graphics card that needs to be copied back and forth, both the CPU and GPU can share the same large pooled memory, you can have up to 256GB of graphics memory.
deadgirlrevvy@reddit
There's a very good reason why discrete GPU's have their own RAM: performance. Video card RAM is much faster than sharing sytem RAM, while also allowing the card to cache textures without having to interrupt what rhe CPU is doing to request new assets constantly. At literally no point in history has shared RAM had more performance than dedicated video RAM and it never will.
KodaKR-64@reddit
Literally everything you're saying is incorrect, it's honestly kind of funny.
No, a GPU having its own memory is actually slower because the data needs to be copied back and forth between the system memory and the GDDR on the graphics card.
Yes, in fact, it does.
You have no clue what you're talking about. It's been benchmarked extensively.
Apple's new chips are by far the fastest for video editing.
algaefied_creek@reddit
That’s the whole point of above 4G decoding, Rebar, and stuff like Green-whatever’s extended nVidia drivers with memory-mapped pointers?
Treat the system has a cohesive whole
wpm@reddit
Even if the data isn't being shuffled around, it's slower for the GPU on the card to have to read from system RAM over PCIe than it is for the CPU to read system RAM over a DDR5 memory controller or the GPU to read it's own GDDR over its likely very wide bus.
algaefied_creek@reddit
Well, of course it is, but that’s still slower than no actual expansion overtime
MC_chrome@reddit
Apple dropped NVIDIA support after NVIDIA tried to blame failing GeForce GPU's in Macbook Pros on Apple, when most of the blame lay at NVIDIA's feet for a poor GPU design.
There was little chance of two multi-billion (now multi-trillion) dollar companies making up after such a public feud especially when Apple decided to cultivate a relationship with AMD until Apple Silicon was ready
Helpdesk_Guy@reddit
Yup, that infamous #Bumbgate of Nvidia back then most notebook-manufacturers had to endure …
Virtually all notebook-manufacturers suffered huge losses, including the big ones like Dell, HP and Apple, because of most of their notebooks' nVidia-GPUs dying en masse within weeks to months on a large scale, making logic-board replacements pretty much inevitable (even well before any warranty ran out).
The whole industry had to run extremely expensive exchange-programs free of charge out of pocket on their own dime, causing huge, billion-worth losses — nVidia blamed it on TSMC's processes, when none other of such processes-derived GPUs failed the way those of nVidia did …
Ever since, Apple has major beef with nVdia because of #Bumbgate — Replaced nVidia top to bottom with ATi.
Back then Apple was even legally forced by courts to extend and actually RE-open already very costy long-running and previously closed-up free replacement-campaigns & exchange programs for their notebooks, even about a year after those ended already, after losing some consumer-lawsuits with major backing of consumer-advocacy groups.
IMHO, nVidia should've been sued by all vendors they supplied GPUs to for billions in charges (and Jensen's ego downsized for that with billions in awarded damages), for bricking millions of devices — They got away with it …
The whole industry was dumb enough to let themselves being bribed with rebates on future nVidia-orders instead!
Creepy_Accountant946@reddit
If there was a case they would have been sued, source on the rebate conspiracy? Are you just a butthurt amd fan?
Helpdesk_Guy@reddit
What?! Were do you came up with anything AMD here dude?? I didn't even wrote anything remotely close to that.
I was just explaining the background of the infamous bumpgate of Nvidia-GPUs, that's all. And there were plenty of rumours back then of nVidia just doing mere damage-control on all fronts by handing out huge rebates.
AFAIK that was even confirmed by the same outlet who dropped that Apple ditches nVidia in favor of ATi, 9to5?
Ekgladiator@reddit
It almost makes me wonder how apple would fare if they did the same thing to the GPU space as they did to the CPU space with the M chips. If anyone has the money and ability to knock Nvidia down a peg, I'd put my money on apple...
Obosratsya@reddit
What did M chips do exactly? Apple's market share in computing is still the same more or less. They didnt break into a new industry. All you got is more of the same.
derzemel@reddit
Pro/Studio music production. Here is an example from a proffessional using the rack mount version in a studio:
https://www.youtube.com/watch?v=kIQINCWMd6I
Niche, yes, not a small niche though.
phrstbrn@reddit
That use case can be serviced by thunderbolt to PCIe expansion racks, and professional audio companies already sell rackmount solutions to solve this problem. Audio interfaces don't need a lot of bandwidth so daisy chaining one thunderbolt port can multiplex a lot of PCIe cards. It's a non-issue.
SharkBaitDLS@reddit
You could slot in non-GPU PCI accessories but that is niche upon niche at this point. The vast majority of people that wanted the horsepower of a Mac Pro will do just fine with a Studio and the few that were adding in PCI accessories will just move to Thunderbolt accessories.
pdp10@reddit
Additional and upgraded network interfaces, M.2 storage, video capture, are not niche on Mac nor anywhere else.
JtheNinja@reddit
Many of those have Thunderbolt equivalents and have for many years now. If all else fails, there's always Thunderbolt PCIe chassis too. The Thunderbolt ones cost a bit more, but when the Mac Pro costs $2k more than a Mac Studio with the same stuff on the logic board, that kind of goes out the window.
wpm@reddit
And the Thunderbolt equivalents all come with the added cost of the Thunderbolt controller. And need their own DC power supply. And don't have a secure place to live on the desk. And have higher latency than a direct PCIe connection. And can have more PCIe lanes than what Thunderbolt carries.
No professional wants to live the dongle life. It's why the trashcan sold so fucking badly, and why there were aftermarket rack mount solutions that cost a fucking arm and a leg.
The Mac Pro as it was, yes, a dead end. But there would be a lot of non-GPU use cases for a Mac Studio with even one or two low-profile PCIe slots.
ScotTheDuck@reddit
Probably at this point the best use case for PCIe on the Mac Pro was custom video encode accelerators for codecs like RED, but even then, the media engine in the Apple chips is so good that it would mostly be redundant.
derzemel@reddit
also pcie cards for pro/studio music production. Here is an example from a pro: https://www.youtube.com/watch?v=kIQINCWMd6I
KodaKR-64@reddit
All of that works over Thunderbolt now.
Thunderbolt 5 is 80Gbps.
alexforencich@reddit
Except the cables are limited to like 6"
KodaKR-64@reddit
How is that worse than having a PCIe card mounted inside the computer itself?
alexforencich@reddit
Bandwidth. 80 Gbps is quite slow compared to what you can do with 16 lanes of PCIe.
KodaKR-64@reddit
Audio doesn't need anywhere near 80Gbps lol
A single track of PCM audio at CD quality is like 1.5Mbps.
alexforencich@reddit
Other things you might want to connect over PCIe do need that kind of bandwidth.
KodaKR-64@reddit
Such as? They dropped support for graphics cards.
Everything else was a very small niche. Apple said Mac Pro sales were a single digit percentage of all Mac sales.
alexforencich@reddit
Networking is probably the most obvious application. 100-400 Gbps NIC, to access your RED RAW video content on a NAS, or similar. Or maybe an array of SSDs - you can get gen 5 x4 nvme drives now, and you can put four of those in a bifurcated gen 5 x16 slot. In that case thunderbolt would be the bottleneck.
KodaKR-64@reddit
Most companies do NAS over 10Gb Ethernet, and I don't know of any smaller companies that use NAS at all.
I'm a video editor and haven't worked at any companies that used it.
Thunderbolt isn't a bottleneck for anything lol
alexforencich@reddit
Well, I do FPGA development and as part of that I do a lot of stuff with 100G Ethernet, and I have some stuff assembled for 400G. Everything needs PCIe x16 slots, thunderbolt doesn't provide anywhere near enough bandwidth. Yes I admit this is pretty niche, but it is still a data point. Flip side, none of the software I use runs on arm/macos, so apple hardware is not a consideration at all.
KodaKR-64@reddit
"It's a niche" is why Apple stopped selling the Mac Pro lol
That's my point. It just wasn't selling enough for them to justify the time and money they put into it.
The vast majority of professionals who use Macs are perfectly fine with their other laptops and desktops.
Believe it or not, my MacBook Air with no cooling fan can easily handle 4K video editing. It's pretty wild.
Things that used to require an iMac or Mac Pro can now easily be done on their laptops. I no longer need a desktop for any reason.
alexforencich@reddit
The thing is, software issues can be worked around to some extent (emulation, virtualization, etc.) but if the necessary IO capability is not there then you're completely screwed. So yeah I guess they work fine if you don't need any oddball high bandwidth IO... But for anyone who does need that, apple hardware is completely out of the question.
KodaKR-64@reddit
And that's such a small niche, Apple clearly doesn't care. No one is forced to buy a Mac.
0xe1e10d68@reddit
I'll take it seriously when an actual professional using the Mac Pro with 100 Gbps networking, who is dependent on it, speaks up
alexforencich@reddit
Great! I am an actual professional and I use 100G and 400G Ethernet in basically all of my computers, along with other PCIe hardware that requires lots of bandwidth. But, I will admit none of the software I use runs on arm or macos.
JtheNinja@reddit
Apple sells a 3m one, although it’s absurdly expensive
justjanne@reddit
Call me old fashioned, but the PCIe connector just feels more reliable than thunderbolt.
KodaKR-64@reddit
Good thing it's a free market, and no one's forced to buy a Mac.
But Mac sales seem to be doing very well, I'd say. They only continue to increase.
JohnPooley@reddit
People are apprehensive to use cables for audio because of the USB audio latency issues which don't exist on thunderbolt
KodaKR-64@reddit
No need for that at all. Even my MacBook Air without a fan can smoothly play back raw 6K RED footage, it's pretty insane. I was like speechless when I got my first M1 Air and saw how smoothly it handled the RED footage.
Try doing that on the 2019 Intel iGPUs and it probably would've burst into flames lol
A lot of things that used to require an iMac or Mac Pro can now be done on a MacBook Air.
wpm@reddit
There are other types of PCIe cards you can plug in you know. It's not just GPUs.
Ok-Parfait-9856@reddit
Audio cards, pcie storage, and fast io would be use cases but extremely niche. Thunderbolt 5 substitutes just fine anyways, it can tunnel pcie 4 x4
pmjm@reddit
As someone who used to run a Mac Pro RAID I wouldn't dare use thunderbolt for it. Too easy for the cat to unplug external drives and then you're fuggggged.
I've since moved on to a better config but it was nice having a huge raid connected to airdrop for a while lol.
turtleship_2006@reddit
I wonder how people would feel about a VGA style version of thunderbolt with the lil screws...
corvaxL@reddit
There are some devices that actually do this. Companies like Sonnet and OWC, on some of their higher end devices, have screw holes above the USB ports, which mount to collars that fit snugly on the cable.
xxTheAIguy@reddit
Does Apple make their own cards?
Uptons_BJs@reddit
Just the integrated GPU right? No discrete GPUs.
xxTheAIguy@reddit
I assume yes. Thank you.
This-is_CMGRI@reddit
Welp, it ded now, long live Mac Studio
jhenryscott@reddit
It been dead. Nobody paying $8,000 for 1/2 a Threadripper
KodaKR-64@reddit
Who cares about Threadripper? lol
A121314151@reddit
Workstation users.
It might not be in your scope (or mine for that matter, Ryzen 9 is enough for the work I do), but for work that needs much more cores and power the Threadripper is not just good, it'll pay for itself in the span of 1-2 months. The amount of money they stand to earn from the work they do on it is so great it has much better value proposition.
Time is money in some cases.
simo402@reddit
Not even the cores, the biggest difference is the connectivity via pcie
KodaKR-64@reddit
Well I guess that's the point. It's such a small niche that Apple never really sold a ton of Mac Pros, so it wasn't worth the investment.
Years ago they said it was only a single digit percentage of their total Mac sales, and that most professionals actually were buying the MacBook Pro and their other desktops instead.
Most of their professionals are creative pros like video editors, and those things rely a lot more on good video decoding and encoding.
Apple's chips are by far the best I've seen for video editing. Even the MacBook Air without a fan can smoothly play back raw 6K RED camera footage, which is pretty insane.
Good luck doing that on an Intel iGPU.
A121314151@reddit
Yeah, as for the Mac Pro it's mostly useless in an ARM age. Just get a Mac Studio, it's much smaller form factor and has all the power you need either way.
Threadrippers are an entirely different class though. It's for things such as CAD, physics simulations, heavy calculations, 3D modeling among others in environments where Windows or Linux is preferred, and multi-core is necessary. Different use cases necessitate different devices.
KodaKR-64@reddit
I don't know when "multi-core" meant only more than 32 cores, but... lol
Yes, some tasks (not many for PCs) need more than 32 cores.
JtheNinja@reddit
Most tasks that need more than a few cores scale linear-ish to arbitrary numbers of cores. If going from 24 threads to 32 threads makes your workload 50% faster, there's a good chance you can quadruple it again with 128 threads
KodaKR-64@reddit
Does it? That's not really how most software is written.
Even benchmarking software isn't written to benchmark that many cores lol
JtheNinja@reddit
...yes it is? Have you ever used any kind of software that does some embarrassingly parallel operation? Raytracing, fluid simulation, etc? I assure you they support 128 cores, because their customers want to run them on 128 cores and the algorithms scale to that because again, they are embarrassingly parallel operations and the code is designed to just spawn arbitrary numbers of worker threads as cores are available.
fullmetaljackass@reddit
Exact reason I'm typing this comment from a Threadripper workstation. Needed to run some industry specific CAD software that can only render via it's built-in CPU based renderer, but it'll scale across as many cores as you throw at it.
KodaKR-64@reddit
Most software isn't written for that.
KodaKR-64@reddit
Are people using regular PCs for that?
A121314151@reddit
I do, on a Ryzen 9. My work doesn't make enough money to justify a Threadripper however.
NevergofullPJ@reddit
Yes. But preferably beefy workstations with threadrippers.
jhenryscott@reddit
All the people who might have bought the Mac Pro. In my case, BIM/CAD/Workstation tasks are reliant on massive amountts of IO to accelerate simulation and modeling. A Mac Pro would be an option but it’s a really bad deal next to the TR and even the Xeon W
KodaKR-64@reddit
But almost no one was buying it. That's why they stopped selling it... lol
Same with workstations overall, really.
NevergofullPJ@reddit
There's still a market for workstations. It's not that your only usecase for a computer is browsing reddit and office that there arent companies who need that performance.
KodaKR-64@reddit
Not from Apple, clearly. Almost no one was buying it.
NevergofullPJ@reddit
Cause it was more expensive for less.
KodaKR-64@reddit
Even back when it was using Intel Xeons, few people were buying it.
Apple always said it was a single digit percentage of all Mac sales. It was never a big seller. Most people just don't need a workstation.
Even tons of professionals don't.
That's why most PC gamers and creative professionals are using chips like the Core i9 or Ryzen 9 or Apple M3 Ultra instead.
moofunk@reddit
Because it was shit when it came out. It was a terrible deal.
At the time, if you wanted to use it for simulation/rendering as a traditional multi-core workstation, it would be 2-3x the cost of a PC with the same performance. You could also buy dual-CPU workstations with more cores at the time for much more performance in the same box.
You could put any GPU in the PC in up to 7 GPU config, where the Mac Pro needed specialized AMD GPUs, and you could only fit two.
The Intel CPUs offered at the time for the Mac Pro was a tail end generation of Xeons and they were never updated. The Trashcan Mac Pro had the same problem, that the CPUs they offered were simply too old and were never updated.
The first cheese grater Mac Pro didn't have this problem. It was the best design with the most expansion options and the best cooling, and Apple just gave it up.
The only thing the 2019 Mac Pro had going for it, was that you could put 1.5 TB RAM in it.
There really were quite many reasons the Mac Pro didn't sell. You could simply pack way more power into a traditional PC than you ever could with the Mac Pro.
KodaKR-64@reddit
I don't think the trash can Mac Pro was a bad design, it was just too ahead of its time.
The current Mac Studio is even smaller than that Mac Pro was, but Apple's chips run way cooler and use a fraction of the power.
Some people were bothered by the lack of PCIe slots, but I think that's the way things have been moving for a while now. Thunderbolt 5 is 80Gbps, and the Mac Studio has been really popular with creative professionals.
You also don't get any of the speed benefits or efficiencies of unified memory with a graphics card with its own GDDR.
Even the PC world is moving to SoCs like Qualcomm and Nvidia's upcoming ARM chips.
The only people who still care about huge 50 pound workstations and discrete GPUs are PC gamers lol
moofunk@reddit
What they do on SoCs is just a miniaturized version of multi-channel memory on a many-core workstation CPU; The speed is measured as the combined (aggregate) bandwidth from 4-8 channels of memory, which alluringly outcompetes the standard desktop CPUs with dual-channel memory, but don't compete with Threadrippers and EPYCs with 8-12 channels of memory in much higher quantities.
The downside that the SoCs have is that they can't get anywhere near the same memory bandwidth as discrete GPUs on the graphics side, so they'll never be anywhere as fast inside the GPU/GDDR memory boundary. Completely forget about it, once there are multiple GPUs in the system.
The upside for SoCs is that they are fast enough for general use, but still not for specialized use.
Not at all. Anyone that needs the PCI lanes for large GPUs or other specialized accelerator hardware will have to buy such a machine. If you need the cores for whatever eason, you will need to buy such a machine.
You simply can't get anywhere near real workstation performance with single SoCs or smaller enthusiast CPUs. Those things are literally 5-10x faster than the fastest Apple M CPU available.
KodaKR-64@reddit
Apple's unified memory is up to 820 GB/s.
The advantage of on-package memory is it can be pooled and shared with the CPU and GPU, data doesn't need to be copied back and forth between the main memory and the graphics memory like with a discrete GPU.
Also, soldered LPDDR5X is generally faster than regular DDR in DIMM sticks.
Apple's Mac Studio is actually by far the best for video editing. It performs better than a workstation with a discrete GPU.
Because the RAM can be fully used by the GPU as graphics memory, the Mac Studio can have up to 256GB of VRAM. That's far more than any graphics card.
moofunk@reddit
That's aggregate bandwidth for Apple's highest end CPU.
A single Nvidia RTX 6000 Pro is 1700 GB/s.
A Tenstorrent Blackhole Quietbox has 2+ TB/s aggregate bandwidth between its chips. It costs less than a Mac Studio.
This is really the weak point of such systems, even though it appears strong: You can't do any other work with it, while your task occupies most of the memory. Your only option is to get another Mac Studio.
Dedicated GPUs can transparently occupy 100% of a task without you noticing at all. Yes, the M chip has more memory per chip, but most software that needs this much memory are built today for splitting memory across multiple GPUs and it's not a problem to run 4 GPUs in a system for a single task.
It would be far more interesting if Apple made an M-cpu on a PCI card for workstations.
KodaKR-64@reddit
And yet people are having no issues at all with the performance of the Mac Studio, and it's been getting nothing but great reviews.
Apple primarily targets creative professionals with them.
No, they aren't better at everything. Neither are discrete GPUs.
Apple is clearly targeting people like video editors, music producers, etc. with it.
For example, ProRes is a very common pro video format, and Windows has pretty limited support for it, and Intel/AMD/Nvidia GPUs have no native support for it at all.
Apple's GPUs are so good that I can smoothly edit 6K raw RED camera footage on my MacBook Air. On Intel, that required a Mac Pro with an accelerator card.
Good luck doing that on an Intel iGPU...
moofunk@reddit
Because they use fixed function hardware to encode and decode video. They are overgrown version of what they spent a decade developing in-house for their phones. That makes them extremely good for video editing, but I have no clue why I would buy such a machine for workstation use.
The GPU will age out in a couple of years and then you'll have to buy a new machine to keep up with new software developments, and again, you still only have one GPU.
KodaKR-64@reddit
Who's telling you to buy one?
It's a free market. Buy or don't buy whatever you want.
Not for video editing, at least. 8K isn't really a thing that's happening. Video will be at 4K for the foreseeable future.
All TV manufacturers have stopped selling 8K TVs now, and no Hollywood movies or TV shows are being edited in 8K.
Their M1 chip from 6 years ago still works very well.
The new MacBook Neo basically has the same performance as the M1, and people have been able to do 4K video editing on that. Pretty wild that a $499 laptop can do that.
moofunk@reddit
I'm just going back to your claim that people would rather buy a Mac Studio over a many-core workstation.
I think you're not at all considering what many-core workstations are used for.
KodaKR-64@reddit
For 99% of people, a Mac Studio is more than enough performance.
Apple didn't care about the remaining 1% any more. That's not a significant loss of customers to them.
KodaKR-64@reddit
No it wasn't.
Apple literally compared the Mac Pro on stage in 2006 to a Dell workstation configured the same, and the Dell was over $1,000 more expensive.
It's just that most people don't actually need workstations, it's a niche.
moofunk@reddit
The machine demonstrated in 2005, not 2006 was a Power Mac, not a Mac Pro.
The 2008 Mac Pro was considered competitive with PCs at the time.
It doesn't have anything to do with that the 2019 Mac Pro was a bad deal to compared to contemporary PC workstations.
KodaKR-64@reddit
No, look at WWDC 2006 on YouTube.
They compared the Mac Pro to the Dell Precision 690 with the exact same specs.
The Mac Pro was $2,499 and the Dell was $3,450.
Okay? But Mac Pros were always a niche, and never a big seller.
Apple always said they made up only a single digit percentage of all Mac sales, and that most professionals instead preferred their other computers like the MacBook Pro, iMac, and now the Mac Studio.
moofunk@reddit
And as I'm saying, that has no bearing on the actual popularity of many-core workstations like you claim it does.
I was generally focusing on the 2019 Mac Pro, sorry if that wasn't clear. It is the 2019 workstation that was considered a bad deal, not the older version.
KodaKR-64@reddit
PC workstations aren't hugely popular either. They're a niche.
Most people just don't need them.
A Core i9 or Ryzen 9 or Apple M Ultra is more than fast enough for 99% of people.
moofunk@reddit
Not speed. Parallelism. Do many things at once.
Your Core i9 can do your tasks fine, but the many-core workstation can do those same tasks, while you do several other things in the background.
The financial software we build at the company I work for can benefit from 128-core CPUs, as a type of high precision compute where you must run many different simulations to pick one good one and that can take days on a regular CPU but a few hours on a very large CPU.
KodaKR-64@reddit
Yes, but those things are such a small market and niche that Apple didn't feel it was worth being in that market.
They considered making even larger chips, like 64 CPU cores or more, but decided that market was so small it wasn't worth the time and money for such a tiny market.
Apple isn't a chip company like Intel or AMD, and Apple doesn't really sell very much to enterprise.
Intel and AMD can afford to make those 128-core+ CPUs because they get purchased in volume to go into servers.
Apple doesn't sell their chips to anyone else, and isn't in the server business, so making those chips wouldn't make any sense for Apple.
moofunk@reddit
They aren't in the market, because they can't compete and haven't been able to for about 14 years.
They didn't make the larger chips, because they simply can't make them. Their M3 and M4 Max chips are built near the reticle limit. The M5 Max is already split in two dies because of the bigger GPU.
There are grave design and scaling costs to the M-chips: The low power requires many more transistors for the extremely deep design and very wide memory buses they use to achieve the performance.
This leaves very little room to scale up chips to go beyond prosumer level performance. If they were to do that, they'd have to do a traditional split between CPU/GPU into two separate packages.
KodaKR-64@reddit
They could compete, they just have no reason to because it's such a small market, and the costs of making chips like that would exceed the sales of the computers, since they don't sell in volume in servers.
Intel and AMD only make those chips because they get large orders of them for servers.
Workstations are a pretty small market.
They absolutely could.
Bloomberg even reported that Apple had an "M1 Extreme" planned, which was two M1 Ultras put together, but they decided it was such a small niche it wouldn't have made financial sense.
AMD and Intel only make those chips because they have economy of scale with large volume purchases for servers.
Apple would never make their money back on that chip just selling them in Mac Pros.
moofunk@reddit
It's not that small. Apple simply can't produce chips that perform in the same league as Epycs and Threadrippers, but if they could, people would certainly buy them.
There is an economy to it of course, but if they didn't sell specific workstation parts, they wouldn't make them and motherboard makers would not make workstation boards. That's the end of it.
Threadripper and Threadripper Pro aren't used in servers.
The chip was never more than a rumor. It would have been too complex to manufacture, and hit against the upper limit of what could be manufactured at the time, and M1 didn't have a PCI controller suitable for the task.
That's the design cost I mention; It's simply not possible to scale those chips up beyond a certain core count without moving to a smaller node, which is how the newer chips are able to use more transistors and get more performance. Further advances will require an even smaller node.
They can make those chips because, especially AMD, don't hit against manufacturing limits thanks to the chiplet design, hence their ability to hit very high core counts.
KodaKR-64@reddit
Yeah, it is. Workstations are a tiny percentage of PC sales.
Most high-end things are moving to server farms, like AI and heavy compute tasks.
And everything else can easily be done on a Core i9, Ryzen 9, etc.
Very high-end professional graphics rendering are all done on huge render farms, not locally on a workstation.
Again, yes, they could.
No, because Apple isn't a chip company. They don't have any interest in selling their chips to other companies.
They only make chips for their own products.
Mark Gurman is probably the most accurate Apple leaker, and is very well-connected inside the company.
They had designed the chip, but decided not to mass produce it due to the high cost and very limited potential customers.
They could do two different dies on the same package and interconnect them, like they do now, and AMD has been doing for years.
I don't think the rumor was ever that it would be one single die.
Apple uses chiplets now too with the M5 chips, they call it "Fusion Architecture".
moofunk@reddit
That is not necessarily true. There is certainly a lot of compute that has been established in server farms due to new AI software, but there are still many tasks that you can't move off the workstation, because the software doesn't allow it or the data is too sensitive to move out of the house or even the office it sits in.
The stuff we work with can't be moved like that, for example. It's always local to the workstation it runs on.
I don't think there has been any change to how such projects are done in recent years. The full weeks of renders may be done on server farms, but the main work is still done locally on machines with powerful GPUs.
Again, you can do that work on a Core i9, but workstations aren't based on those chips for a reason that you continue to miss.
The chip was only a rumor, sorry. They could have speculated on it as a concept of what it would have looked like, but they could not have produced it, because it would be a bigger chip than anything even produced today.
At best there would have been done an internal engineering study on it, long before M1 Ultra was ready.
They still haven't talked about any Mx Extreme chips, because they are plainly physically too large.
KodaKR-64@reddit
Call up Dell and ask them what percentage of their computer sales are workstations with Xeons or Threadrippers.
I'd bet money it's in the single digit percentage, just like Mac Pros were.
If that's how you want to describe sourced reporting from a journalist, sure.
But his reporting isn't "rumors". He talks to actual employees currently working at Apple and gets insider knowledge.
He reported on Apple's plans to switch to ARM in detail in 2018, a full 2 years before it was announced, and even reported accurately exactly when it would be announced.
The chip was planned, but they decided to shelve the idea.
No... once again, it's because the market is so small for those systems, especially Macs.
Apple could easily do it with chiplets. It would just be too expensive and would be a very low selling product. Not worth all the engineering for such a low volume product.
moofunk@reddit
So, that means there's a market for them? That Apple couldn't compete in?
You will have to know that Apple can feed any rumors they want to a "trusted reporter" and use him to leak information as close to official as possible, and said reporter could also earn good money on being "accurate" with information, because nothing conflicting was leaked.
The idea of an engineered M1 Extreme is unlikely, given the physical challenge of simply building the chip. I don't even think Nvidia could do it.
But, I won't put it past them that they thoroughly studied the problem, maybe in order to understand if future chips could scale that far up.
No, they could not have done it with chiplets. The dies are far too large for that.
The reason AMD can do chiplets is because they are no more than about 80-120 mm^2 in size each, which makes it easier to make cheaper high yield chips.
KodaKR-64@reddit
A very small one.
People were buying the Intel Mac Pros, but after they switched to ARM the product really made no sense any more, since it no longer supported graphics cards or upgradeable RAM.
The Mac Studio is much better.
You know literally nothing about Apple if you think they'd do that lmao
Care to show us all your degree in CPU engineering?
NevergofullPJ@reddit
So you're just saying the obvious. But you think nobody buys workstations? People who need workstatioms will buy them, if you don't buy them you're not the target demographic.
You don't buy a formula 1 car to go grocery shopping either, but you don't want to be in a fiat panda when you go racing either.
KodaKR-64@reddit
Not enough for Apple to think it's worth being in the workstation market, especially making their own chips.
Apple isn't a chip company like Intel or AMD, selling their chips in tons of servers.
So it would make no financial sense for Apple to design and manufacture a 64 or 128-core workstation chip for such a tiny market, since they don't have the economy of scale of selling tons of server chips like Intel and AMD.
newhereok@reddit
Because it was a bad deal and couldn't compete
KodaKR-64@reddit
People weren't really buying it in large numbers when it had Intel Xeons either lol
Workstations have always been a niche, including Windows workstations.
newhereok@reddit
Off course, but still interesting enough to apple to make these. But especially the last year's it didn't make sense at all
rome_vang@reddit
Anyone who needs more than 20 something PCI-E lanes. A graphics card can take 8-16 of them in one go.
KodaKR-64@reddit
And what percentage of the PC market is that? lol
rome_vang@reddit
I have no idea, enough to keep making threadripper for DIY and workstations offerings through OEMs like Lenovo and HP.
KodaKR-64@reddit
Clearly not enough for Apple.
And what's AMD's market share?
rome_vang@reddit
Not sure what your point is, You asked for who cares about threadripper, well enough people care for it to continue to exist. Otherwise it would have gone the way of the Mac Pro.
KodaKR-64@reddit
This is a discussion about the Mac Pro...
kabrandon@reddit
Companies that built iOS/MacOS apps and selfhosted their CI runners did. Mac Studio has been a pretty economical alternative option though.
kwirky88@reddit
That’s some serious walled garden.
kabrandon@reddit
Companies that built iOS/MacOS apps and selfhosted their CI runners did. Mac Studio has been a pretty economical alternative option though.
lord_pizzabird@reddit
Till that gets killed too.
Such a strange time for mac, with the pro ecosystem dying, while the consumer end is absolutely exploding with the Neo and M-series laptops.
KodaKR-64@reddit
What? lol
The Mac Studio has been super popular with creative pros like video editors, and it costs a fraction of what the Mac Pro did.
The M3 Ultra is faster than the Core Ultra 9 285K or Ryzen 9 9950X
nbiscuitz@reddit
aww i need them $2000 wheelzzzz
noiserr@reddit
This shit is precisely why I abandonned the Apple ecosystem. My Mac Mini had died at one point and I couldn't buy a replacement that was of same performance as the mini that died. This is right before the ARM switch and all they were selling were Intel dual cores, and my quad core Mini died.
ILikeJogurt@reddit
your mini died and u wanted to buy pro, not the studio? why?
noiserr@reddit
There was no studio back then. Pro was too expensive and pretty outdated as well,
nisaaru@reddit
The whole CPU switch by Apple was definitely badly executed. SW and HW support is badly lacking here. But CPU switches, even for Apple, aren't really the norm even though they went through several. At least I don't expect another radical one for a long time.
KodaKR-64@reddit
How was it badly executed?
All major Mac software has been ported to run natively on ARM by now, including professional software like Adobe CC.
And they have x86 emulation for the very few apps where the developer hasn't bothered to port them to ARM yet.
Maybe CPU switches should be the norm. Look at how much better the Qualcomm Windows laptops are vs. x86. They run cool and quietly, and have twice the battery life.
nisaaru@reddit
Badly executed that they quickly stopped x86 OS and app support. Music for instance has huge UI issues and that's an app to play content bought from their shop. This would have never happened under Jobs.
KodaKR-64@reddit
They didn't.
Intel Macs are still getting updates.
x86 Mac apps continue to open under emulation in MacOS.
What's the issue?
Lol, when they switched from PowerPC to Intel in 2005-2006, they dropped support for PowerPC very quickly. By 2009, they no longer supported PowerPC Macs.
nisaaru@reddit
The issue is that Music and also Safari have mayor issues on x86. There are no updates since they bundled both with the OS itself though in case of Music it's also an important element of an Apple service.
KodaKR-64@reddit
No they don't.
nisaaru@reddit
Huh, what? These apps have no problems? Not fixing essential issues doesn't mean no support or what?
KodaKR-64@reddit
Them discontinuing a low-selling product after 20 years is why you abandoned Apple? lol
yycTechGuy@reddit
The Mac Pro sucked. The AMD Threadrippers of the day were way faster at 1/5th the cost. Apple pumped the market with every influencer they could find and then the Mac Pro fell flat on its face. Sales were abysmal.
KodaKR-64@reddit
Which model are you referring to? lol
yycTechGuy@reddit
All of them. There wasn't a single Mac Pro model that didn't have a corresponding Ryzen or Threadripper CPU that was faster and way cheaper.
KodaKR-64@reddit
What does that have to do with Apple, though?
Most of the Mac Pros used Intel Xeons, same as tons of Windows workstations and servers.
So what you're really saying is that AMD was faster than Xeons.
yycTechGuy@reddit
Threadrippers and EPYCs are much faster than Xeons. The only thing the Mac Pro did that a Threadripper or EPYC wouldn't do is run the Mac OS.
I know this because I was doing a simulation project that needed serious compute power. I set up an EPYC server.
KodaKR-64@reddit
None of that is really the target audience for Macs, though.
Apple's pro customers are overwhelmingly creative professionals. Video editing, music production, graphic design, etc.
The Mac Pro was great for those things. You'd have a hard time walking into a video editing suite or music recording studio and find one where they aren't using Macs.
The Mac Studio replaced the Pro, so the Pro became unnecessary since most people have no need for PCIe slots any more.
yycTechGuy@reddit
Whatever.
KodaKR-64@reddit
Maybe it sucked for your specific tasks, but it didn't suck for everyone lol
Stingray88@reddit
As a post production manager I’ve overseen the deployment of several hundred Mac Pros and Power Macs over my career… The Mac Pro was dead as soon as they released a version that didn’t support dGPUs and expandable memory.
And don’t get me wrong, I know exactly why the Apple Silicon Mac Pro exists in the state that it does. I just don’t understand who the hell would buy it. What it can do on top of the Mac Studio makes *very* little sense considering the price difference, and these are already astronomically priced machines.
RIP Mac Pro / Power Mac. You gave us a good run. Except the trash can, that thing sucked.
wpm@reddit
It's funny that everyone agrees the trashcan sucked but then half the people in the comments here are like "The modern equivalent of the trashcan is just fine as a replacement for the tower!" Like, come on. It has most of the same problems that the trashcan does. Like the issue with the trashcan wasn't "bad CPU" it was "Dongle-hell expensive thunderbolt shite instead of slots".
StrategyEven3974@reddit
Spoken like someone who never operated on the level that required a Pro solution in their life. The issue wasn't dongles, like, ever with the Trashcan. It's that Apple at the time bet on the wrong future for GPU's and cooling.
Apple thought the future was dual GPU's. And it wasn't. It also thought the future was unified cooling, and it wasn't. This led to underpowered GPU's and crashing GPU's because of inadequate cooling.
wpm@reddit
How many heatsinks does the Mac Studio have?
StrategyEven3974@reddit
How may Discrete GPU's manufactured by another company slotted into PCI lanes does the Mac Studio have?
Common bro what are you even talking about right now
Stingray88@reddit
I agreed with you when the machine first released. I did not agree even as early as a year later, and definitely not several years later.
“Dongle-Hell” wasn’t a problem at all. We thought it would be, and in actuality it was the opposite… It was great. The first couple times we took hardware that was normally restricted to big desktops with PCIe slots and plugged it into not even just MacBook Pros… but MacBook Airs, for remote productions… I got it after that. Thunderbolt wasn’t hell, It was liberation.
The real reason the trashcan sucked is because of the cooling system. It was fundamentally inadequate, and the amount of lost time and productivity I spent trying to troubleshoot hard crashes on dozens of machines that always ended up being failed GPUs? That really, really sucked. That is why I hated them.
The Mac Studio is totally fine. It doesn’t have any of the problems the trash can did.
mduell@reddit
PCIe for networking, video capture, video acceleration?
Stingray88@reddit
All of that is more than capable over Thunderbolt. And most shops had already switched to those alternatives after being forced to with the 2013 Mac Pro. It was 7 years between the last PCIe wielding Mac Pro in 2012 and the last one in 2019. So even folks that held onto their 2012 and older stock for a while still ended up transitioning to a 2013 trash can or an iMac.
mduell@reddit
Slower (1/8th the bandwidth), less reliable connection (no physical retention on Thunderbolt).
StrategyEven3974@reddit
Found the person who has never actually done pro work in their life
Stingray88@reddit
>Slower (1/8th the bandwidth),
Not remotely an issue for the use cases you mentioned. I ran a post facility with over 50 of the 2013 Mac Pros for 6 years. Each machine had 16Gbit Fiber Channel boxes, BlackMagic 4K UltraStudio, and sometimes other accelerators like RED Rocket cards in Thunderbolt enclosures. None of those use cases ever exceeded the 3x 20Gbps Thunderbolt 2.0 busses on those machines.
More modern Macs have Thunderbolt 3/4 with 40Gbps, or Thunderbolt 5 with 80Gbps. Bandwidth isn’t a problem.
>less reliable connection (no physical retention on Thunderbolt).
We’re talking about desktops here. This just isn’t an issue for the vast majority of people. We never had any disconnects with the hardware I mentioned above, and I haven’t had issues with it since then.
If you’re building out a DIT cart, sure, you might wanna ensure your connections are a bit more secure… but there are lots of existing solutions for this already. Just not a big issue in my experience.
PCIe is pretty much only necessary for regular GPUs in my industry… and Apple doesn’t support dGPUs anymore, so that’s a moot point.
77ilham77@reddit
When someone starts arguing "spec number" "numbers paper" etc., you know they never use these things in professional capacity, just another "tech" enthusiast who froth at specs numbers.
Obosratsya@reddit
Apple stans will die defending Apple. But this is why they are a foot note in computing. Its moves like this that dont instill confidence. Why would any pro shop invest in Apple hardware when Aplle can screw your entire segment over and not even blink.
lord_pizzabird@reddit
Being a 3d modeling nerd I've known so many people who have switched away from Macs and not by choice for this reason.
On the GPU end they're better than ever, but the lack of dedicated GPU support makes them a non-starter for anyone seriously working in that field.
EmilMR@reddit
you can do all that through thunderbolt. Video cards didn't work since Intel Macs regardless.
heepofsheep@reddit
Can be done over thunderbolt with PCIE enclosures… but it can get real messy quickly for more complicated setups. I would have loved to get some rack mounted Mac pros instead of studios for machines that have a lot of I/O. Way cleaner and less things to troubleshoot (thunderbolt/cables).
mduell@reddit
Slower (1/8th the bandwidth), less reliable connection (no physical retention on Thunderbolt).
heepofsheep@reddit
Yeah definitely not as reliable since thunderbolt and the cables themselves can cause issues. In terms of bandwidth… yeah there’s less than a real PCIE slot, but I don’t actually need it for my use case. Nothing that I connect via TB4 comes close to utilizing all the bandwidth… there’s a 16gbs fiber hba, sometimes an additional 10gbe adapter, and a couple things that are negligible.
KodaKR-64@reddit
I used the 2013 (trash can) Mac Pro years ago, edited a few short films on it.
The design was just ahead of its time. It wasn't a good choice for Intel and AMD chips that ran way too hot for that design.
Ironically, the Mac Studio is even smaller than that Mac Pro was, it's just that the chips are so much cooler and efficient now.
deadgirlrevvy@reddit
It was regressive and consumer hostile. It removed most of the upgradability, which was the single biggest reason people bought Pro models. The trashcan was just another proprietary middle finger from a company with a history of exactly that sort of behavior.
KodaKR-64@reddit
And yet almost every video production company I'm aware of was using them during that time period...
You're awfully angry about a product you apparently didn't even use lmao
Stingray88@reddit
Definitely ahead of its time. I was optimistic about it at first too. But after years of use in the field… bleh… I was THRILLED when the 2019 model was introduced.
KodaKR-64@reddit
The issue was really that they couldn't refresh it with faster chips because of the poor cooling, so even in 2018 you were stuck using a CPU and GPU from 2013.
I didn't have any issue with the lack of PCIe expansion like some did, I think that's overall a small niche among pro Mac users.
But by 2018 it was really starting to chug with 4K/6K footage, even though our company had the top of the line 12-core model with the D700s and 64GB of RAM.
Exist50@reddit
There were many faster chips at the same TDP. Apple chose not to use them.
KodaKR-64@reddit
The existing chips didn't even run well with that design. They had a large recall and exchange program for the FirePro GPUs overheating, and then the replacements would often overheat too.
When doing a large render job, even with the fan at full blast, the GPUs would start to overheat and glitch out. You'd get artifacts like green horizontal lines across your exported video if the GPU was overheating.
We fixed it by putting an additional fan on top of the Mac Pro lol
Exist50@reddit
Well it wouldn't have been meaningfully different either way with more modern chips limited to the same TDP. And when you have stuff like a literal 2x efficiency difference between GCN 1.0 and Pascal, it's all kind of moot...
KodaKR-64@reddit
Yeah, but if the existing chips were overheating and causing problems due to the ineffective cooling, why would they want to continue updating the same flawed design and continue doing recalls?
Unless the newer chips had a lower TDP, they would've had the same overheating issues.
I don't know why it took them until 2019 to redesign it, but... even a refreshed model with a higher RPM fan probably would've fixed the problem.
It seems they finally accepted it was kind of an overkill product for their user base.
Exist50@reddit
Even assuming they don't take the opportunity to improve anything else, what's the alternative? Keep having recalls with the old design? Just not sell anything? It seems clear they they wanted to drop the Mac Pro entirely, and then at some point changed their mind... and then changed right back again...
KodaKR-64@reddit
Who knows. They've done some things I've disagreed with, and according to reports even their own executive teams don't always agree with each other, like about whether they should've done Vision Pro.
Even Steve Jobs made some mistakes.
If they have the money, it's pretty harmless to release a product that flops, other than maybe some negative press.
I think removing the ports and MagSafe from the laptops was a mistake, and they reversed that after a few years.
In hindsight, I think it could even be argued that switching to Intel was a mistake, but that's my hot take lol
deadgirlrevvy@reddit
The switch to Intel was what brought them back from the brink of bankruptcy, along side the iPhone. I knew people who bought macs solely because they were intel based, which meant they were able to run other OS's and had an easier time running VM's on native architecture. Intel macs were several times faster than PPC models out of the gate. Power PC chips peaked in the 90's and they couldn't scale any further. If not for the switch to intel, they wouldn't have gained what little market share they have now. Prior to the switch, macs were super niche. After the switch, they became somewhat more mainstream but that was also a side effect of the iPhone. Even now, anyone who does something other than email or media production doesn't have a mac. They only have 15% of the desktop market, which is only a percent or two more than Linux FFS.
KodaKR-64@reddit
I have no idea what you're talking about. Almost everything you're saying has been incorrect.
Um... no. They switched to Intel in 2005-2006, many years after they were profitable again.
Apple was on the verge of bankruptcy all the way back in 1997, but by January 1998 they were profitable again. Then they did the iMac a few months later, which was a huge success.
You're also forgetting the iPod, which by 2004 was outselling Macs.
Not true. The G5 was faster than the Pentium 4 and Xeons at the time it was introduced.
Macs have 25% market share in the US, but yes it varies by country.
Wait until you learn that the vast majority of people actually only use their laptops for pretty basic things like browsing and email and word processing lol
I know it's hard for Reddit to believe, but most people are actually not PC gamers or doing code compiling and things like that.
jeffy303@reddit
It's all geared for transition towards cloud compute, use something like Mac Studio for all the production work and then final render/mastering is sent to the cloud, and if (which is a big if) AI bubble bursts, there will be lot of cheap cloud compute. The obvious loser in this are workflows needing lot of RAM but high speed SSDs make the Swap memory as not much of a joke as in the times past.
pppjurac@reddit
It looks good on bookshelf as piece of industrial design and warning.
americio@reddit
Not the powermac-era Apple anymore, they completely left the professional segment a decade ago. Now if they could get away with getting a subscription and not even selling you hardware they would.
ItsTheSlime@reddit
The last post place I worked at was using the trashcan mac as an actual trashcan lmao.
heepofsheep@reddit
I took home a couple trashcan Mac’s that we retired a few years ago. One of these days I’m going to convert one of them into a flower pot.
Stingray88@reddit
Considering the number of those I had fail on me, I should have started using them as actual trash cans!
heepofsheep@reddit
Yeah i just buy Mac studios now (sometimes mini’s depending on the use case)… the only reason to buy a studio that I can think of is that you can cleanly rack mount it especially if you have fiber or other expansion cards. I always get a little annoyed at the mess of dongles and thunderbolt enclosures surrounding our rack mounted Mac studios… but even then it’s still it’s hard to justify the additional cost of the Mac Pro when performance and capability is exactly the same.
laptopAccount2@reddit
Last time I had one of those was during the powerPC days. Was always so impressed by the cases and general design. Haven't been near one since.
ConsistencyWelder@reddit
I remember how they for the longest time kept their "one button mouse" alive. Not for a good reason, it was obviously a bad idea, it was just their way of "thinking different".
KodaKR-64@reddit
Laughing stock? I don't remember anyone caring really.
deadgirlrevvy@reddit
Good. Once they switched to the trash bin look-alike, the writing was on the wall that Apple had lost its way in relation to the Pro. Better for the line to die off than for its corpse to be trotted around in disgrace.
KodaKR-64@reddit
How do you define "pro"?
Apple has always heavily targeted creative professionals, like video editors.
The 2013 Mac Pro was very popular among that crowd, and widely used (even in Hollywood).
The Mac Studio they sell now is even smaller than that 2013 Mac Pro, and is fantastic for video editing.
deadgirlrevvy@reddit
The Mac Pro. I bought one in 2008 and even though I absolutely despise Apple and Mac OS, it was thoroughly amazing. It wasn't due to their operating system, or any of their software though. The hardware itself was server grade and extremely powerful for its time. Dual quad core Xeons and registered DIMMs were a game changer. The quality of the hardware was amazing. I purged Mac OS and installed windows on mine, and it was hands down the best windows PC I have EVER owned - even by modern standards. Even though my current VR rig runs circles around it now, it still quietly toils away in my shop running my machines (CNC, laser, 3D printers, etc).
The strenof the Intel based Macs wasn't in Mac OS, or any of Apple's widgets... it was the fact that the hardware was top notch and could run windows instead. Now though, not only have they lost Intel but any and all ability to expand the hardware which extended its useable timeframe. I installed many different video cards, USB expansion cards, hard drives, etc in my Mac Pro, which is one reason why it's still a viable machine 16 years later.
KodaKR-64@reddit
That's... odd lol
You bought a Mac just to run Windows? Why?
You could've had the exact same Intel hardware in any number of Windows PCs, probably for less money too.
Almost no one was buying Macs due to the Windows compatibility lmao
And their current Macs can run Windows, in a VM.
But I have no idea why you'd be buying a Mac just to run Windows, that really makes no sense at all.
Most of the same software is available for both Mac and Windows, so that's usually a non-issue.
deadgirlrevvy@reddit
Believe it or not, the Mac Pro was about $2k cheaper, than if I had purchased the equivalent parts and built the machine myself! I was shocked at the time, but it was true.
That Mac Pro was the only prebuilt machine I have ever had. I built every single PC I had before and after it. I even built my first one, decades earlier (a 486DX-33mhz), myself.
The Pro was the right computer, at the right time, for the right price. All the stars aligned for a brief moment and I do not regret it at all.
At the time, my best friend was a huge Apple guy. He had an earlier Pro and I was so impressed by the performance, it sold me on the hardware. Like I said, I have never liked Mac OS, but I wanted the sheer horsepower the hardware provided. I actually gave Mac OS a chance. For 1 year I used only Mac OS after I bought it, to give it a fair chance to win me over. 365 days to change my mind. I have hated Apple forever. Still do, frankly. But I was trying to be fair. I had never owned an Apple computer, but every decision, every design, every choice they had made for their previous hardware was so consumer hostile, it made me despise them entirely. The iPhone, was the foot in the door with me. I liked it, at the time (there was nothing else like it - Android didn't exist yet). I liked Linux/Unix and the fact that Mac OS was basically Unix with a better GUI was something I liked in theory.
Well, that first year on Mac OS was pure, unadulterated HELL. I hated every single second of it. I found almost no redeeming qualities to the OS. It was, in my opinion, years behind what my windows machines could do (there's a laundry list of things that windows had, but Mac OS did not have at the time. I'm very much a windows power user and I take advantage of every single feature, so I sorely missed the creature comforts I gave up.) Every app substitute I tried was dog sausage in comparison to the Mac version. I couldn't play my games on Mac OS and had to use VM's which were nowhere near as performant as running native apps on windows. Mac graphics software pretty much consists of 1 app: Illustrator, which I vehemently despise (I have been a CorelDRAW user for decades because it's legitimately a better application in every way). I got so sick of running every single app I used in VM's, that on day 366, at midnight, I installed windows on the Pro and never booted back into Mac OS again. I hated the experience and it brought me nothing but misery and complication. Once the machine had windows on it though, I LOVED the Mac Pro. It's the best PC I have ever had.
Unix is fantastic for servers. It's not as good for desktop machines, because nothing good runs on it.
KodaKR-64@reddit
You don't sound very mature, to be honest.
I can't imagine hating a company. Comes across as childish.
I don't use Windows or Android, but I don't "hate" Microsoft or Google lol
They both make some good products I use too.
Um, how so?
Their products are extremely popular with consumers lol
Isn't it something like 90% of teenagers in the US use iPhones?
That's pretty funny, since Windows and Android were both literally copied from Apple lol
Literally most of the same software that runs on Windows is also available on MacOS, so I'm really not sure what you mean.
No, Macs aren't the best for gaming, but otherwise all major software is available for the Mac too.
But hey, some people love devices with bloatware and Candy Crush ads in the start menu, I guess.
el_f3n1x187@reddit
All they had to do is ensble the pcie ports to allow external GPUs for GOGPU tasks but then they doubled down on locking the system on apole silicon only.
KodaKR-64@reddit
Seems to be working well for them, their sales have only increased lol
Balavadan@reddit
Apple could sell repackaged shit and it would still sell
KodaKR-64@reddit
Really? So why isn't the Vision Pro selling?
Why didn't the iPhone mini sell?
Why didn't the Mac Pro sell?
Apple's had plenty of products in 50 years that flopped.
samcuu@reddit
I'm mostly interested in the case. Once the hardware become truly obsolete hope I can pick one up for cheap.
S0phon@reddit
If you're fine with Windows/Linux, you can get a McPrue Apollo.
But be warned, the price is brutal.
ChrisOz@reddit
Exactly, I currently have a reporposed G5 under my desk. These would be interesting alternatives.
reddit_user42252@reddit
I thought it was already tbh. Yeah it was way expensive but what an amazing piece of engineering and design. Peak Apple.
Reasonable-Youth7794@reddit
Why not rename the Mac Studio the Mac Pro?
el_f3n1x187@reddit
I wonder what will happen to the very niche cases that used the rack version for audio and video editing
Remote-Combination28@reddit
You can buy third party rack mounts for them
el_f3n1x187@reddit
can you put recording PCIe cards into the studio?
Remote-Combination28@reddit
With a thunderbolt adapter yes
justjanne@reddit
Can't get the same PCIe lanes though, even if you add janky thunderbolt enclosures for your SDI cards.
InsaneNinja@reddit
Use the same amount of space/power and swap in four or six studios. Or 18 macminis.
HomemadeBananas@reddit
Guess they’ll have to put Mac Studios or Minis somewhere else. Or buy some sort of rack mount for those.
THE_BURNER_ACCOUNT_@reddit
It's what they always wanted. All Macs are black boxes with no expansion, no upgrade-ability, no tinkering or using beyond what they allow
irrealewunsche@reddit
The Mac Pro wasn’t really expandable. Yes, you could put cards in it, but apart from graphics cards, which it didn’t support, what would you want to put in it that you couldn’t add with thunderbolt?
laffer1@reddit
It did before the arm downgrade
KodaKR-64@reddit
Downgrade? lol
Their chips are dramatically faster and more efficient, it's not even close.
laffer1@reddit
Yes downgrade. Apple no longer competes at the high end workstation level. There are no upgrade paths and repair is difficult. Basically they are throw away systems aka future ewaste
The chips are great for laptops but suck for desktop and workstation class. Too slow. Not competitive.
nisaaru@reddit
It's only a matter of time until PC desktops will use the same designs with close and wide memory buses with glued on LP memory chips with no upgrade options. DIMMS and soon CAMMS are a bad compromise in speed and real estate.
Most desktop consumers don't need dGPUs either or can't afford them anymore. Only a matter of time until "laptop" APUs like Strix Halo become the norm for desktop PCs too because they do things either good enough or even better for certain sweet spots.
KodaKR-64@reddit
Huh?
The M3 Ultra has 32 cores, and outperforms the Core i9-14900KS in multi-core performance.
That's more than fast enough for 99% of people, including video editors, which is probably the largest group buying the Mac Studio.
Workstations with Xeons or Threadrippers aren't a gigantic market overall, and Apple really only targets consumers and creative professionals with Macs.
I'm not sure what this means, but Macs have usually 7 years of software support. Few people keep their computers for that long, but either way Apple has a recycling program. Their computers are made of glass and metal, and everything is recycled.
laffer1@reddit
The Mac Pro shipped with Xeons back in the day. Had amd done threadripper, we would have 192 core cpus with multi terabyte memory configurations with upgradable ram and enterprise ssd options up to 60tb+
Not these dual beater laptop models.
The 14900ks isn’t the best consumer chips right now. Depending on workload, 9950x, 9950x3d2 or even Intel 270k plus or 285k are better.
My 265k and ryzen 7900 boxes can smoke my work m4 pro laptop. They also have more ram and storage and better gpus
KodaKR-64@reddit
I'm confused why you think the M3 Ultra is a laptop chip. It's not.
Yeah, the Mac Pro shipped with Xeons, and it was a niche product. Apple said it was a single digit percentage of all Mac sales. That's why they stopped selling it. Almost no one was buying it.
The M3 Ultra also outperforms the Core Ultra 9 285K in multi-core Geekbench.
Why you think these are underpowered laptop chips is really strange, when they outperform Intel's fastest desktop chips.
RAM isn't an issue at all. The MacBook Pro has up to 128GB of RAM, and the Mac Studio has up to 256GB.
laffer1@reddit
Apple never scaled their arm chips to workstation class. The ultra really competes with high end consumer chips.
For some workloads, they are good chips. However they aren’t really that impressive for general compute and compiler workloads. Things I need a real workstation chip for.
If you want to run a very large llm on a budget setup, Apple has you. If you want to do content creation, they are fine. For other things Intel and amd smoke them
KodaKR-64@reddit
But it's not a "laptop chip" lol
The M3 Ultra outperforms the Core Ultra 9 285K, which is a high-end desktop CPU that uses up to 250W of power...
And it's about to be replaced in a few months by the M5 Ultra which will be much faster.
I'm not sure how you define "workstation", but tons of professionals use chips like the M3 Ultra or Core Ultra 9 285K.
Apple actually said most professionals were buying the MacBook Pro and 27" iMac, not the Mac Pro.
You can easily edit a Hollywood feature film in 8K on a Mac Studio. It's far from an underpowered laptop chip like you keep saying.
justjanne@reddit
Traditionally, a workstation meant a minicomputer, in contrast to microcomputers like the IBM PC.
We're talking about 128GB of RAM at the same time that PCs were still using 32-bit Windows.
KodaKR-64@reddit
Okay, but in 2026 I'm not sure there's a massive difference between a high-end desktop PC and a workstation, other than it might use a Xeon instead of a Core i9, and support more memory.
Tons of professional work has moved from workstations to just high-end PCs instead, as those chips get faster and faster.
KodaKR-64@reddit
To put things in perspective, their cheapest laptop, the $499 MacBook Neo, actually outperforms their top of the line Intel iMac from 2020 with the Core i9-10910.
That's pretty insane. You can easily do 4K video editing on their cheapest, slowest laptop.
Good luck doing that on a $499 Windows laptop.
laffer1@reddit
I don’t care about beater laptops. I can’t use a neo for code any better than I can use a cheap pc laptop. I couldn’t even run my dev tools on that thing with 8gb ram
KodaKR-64@reddit
No one's forcing you to buy one lol I don't understand your point here.
I never said they were perfect for every task or every person.
Just disputing your claim that they're only slow laptop chips or that the chips suck.
Even massive Windows/Android fanboy Linus Tech Tips recommends their Macs now and agrees their chips are amazing.
pmjm@reddit
Large amounts of storage that's not easily accidentally yanked out.
tnoy@reddit
It's usually the people with things like multiple Avid HDX or the higher-end Blackmagic cards, though I'd imagine most of them have already abandoned Apple at this point.
mduell@reddit
Mac Studio has thunderbolt ports for expansion.
KodaKR-64@reddit
It's not really possible with ARM and the efficient design they have with on-package pooled memory.
ARM Windows laptops do the same thing, and Nvidia's upcoming ARM chips will be the same too I'm sure. There's tons of performance and efficiency benefits from doing it that way.
Omniwar@reddit
Writing was on the wall last week when the infamous $700 wheel kit was on clearance for $199.
pppjurac@reddit
LOL what? 700USD for something I can get a farmer supplies for 25€ rated for 250kg , together with can of silver paint....
KodaKR-64@reddit
Or the $999 display stand that they now include for free with their new displays lol
Oligoclase@reddit
That news briefly was voted to the top of buildapcsales as a joke by the community.
ClickClick_Boom@reddit
At that price it's almost worth buying for the meme.
ProZoid_10@reddit
Amazing Mac Studio has enough cooling for future products. Rip
kaden-99@reddit
RIP. It was a cool looking case.
Jayzbo@reddit
Yup eventually I'll buy up an old one, gut the internals, and build a pc in one of those lol
kuddlesworth9419@reddit
That is the best use case for any Mac Pro to be honest. It would be cool if they sold their cases separately with standard internals. They would never do that though because Apple.
rizzninja@reddit
Looking at prices these could be powerful self hosted ai hardware using apple silicon.
Comfortable-Owl-7035@reddit
Apple Vision Pro is the next.
JtheNinja@reddit
I think they'll keep that one around at least until they start shipping the glasses, keeps interest in the platform alive. It might live on through some combo of industrial usage, glasses devkit, and the ever popular "fancy toy"
(it's the coolest media viewer I've ever used. It just costs $3500 and you can't watch it with anyone else at the same time)
ConsistencyWelder@reddit
I prefer my gardens unwalled anyway.
KodaKR-64@reddit
Never used a Mac, huh?
ScienceMechEng_Lover@reddit
The Mac Pro has a 1400 W power supply and everything in it is designed for optimal airflow. It would be nice if we could get the pinout of the power supply connector so that this can be used as a PC case in the future.
JtheNinja@reddit
Can’t be that hard to figure out a PSU pinout with a multimeter, can it?
ScienceMechEng_Lover@reddit
Yeah, but what about things like knowing which rails display what transient response? It uses a fancy connector that plugs directly into the motherboard (like PCIe cards) so we can't test each wire cable.
tnoy@reddit
You'd likely need to build your own adapter board. If you assume the 2019 Mac Pro power supply is similar, it's a single main 12.1V rail and a 11V standby rail. It's similar to what you'll find in some servers and some PC workstations. The power supply just does the 12V and an adapter board or the motherboard breaks out any additional voltages.
Soldering long thin wires to the blade connector of the power supply to monitor them while the power supply is plugged in would be a simple thing to do, you'd then just see what signals are used on the smaller control wires.
ICC-u@reddit
It sounds like a jumbo jet
There are other airflow cases that don't cause so much turbulence
SyzygeticHarmony@reddit
As someone who actually owned one, I’m not sure what you’re talking about. Quietest PC I’ve owned, much better than any air cooled PC build I’ve made prior to it. It’s dead silent
spky-dev@reddit
Then build your air cooled machines with Noctua’s. I cannot hear my 9950X3D/5090 Astral rig.
ScotTheDuck@reddit
Apple reused the case from the 2019 Intel models. He might well be talking about those, which were jet engines.
loozerr@reddit
Post in thread 'Mac Pro 1.1 Psu to Atx Pc Psu Converisation.' https://forums.macrumors.com/threads/mac-pro-1-1-psu-to-atx-pc-psu-converisation.2445543/post-33634143
Pinout was a quick Google away
did_i_or_didnt_i@reddit
RIP I loved my 2008 Mac pro that I got in like 2015 lol
scrndude@reddit
Man what a shame. They were experimenting with another chip beyond the Ultra but I guess the cost just never ended up making sense, especially now.
https://9to5mac.com/2024/03/11/apple-car-chip-four-m2-ultra/
I bet this was meant to be in the Mac Pro, and maybe with Apple Car using the same chip that could make it be produced in enough quantity to be cost effective. But the chip never made sense to mass produce and the car project got canceled. The Pro has been on life support for a long time.
KodaKR-64@reddit
Even years ago they said the Mac Pro was only a single digit percentage of Mac sales.
Probably made no sense to manufacture such an expensive chip for such a tiny market.
randomkidlol@reddit
this thing was DOA when they downgraded the maximum memory and blocked pcie GPUs from working. surprised they built it out for this long.
Majortom_67@reddit
Shitty product policy
faizyMD@reddit
Damn, Mac Pro really got sunset like that… end of an era fr.
Slasher1738@reddit
RIP. Guess the studio is it's replacement