Apple M5 Pro & M5 Max GPU Analysis - M5 Max GPU on par with the GeForce RTX 5070 and faster than Strix Halo
Posted by Antonis_32@reddit | hardware | View on Reddit | 340 comments
potatoears@reddit
Looks like the cheapest MacBook Pro config with the 40 GPU core M5 MAX is $4099
A desktop with a 5070 and similar amount of RAM/Storage should be under $2k and a laptop with a 5080(more comparable to desktop 5070) would be under $3k
Educational_Yard_326@reddit
You can always get a cheaper PC, but it won’t have all of: trackpad, keyboard, speakers, hinge, display, not windows, battery life, design. You always have to pick 2 or 3 out of those 8 categories for windows laptops. Macs hit all of them.
N2-Ainz@reddit
That's just wrong
High-end windows laptops do feature a haptic touchpad, a good keyblard, good speakers, a good hinge, a good display, a good design, not windows is not an argument, and nowadays a good battery life
crystalchuck@reddit
ThinkPad X and Elitebook X speakers are not good even compared to my M1 MacBook Air (yes, I did compare them side by side), despite being substantially more expensive. The higher display options can be pretty good, but that's not what most people are selling or buying. Not Windows is a big argument for anyone who isn't forced to use it, as it is almost universally disliked these days. No, Windows laptop battery life isn't comparable, and sleep actually fucking works without problems on MacBooks.
Obosratsya@reddit
Speakers on a laptop being a selling point is wild to me. Its a fallback option. The very best of laptop speakers will still suck.
Dual booting Linux is always an option. Windows is getting to be unusable in strides, hard to argue there but to jump to another extreme with MacOS isnt a good option.
PuffyCake23@reddit
Dual booting is a far greater extreme than switching to a different OS that just works properly.
padmitriy@reddit
I have absolutely no problem video calling from my Macbook air, but I need to carry headphones to call from my work dell 7770, because it is that bad. Yes, I expect top tier laptops have decent audio
Terrh@reddit
no some of us care about speakers.
And there is no need for small speakers to suck, there are plenty of examples of small speakers that don't suck.
N2-Ainz@reddit
The X9 has good speakers and the new X9 15p adds a six speaker setup. They are very well liked. Samsung's new GB6U also features a six speakee syste.
A display is definitely important and part pf what people buy
Windows is only not liked by a a reddir minority, the majority still uses it and I prefer it over MacOS due to the restrictions it imposes onto me. Also Office is inferior to the Windows version which is another point for it
Windows battery life is comparable, just check out the X9 15 which achieves great results similar to the MBP 16" while having a 80Wh instead of a 99Wh battery. The new Lenovo Ideapad features a 99Wh battery and the battery life on it is insane
Sleep definitely is an issue with Windows to this day
crystalchuck@reddit
You said "high-end Windows laptops", now it's just the X9 that has good speakers? Also, the X9 speakers are good... for a Windows laptop. They're still worse than even a MacBook Air, which is much cheaper.
Yes, it does matter. What I meant is that, unless you're ordering from Lenovo and configuring yourself, you're typically not getting a good screen.
No, people genuinely dislike Windows, or are at least apathetic towards it. No one actually likes it. This is something I hear a lot from customers. If people liked it, Apple's market share wouldn't increase year after year.
Inferior how, and does it matter to most people?
It also doesn't even have the performance of a MacBook Air. That is how they achieve that battery life.
You don't think this is a big issue?
N2-Ainz@reddit
What do you mean with just the X9? I mentioned two devices, how many do you want me to mention? All of them? They were clearly two examples out of the options
Also yes, you are typically getting a good screen. Samsung has the best ones out there and they are native 3K OLED in every Pro/Ultra order. There are also other manufacturer who nowadays put 3K OLED's in their base configuration
Apple's market share increases because their base devices have great value especially in the USA. Their battery efficiency was for the generations from M1-M3 basically unrivaled and only LNL which released around M4 started to get into the battery life game that Apple offers. PTL now offers great GPU improvements while keeping the good battery life that LNL offered which is really nice. However the moment you need to upgrade RAM/Storage, they become very unattractive at least in Europe
Obviously it doss matter to most people because Office is the most used software in the world. It is inferior by lacking certain features that are only available on Windows. This can cause compatability issues too when opening it on devices with another OS. One of the reasons why companies won't buy Apple but stick with Windows devices, though there are many others were Apple restricts the usage, e.g. attaching network storage which is pretty bad on MacOS
Sure, the raw performance isn't as great as the M4 when it launched but people don't buy a Macbook Air for it's maximum performance but for it being the middle option that has decent pricing while offering decent specs. Most of them are used for office work and some do editing with it but you generally opt for a Pro model if you regularly do editing and stuff like that. However Intel is closing that gap with newer generations as they only started to get out od the rabbit hole with LNL which is not that old. On the other side Snapdragon's new X2 has apparently way better performance than even PTL which is going to be interesting. Windows on ARM has improved a lot over the years and a lot of stuff is now native to ARM, the average consumer won't notice apps not working and more stuff comes to ARM, e.g. KCD2 recently released a native ARM port
And no, it does not work like that. You can achieve shitty battery life with shitty performance too. LNL is obviously a chipset that keeps good performance while running more efficient. PTL improves that especially in the GPU department as said before and it still improves on the efficiency. The times where Intel sucked at that are over, they finally understood on how to improve their chips like they did before they went downhill with the generations from 10 years ago
Sure sleep is an issue, however it is not an issue that would stop you from buying a Windows device. A bigger issue would be e.g. the 60Hz LCD on the Macbook Air that is still a thing. At least they finally bumped up the base storage to 512Gb
crystalchuck@reddit
Sure, go for all of them. As it's going to be a very short list, it's not a lot of work.
That's not what typically means. There's still plenty of X1 Carbons or Elitebook Xs for 1,500 USD and more with FHD panels out there for instance, which is a travesty at this price point. You single out Samsung, but it's not like that's a very popular laptop brand in North America or Europe.
This might be true for some advanced VBA features and stuff like that, not relevant for most people though. Which is why there are millions of office workers typing away on MacBooks.
You can't seriously criticize compatibility issues with Office on macOS, which for most users are minor, if not completely irrelevant, without highlighting the more serious compatibility issues of Office on ARM. Also, basically anything requiring a driver is liable to not work on ARM, which is a lot of things. KCD2 isn't exactly a killer app and who would play KCD2 on a fucking Qualcomm chip?
The same way X1 was going to be amazing? I'll believe it when I see it.
Sure, go for the X9 instead, which is a couple hundred dollars more expensive, but at least you'll also get worse speakers, an ugly design (granted, subjective), worse build quality, and worse performance for a couple hours more battery life.
It does work like that, unless you reinvented physics? If you didn't fuck up the engineering, yes, lower performance means lower power consumption means more battery life (within a certain band of course).
Do you honestly believe people care more about HRR screens than their laptop actually sleeping correctly and having some battery life left when you take it out of your backpack 5 hours later???
Forsaken_Arm5698@reddit
Damn. That's a mountain of text.
boissez@reddit
The X9 15 is about M5 level performance. The M5 Pro and Max is on a whole other level. You can't even get the 16" MBP with a M5.
padmitriy@reddit
So funny. The top tier Dell 7770 I work on has the worst keyboard I have ever had before, same about the touchpad and speakers.
N2-Ainz@reddit
Then don't buy Dell
There are many companies that produce good devices, take a look at Lenovo's X9 15 or now the 15p (new gen with PTL) lineup
Also Samsung's latest Galaxy Book 6 Pro/Ultra are really good with the X7 chipset
You can't just use one company and claim 'Windows bad' as that's now how this system works. Windows can be used by many companies and there will always be shit companies and good companies
padmitriy@reddit
"work laptop - didn't buy myself"
So some high end windows laptop feature high end user experience, unexpected!
Silverado_@reddit
"not windows" is just factually wrong, you can install Linux on anything PC basically. Now try to install "not MacOS" on the Macbook.
Obosratsya@reddit
Except that you have to live with the Apple "ecosystem". Has aplle made displays beyond 60hz? Cause its really hard to replicate the smoothness if a decent dusplay. How about ports?
No matter how well Apple builds their devices, the fact that youre stuck with limited functionality will always make them inferior purchases as far as computing goes. You always need some kind of niche or specific workflow, which is crazyas its supposed to be a computer.
Educational_Yard_326@reddit
What niche workflow? I’m a photographer and a MacBook Pro is the number 1 recommendation for that. Nothing comes close. Apple has been using 120Hz displays for years and years. Just that statement alone is an indication of your knowledge level.
Strazdas1@reddit
whoever brainwashed you into thinking mac is good for photography....
Educational_Yard_326@reddit
Highest single core performance, highest memory bandwidth. Go to r/lightroom and have a scroll, read all the people complaining about their supposedly powerful windows PCs stuttering and struggling.
Themods5thchin@reddit
Don't bother they're a redditor whining about apple products, it doesn't have to align with reality it just has to make them feel right.
DM_Me_Linux_Uptime@reddit
Gamerbros and PCMR types have legit ruined rHardware.
Obosratsya@reddit
I dont follow Apple and their advancements.
But you are proving my point. Photography is a specific usecase. I dont have to consider my workflows or anything remotely close when buying a computer. My work can change next week and I can be certain that my computer will still work.
Sadly I do have to deal with Apple devices at work. Im in IT and usually have to be the bearer of bad news that the new mac they got isnt compatible with Logitech conf room equipment.literally only macs dont work. Even Android is better at this.
crystalchuck@reddit
I use Linux, an Android smartphone, and macOS. The Apple ecosystem, which I mostly ignore, has not been a limitation in any way. MacBooks support almost every use case, except for some quite specialized ones where software isn't compatible (some CAD programs come to mind, which are literally completely irrelevant to 99% of people). What limited functionality are you talking about?
Obosratsya@reddit
How about conf room equipment. Very common enterprise usecase where Macs fail and have been for a while. I get clients with Macs all the time who huff and puff that their toy isnt compatible but I can run a whole presentation off of my Galaxy S25.
S0phon@reddit
Conference rooms don't have HDMI?
Obosratsya@reddit
We went USBC a while ago. Only devices having issues are Macs.
S0phon@reddit
You can connect any screen to your Macs...
droptableadventures@reddit
Uhh yes, for years now.
NeroClaudius199907@reddit
Razer Blades are the apple of PCs
Marble_Wraith@reddit
Not anymore.
The problem with Razer is they all got mandatory dGPU's + synapse dogshit.
I mean OK looking at the XPS, Mac's still does battery design, and therefore power delivery better. But aside from maybe a 1-2 hour total runtime difference, it's pretty darn close.
In addition the new Pantherlakes have accelerated AV1 4:4:4 decode and encode. M5's don't.
If Dell stays headed in this direction and iterates on this design, for example:
I'll be happy to recommend the 2027 edition (or whatever) to people, and i'll probably buy 3 or 4.
SomeWonOnReddit@reddit
My PC laptop with a RTX 5080, OLED display was only $2500 lol. And this $4000 MBP doesn’t even have an OLED display.
Icy_Composer8357@reddit
the only gripe we have with windows laptops is the batterylife otherwise for all tasks they very good, from 3d to video editing
Hour_Firefighter_707@reddit
Which one? Doesn't matter, really. Unless it was the Zephyrus G16, it is way bigger, way heavier, not as good looking, runs way hotter, way louder, doesn't have a massive haptic trackpad, has absolutely crap speakers and lasts 2 hours on battery editing a video and maybe 5 or 6 browsing the web. And it came with a power brick the same size as the laptop. And the 5080 laptop only has 16GB of VRAM.
Sure, it doesn't have OLED, and the pixel response times are really bad, even on the improved quantum dot panel, but it gets way brighter and Apple knows how to tune stuff so colour accuracy in both SDR and HDR is impeccable.
It's not as simple as comparing performance. They're not desktops. Laptops are a packaged deal and every component greatly affects the user experience
SomeWonOnReddit@reddit
Lenovo Legion Pro i7. And it runs very quiet and it outperforms my 16” M1 Max MacBook Pro easily. And you’re not bringing a $4000 MBP to Starbucks or the library, let’s be honest.
And most of the time when I look around in the library or Starbucks, pretty much everybody has their laptops connected to a power outlet.
The only reason why I bring my M2 MacBook Air outside instead of my Lenovo Legion Pro i7 or 16” M1 Max MacBook Pro, because it is a base spec and I don’t care if it get stolen.
Wisdomnaut@reddit
I might be missing something, but I often bring my MBP to a cafe and rarely plug it into power.
Icy-Pay7479@reddit
They’re probably a broke college kid. Someone who buys a $4k tool will treat it like one. You don’t trash it but it’s meant to be used.
MrRonski16@reddit
Like sure if you just look at the CPU/GPU specs you get the same power for less.
But you do have to understand that there are alot of compromises for lenovo legion pro to get those specs at that price.
Shadow647@reddit
I literally have never taken my charger out of the bag not one single time (away from home) since I bought my MBP
Kowbell@reddit
...and if you do, every USB C charger you can find these days is smaller than "normal" laptop charging bricks.
Shadow647@reddit
Yep – my MBP14 manages to charge even from a 20W USB-C charger (while using it!), so there is no point for me to carry anything over 65W, which can be compact as hell. I have Ugreen's Nexode 3-port one with 2xUSB-C and 1xUSB-A with me when I travel, it's the size of a matchbox, and it has enough power to charge my laptop, phone, and headphones or watch at the same time quickly enough to not bother with 1lbs 200W bricks of the past.
oureux@reddit
I work from the public library on my M4 Max 64GB ram 2TB ssd every Monday. AMA
Lanky_Argument526@reddit
Yes the 7i is a 2025 machine. I'm glad it outperformed a laptop launched 4 years ago. (It still has poor battery life compared to said 4 year old model lol).
SomeWonOnReddit@reddit
At $2500, it rivals the $4000 2026 M5 max apparently, while being able to actually utilize the power with video games.
Wisdomnaut@reddit
I'm genuinely curious why you think battery life isn't important or that people don't bring a MBP to Starbucks. One of the main reasons I bought mine was so I could bring it anywhere and not worry about battery.
braaaaaaainworms@reddit
It's because garbage battery life is the biggest downside of gaming laptops
Lanky_Argument526@reddit
It rivals it on only one way. By having a GPU that beats it in video games. Virtually every other aspect of said laptop is downright inferior in extremely notable aspects. I don't see how someone can look at a machine that has less than a third of the battery life of the competing machine and call it "competitive"
To summarise, the M5 Max MBP has 1) 3x more battery life 2) 20-30% faster CPU 3) GPU thats superior in AI, video editing and 3D rendering, only loses in gaming. 4) 2x display brightness, 90% higher resolution 5) 2x faster storage 6) 3x faster I/O 7) 50% more RAM that is also faster 8) Full metal body and isn't half plastic 9) Maintains similar performance away from the charger 10) Fan noise under load thats significantly lower.
This is not "competitive". This is an entirely different product category. Unfortunately irl, a laptop's price isn't just how well it can play video games. Lmao.
996forever@reddit
Asus ProArt P16 with 5090 and Tandem OLED appears to solve all of your complaints. On balanced or battery power plans (typically 40-60w TGP, similar to MBP’s sustained wattage) they are very quiet because the cooling solution is designed to handle well over 100w sustained.
agracadabara@reddit
You can’t even buy this laptop since it is out of stock almost everywhere. Also there are lots of complaints about HW and SW issues and bad support from ASUS.
A spec sheet doesn’t make a good purchase. Also the Macbook Pro provides full performance on battery while being quiet. Performance is going to be in the toilet with the 40-60w TGP on the ASUS. So it is not a comparable solution.
MrRonski16@reddit
Yep because it has way brighter mini-led display.
And billion other advantages that are not related to GPU/CPU
weazelhall@reddit
What’s the color gamut and pixel density of that screen?
jonydevidson@reddit
The difference is that you can take the MBP to an 8 hour train ride and do a full day's worth of work on it and the battery won't drop below 30%, and the laptop will have the same performance as it does when it's plugged in, and it will make zero noise unless you're rendering or running long compiles (or gaming).
Also, the 14" MBP in the link in the OP is less than 1.5kg and charges easily with a small 100W USB-C charger.
How heavy is your laptop, and how big is its charger.
The point is that you're not paying just for plugged-in power. If that's all that matters to you, then great, you can get the product you need for a lot less money.
If you need a laptop to run for 8 hours on battery and let you do the work with the same cadence as you do when it's plugged in, the Windows laptop for it that works as smooth as is as performant as the MBP does not exist.
If you want your entire kit to be lighter than 1.7 kg and have that performance, the Windows laptop does not exist.
On top of that, add amazing speakers, amazing DAC for the headphones (better than 95% of professional audio interfaces' headphone amps), amazing display color accuracy, incredibly fast WiFi, incredibly fast drives.
RecordingHaunting975@reddit
Ok I like my wife's m1 and all since it can actually play a game for a few hours before needing to be plugged in, unlike basically any intel/amd laptop ever
But I keep seeing y'all mac people say shit like
Like who the fuck are you people and why are your 4-figure purchasing decisions always based on 8 hour train rides and being stuck in the airport bathroom for 46 years. You guys really whip out your Mac on the bus??? 20-40 minutes, strangers crowded around you pressing ass to your face, 3 homeless guys yelling in the back and you got your $4000 email machine out? Don't you guys have phones?
Also why do your trains and planes not have charging ports????
I feel like I'm going insane.
stryakr@reddit
it's an easy number to swallow.
say you start work and move between meetings, lugging around your device to each room, where you don't need to plug in.
Then you pack up to go home and you have an hour commute on transit, again no power, so you can just work or be productive.
ShortSightedMongoose@reddit
I just owned one from 2010 that only recently died, so I bought another one fully expecting it'll work for me for another decade+ without issue. The hardest thing mine does is a bit of audio work as a hobby, everything else gets done on my built rig.
jonydevidson@reddit
I personally don't do 8 hour train runs all that much, but even at 4x per year, if I can get a day's work done instead of wasting that time, that's 2% of my yearly working hours.
I cannot afford to drop that time if I have projects and deadlines.
996forever@reddit
Asus Zephyrus G14 with 5080.
jonydevidson@reddit
When you unplug it, performance drops by 50% and the battery life is nowhere near that of a MacBook pro.
Just idling with the dGPU on will drain it ridiculously fast.
996forever@reddit
Why the hell would a laptop with graphics switching idle with the dGPU on?
jonydevidson@reddit
Because you have to disable it manually and restart the computer for it to take effect, and people are just lazy.
Even if you ignore this point, which I'll concede is a bit iffy, the 50% performance tank on battery is unacceptable, and even more so given how the battery then doesn't last nearly as long as that of a MacBook.
996forever@reddit
No you don’t, are you living in 2015? Even in 2015 Nvidia Optimus existed and graphics switching was on the fly. Are you talking about when the dGPU is directly driving the display bypassing the iGPU? Even that has been solved with advanced Optimus and MUX switch no longer requires a restart.
Putting a typical slim gaming laptop on battery/silent mode usually reduces the TGP of the dGPU to around 50w. According to the article the MacBook Pro 14” sustains 44w and the 16” 62w under load. If they do not lose any performance on battery that means it’s still 44w and 62w unplugged. How can it physically last much longer when everybody has the same battery size limits? A watt is a watt, a watt-hour is a watt-hour.
jonydevidson@reddit
Because you're not running sustained max load all the time.
When you're doing tasks that do not require your CPU to be working at full throttle, macOS and Apple Silicon have way better power optimization.
Estbarul@reddit
Are there actual benchmarks or is it all apple feelings ?
jenny_905@reddit
A lot of Apple evangelists seem to exist in this distant past. You hear some absolutely wild claims.
Quatro_Leches@reddit
am ngl. retina displays look better than your run of the mill oled in overall use, especially considering how people use laptops.
996forever@reddit
At mbp price range you’re not competing with “run of the mill oled” but tandem ones with up to 1000nits sustained full screen brightness.
Lanky_Argument526@reddit
Its nice that you chose not to name your laptop because you immediately know how bad it would be in comparision in every other scenario.
SomeWonOnReddit@reddit
Lenovo Legion Pro i7 and is superior to my 16” M1 Max MacBook Pro. It even works with my Apple Studio Display.
You do realize that most people watch YouTube outside of the house on smartphones right? They don’t grab a $4000 on the train to watch YouTube?
Lanky_Argument526@reddit
The Legion Pro 7i has a 275HX chip. Loll. I don't think I have to mention anything more. Yes, people don't buy laptops to watch Youtube. But the laptop you just quoted has utterly dogshit battery life compared to the M5 Max.
In reveiws of laptops with the same chip, the battery life advantage is nearly 3-4x in favour or the Max. Forget about youtube, unless you take a power brick everywhere with you, its practically useless.
Never mind the multitude of other inferiorities.
Display is OLED, but is significantly lower res (3.4k vs 2.5k), has half the peak brightness (1000nits SDR vs 500 nits SDR). It does have double the refresh rate.
The laptop is half plastic, unlike the fully aluminium Macbook Pro.
It also seems to start with less memory (48gigs vs 32gigs). The 2500 dollar model you quoted.
It has less than half the storage speed (Gen 5 vs Gen 4)
It also has no support for TB5.
The trackpad's significantly worse.
I can go on and on.
SomeWonOnReddit@reddit
Who cares about just an “275HX”, which is around M3 Max in performance. Real world is what matters.
And in the end, the Lenovo Legion Pro i7 is able to utilize the RTX 5080 in the real world because you can run games.
There is no point for me to upgrade my 16” M1 MacBook Pro or M2 MacBook Air, because what real world benefits will there be as there are hardly any games on Mac.
Lanky_Argument526@reddit
Maybe its shocking, but there a people who buy these expensive machines and don't play video games on them.
If you're a creator who's workflow is dependent on 3D rendering (Blender/Cinema4D/Aftereffects), or video editing or even local LLMs, the M5 Max is significantly better than the 5080/5090m in virtually every way
Method__Man@reddit
I bought a 5080, 285k, 64gb, 4tb desktop for $2200 this month. During the price apocalypse
Keulapaska@reddit
How?
Like even before the ram price craze that's pretty decent price for the full system with those specs, but with the ram price hike ram+storage+gpu alone is more than $2200.
MartiniCommander@reddit
Microcenter has the Alienware for this. I bought one to use for a golf simulator
Method__Man@reddit
There are still sales to be had. Especially open box stuff
Vathe@reddit
A little bit disingenous to drop used prices in a thread about new prices but whatever floats your boat.
chicken101@reddit
It also doesn't sound like that includes mobo, case and power supply
Method__Man@reddit
It an entire system. The whole thing
It was a MSI Aegis Prebuilt. easy to find. You can also find Acer, and HP prebuilts
Vathe@reddit
I'm guessing he bought an open box prebuilt, but yeah only something like that or a crazy sale approaches $2200.
The cheapest 5080/285k/64GB combo right now on pcpartpicker is about $2400. Not even including the rest of the PC.
Lanky_Argument526@reddit
For 1550? No offense, but I that just sounds implausible even before the RAMpocalypse.
certainlystormy@reddit
i got a predator helios with a 275hx + 5070 ti and it was $1600 on sale in january.. yeah no 😭
Lanky_Argument526@reddit
Yea. Thats a 5070ti. Not a 5080 like you quoted in the previous comment. Nvidia segments the 5070ti and 5080 quite far apart due to their VRAM differences.
Pensive_Goat@reddit
There are Mac mini and Studio SKUs with the Pro, Max, and Ultra chips; those are going to be better value than MacBooks.
MartiniCommander@reddit
That’s a bit ignorant considering I’m using my MacBook on a flight to Phoenix right now
Real_Ebb_7417@reddit
The difference is, that on Mac you have unified RAM and on desktop vRAM and RAM are separate and communication between GPU and CPU is super slow. Depends on what you want to do with your PC. For many usecase MacBook with lots of RAM will be 10x cheaper than eg. a workstation with Nvidia with a comparable amount of vRAM.
LastChancellor@reddit
and the maxed out Asus ProArt P16 with:
also costs $3999!
cookedart@reddit
But keep in mind that:
ARM_over_x86@reddit
Except it has 24gb vram and the battery lasts 5 minutes
DrKenMoy@reddit
username checks out
KolkataK@reddit
The power draw when playing cyberpunk is 86W on M5 Max, the battery lasts a little over an hour. I cannot find any power draw figures for blender on M5 chips, interested to see how battery life is on peak load compared to other x86 laptops
ARM_over_x86@reddit
I expect the M5 to do better on productivity software, not on games. Also, you're gonna want to look for battery life on equivalent load between the two, not peak load
agracadabara@reddit
You got a link to where one can buy it? I can’t seem to find that configuration for sale on a quick search.
Skiddie_@reddit
I've been loosely following it since its release and I've never once seen it in stock on the ASUS website.
abhinav248829@reddit
But it runs window…
Who wants that
Plank_With_A_Nail_In@reddit
You are only comparing some of its features not all of them, you are always going to find something cherry picking like this.
d0m1n4t0r@reddit
Blame the title thats comparing it to 5070. 5070 laptops are cheaper.
Terrh@reddit
$10,399 for a reasonably loaded one.
Snoo-42683@reddit
Test is vs a laptop 5070. A desktop 5070 absolutely smokes both. Laptop 5070s can be had for 1200 in the right places, hell a 5070ti could be had for that price a month or 2 ago.
Does anyone in this subreddit actually read the articles?
MrRonski16@reddit
The issue with windows highest end laptops is that they are more like portable desktops rather than laptops. They really can’t do things properly with battery power.
DerpSenpai@reddit
Yeah in this scenario, it's best to use TB5 and put the workload into the desktop GPU lol
boissez@reddit
Laptops are and always have been more expensive for a given performance point. Thats the price of efficiency.
What's remarkable about the Macbook Pros is the performance to power consumption ratio.
Which is also why they can cram RTX 5080 performance into a 14" inch laptop.
You pay a premium for that, just like you pay a premium for a laptop with rtx 5070 performance.
Lanky_Argument526@reddit
Irrelevant comparision. He's comparing gen on gen gains between the 16 inch M4 Max and 14 inch M5 Max. Every other reviewer has the laptop be even faster than the laptop 5090 in Blender/Cinebench GPU accelerated workloads
Cubanitto@reddit
No matter how impressive these MacBooks become, until I can run my 5000+ Steam games along with GOG, EA, Battle.net, and Ubisoft launchers natively, they’re just overpriced and useless to me.
PaulMetallic@reddit
MacBooks are not for gaming. They are productivity-focused machines.
deusXex@reddit
If you want proper productivity machine, buy high end desktop PC. It will wipe the floor with any Mac. And as a bonus, you can also play games.
makavili@reddit
Bros stuck in 2019
Cubanitto@reddit
I agree, very expensive productivity focus machines
mr_tolkien@reddit
This is more on the devs than on Apple. A ton of great games run great and natively on Metal, and with LLMs it really must be getting simpler and simpler to port the graphic stack to Metal.
cookedart@reddit
It also has a lot to do with the user base. They have about 15% of the worldwide desktop computing marketshare, and macs make up only around 3% of all devices. It's harder for devs to want to make custom ported games to that market if the market penetration is small. The argument flips on its head for iOS, where devs are enthusiastic because it has a major marketshare.
noiserr@reddit
You could boot camp dual boot Intel Macs. So it's on Apple if you ask me. They chose to use their phone architecture.
Cubanitto@reddit
Game developers face challenges with Mac due to smaller market share, technical differences in programming and higher development costs. I hate to beat a dead horse but again this is why Macs suck for gaming.
mr_tolkien@reddit
Macs are not great for gaming but I played Baldur's Gate 3, a ton of DotA 2, Disco Elysium, and many great indie titles on my Mac.
I wouldn't say they outright suck outside of AAA gaming, and even then Capcom released RE Village for example.
If Mac OS market shares continues to grow, I wouldn't be shocked to see more and more Mac versions of AAA games.
Loose_Skill6641@reddit
apple doesn't care about gaming and never has
Haunting-Public-23@reddit
Apple's chosen business model & tech is not profitable for PC gaming.
Their time/talent/effort/resources/money is better spent elsewhere like the $499-599 Macbook Neo.
Kougar@reddit
But they should. It's pretty silly to go through all the cost and effort to build a capable GPU and then not take full advantage of it, is just money left on the table at this point.
midnightbandit-@reddit
It's a tiny amount of money compared to their core market
Brostradamus_@reddit
On the other hand, apple already makes five times more revenue from gaming than Steam does. They just do it all through the far, far bigger mobile market than the PC.
They're the single most valuable gaming company on the planet.
stryakr@reddit
casual gaming > serious gaming. which is true in so many spaces, so why spend more for less revenue
Deep90@reddit
Would have been a way easier way to expand their customer base compared to something like the Apple Vision if you ask me.
braaaaaaainworms@reddit
Gaming market is too small to care about
Cubanitto@reddit
You are incredibly uninformed in fact gaming is the largest entertainment segment among books movies music combined.
Brostradamus_@reddit
Only if you're lumping the mobile gaming market into that number. And if you do that, well... Apple is then the biggest player in gaming.
Strazdas1@reddit
gaming market is literally the largest entertainment market in the world. Larger than books, movies and music combined. It is the second largest GPU use market after AI too.
NeroClaudius199907@reddit
Apple already generates a lot of revenue from gaming. that 30% applestore is handy
Strazdas1@reddit
apple generates gaming reveniue from mobile marketplace, not from desktop software.
NeroClaudius199907@reddit
Guess its just opportunity cost they're willing to forgo
Strazdas1@reddit
Yes. But this isnt because gaming market is too small to care about as the person i replied to claimed.
Cubanitto@reddit
That's a different segment again trying to cross apples and oranges
NeroClaudius199907@reddit
Nothing is technically stopping them from supporting gaming they simply dont want to because perf/$ will be bad, just get a gaming laptop.
They can easily support vulkan & support proton to get access to nearly all games with translation hit. But they wont, its okay/bad
JoshuaJoshuaJoshuaJo@reddit
And most of that is in mobile gaming where the gaming capabilities of hardware barely matters
Strazdas1@reddit
Mobile gaming is the largest part, yes. But hardware for mobile gaming does matter actually. Theres a reason why mobile GPUs are now as better than 980.
Cubanitto@reddit
Exactly that would be like apple making a phone and it can't make calls
mr_tolkien@reddit
Apple is literally the company who makes the most money from gaming in a given year, from Apple Store sales.
They care about gaming, but just mobile really.
Cubanitto@reddit
And that's why I don't care about Apple, never have
Strazdas1@reddit
Their loss.
forsakengoatee@reddit
Wouldn’t be surprised if Apple looks to piggy back off the work Valve is doing
somekindofswede@reddit
While Valve are definitely a part of it, it’s misleading to suggest it’s their work that would be piggybacked. For MacOS, CodeWeavers are almost definitely a more important commercial contributor to e.g. Wine than Valve is.
Even for Linux, Valve are at most a fraction of all the pieces that make gaming on it better day by day.
superkickstart@reddit
These macs are basically video editing machines and nothing else.
Cubanitto@reddit
And for the price I'll stay with my AMD/Intel boxes instead because I'm not paying those prices
pashale@reddit
You know a GPU can be used for much more than gaming, right?
Cubanitto@reddit
My Lenovo does video editing, I work spreadsheets, and I use it for video conferencing for my job, so yeah my computer does a lot of cool stuff. But gaming that's my passion and apple sucks at it.
Sauceror@reddit
They said "useless to me". Your statement does not apply to their personal assessment of how useful it is to them.
pashale@reddit
Oh I missed that part! Is comment is pointless anyway.
The equivalent would be me commenting how a Ferrari doesnt do what I want from a car, so it's useless piece of garage, like my opinion on the matter.
braaaaaaainworms@reddit
People nowadays use social media as their personal websites
EastvsWest@reddit
What's the point in writing that? Does the world revolve around you?
Cubanitto@reddit
Technically the planet revolves I just live on it.
To quote: "To the world you may be one person, but to one person you may be the world" (often attributed to Dr. Seuss) and "One person can make a difference, and everyone should try" by John F. Kennedy
Lanky_Argument526@reddit
If you're buying a 4000 dollar machine to play video games, well hey its your money.
SupermarketAntique32@reddit
Dang. When will Apple Silicon chips plateu? It keeps increasing every year.
IBM296@reddit
Apple's chip engineering team is probably the best in the industry.
Evilsushione@reddit
The same guy that designed Apples silicon, designed AMDs. There is one dude who is a legend that basically designs everyone’s chips and he hops between companies every few years. He gets there and whoever he is working for ends up with huge gains for several years after he leaves, so he must set them up on a roadmap for years. He’s literally worked for everyone and when he does you know it. Dude is 70 now though so I don’t know how much he has left in him. His newest thing is his own company designing AI Chips.
voyager256@reddit
Then why Apple chips still generally suck at graphics applications and especially, gaming performance( even with native/optimized games)? The current fastest Apple chips: M4 Pro Max and M3 Ultra are way slower than RTX 5070 ti or 9070 XT, let alone 5090. Probably the new M5 Pro Max won't improve that much - except for AI workflows, of course (M5 Max/Ultra chips are supposed to be few times faster at prompt processing).
And for AI applications - despite it being pushed so much on YT etc. - Apple's M4 Pro Max or even the fastest Mac Studio ( M3 Ultra with 256GB or 512GB ) is still painfully slow for any models that need 100GB+ memory or so. Running other models in parallel is an option , but probably they will compete/ be bottlenecked by GPU and memory bandwidth. So in general, unless you can also run other stuff on it like VMs - that mostly utilize CPU rather than GPU, then it's not that great.
It's very efficient in terms of power usage, physical size etc. , but this advantage completely dwarfs in practice when you need a serious performance. There's a reason Apple Silicon is non existent in data centre/enterprise market.
You can cluster/connect , say 4 such Mac Studios to get a decent local LLM rig, but that not that cost efficient either. But M5 Ultra is expected to be released within few months, so we will see what would be it's price / performance ratio.
thatonegamer999@reddit
Apple silicon is non existent in the datacenter market because they don’t sell their chips to datacenters.
The comparison was to a laptop 5090, which is significantly slower than a desktop 5090. I don’t find it too hard to believe that the m5 max is compatible to a desktop 5070, I have an m4 pro and it’s comparible with a desktop 3070
voyager256@reddit
?? They don't sell? So you think they ban datacenters? LOL
thatonegamer999@reddit
They don’t make products for datacenters. No apple silicon machine supports ECC memory, or remote management, or easy servicing, or rack mounting, or linux, or windows server, or redundant power supplies, or VGA output, or any hardware security standards.
They’re consumer machines, datacenters have no interest. There are ARM based servers like Ampere Altra which have seen rising adoption despite their cores not being quite as fast as modern x86 server chips. If apple were to make a proper server board + chip it’d sell like hotcakes. They have the fastest cores on the planet by a wide margin and they’re incredibly efficient.
voyager256@reddit
So why they don't make them if it'd apparently be profitable? They don't really need most of these things you mentioned. Especially things like windows server or VGA output LOL .
Also, what ARM architecture has to do with the topic? It's not like Apple CPUs or "x86 server chips" are any good for AI (and we are still talking about data centers, right?)
Anyway, It seems like you have no idea what your just pasted.
What cores? Certainly not GPU , since a single RTX 5070 ti is much faster than the fastest GPU in Apple Mac Studio. RTX 5090 runs circles around it in any applications. As for data centers - yeah, Apple silicon is non existent.
CPU is more complicated , but e.g. for gaming 9800x3d is faster.
M5 single core is certainly fast , but it's main advantage is overall energy efficiency.
thatonegamer999@reddit
Apple doesn’t sell its SOCs to datacenters because they simply don’t care.
Datacenters != AI, and yes in datacenters features like vga are wanted. Most servers have a VGA port accessible on the front of the server, so technicians can plug in a monitor on a cart they wheel around.
Apple has no interest in producing dedicated GPUs for datacenters simply because any GPU compute likely needs CUDA, which they can’t offer. AMD gpus are also almost nonexistent in modern datacenters for the same reason.
Apple could absolutely compete if they released a datacenter cpu, but again they’d have to design an entire machine from scratch for a market segment they’re not interested in. Current apple products are simply not suitable for datacenters due to the missing features I outlined above. It’s also very hard to get into the datacenter market as customers are hesitant to swap providers. Intel still has market dominance despite AMD offering objectively superior products at lower prices for this reason.
A 9800x3d is not faster for gaming than an m5, they are not comparable in any sort of fair way. There’s currently no way to run both cpus with the same GPU, unless you want to use the integrated gpus in both in which case the 9800x3d gets slaughtered simply because it’s gpu is good enough for display output and not much else. An m5 would likely be faster though, as it has much higher memory bandwidth and significantly higher per core performance.
voyager256@reddit
I love the downvote bots , but no actual counter arguments
Evilsushione@reddit
It’s what happens when you own the whole stack. You can make optimizations that no one else can.
HolyPrepuceOfJesus@reddit
Plus you don't have to deal with the compatibility baggage x86 carries generation over generation for decades now
ARM_over_x86@reddit
It's not that big of a deal, the internal engine of CISC processors have been running RISC since the 90s, and the cost of decoding CISC internally is offset by things like better caching.
Apple is ahead because of their vertical integration, but Panther Lake has shown it's not impossible to catch up.
Haunting-Public-23@reddit
But can they catch up to scale? Does Windows 11 in its latest build induce performance per watt eficiencies that fully utilize Panther Laek?
Evilsushione@reddit
Intel is big with donating compiler tweaks to make thier stuff run better. So I wouldn’t be surprised if intel made Windows run better.
ARM_over_x86@reddit
Couldn't tell you how well windows optimized it, but overall it was certainly a big step forward. It's now comparable to the arm based Snapdragon windows laptops, though they still both lose to M chips in perf/watt and single core perf. Battery life improved a lot, that's what most people are going to notice
Familiar_Resolve3060@reddit
It's not x86 or arm it's the pc industries making bad chips
dannybates@reddit
True, doesn't all that also take up space on CPU Die?
Strazdas1@reddit
yes, but estimated total is less than 1% of the chip size, so not very significant impact.
R-ten-K@reddit
No. That is not how that works.
Apple silicon and systems built on top of it still follow the same principles than the rest of industry.
There is no black magic about vertical integration in terms of engineering or breaking the laws of physics.
The vertical integration benefits kick in more in terms of financial benefits, where Apple can extract revenue from the final product and distribute it in more adaptive ways that non verticalized vendors can't.
Evilsushione@reddit
No it works exactly like that. You own the silicon you make it for you, you don’t worry supporting every other possible vendor you can strip out all the things you don’t need. You write your software for your silicon and you can concentrate on optimizing for your silicon. You can do things like optimize for your version of SIMD and forget about things like AVX512. One specific optimization that Apple is able to do is they have a truly unified gpu/cpu memory model where other companies can only do a flexible shared memory model. There are a ton of things you can do when you own the whole stack that makes whole greater the pieces.
R-ten-K@reddit
you're confusing a reduced configuration space with performance.
The stuff that makes Apple silicon so fast is stuff like their microarchitecture and access to better nodes earlier, and one of the most competent uarch and silicon teams in the industry, etc.
sure they can control some parameters better, but Apple's performance advantage is genuinely not that dependent on their vertical integration engineering-wise. If Apple cared enough to support windows drivers, for example. Windows on ARM running on a Macbook M5 would still obliterate most Windows Laptops.
Evilsushione@reddit
Reduced configuration space makes room for more compute. So it absolutely makes a difference. Windows might work great on Apple silicon but it won’t run as well as Apples OSes do because they were made for each other.
R-ten-K@reddit
What I am saying is going over the top of your head. Got it.
R3Dpenguin@reddit
All the optimizations in the world are not going to save them if they keep dropping the ball on the software. I use a Mac at work and every update it keeps getting worse.
Haunting-Public-23@reddit
I expect Apple to do a Mac OS X Snow Leopard of "zero new features" before 2030 to remove x86-specific code.Hopefully it improves performance, greater efficiency and the reduction of its overall memory footprint.
Evilsushione@reddit
Yea, really dislike the new UI choices on my iPhone. I really wish they made a better OS and stopped locking people in. Don’t force people to use your ecosystem, make it the best so everyone chooses to use it.
smarlitos_@reddit
Yeah they shoulda just stayed on Big Sur for a long time, added a few features like remote iPhone control as an added default app. they change OS’ too often imo
kirk7899@reddit
Yea, designing for one cpu stack and os is probably easier than X86 wide range of processor/gpu and os combinations.
Evilsushione@reddit
I have a project where I have I want really performant rust code. I have to make platform specific optimizations on every platform I support so this is a really big deal. I could probably tweak performance even more if I only supported one platform.
beefsack@reddit
Not just owning the whole stack, but having a huge amount of power and control at the manufacturing level. Very few companies have the ability to do fully custom silicon for their own retail products.
Specialist-Buffalo-8@reddit
Not that impressive once you realise that the flagship nvidia consumer chips are defective cut down versions of their enterprise products...
Nvidia's junk is the world's flagship.
IBM296@reddit
The 5080 consumes 300 Watts of power. The M6 Max being able to beat that with 100 Watts will be very impressive.
g33ksc13nt1st@reddit
too bad the macOS team is not...
HolyPrepuceOfJesus@reddit
German chip design and engineering teams are killing it
Haunting-Public-23@reddit
My guess is when die shrinks stop shrinking?
Of course chip microarchitecture matters but without our rapid pace to sub-1nm nodes this wouldn't be happening as fast.
vegetable__lasagne@reddit
I'm guessing the chips are getting larger and larger so eventually they'll hit a size limitation which will slow things down.
Pleasant_Witness_113@reddit
The A19 Pro with the exact same gains shrunk in die size compared to the A18 Pro. So did the M5. They are not making chips larger and larger. They achieved these gains while maintaining area on the same node.
TurnUpThe4D3D3D3@reddit
AI perf is expected to keep improving massively every gen. It seems like we’re nearing the limits of single core CPU perf though.
Hour_Firefighter_707@reddit
In 3D rendering and high end video editing it is very similar to a 5090 laptop. And it can do it all on battery.
If your workflow exists on mac, there is no reason to buy a windows laptop
Various-Inside-4064@reddit
For performance wise yes sure but maybe I want OLED screen? Or possible upgrade options. I think that might be only reason someone might buy windows now!
Hour_Firefighter_707@reddit
Oh of course there’s reasons. I don’t have a Mac myself because I wanted a high refresh rate OLED screen and I got my Omen Transcend 14 for the same price as an M4 MacBook Air. It is also faster than the MacBook Air and can play games, and I can install Bazzite on it if I wanted to.
But if you’re a web developer or a designer or a 3D artist or a pro video editor and need a laptop, it should be a MacBook Pro. Unless you’re an architect. For some reason, Apple has never targeted architects. None of the software works
ItsTheSlime@reddit
Idk. On paper Ive always kept seeing how it performs better in benchmarks and stuff, and then in practice it always seems to fall short.
Not to mention that the cost of that performance is roughly double the price of any equivalent machine, and unacceptable displays for that price.
crshbndct@reddit
What is wrong with Apples displays? They aren’t OLED, but they are still of very high quality with all the correct color grades etc.
ItsTheSlime@reddit
They're made to look nice to consumers, but are notoriously near useless when it comes to stuff where accurate colors are needed.
The resolution is also completely non-standard for zero reason which means that any color work needs to be scaled up in unorthodox ways, potentially messing up the image. Instead of a true 4k screen, where in a 4k project each pixel is reproduced at a perfect 1:1 ratio, you get all the pixels stretched out to a weird 1:1.2749 or whatever, making pixel peeping really difficult.
Add to that all kinds of software tweaking like True Tone that just shifts all your colors around.
Every single time I work with clients on Macbooks they say that the grade "doesnt look quite right" on their machine, only to come in person and say that its perfectly fine when watching on a calibrated monitor.
Funnily enough, the Ipad Pro displays are actually considerably superior in every way and are actually a very decent (client) reference monitors, in spite of the weird resolution. If there was a way to hardwire one to a computer to use exclusively as a monitor, I'd probably get one.
agracadabara@reddit
Utter nonsense. Machobook Pros that have been shipping with reference presets that are used for pro work where Ture Tone and such processing is turned off.
https://support.apple.com/en-us/108321
The also ship with Pro Display calibrator that can be used to calibrate the displays.
https://support.apple.com/guide/mac-help/customize-calibration-pro-display-calibrator-mchlff4659b7/mac
You clearly have not used a Mac for any studio work.
kuddlesworth9419@reddit
I worked at an architect company for a little bit and they all used Mac desktops but this was some 15 years ago or so. It was a pretty terrible experience on them to be honest which is probably one of the reasons why I hate using MacOS.
S0phon@reddit
Nvidia is still way faster in 3D modeling and rendering.
igenicoOCE@reddit
But you just said there's no reasons
Hour_Firefighter_707@reddit
If you're getting paid to do something, why wouldn't you get the best tool for the job? A video editor or a 3D Artist using Blender doesn't need an OLED screen, but if they're looking at a laptop, they probably want portability, battery life and good performance on battery.
All the reasons I mentioned to not get one comes down to either cost, compatibility or what you want as an enthusiast.
Educational_Yard_326@reddit
I work with architects who use macs, they love it but they use archiCAD whilst everyone else’s uses Revit which is annoying for compatibility.
GhostReddit@reddit
Aren't OLEDs more power hungry? And do you really want that in a screen thats fixed to your device where burn in and stuff is more of a concern?
Good IPS screens on laptops can be very accurate and still do good color, and given a lot of them are used in brighter environments I'd argue are generally more useful.
S0phon@reddit
Then buy a monitor.
Various-Inside-4064@reddit
What type of useless answer is this? What if I want OLED on my laptop because I use it on the go? There are ton of other scenario
HotRoderX@reddit
you have to be insane to buy a OLED laptop... knowing that they will burn in. That doing static work on them isn't a good idea.
OLED's better then what it was. Better then what it was isn't the same as perfected or foolproof.
not to mention the coatings used on OLED screens is insanely hard to clean. That screens going to look like trash with in a year or two if its even remotely used as a laptop.
violet_sakura@reddit
LCDs are not immune to degradation as they tend to develop non uniform discoloration over time. As of now OLED isn't the most suitable for productivity but it is great for gaming due to low response time and true blacks compared to mini LED
jaju123@reddit
MacBook ultra later this year will be OLED.
Personally I've never seen any burn in on any OLED device I've ever owned in the past 10+ years
Hour_Firefighter_707@reddit
My laptop will be due a replacement long before I burn it's screen in.
Various-Inside-4064@reddit
Nothing is fool proof. Most displays eventually will be useless so people don't buy laptops to use for the next 10 years tho? Also the burn in is real but exaggerated a lot. There are some real tests I saw someone so by RTING and it was worst case testing. Still most OLED tv survived!
In laptop people but gonna leave static image for months as well!
HotRoderX@reddit
OLED Tv isn't the same as a OLED monitor.
Techs similar but not exactly the same. On top of that a normal display will be cleanable easily. Won't be as delicate. I mean you do you though.
If you want a OLED or anyone else more power. I owned two and found the downsides far out weight the positives.
The way reddit makes OLED sound you think it was the second coming of CRT's. Which are in every way picture quality wise superior to anything we have today in a lot of cases.
alexandreracine@reddit
The thing is, NVidia will be faster with some codecs, and the M5 will be faster with others, so it's a "look at your workflow" game.
PM_ME_SQUANCH@reddit
Codecs aren’t terribly relevant to 3d workflows, that’s more a video editing concern. The cpu thrashes my 7950x, and the GPU is finally Nvidia tier. Of If I could get dual 5090 performance I’d probably be close to switching for my day to day
macgalver@reddit
On one hand, the Macbook pro is so reliable, great performance, super long battery, vivid screen, very portable, no bloated AI intruding into every single thing you do.
But on the other hand Blender is about to implement Nvidia DLSS to allow for realtime viewport rendering, so I might have to buy that AI bloated Windows machine instead.
Forsaken_Arm5698@reddit
So M5 Ultra will match RTX 5090 Desktop ?
Hour_Firefighter_707@reddit
Based on the Cinebench 2026 scores I've seen, maybe. It is scoring around half of what a 5090 scores. But Ultras never scale 100% so it will probably fall \~20-25% short.
In Blender Open Data Benchmark, multiplying the M5 10-core by 8 gets you to within 10k points of a 5090, but again, due to scaling, the gap will be bigger.
I would estimate it lands around about where a 5090D V2 sits, but comfortably over a 4090 for 3D work
rpungello@reddit
And keep in mind, whatever performance it hits will likely while drawing significantly less power.
An M3 Ultra Mac Studio draws \~300W at full tilt, which is nearly half the 5090’s power alone. Factor in the rest of the system the 5090 is installed in and you’re probably more like 700-800W, which is 2.5x the M3 Ultra.
cookedart@reddit
Why does power consumption matter in this case? Isn't raw performance what matters more in a desktop configuration?
rpungello@reddit
Have you seen energy prices these days?
cookedart@reddit
And what is the cost of taking longer to do the same task? How does your wage compare to those energy prices?
rpungello@reddit
Many GPU-heavy tasks (like rendering/exporting) can be left to run overnight.
Yes if you're a full-time pro who is constantly hammering the thing a faster GPU is probably going to pay for itself, but if you're a hobbyist or just starting out, energy costs matter a lot. And for every watt of heat your system puts out, you need an extra watt of cooling potential. All adds up.
cookedart@reddit
You're not incorrect, but i'm also not sure a hobbyist is getting a Pro Apple laptop in general in that case. The cost savings of getting a midrange desktop pc would also negate the energy savings.
rpungello@reddit
I guarantee you tons of hobbyists are buying MacBook Pros, even reasonably high-spec ones.
With RAM/storage prices being what they are these days, the cost differential isn't nearly as much as it once was. And with rumors that Windows 12 will require a subscription, I think Apple is only going to gain popularity amongst enthusiasts and hobbyists.
cookedart@reddit
I think you're missing a key point though. If a hobbyist has the budget to get a $3500+ cad computer setup for 'just starting out', the energy cost/performance per watt is also not going to be their primary concern. I truly believe that the performance per watt argument is only really critical if your workflow requires you to run on battery while also doing creative tasks (editing and sending pictures/video from a sporting event to an editor comes to mind).
And I'm not sure I fully agree with your cost differential assessment - its clear that for Apple to be on a level playing field with a 5070, you have to get the Max configuration of the 16", which is $5399 cad. Most pre-built computers with a 5070ti aren't much more than $3500 cad. I would say that is a significant price difference, even with having to purchase a decent screen.
Caffdy@reddit
look at Mr. Moneybags here with unlimited access to power. Not everyone has access to cheap energy you know, there's a whole world out there with billions of people living in countries where energy is a luxury
SPACEXDG@reddit
Also won't matter since nvidia has cuda
SPACEXDG@reddit
Wont matter in the slightest once nvidia starts putting there gpu architecture on the same arm efficient cores apple does in a igpu they draw less power then dgpus anyways weather apple or everyone
Pleasant_Witness_113@reddit
Ultra scaled 80-85% of full performance at worst. It will almost definitely match a 5090. In blender, it has a good chance of beating it.
Hour_Firefighter_707@reddit
According to the Blender Open Data Benchmark database, 80-core M3 Ultra is 75% faster than 40-core M3 Max
Pleasant_Witness_113@reddit
My conclusion was based on M2 Ultra scaling it seems. My bad. Even then in open data, an M5 Ultra would be scoring around 13500. Thats insane considering power consumption of the GPU in an Ultra chip is around 150-180W compared to the 5090's rated 575W.
Culbrelai@reddit
Maybe for 4 minutes before thermal throttling lmao
Sopel97@reddit
source?
notam00se@reddit
Blender open data hasn't gotten any M5 pro/max benchmarks uploaded yet, however base M4-M5 is almost 60% more performance in rendering (1080-1737). M4 Max 40c sits between 5060 ti and 5070 ti, ~900 points lower than 5070.
Applying that same 60% to the M4 Max would put the M5 max comfortably over the 5090 laptop, and knocking at the desktop 5080.
Blender 3.3 LTS is two LTS versions ago and from 2022 and is EOL. It was also the first Blender release to get Metal RT support, which has gone through many updates and improvements since then (Blender got Metal and Metal RT support before Apple released hardware that supported hardware raytracing)
QuickQuirk@reddit
Unfortunately not a valid comparison: The base m5 got more GPU cores and an increased power budget that is responsible for a lot of the performance boost.
The m5 max retains the same TDP, plus same core count. There is a performance boost from the m5, but it will not be anywhere near the 60% figure you're quoting.
notam00se@reddit
Most of the performance comes from the neural accelerators in the GPU that are new for the M5. Direct comparison to the M4 cores when it comes to computational performance isn't applicable.
Draw Things, an AI image tool, benchmarked the M5 in the ipad outperforming the M2 max in the mac studio with optimized models
Hour_Firefighter_707@reddit
Check other reviews, like from Hardware Canucks and iPhonedo. Apple sent a 14" M5 Max to Notebookcheck and that chassis cannot cool a Max chip. It is throttling badly. Everyone else's numbers are a lot higher
Different_Lab_813@reddit
And those results look fishy, and it's hard to analize them since methodology is not shared
v-a-g@reddit
analyze* lmao
doctorcapslock@reddit
HE DIDN'T STUTTER!
Pleasant_Witness_113@reddit
What methodology? Hardware Canucks clearly states the scene "Lone Monk" which is a well known Blender benchamrk, the resolution (1440p), GPU accelerated and also mentions that it is using Nvidia Optix for Hardware RT. What more do you need?
iPhonedo compared gen on gen with previous Macbooks.
Familiar_Resolve3060@reddit
Blender 3.3 has bad metal support yet it reaps.
And if you have the latest blender it absolutely destroys anything.
Pleasant_Witness_113@reddit
The blender 3.3 benches have obvious issues. The M5 Max is slower than the M4 Max in the rankings which is obviously impossible. Other reviewers (Hardware Canucks) tested Blender and its faster than the 5090 Laptop gpu in that.
sid_276@reddit
And with way more memory. Not constrained by 32GB of VRAM. I that is a big win for complex viewport to render pipelines with massive amount of meshes. 32 feels limiting.
cookedart@reddit
If you're rendering that sort of work, does a laptop actually make sense? Also, in order to not "be constrained" by 32gb of Vram on the MacBook pro you're looking at a configuration of $6k CAD at minimum. Possible to do, but very questionable value proposition. And if pure performance is what you're after, desktop gpus are still the way to go.
trichocereal117@reddit
That’s what the RTX Pro 6000 is for
996forever@reddit
No laptop variant exists. Mobile Blackwell tops out at 24GB no matter how much money you can throw at Nvidia.
NeroClaudius199907@reddit
Thats really good, Apple caught up with Nvidia's greatest strength while having battery advantage. Thats 10/10 strategy.
Seanspeed@reddit
Ah yes video editing, the #1 use case for high powered GPU's.
Nvidia is definitely gonna be hot and bothered about their #1 market of video editing being outcompeted by Apple.
I dare say that Nvidia's next architecture will likely be heavily focused on retaking the video editing crown, being that it's their #1 market.
PhonesAddict98@reddit
I mean being faster in video has way more to do with the fact that Apple uses dedicated ProRes video decoders built into their chips, that devs optimise around for the video editing suites they offer. Nvidia has Nvenc and even then, with how windows has been lately, even the world’s fastest gpu can’t work around Microsoft’s cluttered mess of an OS.
kuddlesworth9419@reddit
You encode video on the CPU not the GPU unless you want shit quality anyway.
PhonesAddict98@reddit
And decode on the ProRes accelerators built into chip.
kuddlesworth9419@reddit
I've not tried that yet, I know they recently added it to Handbreak. Might give it a go although my system is going to be really slow with it.
azvnza@reddit
nvidia doesnt care about consumer chips anymore anyway
Lanky_Argument526@reddit
Its nice that you have selective reading glasses and completely ignored 3D rendering which he also mentioned. Nvidia had the lead in 3D workflows for Blender ever since the launch of Turing in 2018 due to their support for RT acceleration.
And till now, in Blender and Redshift and other rendering workloads they've always had the performance crown. With the launch of the M5 Max, it is no longer the case.
And if you think the amount of people buying Nvidia GPUs for rendering worklows is little, well I got a bridge to sell you. Pre AI era, after gaming, 3 DRendering was the primary reason Nvidia GPU's were highly desired.
HighestLevelRabbit@reddit
https://au.pcmag.com/gaming-1/116218/with-revenue-share-shrinking-does-nvidia-need-gaming-anymore
nVidia profit segments for anyone interested as I was.
Seanspeed@reddit
I also want to say I wasn't really being serious, even ignoring the sarcasm. Obviously video editing or 3d rendering are important to plenty of people.
Lanky_Argument526@reddit
Again why is this being posted in a comparision for CONSUMER GPUs? I guess everyone in VFX who use Nvidia cards should throw theirs out considering they're not included as a separate bar in Nvidia's shareholder presentation.
Nvidia may not care, but it still doesn't make it untrue that their GPUs lost the lead in one of their predominant fields being 3D rendering and VFX.
g33ksc13nt1st@reddit
sarcasm overload
NeroClaudius199907@reddit
Do you know how much they're making from ai atm? There's no way they're not going to maximize profits by serving rubin to dc first.
The shareholders wont allow them
gusthenewkid@reddit
They’re being sarcastic
NeroClaudius199907@reddit
No because Nvidia's next architecture wont close the gap if apple continues with their pace. M6 is coming this year as well & M7 next year. Maybe another 30-70% at similar power.
I dont think Nvidia's next architecture will be able to close the gap. Apple wants to do same strategy with gpus like CPU. They've widen the gap with intel/amd/qual and they wont let them catch up.
niccolus@reddit
I find this somewhat disingenuous. If not for the ram crisis we would be getting Super variants now with 6000 series later this year/early next year. These would be based on Vera Rubin which is going into data centers this year. Meaning, Nvidia could not only have caught up, but possibly passed M5. Apple pushing ahead with M5 is because their bottom line is affected by it.
Proof: the 512GB variant of the Mac Studio is gone. The M5 launch is as much about ensuring that they continue to refresh in time for sales in the fall.
Nvidia didn't release a consumer focused GPU because it's bottom line is affected by AI more than gaming. Thus, Vera Rubin "AI Factories".
NeroClaudius199907@reddit
Coulda woulda shouda
Apple is releasing m6 this year and will extend their lead with perf/Watt.
niccolus@reddit
Vera Rubin hardware is being released this year too. Fall 2026. Just not as desktop GPUs. Which would hand Nvidia a lead Apple would need the M8 to catch up to IF the rumors of Vera Rubins performance is factually accurate. So we can revisit this in after CES 2027.
NeroClaudius199907@reddit
What are Vera Rubin's performance numbers?
Hour_Firefighter_707@reddit
And Nvidia cannot/will not compete with Apple in the mobile space. For their dGPUs, they will always be way less efficient due to the GDDR and PCIe inefficiencies, and even when they come out with their SoCs, they still won't be as efficient because
a) Nvidia is never going to use the latest node for their consumer products while Apple always will and
b) because Apple favours big arse dies with low clock speeds while Nvidia GPUs are tiny by comparison and run at twice the clocks using way higher power.
An M4 Pro is about the same rendering speed as an RTX 4060 (laptop and desktop both since they're the same) but is way bigger and way more expensive.
Nvidia still has a price advantage though, especially at the lower end
NeroClaudius199907@reddit
GDDR is actually an advantage, it gives you higher bandwidth and allows you to use smaller chips.
The cpu is more important for battery life since you're not going to be using the gpu 90% of the time. If the idle is low like pantherlakes and gpu.
Laptop would use the lp cores for background, simple streaming etc and gpu will be used for heavy tasking. I think with mediatek cpu & 5070 with N1 releasing soon. It might be the best of both worlds.
PhonesAddict98@reddit
To be honest, the big advantage with Apple silicon, is the fact that the ram is seen as unified resource by both the cpu and the gpu and both can tap into it simultaneously and is the backbone of what’s known as a Unified memory architecture or UMA.
Most are familiar with this unified memory approach, as it’s been in use in games consoles since the 8th generation of consoles. Devs optimising their apps around such architectures results in apps using the unified memory more efficiently and getting a big performance boost as the full memory bandwidth is available to both cpu and gpu which makes latency so low it’s virtually non existent.
In windows laptops and desktops the gpu has its own vram (gddr) and only it has access to it, while the cpu can only see and access dram(ddr) or the main system memory. VRAM and dram don’t have the same latency, nor do they possess the same memory bandwidth, a gpu will always be faster because of instruction level parallelism and due to the fact that it has massively higher core counts and access to dramatically higher memory bandwidth compared to what the cpu has access to. If we can get this type of Unified Memory Architecture on laptops and get app developers to optimise around it, we’ll get similar benefits to what MacBooks have. It’s doable on laptops, on desktops though, it remains to be seen. Only if we get the big 3 to join forces and create an SoC with new variants released annually. Will that ever happen? Probably not at present.
NeroClaudius199907@reddit
Accessible ram is a benefit, but to me for average consumers wont use them, it will probably be overkill.
The gddr7 memory is that it lets Nvidia scale down aggressively. They could trim the bus to just 64-bit, combine gddr7 36, and achieve bandwidth of 288gb/s just shy of the M5 Pro's 307 GB/s
This will allow oems & nvidia more ability to cut down costs. For example one consumer prefers fast gpus and relatively fast cpus. Then oems can pair intel's & amd hx with the 64bit gpu.
Or slow cpu with a 364bit gpu, if a component has issues cpu or gpu. Swap it out and the production line continues
Basically to me dgpu allows for greater repurposing than apu (at similar perf) which should lead to cheaper products.
makar1@reddit
Upcoming Intel+Nvidia and Mediatek+Nvidia products are also APUs with CPU and GPU on the same package.
Nvidia don't even make 64bit GPUs any more as they can't compete with the performance of existing APUs.
F9-0021@reddit
Both AMD and Intel are pushing for APUs in the mobile space. I think one of the big leaps forward for APUs in the coming years will be with utilizing GDDR instead of regular DRAM, at least for high end APUs with huge GPUs. Regular DDR is already starting to hold back chips like Strix Halo.
FatBook-Air@reddit
An aside: the N64 had unified RAM, too. It was too early with it, I guess.
floydhwung@reddit
Worst of both worlds. You’d get the Windows on ARM experience without the excellent x86 driver support and plugin libraries that exists for Adobe Premiere and After Effects.
NeroClaudius199907@reddit
Rome wasnt built in a day,
If its the way I imagine it to be dont see why nvidia wouldnt throw more resources at it.
floydhwung@reddit
If it’s up to nvidia alone I’m sure they will. Spark is a great example of what can be done if nvidia has total control. But WoA and vendor support like Adobe are still going to be problematic and would take years if not decades to mature. With the current direction that Windows is heading, I am not that optimistic
From-UoM@reddit
They a Laptop SoC coming up soon which will have desktop rtx 5070 equivalent GPU.
likamuka@reddit
The bubble will burst by 2026.
Plank_With_A_Nail_In@reddit
Its already been 2026 for two and a bit months now, we already 20% done with 2026.
upbeatchief@reddit
Oracle and openAi cancelled a 50 billion data center after they failed to secure funding from banks. Like if that isn't bubble burst alarm i don't know what will be.
Creepy_Accountant946@reddit
Old news that upvotes by Reddit bots and clueless redditors
upbeatchief@reddit
https://www.alphaspread.com/market-news/corporate-moves/oracle-openai-scrap-expansion-of-texas-ai-data-center-as-company-rebalances-ai-buildout
The cancellation of the project is from a week ago.
NeroClaudius199907@reddit
Inshallah
TheCh0rt@reddit
Yep, Nvidia may lose its consumer lead to Apple their next gen. But I don’t think Nvidia cares much about consumers anymore
Specialist-Buffalo-8@reddit
doesnt mean that nvidia's chips would be worse then the M6 or M7 max...
Even their current flagship graphics cards are not 100% of the silicon die, instead a cut down defective piece not suitable for enterprise lol
obvithrowaway34434@reddit
You people seem to live in your own bubble. Wonder what happens if that one bursts first? Can you handle it or you keep living in complete denial like now?
NeroClaudius199907@reddit
Companies will go bankrupt when the bubble burst, government will cut interests rates, job losses.
For Nvidia, they'll still have clients for DC products. They'lll allocate resources to client segment
If its a bubble why shouldnt it burst the sooner?
thefirelink@reddit
It's not even close to a 5090 laptop. Where are you seeing this?
Culbrelai@reddit
lol
Too bad for apple that Cuda is an industry standard and they don’t support it.
Oh, and Vulkan.
And Dx12.
Its irrelevant if it matches a 5070 if apple silicon has no software support for the things GPUs do
Exepony@reddit
Cuda and DX12 are proprietary, obviously, and no one cares about Vulkan. Modern game engine have support for Metal, and we're seeing more and more ports come out. Seriously, at this point there's probably more games with Metal renderers than with Vulkan renderers out there.
There are also plenty of ways to inference ML models locally on Mac, which is the other big thing you might want to use a GPU for.
Why in the absolute fuck is this your benchmark? Emulation is more than enough for software from 2003.
xkaoticwolf@reddit
Cuda is from Nvidia, and Apple does try to support running software via Metal instead.
LastChancellor@reddit
i would get a Mac.... if I liked Mac keyboards at all
Mac keyboards don't have key travel, their keys just teleport from on and off
Hour_Firefighter_707@reddit
That's true. The Air is like typing on a cutting board. Pros are a bit better
DrBhu@reddit
That 5070 grown up pretty fast
grtk_brandon@reddit
This is my first time paying attention to Mac releases and the coverage has been an absolute mess so far. YouTubers are basically just spouting from the spec sheet. The benchmarks here are just incomplete since they're testing the Max in the smaller chassis and vice versa with the Pro. Finding Pro benchmarks in general is impossible thanks to Apple's naming convention (MacBook Pro M5 Pro??) and Google ruining search. Is it always this hard to find quality info?
low_end_AUS@reddit
...sounds like every Mac release ever.
996forever@reddit
Peaking at 72w but dropping to 44w means the 14” is overdue for a chassis redesign with much improved cooling and power delivery. 44w Soc TDP is pathetic for a 14” laptop of that weight class in this day and date. Rivals can do almost 100w cpu+gpu combined sustained usually in their balanced performance profile with similar fan noise level.
Consistent-Leave7320@reddit
Or maybe apple should stop increasing power consumption each gen? I rather efficiency not get worse.
996forever@reddit
It’s a universal trend to increase power consumption among all vendors including Qualcomm, Intel, amd, Nvidia because of the diminishing gains of new nodes. The TDP of macs are extremely low compared to others already.
Consistent-Leave7320@reddit
Yes I know they all do it but I was hoping apple wouldn't follow. I rather have less progress in speed when it comes at the cost of battery.
996forever@reddit
Battery life is related to idle management and not the maximum tdp of the devices. Battery life has not reduced or changed throughout generations despite increase in maximum draw.
PlsDntPMme@reddit
Such a great point. Anyone opting for a Max 14” is just wasting money in comparison to the 16”.
996forever@reddit
It’s so bad. The difference between a zephyrus G14 vs G16 is much smaller than this.
SPACEXDG@reddit
Lmao it wont get anywhere near 5070 results in gaming
TheNiebuhr@reddit
Such a misleading title by Notebookcheck, it's on par with the lapto-rized 8GB 5060 Ti. And lets not talk about die size...
WhoTheHeckKnowsWhy@reddit
and using Steel Nomad these days is very questionable given you cannot really bench Nvidia gpus with it anymore. The results are completely out of whack with actual gaming performance as the CP2077 results show.
Zalack@reddit
Notebook check only got the 14” model, this is the 16” model.
Pleasant_Witness_113@reddit
In gaming. In 3d rendering workloads like blender, it shits on it.
WingSK27@reddit
Nvidia really needs to bring back the "m" designation for their laptop models. Every time headlines like this comes out, for a split second you thought they are talking about it matching up to desktop gpu's. It actually well plays into Apple's disingenuous marketing too.
LoonSecIO@reddit
I can buy a strix halo laptop with 128gb of ram for $3k. Even if the M5 is a bit better hard to pay close to 1.5-2x for a similar outcome.
noiserr@reddit
And you can run Linux for much better developer experience (native Docker and fast filesystem), and still be able to game on it (thanks to Steam Deck).
LoonSecIO@reddit
I still much prefer MacOS for software development.
noiserr@reddit
There is nothing MacOS is better at than Linux for software dev. these days which you can customize infinitely.
LoonSecIO@reddit
Thought of one by the way. You can unwrap a MacBook direct from the factory and be compliant to work in FedRamp or other HiTrust spaces.
It’s trivial to prove compliance to even the strictest end user standards. Even easier than windows thanks to the universally used Macos Security Project.
noiserr@reddit
That's not a common use case though. I'm speaking in generalities.
LoonSecIO@reddit
Yes it is. Common control set in SoC2 and ISO or even bidding in a government contract with cmmc.
noiserr@reddit
it's not
LoonSecIO@reddit
No it’s literally not, uk cyber essentials. All software installed on an end user device must be explicitly allowed listed and you have 2 week to patch all software. Comply with that in Linux.
LoonSecIO@reddit
Prefer is a statement of personal choice.
satysat@reddit
You can buy a strix halo for $3k which will be slower, throttle faster, lose half its muscle while unplugged, have a worse screen, lose resale value faster and last you a couple of years less than a Macbook, yes.
Windows laptops are a horrible value proposition. If you need a laptop, get a Mac. Desktop? Probably PC.
LoonSecIO@reddit
I was making the point that it is a little unfair to compare a $6k MacBook Pro to a sub $3k device. strix halo $ for $ outperforms it. The Strix Halo at $3k will outperform the same $3k MacBook which will likely put you at a 15c/16g M5 Pro with only 48gb of ram. Even if you consider resale you're better off putting that extra 3k into an ETF then trying to argue resale values on a 2x cost machine.
Sure go here... I will update the link when the M5 Pro and Max get released.
https://www.cpubenchmark.net/compare/6981vs6403vs6345vs6348/ARM-Apple-M5-10-Core-vs-AMD-Ryzen-AI-Max+-395-vs-Apple-M4-Pro-14-Core-vs-Apple-M4-Max-16-Core
Even with a 20% uplift to the M5 Pro from the M4 Pro Strix halo likely trash it.
Also the M5's Max thermal throttle on 14inch MacBooks. You are seeing this in the threads right now that the 16 inch outperforms the 14inch by a pretty good margin. Apple back to its intel days of hamstring a device due to inadequate cooling.
This also doesn't even get to the point that apple is claiming to now beat a CPU that was released well over a year ago now... My z flow has been sitting on my desk and daily driving next to my Mac for over a year. I actually prefer to run models on it compared to a M4 Max simply because RAM Quantity is king. You can't run as complex of models on the same cost hardware it is pass fail on it. Sure if a take a say Qwen3 quantized version that can run in 30gb of ram it will run faster on my Mac then the Z flow but I can run 2x-3x larger model and 5x to 10x the context window and that is far more important then just raw token generation speed.
satysat@reddit
Yeah, fair, I always skip the 14 inch cause of poor temp performance.
But I guess we're simply coming from different places. I work in motion graphics, video editing, 3D, etc and there's genuinely nothing that comes close to the M chips if you need mobility. Yes some windows laptops will render 3D faster, but that's it. Everything else is consistently faster on Macs for me. Hell, my desktop 7950x is slower than my friends M4 Max for After Effects.
I know that some workflows are well optimized on windows, but as a rule of thumb, macs are an insane value proposition ATM. Specially with the recent price increases, I could spend more money on a desktop PC than on an M5 Max an get worse performance in several workflows.
sdwvit@reddit
Can it run slaker 2 at 5070 level of performance?
jocnews@reddit
Looks like it's actually slower than laptop RTX 5070 in games. Apple only seems to be faster in synthetics and some compute tasks in that test.
gabeandjanet@reddit
Well thats not good at all , a laptop 5070 is equivalent to a desktop 4060 ti...
So slower than that is very weak.
What a stupid headline lol.
Pleasant_Witness_113@reddit
I don't think anyone's buying a macbook to game. The compute tasks are what that matters.
jenny_905@reddit
No.
In gaming tests we've seen how Apple Silicon actually performs and while it's competent it's nowhere close to these claims.
Lanky_Argument526@reddit
Does stalker 2 even have a native port? Max Weinbach tested Cyberpunk 2077 at 1440p with Psycho RT and no upscaling. It got 22.91fps compared to a 5080m which scored 26.95fps.
sdwvit@reddit
S2 works really well through proton
GhostReddit@reddit
It's a shame Apple doesn't do anything to support games on MacOS, it's not like they don't have the capability.
LeoNatan@reddit
Cool story. If only it had games to play.
Fr1tzOS@reddit
A slightly misleading headline.
In some 3D rendering and video editing workflows the M5 Max might be on par with a 5070, which is impressive for an integrated GPU with a fraction of the power consumption.
But in gaming it still won’t touch it, assuming (as ever) that the game you want to play is even available on MacOS. Not that people buy a Mac for gaming.
deusXex@reddit
Until Apple comes up with a competitive alternative to NVIDIA's DLSS4, they are out of the game. M5 Max is also significantly more expensive than PC alternatives. So what is the point? Native resolution benchmarks are nice, but the world has moved on.
voyager256@reddit
Regular RTX 5070 or 5070 ti?
foxfox021@reddit
surpassing nvidia hardware..... so what? it cant even run gamez..... welp, good for productive works tho...
Stilgar314@reddit
So, can it run Monster Hunter Wilds?
msolace@reddit
At that price point why would you avoid the nvidia options though.
Apple ecosystem lock is a real downside...
i do love the mini's when it was on sale for 499 that was a heck of a buy, and i have 0 apple loyalty
jenny_905@reddit
Heard that before.
Not true in the real world.
ishsreddit@reddit
16 is still being tested. It uses around \~40-50% less power than 5090 MOBILE while maintaining 70-80% of the performance. That is INSANE lol.
Idk why are people trying to make it much better than it is already and making up things. Like my dudes, there is a hard worked written article right here.
Lanky_Argument526@reddit
The entire comparision is stupid. We have the M5 Max losing to the M4 Max in Blender which just makes this sound extremely inacccurate. Every other reviewer has the M5 Max be significantly faster than the M4 Max in Blender or any other 3D workloads.
The decision to make a performance analysis article while comparing chips in different chassis is baffling.
IBM296@reddit
10-15% improvement from M4 Max. Looks like next year's M6 Max will be on par with the RTX 5080.
Lanky_Argument526@reddit
No, its a lot higher. For some stupid reason, they chose to compare the M4 Max in the 16 inch to the M5 Max in the 14 inch severely limiting its thermal performance.
AutoModerator@reddit
Hello Antonis_32! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.