Apple's M4 Max GPU (40 core) almost on par with (mobile) RTX 4080 in Blender, beats 3080 Ti
Posted by RealTaffyLewis@reddit | hardware | View on Reddit | 138 comments
Tenelia@reddit
For scientists and engineers, the bonus of the M4 is having all that unified memory available for large datasets or any kind of AI/ML work (ignoring that GenAI slop though). With a 32GB RAM machines machine, your NVIDIA GPU still only has 6GB of VRAM...
INITMalcanis@reddit
Amazing hardware. If only one didn't have to deal with Apple to use it.
JoshRTU@reddit
honest question, what does an average consumer have to "deal" with?
moofunk@reddit
Indirectly, API support/stability levels, so if you're using some 3rd party app that depends on OpenGL or other non-Apple API like that, the reliability will be subject to how well Apple decides to support it that week/month.
Of course OpenGL is being phased out now, but I do remember that support in 3D apps would vary quite a lot. I remember MODO being absolutely broken and crashy and was never fixed on the Mac Pro Trashcan and that made a lot of users angry.
I remember that the Intel chipset on my Macbook supported OpenGL 2.0 in Windows, but only 1.2 on MacOS, so I could not run certain apps that required OpenGL 2.0, unless I had bought a Mac with a dedicated GPU.
That said, I don't see why the same wouldn't be the case for supporting more modern non-Apple APIs.
JoshRTU@reddit
Are there any common use cases that would impact an average consumer?
moofunk@reddit
You keep asking about the "average consumer". The average consumer doesn't need a Mac. Anything that will run a webbrowser would then be enough.
Ionic-Nova@reddit
Average delusional r/hardware power-user who thinks the average person just wants the equivalent of a $200 Chromebook and can’t comprehend the benefits of ultra books (Apple AND premium Windows devices)
moofunk@reddit
The term “average consumer” wasn’t defined, and made the question hard to answer. Anyone will eventually run into the idiosyncrasies of a platform, once you use it long enough.
I presented the ones I ran into, namely rather shaky driver support for 3D modeling tools, because they uniquely depended on Apple providing them, they didn’t do that reliably, and they didn’t fix bugs to benefit 3rd party apps for years and I don’t see that changing.
Ionic-Nova@reddit
Using big pseudo-intellectual words doesn't make your point any less stupid. In no possible way is an "average consumer" someone who actively worries about or experiences what you mentioned previously (however valid it may be in your use-case).
And to imply that the average consumer just needs/wants e-junk that can simply browse the web is hilariously out of touch with reality. The majority of college students own ultrabooks like Macbooks, Dell XPS, Lenovo Carbon X1s, Microsoft Surface Laptops, etc. People prefer using devices with top tier keyboards, trackpads, and screens. Shocker.
moofunk@reddit
These two statements are really in conflict. Who is the "average consumer" of a base Macbook Air or a $15.000 Mac Pro? Apple's product spread and the types of people they sell to is fairly wide. You will run into different issues as a low-end or high-end user over time.
That's why the question isn't really answerable with anything, but what I said.
You're just talking about build quality, which is unquestionably high for Apple, but if your Mac can't run the software without investing in pricey VM solutions, what's the point? And now with ARM Macs, do they run the x86 software for Windows or don't they? That's a degree of knowledge you don't need for a Windows machine.
Anecdotally as an engineer and software developer for over 25 years, I didn't experience much issues with my Apple machines professionally. It was in the hobby area that it fell apart over the course of 7 years.
Ionic-Nova@reddit
Having to create a hypothetical of what an “average consumer” entails should indicate how nonsensical your argument is.
The average consumer primarily uses web based applications and suites like Adobe which are supported.
There are less than a million licensed engineers in the US. That’s not even remotely close to “average”.
moofunk@reddit
Do you have anything more interesting than that to say?
Ionic-Nova@reddit
Nope! You are just very out of touch with reality if you think the average person will have to deal with any of the things you mentioned!
Please go outside and touch grass 😁
moofunk@reddit
Same goes back to you. You are wasting far too much time on this.
Ionic-Nova@reddit
Your mental gymnastics goes as to far as to calling 30 second responses as “wasting too much time”
Lol
moofunk@reddit
Go away kid. Blocked.
skinlo@reddit
You obviously go to a college with people with wealthy parents. That's not an average situation. Most people watch YouTube, check emails and maybe browse the internet a little. I got through uni fine with a £500 laptop a decade ago, as did everyone else.
Ionic-Nova@reddit
How presumptuous and wildly incorrect of you.
500 British pounds in 2014 is worth 700 British pounds now, which is equal to 900 usd. Every single ultra book I mentioned can be purchased for that price.
Not sure why there’s this completely uninformed misconception on tech forums that MacBooks are wildly expensive. Entry level MacBooks, which are good enough for the vast majority of “average users” regularly are on sale for $800-900. Most premium windows laptops are priced the same.
Your laptop from a decade ago is quite literally the same price when accounting for inflation.
Ionic-Nova@reddit
How presumptuous of you and wildly incorrect of you.
500 British pounds in 2014 is worth 700 British pounds now, which is equal to 900 usd. Every single ultra book I mentioned can be purchased for that price.
INITMalcanis@reddit
I buy hardware for me, not for them
JoshRTU@reddit
fair, but then it would be "I didn't have to deal with" and not "One". Also what specific problem does using Apple cause you?
INITMalcanis@reddit
Specifically that it can't run the OS I prefer.
Also I resent the absolutely unjustifiable price-gouging on decent amounts of RAM. I looked at the M4 mini by way of comparison, and an upcharge of £210 to increase RAM by 8GB is insane.
Also I despise their practice of essentially eliminating 3rd party repair, making the hardware basically disposable once the warranty period is over.
monkeyDwragon@reddit
Use virtualisation, it runs windows/linux pretty pretty fast
INITMalcanis@reddit
Or I could not use virtualisation and just run Linux pretty fast on x86 hardware. And add 8GB of RAM for less than £210 if I feel like it.
pc0999@reddit
I really dislike MacOS, many times it is pricier than the competition, with does not play well with many accessories, it is too much of a closed ecosystem...
zerostyle@reddit
Do you want to pay $800 for 12tb of storage for your photos instead or $150?
JoshRTU@reddit
What average customer need 2TB for?
INITMalcanis@reddit
what average customer needs an M4 at all for?
account312@reddit
Churning through the piles of JavaScript between them and the comment section.
zerostyle@reddit
Find do 512tg or 1tb then. Also incredibly expensive per gb
titan_hs_2@reddit
The same consumer that would need RTX 3080 Ti rendering speed in Blender, so me. Lots of media files and cache
account312@reddit
I don't store photos on my my desktops/laptops, I keep them on my NAS. And all OEMs charge lots for RAM.
zerostyle@reddit
Even then 256gb is a tiny amount of working storage if you're doing anything semi professional (photo/video editing, large app installs, any type of game, etc).
Ploddit@reddit
A solid product and great customer service? Horrifying.
Ionic-Nova@reddit
Redditors constantly nagging them for making the “wrong” purchase /s
cloud_t@reddit
Apple and the price tag.
kiwiiHD@reddit
The price tag is 599 or 499 with discount, not exactly steep.
cloud_t@reddit
What? Are we talking about buying an m4 max chip (and mandatory computer around it)?
kiwiiHD@reddit
Eh whatever
Probably still a good deal and you were never going to buy one in the first place
cloud_t@reddit
I was adding one has to deal with Apple (to use the m4 max, as the other user said) and the strep price tag. You can buy an 4080 laptop way cheaper than the cheapest m4 max mbp (and you actuallt need the 16' model to even take advantage of it, and more RAM than stock, só factor in a good extra 2k over the already 1.5k extra).
Aggressive_Ask89144@reddit
And if you need all of that performance anyway for 3.5k.
Insert 4090 and ADA cards. You can take a 9950x and chock it full of the ram you need as well. Portability can honestly be solved by simply using a nicer ITX case tbh. I'm not why you would need such power on a laptop and be constantly moving with it to the point where the laptop form factor is a necessity.
Personally, I just have a workstation + business class laptop. The 5825U has enough spice for anything I need to do while I'm out and about, comes with 16 gigs of ram, SSD, and so on and it was 299. Absolutely wonderful machine considering the laptop I used beforehand before a few months ago had a...celeron 3060 💀
cloud_t@reddit
Yep
dampflokfreund@reddit
Let's hope Nvidia really makes an M4 Max competitor next year with great emulation and lots of native arm software. So, so much potential.
crystalchuck@reddit
Not gonna happen. Whose cores would they even use? The only one remotely in the same league would be Qualcomm's, however that's excluded right off the bat.
Wrong-Historian@reddit
Lol. You want from having to deal with Apple to having to deal with Nvidia? There is like no improvement in that
dampflokfreund@reddit
I just care for good products, not the price. Nvidia would ensure good emulation layers and software support.
skycake10@reddit
MacOS really isn't all that closed off, there just aren't that many games available.
Wrong-Historian@reddit
cries in not having a different browser engine on iOS
ChemicalCattle1598@reddit
AMD could have done this. Jim Keller worked on such a processor.... If they followed through it could be as mature as Ryzen. Probably like M5 going on M6 performance, give or take.
Raikaru@reddit
Don't really think that's probable though. If Nvidia creates a M4 Max tier processor with unified memory and all that wouldn't it just literally cannibalize a lot of their consumer LLM/Blender/Anything high VRAM market? Even paying Apple Tax is less expensive than buying GPUs with massive VRAM.
dampflokfreund@reddit
It would not be cheap of course. But there's a clear demand for high performance laptops with great battery life and efficiency.
996forever@reddit
What makes you think nvidia is interested in big die low margin consumer market lol
battler624@reddit
Honestly if only it supported Vulkan too instead of just metal.
CarbonatedPancakes@reddit
Lot of comments focused on the price, but the bigger story is that it can do this without turning the laptop containing it into a brick (and double the size of the power adapter) and/or make the machine scream like a banshee when you’re actually using that power.
I owned an Asus G15 with mobile 3080 Ti in it for a bit but I couldn’t deal with the thing constantly sounding like its fans were struggling to keep from frying itself alive and returned it used the money to build a desktop instead.
Obviously MacBooks aren’t gaming laptops but I’d kill for a reasonably thin and light gaming laptop that could stay quiet when in use.
Also, if this is just the Max, what’s the Ultra (which should double the GPU core count) in an M4 Mac Studio/Pro going to be capable of? With how much RAM those machines can be configured with it’s going to be nuts.
sk3tchcom@reddit
I travel with a MacBook Air M2 and a gaming laptop - together - in one laptop bag.
Windows thin & light gaming laptops have been viable for years. The most recent one are incredible. Earlier this year I had an ASUS Zephyrus G14 RTX 4060 and I currently run a Razer Blade 14 2024 RTX 4070.
dampflokfreund@reddit
Even the G14 and Razer Blade only lasts 1-2 hours while doing demanding work. They are loud and framerates completely tank when you unplug it. Macbook keeps it performance and usually lasts 3-4 hours in demanding workloads.
sk3tchcom@reddit
You can’t play games on a MacBook. I never use my windows gaming laptops unplugged - no point. I game at the airport in the lounge or in my hotel room.
dampflokfreund@reddit
There's native games on Mac. Not many, but you can play games on a Mac these days.
And now imagine you would not have to carry around huge charging bricks, not have to search for a plug in that airport lounge, not bothering other passangers with your literal turbine jet and still get great performance. And also never running out of VRAM (8 GB is very tight these days)
Next year, Nvidia might release an M4 Max competitor with unified memory. It's a very bad time to buy gaming laptops now.
sk3tchcom@reddit
GAN chargers ftw - just toss it in my luggage.
I’d love to have one device but I don’t foresee it in the near future. Especially because I want my work stuff on one laptop and my stuff separate.
8GB VRAM as an issue is overblown.
lcirufe@reddit
The killer app of the Macs though is it can do all that while being relatively quiet and cool. And with unmatched power efficiency. Windows laptops aren’t quite there yet unfortunately.
sk3tchcom@reddit
Totally agree - but you can’t play any of the big games on Mac reasonably well. MacBook is a requirement for me at work.
thicchamsterlover@reddit
With the adoption of Thunderbolt 5 in Windows Machines, there‘s two very compelling options in the future for sleek power laptops. On the one side you have the Macbook with this extreme power in a very small and elegant package, though program compatability and gaming is a tough one (for a windows user that is). On the other hand we might just be able to have a slim Windows Laptop with a powerful CPU that‘s able to not draw the whole battery when watching a video, connected to a well cooled, full size eGPU via Thunderbolt 5, if the need for heavy tasks arises.
I am super excited and hope that this can be Windows Answer to the Macbooks… otherwise it‘ll be my time to switch.
g-nice4liief@reddit
That has been the case for quite some while on Windows laptops. It just depends on where you're shopping.
BrotherO4@reddit
yea, but to get silent you tend to get a weaker laptop. with apple you get everything.... everything but gaming.
g-nice4liief@reddit
Laptops nowadays come with liquid metal applied for the factory. Like I said before. It depends on where you shop. Plenty of boutique laptop builders that you can customize the build of your machine.
devinprocess@reddit
I don’t know, looking at the new M4 stuff, the “banshee” is coming to Apple as well and everyone is figuring new ways to defend it.
Meanwhile non-Apple laptops are slowly getting cooler and quieter.
SudoUsr2001@reddit
Check out XMG. 140w 4070, Ryzen 8845hs or 14900hx. Core 15/Fusion 15.
Paraphrasing_@reddit
Okay, and how do these compare in terms of pricing?
Crazyirishwrencher@reddit
The cheapest 4080 laptop I could quickly find was just under $2000 USD, while the M4 Max starts at $3700. I assume the more impressive comparison is accomplishing this speed while using so much less power/generating so much less heat. Plus, you could configure the M4 Mac with 128gb unified ram and have something really special in a portable package.
TwelveSilverSwords@reddit
Also that Macbook Pro probably has a better screen, speakers than that $2000 4080 laptop.
jonydevidson@reddit
There are no laptop speakers like the ones in MBP from M1 onwards. There's just no competition in how clear and stereophonically accurate they are.
As for the screen, it's a 1000nits full-window array-lit IPS with wide viewing angles. I have seen many screens in my life and this one absolutely takes the cake. The text clarity and contrast only comes close on an OLED screen (ASUS array-lit Nebula screens have huge contrast issues with text and fine details), but laptop OLEDs are no more than 250nits full-window.
OppositeArugula3527@reddit
The OLEDs on the Asus is way better than any mbp screens. OLED is a must these days.
jonydevidson@reddit
I have an OLED TV and an OLED desktop monitor and the MBP stands proudly next to them in colors and contrast. Blooming is only visible in white-on-black scenarios which are almost non-existent except for some movie titles or credits.
Motion is bad, around 50ms black to white response. I don't game on it so I don't really notice it. 1000nits full window impact cannot be overstated.
I don't think you've heard the speakers on the recent Macs. There's just nothing like it in the laptop market.
OppositeArugula3527@reddit
You're lying through your teeth. OLED technically is superior to LED. That's not for you to say anecdotally. Its a fact lol.
ReiBacalhau@reddit
You could remove the speakers from laptops most people wouldn't care, everyone has buds or headphones
jonydevidson@reddit
False.
996forever@reddit
For the price you can get laptop 4090 with a proper display without dogshit response times, though.
Glebun@reddit
They fixed that on the new ones
NeroClaudius199907@reddit
No its still 70ms
Glebun@reddit
I've heard that they quietly upgraded the panel, resulting in better response times among other things - do you know what it was before? Or is it the same?
NeroClaudius199907@reddit
nvm its actually 41ms
https://www.notebookcheck.net/Apple-MacBook-Pro-16-2024-review-Enormous-battery-life-and-better-performance-of-the-M4-Pro.917793.0.html
Glebun@reddit
Is that better than before?
marcanthonyoficial@reddit
response times are irrelevant in a mobile workstation
HIGH_PRESSURE_TOILET@reddit
Yeah color accuracy and resolution are more important and the macbooks outshine most laptops in these regards
marcanthonyoficial@reddit
and brightness, where macs are best in class
MuTron1@reddit
Also, in a mobile workstation where it’s assumed that you’ll be spending a lot of time with a desktop OS onscreen, the fact that it’s not an OLED is also quite relevant. Burn in is still a problem
GreenDifference@reddit
no one cares speakers on laptop
Crazyirishwrencher@reddit
Maybe. 14" form factor can be pretty limiting. Didnt look at the larger macbook.
therinwhitten@reddit
Yeah but Windows Laptop Manufacturers went OLED because you would buy a 4k Lenovo Legion 9I and get a MiniLED with bloom issues and microstutters.
Say what you want about Macs, their QC is miles ahead of others.
CarbonatedPancakes@reddit
I wish more laptop manufacturers took QC more seriously. It sucks that things like case panel fitment, screen panel light bleed, etc are a lottery for so many laptops.
therinwhitten@reddit
Def agree.
Deep90@reddit
I can't imagine what Apple charges for 128gb's of RAM.
cloud_t@reddit
128GB is going to set you back another 3k at least, saying it out of my head from what Apple usually changes on upgrades at checkout. I assume this is what you meant by "truly special", as in truly screwing with your wallet.
Rudradev715@reddit
https://www.bhphotovideo.com/c/product/1811267-REG/lenovo_83de0008us_16_legion_pro_7.html
4080 2k
4090 2.5k
CJKay93@reddit
I mean, the battery life on this has got to be at most like 5 hours while basic web browsing.
devinprocess@reddit
You don’t get 24 hours running blender so it would be nice if people were accurate.
On my m2 pro mbp I don’t even make 14 hrs if I am constantly editing or running something intensive like Xcode and blender.
CJKay93@reddit
You say that as if you'd get five hours doing that on a Legion lol. My Zephyrus only gets an hour on battery while doing anything even remotely challenging, if I'm lucky.
pluckyvirus@reddit
That is the sale price though isn’t it?
Rudradev715@reddit
Yes even after tax it is still cheaper than M4 max
MuTron1@reddit
Except one gets you a GPU and the other gets you a laptop with best in class build quality.
You can’t really compare the price of a component vs the price of a premium build laptop
Rudradev715@reddit
ok
Large_Armadillo@reddit
If only Apple could get those pesky game developers to port their games, no one would buy windows machines.
Coolider@reddit
Too little. With 5090 on the horizon it will bring 2x if not ~3x perf of M4 Max and it's Apple who's playing the catch up here. Unless M5 Max somehow brings a 2x improvement next year (which is very unlikely looking at Gen to Gen records and consider that Apple's power budget is very limited) it will still lag far behind the offerings from NVIDIA.
UGMadness@reddit
Apple still hasn't revealed the M4 Ultra (presumably) that will be used on the highest SKU Mac Studio and Mac Pro.
NeroClaudius199907@reddit
If m4 ultra is the same as m1, 2 then we can already guesstimate where perf will land.
OkBrick4260@reddit
Out of curiosity, if you're comparing x090, shouldn't you compare with m4 ultra? (which isn't out either?)
Coolider@reddit
Thing is it's quite possible that even the 5090 Laptop will bring over 2x improvement over 4090 Laptop, which will be extremely hard for Apple to catch up.
moofunk@reddit
On a timescale longer than 1 GPU lifetime, say, over the next 7 years, you'll be buying 2 or 3 Nvidia GPUs to get renewed performance, where on the Apple side, you'd need to entirely replace the machine to do the same.
Even if M-chips eventually match or slightly surpass the fastest Nvidia GPUs, the ongoing replacement costs to maintain top performance over the years are wildly prohibitive.
nuke_489@reddit
Apple has improved a lot in the GPU department. Although I wonder how much difference does the process node actually makes (N5 vs N3E).
Zednot123@reddit
Just compare other companies against themselves. The GTX 980 Ti has a 25% larger die than the 1080 Ti and deliver roughly half the performance.
Not every node delivers that much GPU performance as the jump from 28nm to 16nm, but it is generally substantial. Transistor budget and efficiency gains are the pillars of GPU scaling.
ThankGodImBipolar@reddit
Remember that this was technically a double node jump due to TSMC fumbling their 20nm process, which I don’t think ever made (or mass produced, not sure) any chips bigger than what you’d put in a smartphone. If anything, that jump was actually less pronounced than you’d expect because Maxwell was already so efficient on 28nm when compared to Kepler. Pascal was an extension of that.
Zednot123@reddit
No, it was not.
16nm FF, is based on 20nm. It is essentially 20nm with FF transistors rather than planar, which is supported by the density numbers.
It's the main reason why Intel 14nm was so superior compared to Samsung and TSMC's nodes this generation. They simply were not the same generation at all.
PhilosopherJust3075@reddit
Did some digging on your comment. The GTX 980 To does have a 25% larger die compared to the GTX 1080 Ti, but you ignore many other factors.
GTX 980 Ti (Maxwell Architecture): Die Size: 601 mm². Process Node: 28nm (TSMC). Transistor Count: 8 billion.
GTX 1080 Ti (Pascal Architecture): Die Size: 471 mm². Process Node: 16nm (TSMC). Transistor Count: 12 billion.
The GTX 980 Ti's die is roughly 27,6% larger than the GTX 1080 Ti's die, mainly due to the older 28nm manufacturing process, which required more space for fewer transistors.
Performance Gap (TFLOP of FP32) GTX 980 Ti - 5,6 TFLOPS GTX 1080 Ti - 11,3 TFLOPS
That's nearly double the FP32 performance of the 980..
This massive performance boost was due to the improved Pascal architecture, higher clock speeds, more CUDA cores, and the smaller, more efficient 16nm process. The GTX 980 Ti's larger die size came from the inefficiency of the 28nm process, which limited transistor density and increased power consumption.
The GTX 1080 Ti, with a smaller die size, benefitted better performance due to the 16nm node, which have a higher transistor count and better energy efficiency, which means it has a much higher performance.
Now we have the RTX 4080 Mobile and the M4 Max 40C:
M4 Max 40C: Die Size: ~200-250 mm² (unconfirmed). Process Node: 3nm (TSMC). Transistor Count: 28 billion. Density: ~197-216 million / mm² (unconfirmed).
RTX 4080 Mobile: Die Size: 294 mm². Process Node: 5nm (TSMC). Transistor Count: ~35,8 billion. Density: ~121,8 million / mm².
The RTX 4080M does have a larger die size (~20-30% larger) and more transistors (~28% more), but it operates on a less advanced 5nm process compared to the 3nm process of the M4 Max. Then you have all the other tech, like CUDA and RT.
The M4 Max benefits from the 3nm process, which increases transistor density, reduces energy leakage, and improves thermal performance. With a denser chip, you consume less power per watt since the signals travel shorter distances. Larger density also means Apple can integrate more specialized components like AI accelerators, video and rendering units, or additional cache within a compact die size.
I do wonder what RTX 5000 series would turn to be, maybe Nvidia can give larger VRAM size rather than being cheap pricks.
Zednot123@reddit
Transistor counts on GPUs now are much harder to compare due to cache. Cache has higher transistor density than and MUCH higher than I/O like memory controllers.
Ada built with the same cache amount os Ampere would have considerably lower transistor count.
PhilosopherJust3075@reddit
Oh, didn't know that info.
chapstickbomber@reddit
AMD out here with 4nm CCD and 6nm last gen IO, Mike Vining ass chip.
996forever@reddit
The process node difference probably already helped with the massive difference in power draw.
Meekois@reddit
Man. At this point, I'm not going to buy another GPU. Just a Mac studio...
pc0999@reddit
I really would love a similar piece of hardware from the competition.
I want this power/for factor and principally energy efficiency under on Linux. And without Apple tax.
spiteful_fly@reddit
It's a real shame that CAMM2 will only have a max bus width of 128 bits. We're not going to see good GPUs like this on non-Apple machines for a long time. At this point, my dream would be to have an SoC like Apple's, but using RISC-V with a 256 to 512 but memory bus to compete.
Kryohi@reddit
LPCAMM2 supports 256bit width, using two modules. And LPDDR6 lpcamm should increase this limit to 384bit.
spiteful_fly@reddit
I am hopeful, but skeptical about whether the industry adopting multiple CAMM2 modules because space is a premium. I can now understand why manufacturers started soldering the memory chips onto the board. I've been wondering if it would make more sense if we had a sockets for inserting memory chips onto the board directly.
Famous_Wolverine3203@reddit
It seems Apple is the only possible alternative to Nvidia in the 3D rendering space at a much higher price. The fastest AMD GPU is behind the M4 Max. AMD needs to dedicate more resources toward its RT cores.
But looking at the PS5 Pro, it doesn’t seem to be enough.
randomIndividual21@reddit
Last time they brag about benchmark with M2, I recall their non mobile games still running horribly.
Kant-fan@reddit
Intel Arc GPUs actually had decent raytracing, especially when compared to AMD RX 6000 series at the time and they improved it with battlemage in LNL as well but I doubt they'll be competitive when their dGPUs launch in the next months.
Successful_Bowler728@reddit
I was looking for some test on M4 but I cant find any real test: an M4 on a desk running a video of a workload footage of the M4 screen.
Tradeoffer69@reddit
Would the almost par score be because the M4 Max did run a lot less tests than 4080? Or that shouldnt affect its metrics?
ayoblub@reddit
Desktop 4070
skycake10@reddit
It shouldn't, it says "Median Score" for the ranking
johannbl@reddit
I wonder how much of this outcome is tied to software optimization since apple silicon is still new… and how much more power can be obtained through giving the chip more power.
Darlokt@reddit
Very little based on the previous M series chips. Apple are amazing at building their chips for the low power envelope they need, which gives them amazing performance per watt etc. but the downside is that they don’t scale to higher wattages/frequencies. There are fundamental micro architecture differences between high frequency high power chips and low power high efficiency chips.
iCruiser7@reddit
What? Apple has been steadily increasing their clock frequency and power draw with each generation. M4 now sits at 4.5GHz compared to M1 which could only do 3.2GHz. Geekerwan's review highlighted this trend.
johannbl@reddit
Ah ok interesting. In some cases it’s way more interesting to focus on lower power usage as it opens up use cases.
Kursem_v2@reddit
no, it's actually tied with the accelerators inside the M series GPU that are specifically tailored for tasks that are likely being used by macOS users, which can be fully utilized by Adobe for their editing software.
that doesn't mean M4 Max is faster than RTX 3080 Ti, it's just competitive in some tasks.
bik1230@reddit
The vast majority of benchmarks only use the CPU and GPU. This Blender benchmark for examples, uses the GPU. It is entirely comparable to the RTX 3080 Ti.
Kursem_v2@reddit
in this regard, yes, as Blender are an actual application that its benchmarks are tied with the performance that you'd expect on an actual workload.
you reiterate what I'm saying, tho.
johannbl@reddit
Ah! I assumed adobe and blender would have been part of the same type of task regarding this. Is gpu 3d rendering that much different from how after effects uses the hardware? Maybe that’s why I’ve always found adobe tools to be slow at what they are supposed to do?
Greenecake@reddit
Come a long way from the pretty poor performing M1 GPUs in Blender.