Samsung Exynos 2600 Reportedly Boasts 75% GPU Lead Over Apple A19 in Internal Testing
Posted by raill_down@reddit | hardware | View on Reddit | 77 comments
FollowingFeisty5321@reddit
Presumably a lot of this benefit will actually be derived from 2nm and LPDDR6, which Apple will also have soon enough.
-protonsandneutrons-@reddit
Early LPPDR6 bins aren’t likely a major speed improvement over the newest LPDDR5X bins. Maybe power savings, but Samsung is rarely one to exploit those well.
Qualcomm and Mediatek both are launching 2026 chipsets with LPDDR5X. Maybe Samsung Semiconductor is giving Samsung Electronics a leg up.
Plus-Candidate-2940@reddit
They should just use LPDDR5T
-protonsandneutrons-@reddit
IIRC, they already are.
LPDDR5X-8533 - original LPDDR5X max bin
LPDDR5T-9600 - adopted into JEDEC as LPDDR5X-9600
10700 and 12700 bins, IIRC, are not in JEDEC LPDDR5X.
MissionInfluence123@reddit
Samsung renamed its second-gen SF3 to SF2 and although first gen is indeed better than SF4, it's not much better than N4P from TSMC
Geddagod@reddit
Apple (and TSMC) very likely already has the benefit of Samsung's "2nm" with N3P.
DerpSenpai@reddit
Samsung might have a power benefit but lower density
Educational_Yard_326@reddit
Or just 3x the wattage
badmintonGuy45@reddit
This is a nothing burger clock bait article. Any chip can have faster performance if given more wattage and electricity usage. Exynos chips are terrible and have bad performance. It's why people prefer Snapdragon CPUs. What needs to be looked at this power/performance.
Apple_The_Chicken@reddit
The Samsung fabs have improved massively
Tasty-Traffic-680@reddit
Using a Samsung fabbed snapdragon in my phone right now... So figure that one out.
Apple_The_Chicken@reddit
Both the sd888 and the sd8gen1 (which were the last 2 flagship Snapdragon ships with Samsung fabs) were awful and overheated like crazy. The fab matters tremendously.
ChoyLeh@reddit
Let's not forget the Exynos 990 in my S20+. Overheats easily even from scrolling playstore or FB. Buying Exynos is like gambling. There may be good one but there's a chance you'll receive a bad one too.
Tasty-Traffic-680@reddit
Has max power draw been reduced or are phones just equipped with better cooling since then? The only reason the sd8g5 equipped phone geekerwan tested was able to sustain 18 watts was because it had liquid cooling/heat spreader, an active cooler fan built in and another cooler strapped to the back. No way that's happening on a mainstream non-gaming phone.
Apple_The_Chicken@reddit
the 8 elite is a monster consumer but monster-performance likewise, you technically can get to that level of consumption but regular tasks consume much less, hence the gradual improvements in battery life and thermals we've been seeing.
I switched from a sd888 s21 fe to an 8elite Xiaomi 15. Granted, the newer one has a vapor chamber, but it's almost cool to the touch using graphing settings with which my last phone overheated to the point I couldn't hold it.
I can also max out all graphic settings on some games on a 30º day and it'll also get dangerously hot, but why would I do that?
InformalAd202@reddit
These were 4nm "allegedly" the newer ones are much better
TotalManufacturer669@reddit
Except there is also no source of this "news" except Samsung press release itself.
Every time Samsung is about to release a new node or a new Exynos, there are always "leaks" that stated how good they are going to be this time. And still people like you fall for it rofl.
raill_down@reddit (OP)
Tesla recently chose them for AI5 and AI6. A bunch of startups are choosing them too.
Their 4nm is also on a roll getting orders from Hyundai, Exynos 2500 and Qualcomm 6s 4th Gen.
MissionInfluence123@reddit
Yes, Tesla chose them but it's actually dual-sourced with TSMC.
raill_down@reddit (OP)
Not AI6. That's exclusively Samsung
Geddagod@reddit
Has Tesla historically used leading edge nodes?
If it's anything like Tenstorrent choosing Samsung, it very likely could be Samsung had invested into their company, making them choosing Samsung much more likely.
All I've seen was Nvidia saying Samsung Foundry was their partner, nothing for specifically 2nm or anything.
For cheap, lower end stuff. Which isn't bad per se, but doesn't tell us much about their leading edge stuff either other.
Also Exynos 2500 is Samsung's 3nm node too. But it also should be considered an "internal" customer, like how Intel's server products use IFS.
MissionInfluence123@reddit
Im surprised that geekerwan has no review of it.
Geddagod@reddit
yeah, only rice reviews on bilibili has one.
Apple_The_Chicken@reddit
I'm not basing this on Samsung press, I'm basing this on reports I've seen of their processes having caught up with tsmc. I can't find the source anymore, I tried, sorry.
ML7777777@reddit
I love how people demand source when its positive but don't get a damn if its negative.
Plus-Candidate-2940@reddit
Because people are bias, you don’t have a bias or preference?
Geddagod@reddit
There was plenty of skepticism around the "18a 10% yield" negative rumors.
But also, Samsung (and Intel) has a decently long history of problems on their new nodes. So is it really surprising that people would be more skeptical of good news than bad news?
badmintonGuy45@reddit
source?
Olde94@reddit
Breaking news! Old i7-2600k still on par with modern computer chips!! (compares 95W desktop cpu with 7W laptop cpu)
Death2RNGesus@reddit
It won't be on par when comparing IPC.
Olde94@reddit
I never said that. I actually kinda said the opposite
damien09@reddit
Just don’t compare it to the m series apple chips at 7w or even the new Samsung arm laptop chips. I believe it gets smoked by even an m1 air at 7w
Olde94@reddit
Yeah i was having a cheap celeron in mind or something like that
Exist50@reddit
Other way around. The best Sandy Bridge at huge overclocks still loses to the M1 in raw perf.
Olde94@reddit
Did you even read what i wrote?
Exist50@reddit
I assumed when you said you were thinking of a Celeron, that implied the higher end of the Sandy Bridge lineup would look better.
Olde94@reddit
All in all it was a bad joke because i went too far back
Ok_Spirit9482@reddit
I've tested with 8850U actually and cpuz benchmark indicate it's on par with 2600k when I limited the CPU packge power (that includes the GPU) to 2W. I suspect for M1 it's even lower.
Olde94@reddit
Phew! A saviour in all of this!
Ok_Pineapple_5700@reddit
They improved a lot at lower wattage, but as soon as you try gaming or push the chip, it sucks a lot of power and heats up. You also get random stutters every now and then
Rocketman7@reddit
Isn't Samsung using AMD GPUs in their SoCs?
Aware-Bath7518@reddit
Xclipse is Samsung's tiler combined with RDNA IP.
As on software part, they use modified amdgpu kernel module and a fork of AMDVLK PAL codebase.
InformalAd202@reddit
So you're just making up the whole tiler thing?
Aware-Bath7518@reddit
This is something I got from the OpenGL->OpenGLES wrapper developer when talking about Xclipse architecture.
Samsung themselves mentioned they did some bandwidth optimizations for mobiles as well.
InformalAd202@reddit
Is that conversation available on any forum? Bandwidth optimization don't necessarily mean a tiler and potentially a binning pass for each render pass.
Aware-Bath7518@reddit
Yes, if we consider the public Discord server as a forum.
InformalAd202@reddit
Can you send a link?
Aware-Bath7518@reddit
It's in this project's README: https://github.com/MojoLauncher/MojoLauncher
InformalAd202@reddit
Actually found the message where he said it. Still most likely not true tho as he doesn't provide any proof, yet Samsung claims they are doing immediate mode rendering.
InformalAd202@reddit
They don't have a tiler. It's just RDNA. But low clocked
InformalAd202@reddit
They are. Rdna 2 for 2200. Rdna 3 for 2400. And maybe 2500 thou the render back-end and core count suggest rdna 3.5. The Xclipse 940 is a down clocked 780M
InformalAd202@reddit
Thou this will probably be rdna 3.5 based. Adreno and Mali bringing even more tile based rendering improvements and Samsung using immediate mode. Will love to see how they compare. Tho mode difference will obfuscate most of it
Aware-Bath7518@reddit
Xclipse GPU is TBDR, AFAIK.
InformalAd202@reddit
Actually just ran geekerwan's triangle bim test on my notebook's 1660ti and it behaved exactly like adreno gpu, but with smaller blocks since it only has 1.5MB L2 cache. Maybe AMD has being doing some sort of tiling like that as well so it may not be full "immediate mode". Tho you claimed exynos has a "Samsung made tiler". And also Samsung has "console like immediate mode rendering". The only way to know for sure is to run triangle bin on a exynos gpu
InformalAd202@reddit
No it's AMD's immediate mode based. Samsung even bragged about it as a "console like" feature. Read the "Mobile gaming ecosystem, expanded" paragraph
https://semiconductor.samsung.com/technologies/processor/gpu-technology/
Aware-Bath7518@reddit
Qualcomm Adreno is also IMR (unlike Mali/PVR/AGX)
Actually Xclipse has custom tiler tackled onto RDNA IP, so it's TBDR in some extent.
InformalAd202@reddit
Watch geekerwan's video on the snapdragon 8 elite. He probes that once scene depth complexity rises adreno 840 switches to tile based rendering
InformalAd202@reddit
Adreno is still hybrid that choses which one is best by toggling between the two and choosing the fastest. The recent adreno 840 even have a custom extension that improves TBDR by giving devs more access to tile ram. Would be great to post any source on this "tiler" since even Samsung said they are doing immediate mode
VastTension6022@reddit
So they're claiming it has a >200 TOPS neural engine and an over 100% GPU performance increase leading to a multi-gen lead over apple, mali, and adreno? Sure.
InformalAd202@reddit
The actual Korean global site mentions 29% faster gpu compared to adreno 840. The Adreno 840 is like 5% behind Mali
dmaare@reddit
This is almost surely a cherry picked result from a single specific GPU test, possibly it can be take from GPU compute power test. Mali and adreno pretty much suck in GPU compute tests because these mobile GPUs are not really meant to be used for GPU compute (servers are used for GPU compute). Because exynos is using GPU that's based on AMD graphics cards, it will for sure destroy both mali and adreno in GPU compute. Graphics rendering performance and real game performance will be a different story.
InformalAd202@reddit
Geekbench's gpu benchmarks are just opencl and vulkan compute. Previous exynos AMD GPUs didn't "destroy" them. Also no both adreno and Mali have been improved for compute tasks as well.
Tasty-Traffic-680@reddit
I wonder how much power it will suck back to achieve those numbers. The SD8G5 was pulling 18+ watts in some geekerwan PC game emulation tests.
Geddagod@reddit
Samsung seems like it gave up on trying to match Exynos Fmax with TSMC ARM cores with the exynos 2500, though with the exynos 2400 they allowed the P-core to guzzle power to try to match competitors single core scores.
Not sure if this is because they are unable to do so, or a strategic shift.
Plus-Candidate-2940@reddit
And the article completely skipped over single core performance.
DerpSenpai@reddit
If RDNA is able to do this, they are the best uarch around but seeing laptop AMD vs Apple, i'm skeptical!
Kryohi@reddit
Laptop AMD Chips are severely bandwidth starved
InformalAd202@reddit
And since Smartphones are even more bandwidth starved than a laptop, for power consumption reasons. that just makes it even more dubious.
Kryohi@reddit
Not really. lpddr5 is lpddr5.
MissionInfluence123@reddit
Phones use 64b (4x16b)
InformalAd202@reddit
Phones can't use that much bandwidth per second on the gpu. I mean Mali only uses 12GBs on genshin impact and most games use less because bandwidth= power consumption. but again performance improvements on the gpu only benefit benchmark because games from play store are either thermal or vsync limited
KR4T0S@reddit
That and yield issues seem to be the two curses plaguing them. I do think they are due their breakthrough moment but who knows when it'll happen.
raill_down@reddit (OP)
Their sample yield is 50% and they're commencing mass production of the 2600 next month so...
I guess they're in a better position than Intel
Geddagod@reddit
PTL and the Exynos 2600 seem to be launching in volume at around the same time (early 2026), but I think 18A is going to be more competitive vs N3 than Samsung 2nm is tbh.
There's a pretty large gap between N3E and Samsung 3GAP, meaning Samsung 2nm, which is renamed Samsung 3nm+, is likely at best a N3E equivalent. Though, because Samsung 2nm is just an iteration of their 3nm node, maybe yields are better for it than 18A.
Meanwhile I think the base line for 18A is being \~N3B, based on how PTL looks. Lower Fmax than ARL-H, sure, but the core area is also rumored to be a bit smaller... though the core perf/power curve is going to be interesting to look at too.
throwawaymask01@reddit
Internal testing = Marketing
Exynos internal testing = Desperate attempt to mitigate the chip's market rejection.
I simply don't get Samsung pushing Exynos on flagship galaxy devices.
Do it on FE, M, A models but not on S.
And before naysayers come all hard, I have both S24+ (Exynos version) and an S25+.
The sheer inefficiency of the Exynos forced me to buy another phone because it simply cannot handle editing videos on dedicated apps, it heats up significantly compared to the S25+ and this is the second time I go through this with Samsung claiming "guys, this time our Exynos chip is so good, we fixed it!"
Spoiler alert: they didn't every time.
Yes, it handles daily tasks just fine, but i purchase totl devices for heavier tasks, from editing videos for social media on the go, editing pictures, testing applications etc. And I regret every time I get something Exynos. My S24+ is nearly useless for the true heavy tasks.
InformalAd202@reddit
For regular gaming use this is probably useless. you just need to watch some YouTube gaming benchmark to see that even the 8 elite needs only 300mhz on even the most intensive android games. Samsung even caps that to 250mhz because more would break the 6 watts tdp limit.
PhonesAddict98@reddit
I mean, yeah, of course this means jack to us, we ain’t the ones testing the chip to know what the actual performance metrics and power consumption curves. We can’t gauge actual, accurate performance when the tests are done in air conditioned labs on test boards.