RIP Intel: AMD Ryzen 7 9800X3D CPU Review & Benchmarks vs. 7800X3D, 285K, 14900K, & More
Posted by b-maacc@reddit | hardware | View on Reddit | 130 comments
996forever@reddit
What a good day it has been, with all these reviews.
adolftickler0@reddit
wink wink.
996forever@reddit
I do not know what you mean
SmashStrider@reddit
See Intel, THIS is how you do a launch.
NightFuryToni@reddit
To be fair, AMD learned from their disastrous launch not long ago.
CatsAndCapybaras@reddit
I'm not sure AMD learns from their terrible launches, rather I think they just get lucky every once in a while. I'm not saying they get lucky by having good products, they actually have been making great things. But their launches have been consistently bad (except this one).
tangosmango@reddit
Any reason to upgrade now from a 7700x? I was initially holding to upgrade to the 5090 and the 9900x3D or 9950x3D.
I'm running AW3423DWx3 so I'm not sure if the 9800x3D will even benefit me all that much.
Framed-Photo@reddit
Techpoweredup did higher resolution testing for their reviews, I think you'll find the numbers do not support you upgrading your system haha.
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/19.html
At 1440p the 7700x is 8% slower on average, relative to the 9800X3D. That's an FPS difference of 148 vs 161.
At 4k it's even smaller, you'd be within 2.3% on average, hardly outside margin of error at that point.
They also do minimums testing, which is where you would normally expect the largest difference? But even then at 4k you're basically within margin of error compared to the 9800X3D, and at 1080p you're behind by a little over 10fps.
So considering your using a resolution between 1440p and 4k, yeah don't bother upgrading, I would say it's a waste of money unless you play games that specifically see large boosts from X3D, or the 9800X3D specifically. Games like factorio or tarkov do that for example.
tangosmango@reddit
Oh wow...very interesting. I wonder how much this would change if you couple it with a stronger GPU like 5090.
I am getting back into VR as well and a stronger CPU even if it's 8% stronger would help. Every FPS helps in VR.
Framed-Photo@reddit
It likely won't change much with a 5090, most games are nowhere close to getting CPU bottlenecked at these higher resolutions.
VR as well, if you have a headset like a quest 3 or an index is doing well over 1440p once you combine both screens resolutions, closer to 4k in the case of the quest 3. You won't see much of a change.
You're already on a really good CPU for your current workloads, you don't need to upgrade. Don't get fomo'd haha. There's a LOT you can do for $500 USD.
tangosmango@reddit
Any idea what I can upgrade? I've been attempting to upgrade something. Closest I came to was upgrading the AW3423's to a 49"
Framed-Photo@reddit
If you have a 4090 and one of the best monitors on the market? There's not a whole lot tbh haha.
If you're not already rocking a top tier mouse/keyboard/headphones/speakers I'd look into those well before upgrading your already good CPU.
Your computer is only as good as the perhipherals connected to it. Doesn't matter how good the GPU/CPU are if your monitor is shit, or if your mouse jumps around, etc. Places like r/headphones, r/mechanicalkeyboards, and r/mousereview are good places to check if that interests you.
Otherwise, you could always just...save the money haha. You're already rocking pretty much the best of the best, you're not gonna be missing anything by just sticking with it.
tangosmango@reddit
Well, I have multiple headphones but I have an DAC/AMP with Fostex TH909 and so many mice and mechanical keyboards lol
I was hoping someone could recommend me a monitor since my middle AW has OLED burn-in on the bottom (taskbar burn in). Not noticeable in games but I would love to have an excuse to replace the multi monitor setup :D
Thanks for your input!
Framed-Photo@reddit
If you're getting burn in then you can't really go oled unfortunately. There's a few miniled monitor options around but honestly not that many good ones. Could be worth exploring though.
porcinechoirmaster@reddit
Unless you're running simulation games, it's almost certainly not worth it. You're going to be GPU limited in pretty much all modern titles, even with a 4090.
Sim games are a different story, because the performance you're optimizing around there isn't tied to the framerate, and benefits from faster CPU regardless of the slideshow you're displaying.
Slyons89@reddit
If you're planning to go with a 5090 then yeah it would be worth it IMO. The 9800X3D will probably stretch its legs even further with a GPU as fast as the 5090 should be.
tangosmango@reddit
Yeah, that's what I decided on. And I just deal with draining the system again if the future ones turn out to be notable upgrades
Thanks maN!
capybooya@reddit
The 9900X3D and 9950X3D will have the added complexity of thread prioritization driver and core parking. We'll just have to wait and see, crossing fingers that it will work well.
tangosmango@reddit
Thanks bro! Cheers!
PlasticComplexReddit@reddit
That is a much bigger improvement than most people expected.
djent_in_my_tent@reddit
Zen5 is choked by the IO system, the extra cache likely helps mitigate that
IC2Flier@reddit
I guess this just adds more to the proof that feeding these cores has become the real challenge now (alongside thermals).
i_love_massive_dogs@reddit
People working with high performance engineering have understood this for decades. There's a reason why naive matrix multiplication is like 4 lines of C code, but it's multiple orders of magnitude slower than an optimized version that's 50-100 lines long, despite having the exact same computational complexity.
Atheist-Gods@reddit
Does the optimized version really have the same complexity? There are algorithms with reduced computational complexity, although I don't know the real world performance comparisons.
another_day_passes@reddit
In practice it’s the implementation that is optimized, i.e a constant factor speed-up. Conceptually it’s still a triple for loops (O(n^3)).
MiyaSugoi@reddit
And the optimized algorithm takes special care about memory allocation etc.?
i_love_massive_dogs@reddit
Aside from the obvious of multi-threading and SIMD, it's mostly about memory access patterns, reusing data in registers, leveraging pipelining. Basically making sure that the data is as close to registers before you do any computations with it. This is often the extremely unintuitive. Like instead of doing the obvious straight pass through a matrix, doing these wacky Morton code paths through the data can be much more performant.
skinpop@reddit
simd, working in L1 sized blocks etc. memory allocation is the trivial part.
frankchn@reddit
Has been for a while I think. DRAM latencies hasn’t improved all that much even as bandwidth has increased.
nero10578@reddit
It actually got worse
No_Share6895@reddit
significantly worse. look at the cas latancy of 8000mhz ddr5. if you had that kind of latency on ddr4 let alone ddr3 people would think you're insane. for a lot of process the increased bandwidth makes up for it but for some especially latency sensitive stuff it dont. which is why large cache and hopefully soon l4 cache will become standard
einmaldrin_alleshin@reddit
I just compared a bunch of DDR5-8000 to DDR4-4000 modules and didn't really see big differences. 18 cycles for 4000, 38 to 40 for 5000. That's a 5 to 10% increase in latency at double the clock speed.
Keep in mind, CL 36 at 8000 would be equivalent to CL 18 at 4000.
BookPlacementProblem@reddit
Did you mean to say
8000
here?einmaldrin_alleshin@reddit
Yes
AtLeastItsNotCancer@reddit
You do know that latencies are typically specified in terms of clock cycles, right? Clockspeed goes up by x%, stated latencies also increase by x%, in the end you end up with practically the same effective latency (in terms of time taken).
I don't see how latencies have gotten "significantly worse", they've stayed relatively consistent for as long as I can remember, all the way back to the original DDR.
MdxBhmt@reddit
Both statements are correct, yours is in absolute terms, theirs is in relative terms.
BeefistPrime@reddit
6000mhz ddr5 cas 30 is the same latency as 3000mhz ddr4 cas 15 .
No_Share6895@reddit
which is why if intel wants a chance they have to bring back l4 cache.
Hendeith@reddit
Intel is working on their implementation of stackable cache to increase l3 cache. Although there's no info when if will be available.
I doubt Intel will bring back L4, they can't just go back to old implementation and it doesn't seem like they really worked on this or have in plans new solution.
Incredible that competition from AMD in last year's didn't really push them to really do their best. They are still believing that once they regain node lead it will just work itself out.
tusharhigh@reddit
Intel gave the gaming leadership to AMD. They dont intend to take it back it seems
Zednot123@reddit
I doubt Intel developed base tiles with cache solely for Ponte Vechio. It may be generations away still, but that is probably where things are going on the Intel side.
Agreeable-Weather-89@reddit
I wonder how well a APU would do with on die memory like Apple.
jmlinden7@reddit
Apple doesn't use on-die memory, they use on-package memory.
IC2Flier@reddit
The thing I wanna see AMD get crazy doing is a Threadripper but the other half of the chip is a GPU and HBM2 stack. Or something but on an AM5 chip (like just take the PS5 chip but the SOC is now fed straight-up from graphics memory)
No_Share6895@reddit
its been this way for a long time. ever since cpus became faster than the ram that supplied the data. it just keeps getting more and more apparent. Sure a few processes took longer to make it obvious than others but still
Kiriima@reddit
One starts to wonder about doubling that cache in future cpus.
djent_in_my_tent@reddit
Intel needs to play cache-up lol
SignalSatisfaction90@reddit
Reddit is a parroting echo chamber, people’s speculation on here doesn’t come from their own thoughts, but mimicking the thoughts of others. It’s very cringe for me to be typing this but it’s more true than ever unfortunately
blazesquall@reddit
I need a plug-in that collapses all the group think and low effort comments.
SignalSatisfaction90@reddit
an AI use for good
Z3r0sama2017@reddit
X3d is not zen 5% lol
f3n2x@reddit
Zen 5%x3
Terepin@reddit
Wut. 9800X3D is literally a Zen 5 chip.
Framed-Photo@reddit
10% on average still isn't what I'd call a "good" uplift, especially considering how much more expensive it is compared to the low prices of the 7800X3D just a few months ago, and especially if you're not using some of the outlier titles like BG3, but it's better then the rest of the 9000 series for gaming at least and isn't actively embarrassing for AMD.
Still, someone wasn't impressed with the 7800X3D I can't imagine the 9800X3D is better enough to suddenly change their mind right?
regenobids@reddit
I reckoned it had to do 10% to respect the x3d brand. AMD clearly thought the same. Shows on the power consumption too
Slyons89@reddit
The clock frequency had a much larger than expected change. The rest of Zen 5 had basically the same clocks as Zen 4.
redm00n99@reddit
So do I get the 9800x3d or buy a cheap used 7800x3d when people sell off their old CPU after upgrading. Decisions decisions
Framed-Photo@reddit
If you're at anything greater than 1080p, 7800X3D is literally within margin of error for a lot of tests.
crashck@reddit
I never understand why they spend so much time at 1080p in these videos without going up in resolution. It's a $480 CPU. People spending that much should not have a 1080p monitor. I get that the fps difference falls off fast at other resolutions, but what about 1% lows and stutters?
Jensen2075@reddit
b/c upping the resolution makes it GPU bound. You want to test the strength of the CPU not the GPU so lowering the resolution removes that variable from interfering with the benchmark.
redm00n99@reddit
Cool. I'll probably go for it then. Also because it's funny having a 7800xt and 7800x3d
Framed-Photo@reddit
I mean you can look at the benchmarks. If you're okay with paying $500 for a 10% improvement only when you're CPU bound, then by all means go for it.
I think it's a gigantic waste of money but it ain't my money.
No_Share6895@reddit
I really hope this makes intel get their heads out of their asses and at least go back to l4 cache...
CarbonTail@reddit
Intel is pretty much done, sadly. They need a miracle for a turnaround.
I predict they'll get acquired or massively restructured in the next year or two.
aecrux@reddit
AMD’s bulldozer era was significantly worse, but at this pace intel will be there as well if they don’t get their shit together soon
SailorMint@reddit
Bulldozer (2011) at least served a purpose as one of the biggest lesson in CPU architecture.
No_Share6895@reddit
yeah bulldozer makes the 285k look good.
f3n2x@reddit
Bulldozer even makes the Pentium 4 look good, LMAO.
Hundkexx@reddit
AMD was beaten far worse and made it back, with vastly less resources and a tanking GPU division they just bought for way too much money.
I'm not worried for Intel, yet.
RZ_Domain@reddit
AMD made good business decisions under Rory Read & Lisa Su by securing consoles, etc. and the development of Zen under Jim Keller.
At the same time Intel is too busy with financial engineering for investors. Even with Pat Gelsinger and Jim's short stint, what do we have? Waste of sands and bible passage tweets.
HorrorCranberry1165@reddit
but Intel have factories, so it can't fall to as low as AMD. AMD found some buyers to his factories, but Intel won'd find buyer for his factories, so big financial problems means probably bankruptcy.
teutorix_aleria@reddit
Having leading edge fabs operational in the west is a massive geopolitical issue. Either intel will be kept on life support by western goverments or their fabs spun out and kept on life support by western governments. Either way intel's fab business will not be the thing that drags them down.
DeliciousPangolin@reddit
Intel is like Boeing or GM. They might end up firing a ton of people, wiping out their shareholders, whatever - but they are never, ever going out of business.
5662828@reddit
They need a new socket :))
No_Share6895@reddit
if thats what it takes to bring l4 cache back...
OGigachaod@reddit
Intel is way too top heavy.
HOVER_HATER@reddit
Nova Lake on A18 class node should be their comeback if that node is good enough but in short terms it's essentially an AMD monopoly for next 12-18 months.
imaginary_num6er@reddit
Anything on Intel 18A is a start. Right now Intel 18A is vaporware
scytheavatar@reddit
Nova Lake will need to compete with Zen 6 which will be on TSMC 2nm. Not clear to me where Nova Lake's advantage will be just because it is A18.
HorrorCranberry1165@reddit
Zen 6 will be on N3(P), probably late next year. Zen 7 will be on N2 / A16 and probably new socket
No_Share6895@reddit
i dont have any hopes until intel actually dramatically increases l3 cache size or returns to l4 like broadwell had.
HOVER_HATER@reddit
As long as memory latency that we see with ARL is fixed and there is a node advantage with good ipc improvments Intel should have a compatetive product in everything besides 1080 cpu bottlenecked gaming scenarios. To come on top with gaming they would indeed need to develop something new like a special series of cpu's without igup, less cores/no e cores and (perhaps even a seperate cache die) and simply throw and ton of cache at it.
No_Share6895@reddit
there was a post recently that showed the e cores may be better for gaming than p cores at this point... its crazy. and yeah a separate cache die like broadwell had would be awesome
HOVER_HATER@reddit
Yeah i know, i just threw random ideas Intel could do. At the same time they would probably be just fine with good all around cpu that beats AMD in multitasking, perhaps efficiency with it's superior node and has at least 90% of gaming performance.
mtbhatch@reddit
Yes. We need competion
Modaphilio@reddit
I am old enough to remember the old days when AMD bulldozer was getting annihilated by Intel Sandy Bridge 2500/2700K, its funny world.
Few_Net_6308@reddit
Something is very wrong with the 7950X3D results in their charts. It's losing to the vanilla 7950X in every game, which is nowhere near how it performs in reality. Did they accidentally confuse it with another CPU?
nbates66@reddit
Might be the location in he game they use for testing, I believe gamers nexus specifically choose a CPU load intensive spot ingame
Handarand@reddit
Okay, this still looks like 265k is a better option for my usecase. Although 9800X3D looks like a great product by AMD!
Kapps@reddit
Curious how the 9950X3D will play out or when that would come. If it had similar gaming performance with better compiling performance, that would be pretty sweet.
imaginary_num6er@reddit
I’m more worried about the tariffs if they launch it too late
Drakyry@reddit
okay so i get that this is reddit and hence unreadable on these topics, but tariffs are for things that the US itself does not produce. get it? the idea is to make the japanese cars cost more to the americans so that the american made cars are more competitive, it's not to duh just make you pay more for electronics because le drumpfs le bad
Shifty-Pigeon@reddit
But this chip is a one of those things America does not produce?
LeadToSumControversy@reddit
damn, ive thought they've upgraded chatgpt to be capable of basic logic?
itemluminouswadison@reddit
probably gonna upgrade from my 5800X to this 9800X3D
wondering if i should upgrade my RTX3070 instead though
GlammBeck@reddit
GPU for sure, a CPU upgrade would be wasted on that card.
signed7@reddit
I'm planning to upgrade both and will be running a 9800X3D with my 3070 for a bit lol just waiting until the 5000 series comes out
itemluminouswadison@reddit
good point, thanks
neat_shinobi@reddit
Both, or either one would be bottlenecked. I'm with 5900X and rtx 3070 and was thinking of upgrading to 5090 or 5080 + this CPU, however it would probably require a PSU upgrade as well...
RoninSzaky@reddit
Depends entirely on the games you play.
Heck, I am running a 7800X3D, yet still tempted to upgrade to improve the tick rates in my favorite Paradox grand strategies.
Platypus_Imperator@reddit
The only reason I'm so excited by this one
Fauked@reddit
Almost always upgrading your GPU is the better option.
7GreenOrbs@reddit
9800x3d faster in BG3 by 26% over the 7800x3d and an astounding 60% faster than the Core Ultra 9 285k.
Hendeith@reddit
It shows how great AMD did and how hard Intel fumbled. 285k feels like it's 3 or 4 gens behind Zen5 x3d
No_Share6895@reddit
it is. it cant even reliably beat a 5700x3d. let alone the 2 gen newer 9800x3d
NeverForgetNGage@reddit
Imagine telling people 10 years ago that intels Q4 2024 chips would be struggling against AMDs Q2 2022 chips on an end of life platform.
Insanity.
COMPUTER1313@reddit
And that Q4 2024 CPU would be running on +8000 MHz CU-DIMM DDR5 while the 5700X3D is chugging along with some budget DDR4-3200/3600 kit.
No_Share6895@reddit
real time simulation and path finding for all those characters takes time and cache to store it all in. higher speeds makes it take less tiem and more cache to keep it closer to the cores is king. like im not ready to replace my 5800x3d yet but man this is getting so close
MwSkyterror@reddit
MMO, multiplayer, and sim games aren't often benchmarked for a variety of reasons, so this is some very hopeful news if any of that performance can be generalised.
MarxistMan13@reddit
This is exactly how I feel. The 5800X3D still does really well most of the time, but seeing 30-50% gains is pretty wild.
aecrux@reddit
I just started playing factorio, I feel you there
JapariParkRanger@reddit
Avoid Gleba
spazturtle@reddit
MMOs also massively benefit, my 5800X3D still let's me max out my current monitor in the MMOs I play but there I no way my next chip is not a 3D chip.
No_Share6895@reddit
yeah especially in 1% lows 3d cache helps raid night like crazy
ebnight@reddit
I'm ready. my 5800X3D has been amazing, but now my wife can enjoy it as well upgrading from her 5600X :D
No_Share6895@reddit
my wife and i both already have a 5800x3d otherwise id upgrade and give mine to her. we'll both probably grab a 12800x3d thou
cram_a_slam@reddit
Really excited for this but still waiting to see what the 9950X3D can do
potato_panda-@reddit
Thank God, finally some good generational gains
Framed-Photo@reddit
Good compared to what we've been getting recently, but really not that good compared to the rest of history.
Better than nothing though.
OGigachaod@reddit
Still not as good as the jump between 5800x3d and the 7800x3d.
Bingus_III@reddit
Yeah. I'm really interested in seeing how they're going to find a solution to the I/O restrictions with the next series.
No_Share6895@reddit
new IO die for one
Yommination@reddit
That was from a whole new socket change and DDR4 to DDR5
AnthMosk@reddit
Well maybe I can get a 9800x3d for under $500 in the next 13 month or so.
Danishmeat@reddit
Maybe and likely. The 7800x3d was under $400 in 3 months while also being the fastest
OGigachaod@reddit
Maybe, but unlikely.
TechnologyForTheWin@reddit
Wow! Way better than I thought it would be
NightFuryToni@reddit
Steve smiling in the thumbnail is all you need to know it's good.
SiegeDmg@reddit
This CPU will be so popular. The new King.
Court_esy@reddit
Tech Jesus has spoken, open the pre-order gates already!