AMD Ryzen 7 9800X3D Review, An Actually Good Product!
Posted by Ravere@reddit | hardware | View on Reddit | 331 comments
Posted by Ravere@reddit | hardware | View on Reddit | 331 comments
SmashStrider@reddit
Wow, this is actually really good. Considering the disappointment of Zen 5, the 9800X3D has pretty much alleviated this by being faster than what AMD claimed. And sure, it does consume more power, but that's kinda expected considering the higher boost clocks. This thing is gonna sell REALLY well. Intel pretty much NEEDS V-Cache if they want to compete in gaming at this point.
zippopwnage@reddit
I can't watch the video now, but is the power consumption that high? I'm planning in getting one of these for my PC, but I also don't wanna blow out my electricity bill. I'm kinda noob when it comes to this.
BadMofoWallet@reddit
I don't know where you live, but if in the USA, you're more likely to run your electricity bill higher by leaving your coffeemaker on than you are by going from a processor that consumes 30 more watts
peakdecline@reddit
The hyper fixation on "efficiency" in reviews seems misplaced. Particularly when AMD spent a significant portion of the design effort on this product to allow it to be "less efficient." The real world impact from the increased power consumption is basically nil. The gains in performance are significant though. Its the absolute right decision.
PastaPandaSimon@reddit
This is your take vs someone else's who may not agree that ~80% more power for ~20% more performance, or 40% more power for 7-8% gaming performance, is worth it.
I think it's absolutely good to cover efficiency as it matters a lot to me and others. And let people ignore it if they don't care.
Maleficent-Salad3197@reddit
They dialed back performance to prevent premature wear. ?????
peakdecline@reddit
What's your power cost? Unless its insanely high then no, that power increase simply doesn't matter. The heat generation is also not significant. For the vast majority of the world, particularly anyone buying a top of the line CPU, this increase in power cost is basically totally lost in how many cups of coffee you might drink in a month. Its nothing.
I don't think people actually care if it wasn't for the hyper fixation in reviews. I think its mostly a made up narrative largely used to fluff the amount content in a review. It isn't something we should ignore but the impact to the vast, vast majority of people is basically nil.
puffz0r@reddit
Efficiency absolutely matters to me, my apartment circuits aren't doing too hot and I can't go much over 800w on a single plug without tripping a breaker. My landlord isn't gonna pay thousands of dollars to rewire the place and I'm certainly not paying for it either.
rubiconlexicon@reddit
For me perf/W is the most interesting benchmark for new CPU and GPU launches because I feel that it's the true measure of technological progress. You can achieve more performance by throwing more transistors/die area/clock speed at it, but achieving more perf/W requires real advancement.
PastaPandaSimon@reddit
It absolutely matters for many reasons. Firstly, I'd rather have a single free coffee every month than a mere 17% faster MT compute. Secondly, I'm not eco-crazy, but Ibcare about the planet enough to feel guilty that I could've burned half the fossil fuels for nearly the same PC experience. Thirdly, many people use small cases, including ITX. It absolutely matters that you dump 80% more heat from the CPU into it, and few would choose to do it for just 17% more performance.
You're saying that you don't care about efficiency. The fact that reviewers care, users talk about it, businesses talk about it, and Intel itself made huge performance sacrifices to increase efficiency, suggest that people have many reasons to care, and it's not just a whim overhyped by reviewers. Again, users who don't care can absolutely ignore those charts like so many people already ignore pieces of information that are not relevant to them.
peakdecline@reddit
Pretending you care about this cost difference when you're buying a $500\~ USD CPU is the peak of what I'm getting at... I don't think there's a rational conversation to be had with those who have that mindset, frankly. Likewise the difference this makes to fossil fuels is a rounding error within a rounding error and you know this.
This is the peak of making a mountain out of a mole hill. This isn't remotely like cars because the actual impact here is a fraction of a fraction of a fraction of that. You could extrapolate your millions of users and that's probably less of an environmental impact of one dude deciding to delete the emissions on his diesel truck.
About the closest to an actual argument here is very compact PC cases but again... the real thermal differences here are not actually limiting the vast majority of ITX setups. I know, I've been doing ITX builds for over a decade.
PastaPandaSimon@reddit
I see, the issue here is that you've vastly underestimated the actual impacts of using less power efficient PC parts.
For example, 100 extra watts per PC, times say, just 10 million users, already means the global grid now needs an extra gigawatt worth of power.
It takes 1.8 million photovoltaic panels to generate this much power. It takes burning through 160 tons of coal in conventional power plants to produce a gigawatt of power for just one hour. Certainly many orders of magnitude more than one guy with a diesel truck.
peakdecline@reddit
Except you're suggesting this is happening 24/7. But its not. And you're using the absolute worst case scenario. Each of these people would have to be gaming 24/7 and on the most demanding possible scenario to achieve those numbers. And these are the peak numbers you're using, not remotely the real sustained load. The scenario you're painting is not remotely close to the reality.
PastaPandaSimon@reddit
No, I did not. Solar power plants also run for just a couple of hours a day at such an efficiency, which I thought was a good comparison. And the other comparison with it taking 150-160 tons of coal to absorb the 100W power increase of 10 million users was per hour.
If I assumed all of those users were gaming 24/7, they'd have burned ~3840 tonnes of coal in a single day. That'd be ~38 train cars worth of coal burned every single day.
peakdecline@reddit
And less than 1% of gamers are using the top end CPU. And their GPUs are making a bigger impact. Hell, probably their over use in number of fans is a bigger issue. Because in reality the CPU peak you're basing this on is never actually even hit in the real GPU bound scenarios the vast, vast majority of these will be in. Not turning off their monitors when they're not using their PCs is a bigger issue.
PastaPandaSimon@reddit
As you can see, you are vastly underestimating the issue to make a point that it's overblown, while it's most certainly not at all.
peakdecline@reddit
Yes, it does need to be the top of the line CPU. The argument was/is entirely "excess" usage. Good grief. If this is the direction you're wanting to take it then you should be on a crusade against all of gaming.
PastaPandaSimon@reddit
Excess usage does not mean what you think it means. The 14600k is a mid-range CPU that uses a lot more power than the 7800x3d for a lower gaming performance. Both are mid-range CPUs, yet one is far less efficient.
Also, now you're moving goal posts and jumping between extremes. Nobody is crusading against gaming. My entire point was that people should be informed if a CPU uses 40+% more power to reach a similar performance, or uses nearly twice the power for a small performance gain. My point was that it matters, against yours that it's irrelevant.
peakdecline@reddit
This discussion was very clearly centered around a specific comparison. Its you whose moved these goal posts.
And no, I'm trying to hold you to the ideology you want to profess. But you have shown repeatedly you're happy to draw the line to where you meet it. Convenient as always from those who argue along these lines. My stance is simple... none of this matters in any context that isn't taken to an absurd extreme, which is what you've done repeatedly to try to make your case. You're quibbling over a difference that doesn't matter to you in all reality and if it did you'd, as I say, make your crusade beyond the lines that you're comfortable in.
nanonan@reddit
It's very likely you can downclock the 9800X3D to get similar efficiency and still have a bump in performance, so I don't really see the problem. You can now choose, efficient, stock or overclocked.
PastaPandaSimon@reddit
I've got no problems with the 9800x3d. My entire point was that efficiency matters to a lot of people. Against the poster I was responding to saying that it's not something anyone should care about.
But I can also say that the overwhelming majority will likely use the 9800x3d as is, with no changes to its stock behaviour with whatever Mobo they get.
Fromarine@reddit
The average price per kWh in the US is about 15 cents meaning you'd have to run literally 100 hours of blender per month to even get to a $1.5 difference or in the actual use case 99% of people with the gaming cpu will be doing, GAMING. Over 200hrs just to hit $2
So no unless you game under full load (not just playtime) for 12 hours a day every day, youre not buying a coffee with ur savings every month.
Also do you guys not have solar panels what?
UGH-ThatsAJackdaw@reddit
Some of us have different use case parameters than you. All my electricity comes from solar, i'm off grid. My MiniPC also cares very much about thermal dynamics. Power efficiency is a deciding factor for me, and the difference in efficiency is probably going to make me stick with my 7800X3D. 11% gains in performance for 43% more power? no thanks.
peakdecline@reddit
If you're entirely off grid I'd say it matters even less. Unless you've severely under speced your solar setup then this difference doesn't cost you anything and its not enough to actually be an issue.
The "small case" argument matters some but its also not the issue multiple of you are making out. And for the record my last... well 10 years of PCs have all be miniITX.
Let alone the absurdity of why were you even considering the upgrade at all... you don't need an upgrade.
ProfessionalPrincipa@reddit
TPU tested on Windows 11 Professional 64-bit 23H2
INITMalcanis@reddit
Maybe. It's not just about spending a few extra £/$ a year to run the CPU (although Lord knows, that aint geting any cheaper). It also means you need a more expensive PSU, a motherboard with higher spec VRMs, a bigger and more expensive cooler, more case fans, and for a lot of people, more money running the A/C in the room the PC is in.
The reaction started because Intel were cheerfully selling CPUs that sucked down 300W (and at that rate the power bills can start to add up a bit)
Fromarine@reddit
No you still don't because basically every am5 motherboard that exists can handle an 8 core and if u only specced ur psu to handle the lowest power cpu we've had in like a decade than youre just a moron lmao.
Also the 9800x3d is literally easier to cool than the 7800x3d so literally every point you made is moot
INITMalcanis@reddit
Did you read the second paragraph of the post you're replying to?
Fromarine@reddit
What does that even mean. Its not 300w so what's ur point?
peakdecline@reddit
This difference is not nearly enough to cause the shifts you're suggesting it does. All the motherboards you would remotely consider for any of these CPUs has more than enough VRM headroom. Same with PSUs (I mean really... what GPU are you even pairing this with to act like you're going to need more PSU)...
This is precisely what I'm getting at. You're making this difference out to be a far more significant issue then it is in reality.
Sleepyjo2@reddit
Upwards of twice the power use depending on workload, 20-50% more in games, compared to a 7800x3d. It is a not insignificant drop in overall efficiency if that’s your concern. It wouldn’t blow out your bill but still.
Fromarine@reddit
The power use is still nothing compared to your gpu youre acting ridiculously short sighted with this argument
Sleepyjo2@reddit
Different things have different power expectations. The 7800X3D offers more than enough performance for all but a miniscule niche of people and it does so with less power. Only problem with it is the price spiking over the past year or however long.
There are, however, GPUs with quite low power requirements. The top 3 GPUs on Steam all use less than 200W, two of them are even close to 100W. Thats not what I have but thats beside literally any point in this entire conversation.
Note that I didn't say the 9800 is bad, just that the 7800 would be an arguably better product in literal response to someone worried about power consumption.
nanonan@reddit
The 7800X3D is insanely efficient for a desktop part. The 9800X3D isn't being pushed hard at all, it's being pushed the typical amount. The 7 series X3D is an exception, being clocked slower and having overcloking disabled to keep temps under control. You can always run the 9800X3D at slower clocks if you want to trade performance for efficiency.
Sleepyjo2@reddit
By "being pushed hard" I mean "outside an efficient voltage curve". Not detrimentally hard or unusually hard. Most chips are past the most efficient part of that curve because it makes for better marketing.
Its power increases are typically double its performance increases for the same core count. I'd love to see some undervolting numbers but I don't think theres any reviews out there that touched on that. (There are some that touched on extreme OC in which it puts out some wild numbers)
The 7000 series in general, barring the X chips, are all very efficient. It, and the pricing, was the whole center point of discussion around the 9000 launch for a reason.
Presumably the non-X versions of the 9000 chips would also be sat comfortably on that curve too if they ever release any, but the 3D now has to be tinkered with if you want to be closer to its predecessor.
There's nothing *wrong* with that. Just, once again, pushing the power makes the marketing better. Its a good chip but its also notably more power hungry in order to be what it is, this only really doesn't matter because it has no competition anyway.
Fantastic_Start_2856@reddit
Imagine buying a Bugatti and worrying about how much gas it consumes.
Seriously, what a stupid ass question
UGH-ThatsAJackdaw@reddit
Spoken like someone who doesnt pay their own way through the world.
NavinF@reddit
https://www.reddit.com/r/hardware/comments/1gkz6oq/amd_ryzen_7_9800x3d_review_an_actually_good/lvu7jxt/
zippopwnage@reddit
Yea, one cost 1 million dollars or more, and one is 500euro.
One can be achieved by being a millionare or billionare and one by saving money for a few months/year to build a new PC. If for you is stupid, good. I have to worry about electricity costs, especially during the summer when my AC goes on almost all day long in 40C+ on top of my wife's PC.
But sure, what a stupid question. I'm sorry i'm not as rich as you.
NavinF@reddit
If you can afford a $500 CPU, you can also afford to pay $30/yr in electricity. It's a relative comparison, not an absolute comparison to your personal wealth
SmashStrider@reddit
It's higher, but still far below other AMD non X3D and Intel CPUs in gaming. You will be fine.
laffer1@reddit
Some reviews show it slightly lower than the new Intel 285k or whatever it’s called. (With better performance of course)
Life-Duty-965@reddit
Presumably it only uses power when it needs it.
So if it's just idle on windows it's not doing a lot
But when you need power it will draw it
nanonan@reddit
The testing was in two games, 18:48 in the review.
Drakyry@reddit
You might wanna invest like 1 minute of your time into asking claude how much your appliances consume then
For reference the CPU's max power usage is 160 watts, that's the maximum, 99% of the time even in gaming it probably wont be using that much. Your kettle, when it's on, likely consumes about 2500 watts (that's 15 times more if you're not into maths). That's just for comparison.
In general if your flat has a fridge, and like a washing machine, and maybe if you're really advanced an AC then your pc would generally have negligeable impact on ur power bills
Mundashunda_@reddit
The power to fps is actually better than the 7800x3d since your get more frames proportional to the extra energy consumed
Atheist-Gods@reddit
It’s still an AMD cpu with far better efficiency than Intel CPUs. It’s just that its no longer power limited and this inline with non-x3d parts.
ByGollie@reddit
$20-30 yearly compared to a 13700K
Individually, not that critical.
If you had a data centre full of high-power drawing chips that need extensive cooling in turn, then the bills would add up
Hence a lot of cloud privers experimenting with custom ARM and RISC boards to evaluate.
lysander478@reddit
Depends on where you live I guess, but AMD's main issue is high idle power consumption as opposed to the power consumed while actually running which tends to be in a better spot and even then the cost of the idle consumption shouldn't be too huge.
Last I checked, something like a 7800X3D would end up costing me at most ~$20 more per year to run than a 13700K since power is cheap right now for me. From what I'm seeing currently, it looks like the 9800X3D actually should have slightly lower idle consumption than the 7800X3D and while its normal consumption is higher compared to the 7800X3D so is the performance so that kind of becomes a question of is it completing the task and going back to idle faster too. Or for something like gaming, if you cap the performance to a similar level it shouldn't end up worse than the 7800X3D either. Looks like TPU doesn't do a v-sync test for CPU power efficiency to check for sure, but I imagine it shakes out like that at least.
No_Share6895@reddit
its higher because its boosting for longer and getting more workdone
cookomputer@reddit
It's still top 2-3 with 7800x3d when it comes to fps/watt even with the slightly higher power draw
chaddledee@reddit
It's high only compared to 7800X3D, but it's still more efficient than a non-X3D chip, and miles ahead of Intel on efficiency.
cuttino_mowgli@reddit
So the flipped cache works!
INITMalcanis@reddit
>And sure, it does consume more power, but that's kinda expected considering the higher boost clocks.
And by recent standards it doesn't actually consume all that much power anyway. It's just that they 7800X3D is absurdly efficient. The 9800X3D consumes a similar amount to eg: a 5800X
a94ra@reddit
Tbf, zen 5 performance is higher in productivity stuff. Sure most of us gamers need gaming performance, but zen 5 actually produces significant higher performance in the server despite a bottleneck in cache. AMD probably think it s only minor sacrifice in gaming performance anw and they will unleash true gaming performance by slapping some 3d cache
Mako2401@reddit
I have a 7800x 3d and have become a preacher of the gospel of AMD. Truly a marvelous product, reminds me of the 1080 ti.
xdamm777@reddit
TBH if I were to upgrade from my 11700K today I’d get the 7800X3D. Mighty efficient and more than fast enough for my 4k120 gaming even with a crappy A620 motherboard shove it doesn’t need robust power delivery.
11700K on Z590 has been rock solid since 2021 though so I’ll keep using it until I truly need an upgrade around Zen 7.
Euruzilys@reddit
Last year I upgraded from the 9 years old i5 4690k to 7800X3D. It was really worth it, also unbottlenecked my 1080Ti lol.
xdamm777@reddit
That’s a mighty fine upgrade! 4th gen intel core were the goat though.
Euruzilys@reddit
I'm probably holding on to this 7800X3D until the last gen for AM5 platform.
Wanted to upgrade out of the i5 4690k for a few years back, but didn't want the hassle of buying new MoBo/ram/cooler. So waited more years.
Then 5800X3D shown itself to be legendary. And with 7800X3D coming our last year I just went with it after reviews. I hope it will last me for at least 6 years! Or however long AMD decided to support AM5 platform. I hope the MoBo can work with the last AM5 cpu.
BobSacamano47@reddit
This is ridiculous. This cpu will be remembered.
Euruzilys@reddit
AMD been cooking with 3XD, pretty much 5800X3D, 7800X3D, and 9800X3D are really good products!
ConsistencyWelder@reddit
I'm hoping everyone will have forgotten tomorrow, when I'll be trying to buy one :P
Nameless_Koala@reddit
this cpu killed Intel
milkasaurs@reddit
Well, I'm excited! Been wanting to upgrade out of my 13600k, so this looks like a good jumping point.
noiserr@reddit
This CPU is a love letter to the gamers. Especially like the frame per dollar chart.
A_Neaunimes@reddit
The intragen difference in gaming performance between the non-3D and 3D parts is really interesting from 7000 to 9000 : 7800X3D is +18% faster on their averaged results vs 7700X (while at lower clocks), and the 9800X3D is +30% faster vs 9700X (same clocks) ; that difference can’t be explained by the relative clock increase alone.
Also the fact that the 9800X3D is noticeably faster in many nT workloads (Cinebench, Blender, Corona) than the 9700X despite being identical down to the frequencies, save for the extra cache.
Really points towards a bottleneck somewhere in the Zen5 uarch that 3D cache alleviates.
venfare64@reddit
iirc, someone said that the IOD is the suspect of lackluster Ryzen 9000 uplift compared to 7000 series.
detectiveDollar@reddit
That explains why the Vcache was helping so much in workloads that were typically not cache sensitive like Cinebench. If the IOD is causing a memory bottleneck, the cache means the system doesn't have to pull from memory as often.
Also explains why Strix point's uplift was so much larger than desktop Zen 5, as Strix point is monolithic.
Rumors are that Zen 6 will be redesigning the IOD, so Zen 6 non-X3D uplift is going to be partially derived from that. In theory, AMD could redesign the IO die and launch it with Zen 5 on desktop, but I don't think they'll do it.
BlackenedGem@reddit
The big question really is whether or not the next gen IO die coincides with a platform change. There's some 'easy' wins for Zen 6 by redesigning the IO die and using N3E (probably N3P in actuality). But from AMD's perspective they'd prefer to do the IO die redesign with AM6 and DDR6.
Jeep-Eep@reddit
Eh, they may either course correct considering they're talking about AM5 having an AM4 level lifespan, and they may steal Intel's dual format idea as well...
BatteryPoweredFriend@reddit
There's still another option if AMD doesn't want to overhaul the IOD, at least for their 1*CCD variants, and that's implement the wide GMI link layout like they already do for the low core-count Epycs. It would increase the number of IF lanes to the CCD, so increasing its memory bandwidth.
Jeep-Eep@reddit
Yeah, but this suggests it may not be a choice.
INITMalcanis@reddit
Wendell from Level1Tech is banging this drum. It's one reason why - although I'm pleasantly surprised by the 9800X3D - I'm still holding out for the Zen6.
No_Share6895@reddit
man zen 6 with better IO die, cache on all 16+ cores... i may have to do it
INITMalcanis@reddit
And hey - if it's a flop, I can pick up a cheap 9800X3D!
lnkofDeath@reddit
it also indicates the 9950X3D could be incredible
porcinechoirmaster@reddit
I called this outcome a couple months back, even!
All of the core architectural changes for Zen 5 require the ability to keep the thing fed to benefit, and the IO die - which wasn't great for Zen 4 - was kept the same for Zen 5. That meant memory bandwidth and latency was going to be an even more pronounced bottleneck for desktop/game perf, ensuring that vanilla Zen 5 fell flat while Zen 5 X3D could really haul.
No_Share6895@reddit
yeah both teams launched with shitty io this gen. its just that one amd is willing to put extra cache on to help alleviate it. intel should have brought back l4 cache
A_Neaunimes@reddit
That’s also Steve’s hypothesis in this review.
CouncilorIrissa@reddit
Zen 5 is a much larger core. It's only natural that given the same memory subsystem it's much more memory bottlenecked than its predecessor.
HTwoN@reddit
nT uplift is due to higher power consumption. Efficient is lower.
A_Neaunimes@reddit
Higher power usage alone doesn’t increase performance. In the video Steve notes that the 9700X and 9800X3D run at the same frequencies in nT workloads, at least that’s what I understand from this section. He explains the higher power draw by the extra cache.
I couldn’t find 9700X vs 9800X3D frequency validations in Techpowerup or GN’s reviews, though maybe other reviewers have done it that I’m not aware of.
HTwoN@reddit
Higher power did increase nT performance in case of 9700x.
yflhx@reddit
Yes it uses more power and has better performance... But it runs at the same clocks, so it's not as simple as you make it out to be.
HTwoN@reddit
Same boost clock or same stable clock? Those are 2 different things.
yflhx@reddit
Why though? It's best to compare stock to stock behaviour.
HTwoN@reddit
To show the nT gain wasn’t from the cache. At least not the vast majority of it.
cowoftheuniverse@reddit
Clock+power+some ipc and possibly something else versus 9700x memory bottleneck caused by iod, and 7800x3d maybe power starved somewhat.
Aleblanco1987@reddit
IOD is fucked, that's why zen5 on server looks much better.
WarUltima@reddit
Higher boost clock due to higher power, is realizing the difference in benchmarks.
bctoy@reddit
And to think AMD still have the low-hanging fruit of going 16C CCD and improve the IO die or maybe even do custom chip without it along with CUDIMM 10GHz+
noiserr@reddit
They are also a node behind the competition. Another low hanging fruit.
NeroClaudius199907@reddit
26.5% vs 14900k? What the hell, thats super generational. X3d is too op
misteryk@reddit
Shitting on intel might be fun but I hope they'll cook something next gen, I don't want another GPU market situation
Aggrokid@reddit
Intel still has far larger x86 market share overall, especially in prebuilts and laptops. To reach that GPU market situation, it would take many generations of landslide AMD wins.
SmashStrider@reddit
True. Even if the 9800X3D does sell like hotcakes (which it will), it's going to be a tiny dent to Intel's overall market share, as deals with OEMs and prebuilts are going to carry the bulk of Arrow Lake's sales. However, it still sends a message to Intel, a message from AMD that says, 'Hey Intel, I'm coming for you, and I'm coming for you FAST.'
peioeh@reddit
It's not just the gaming enthusiasts that are switching though https://www.tomshardware.com/pc-components/cpus/for-the-first-time-ever-amd-outsells-intel-in-the-datacenter-space
Intel is still a massive company and they can come back, AMD managed to do it with Ryzen after being pretty much useless for a really long time. But they really need to come up with something special because they're just losing more and more battles right now.
t3a-nano@reddit
As a cloud infra engineer, AMD is a no-brainer when selecting server type.
Even AWS's info page just says it's 10% cheaper for the same performance.
You can get further savings if you're willing to re-compile your stuff for ARM, but switching to AMD is as trivial as doing a find-and-replace (ie m6 becomes m6a).
But AMD being "useless" was in part due to Intel pulling some illegal and anti-competitive shit (ie, giving deep discounts to companies willing to be intel exclusive), they got fined over a billion dollars for that shit.
I'll admit I do have a strong AMD bias, investing in them in 2016 effectively got me my house in 2020 (As a millennial in Canada, so no easy feat).
But my bias was also out of bitterness towards Intel as an end-user. If you wanted more than 4 cores, feel free to pay a fortune for the special X99 motherboard, or even their need to change the damn socket every generation.
peioeh@reddit
It was definitely a great time for consumers when AMD came back with Ryzen. After 10 years of not even knowing what their CPUs were called (do you know a single person who used a Phenom chip ? I don't) I was glad to go with them in 2019 and to pay a very reasonable price for a 6c/12t chip.
Which is why I hope Intel comes up with something if AMD keeps dominating for 5-10 years they will also start resting on their laurels and offering less and less value to consumers. Just like nvidia have been doing for too long now.
puffz0r@reddit
I used a phenom ;_;
Quantumkiwi@reddit
As someone working in HPC for a 3-letter acronym, every single one of our supporting systems (100s) in the last 2 years has had an AMD cpu.
The large clusters are a different story entirely and are about split in thirds between Nvidia ARM, Intel, and AMD.
olavk2@reddit
to be clear though, datacenter AMD is CPU+ GPU while intel is iirc CPU only, so not really a good comparison
peioeh@reddit
Good point, although Intel also makes GPUs :D
amusha@reddit
Nova lake isn't coming out until 25-26 so it's a long time before Intel can respond. But yes, I hope they can cook something up.
Geddagod@reddit
I would imagine it's going to be late 2026. Intel usually launches products in Q3/Q4. I wonder if the situation is dire enough though that they just rush development as fast as they can and get a RKL like situation where they launch it in the middle of the year, but given the cost cutting Intel is doing, they might not even have that option.
AK-Brian@reddit
I find myself wondering if they have anyone internally who has attempted to get creative with multiple compute tiles on an Arrow Lake class part (similar to how an alleged dual compute tile Meteor Lake-S prototype was floating around).
It wouldn't provide any benefit for the enthusiast crowd, but could at least give them a pathway to a decisive multi-threading win. At this point they'd probably take what they can get.
ClearTacos@reddit
With how good Skymont seems to be, an all-ecore compute tile with loads of cores could be very compelling for some use cases.
jocnews@reddit
2026, not 2025-2026
BeefistPrime@reddit
I agree, but it does make me wonder if they don't how long their mindshare can keep them afloat. Like in 2-3 years, if they can't even compete with what AMD is offering now, will they still control 70-80% of the market because of their contracts with suppliers?
Danishmeat@reddit
I don’t think they can bridge a 30% gap in gaming, that is further behind than Zen 1 was in gaming, which took until Zen 3 and a struggling Intel to surpass it in gaming. Intel needs to rethink its strategy and offer superior value as they can simply not compete well in performance
Fritzkier@reddit
agreed, tho it's not that bad for intel. they're still leading in Laptop against AMD and Qualcomm, their GPU project still looks promising too.
SmashStrider@reddit
Mostly Agreed. I was quite hopeful of Arrow Lake, but it ultimately ended up failing. Again, competition is always good for the consumer, and we should hope that Intel can get their shit together as fast as possible.
But, as some may say, one should also maintain realistic expectations, and deliver criticism where criticism is due. And right now, Intel has been making a TON of questionable decisions, which is why they are getting so much hate to begin with. You can argue that they might be getting more hate than they should, but there is a reason for everything.
But who knows? Maybe Panther Lake, 18A and Nova Lake can reverse this downward trend Intel is in.
NeroClaudius199907@reddit
Its not possible, amd will use 3nm and intel 18a best case scenario and intel still no 3d cache technology. Best thing to do is just to focus on laptops and consolidate power with oems
No_Share6895@reddit
heck they may not even need need 3d cache, bringing back l4 would be enough to make some of us at least happy
No_Share6895@reddit
high clocks, plus high ipc, plus thicc cache. intel needs to bring back their l4 cache if they want a chance anymore.
BlackStar4@reddit
I like thicc cache and I cannot lie, you other brothers can't deny...
viti---@reddit
I don't mean to derail this thread but these lyrics never made sense
What if you genuinely didn't like big butts? Now you have this man claiming, falsely, that you do like them but you're lying to yourself and your friends and family.
What if you had nothing to deny?!
Thaeus@reddit
stop denying
viti---@reddit
Slander! Lies and accusations!
I will not go down with this big butt ship!
OmicronNine@reddit
That's the best part, you don't have to!
She boyent. 😁
arguing_with_trauma@reddit
stop lying little bro
pmjm@reddit
A lot of simps won't like this song.
dragenn@reddit
My AM5 Don't... Want... None... unless you got cache hun!!!
Bonzey2416@reddit
Intel had L4 cache in i5-5675C and i7-5775C, which were great for gaming in 2015.
Onceforlife@reddit
What was the last gaming cpu from intel that had the L4 cache?
No_Share6895@reddit
broadwell
https://www.anandtech.com/show/16195/a-broadwell-retrospective-review-in-2020-is-edram-still-worth-it
and look how well it made that hold up.
Raikaru@reddit
That didn’t really make it hold up well though? Anandtech just doesn’t use fast ram
that_70_show_fan@reddit
They always use the speeds that are officially supported.
Stingray88@reddit
Broadwell, 10 years ago
INITMalcanis@reddit
It'll take more than that tbh. They'll have to do something like quad channel memory.
Fat_Sow@reddit
Cache is king
JensensJohnson@reddit
it comes down to which games the reviewers choose to benchmark with, if you pick enough games that like the x3D cache you'll see big gains, otherwise they'll be less impressive
Geddagod@reddit
That's like 2 generations of a lead AMD has in gaming pretty much tbh.
puffz0r@reddit
With Intel's generations that's like 5 generations of lead
OwlProper1145@reddit
9800X3D being able to maintain high clock speed helps a lot.
polako123@reddit
I'm swapping it in instead of 7700x on my b650 board, and im probably good for 5 years.
fatso486@reddit
*15
CatsAndCapybaras@reddit
With how video cards have been going, I fear you may be correct.
desijatt13@reddit
This is the one and only CPU one should buy for gaming. There is no doubt anymore. RIP Intel.
TalkWithYourWallet@reddit
This is wrong, not everyone needs a $450 CPU for a gaming PC. It depends on the total budget
There are plenty of other options for different GPU performance tiers. Such as the 12400F/5600 and 7500F/7600
DiCePWNeD@reddit
As always, it depends. But seriously, I'd rather buy this and get a 4080 vs a 7500f and 4090 which will be overkill for a 1440p/165hz display
WealthyMarmot@reddit
definitely true. Though I think these results are so good that a lot of used 7800X3Ds might become available at prices affordable to mid-market gamers, and that’s still an elite gaming chip.
desijatt13@reddit
Why would one look for this CPU if it is out of their budget. What I meant is even if you have infinite budget and you only want to game then there is nothing better.
basil_elton@reddit
If you have infinite budget, you buy a 5090 and pair it with whatever CPU can keep it from being underutilized for the games you play.
desijatt13@reddit
If I have infinite budget, no matter what GPU I get I would get this CPU if I only want to game and nothing else.
basil_elton@reddit
Game with a $2000 GPU at 1080p with avg FPS in the hundreds while still having higher system latency than using a higher resolution for better image quality and capping the frame rate below your monitor's refresh rate.
desijatt13@reddit
I do not understand if you are agreeing with me or not. I am not getting your argument.
basil_elton@reddit
The point of having a fast 'gaming' CPU is not to deliver the most average FPS but keeping the render queue fed.
Which basically means that you cannot have frame-to-frame latency shorter than your render queue.
I just tested this out on a lowly laptop with a quad core i7 and a MX450 using the Half-Life 2 lost coast benchmark.
Everything low/disabled except multi-core rendering at 1080p:
desijatt13@reddit
Thanks for the information. There is much more to a gaming CPU than just Avg FPS but is there any other CPU that is faster than this in keeping the render queue fed?
basil_elton@reddit
The render queue depends on the GPU and graphical settings, including resolution.
That's why the usual way of benchmarking CPU gaming performance tells fuck-all about your actual in-game experience.
desijatt13@reddit
I think these reviews show that if there is no GPU bottleneck, i.e. when newer GPUs release with better performance that cause lesser of a bottleneck at same settings and resolution as the previous generation the CPU will not be the bottleneck.
basil_elton@reddit
The bottleneck has to do with the CPU filling up the render queue at a rate faster than the GPU can execute each frame within the queue - GPU Bottleneck.
Or the GPU being so fast that it doesn't matter how long the render queue is because the time taken by the CPU to process a frame is multiple times the length of the render queue - CPU bottleneck.
It is not about faster, more powerful GPUs rendering the game at a higher average FPS.
There exist tools which measure these things nowadays. But reviewers use the same old stale methodology for game benchmarks.
desijatt13@reddit
So what are you trying to say. Is this CPU too fast for any current GPU in filling the render queue. Or is it too slow compared to the GPU.
I tried understanding how the render queue works and what I can understand is that it depends on CPU, GPU and the software how the render pipeline is filled. So how can one tool test general performance?
TalkWithYourWallet@reddit
When comments such as the below say:
desijatt13@reddit
I will try to be as clear as possible next time.
Kiriima@reddit
If you play AAA games in 4k then staying on AM4 platform, buying 5700x3d and just pouring everything into a GPU is what you should do.
virgnar@reddit
Unfortunately for those wanting to play Monster Hunter Wilds, this looks to be the only viable CPU to own.
344dead@reddit
I think it depends on what type of gaming you do. I mainly do 4X, colony builders, city builders, grand strategy, etc.. This is going to be a great update for me from my 5800x. Stellar is is about to get bigger. 😂
NeroClaudius199907@reddit
If you can afford 7500f you can afford 4090.. I mean 9800x3d
Brawndo_or_Water@reddit
Good thing we don't all only game in 1080P.
desijatt13@reddit
Is there any better gaming CPU at 4k?
Brawndo_or_Water@reddit
I don't want to lose all the productivity power just for a small gain in gaming. I would be jumping all over that CPU from my 13900KS if it was a bit better at everything else but I understand that people who only game, this is the one right now if you are due for an upgrade.
desijatt13@reddit
Okay but I was never referring to your use case. I was talking about only gaming. If you were building for pure gaming and nothing else then would you still choose something over this?
szczszqweqwe@reddit
It's the best, but not only, you wouldn't put a 480$ CPU in a 1000$ PC, right?
baskinmygreatness@reddit
Some of us game at 4k where the cpu barely matters and we can use higher multicore performance for productivity but dont let me interrupt your circle jerk
desijatt13@reddit
I explicitly said only gaming and nothing else. But go with whatever makes you happy.
scytheavatar@reddit
Can someone explain to me why AMD has a habit of cherrypicking and overpromising when they have a bad product but sandbag and underpromise when the product is actually good?
RedditorWithRizz@reddit
Marketing
deh707@reddit
Let's say I have 2 PCs.
PC A) 7600X + 32gb ddr5 + Rtx 4090
PC B) 7600X + 32gb ddr5 + Rtx 4080S
If I upgrade PC B's CPU into the 9800X3D - Would it come short, match or exceed the gaming performance of PC A?
RedditorWithRizz@reddit
On 1080p you would be CPU bottlenecked cuz of 7600x so 9800x3D + 4080S would be more or less perform better than PC A (1% lows and AVG framerate)
On higher res, PC A takes the cake
Substantial_Lie8266@reddit
Slower than my 14900k in windows 10. For example you lose 70fps in Cyberpunk 2076 by going from Windows 10 to Windows 11 24H2. There is something seriously broken with Windows 11 on Intel CPUs.
RedditorWithRizz@reddit
Something wrong with your windows installation
Fresh-Ad3834@reddit
LUL 14900K
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hey NoStructure5034, your comment has been removed because it is not a trustworthy benchmark website. Consider using another website instead.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hey Yebi, your comment has been removed because it is not a trustworthy benchmark website. Consider using another website instead.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
b-maacc@reddit
Username checks out.
etfvidal@reddit
Does AMD even need a market/sales team to sell this CPU?
0gopog0@reddit
Yes because mindshare and brand recognition is a hell of a drug
COMPUTER1313@reddit
"Nobody ever got fired for buying IBM" was very much true until IBM themselves threw in the towel for the hardware market.
A coworker told me about how their previous company used to be an IBM-only shop until they literally couldn't buy new IBM hardware (Token Ring network cards and cables to match Gigabit Ethernet, laptops, etc).
0gopog0@reddit
True, though IBM's circumstances were a bit different.
Even with the problems, in my eyes Intel is still in a better spot (on the CPU front) than AMD was during the worst of bulldozer. Between AMD not being a great company to work with according to some OEM's and fewer number of chips, Lunar Lake being decent enough to stave off any inroads from Qualcomm, and being in a stronger market share position, they still have a stumble for a while yet before I'd say they were out of it.
Zacisblack@reddit
A really bad day for things that are Blue.
Qaxar@reddit
As some reviewers have noted, this chip proves how Zen5 is hamstrung by its I/O die. AMD could release a Zen5+ with no change other than the I/O die and would result in great uplift. They do that next year and push desktop Zen6 to 2026. It's not like Intel is catching up anyway.
desijatt13@reddit
These reviews have shown with the uplift of 9800x3D over 7800x3D that Zen 5 has huge potential and is held back by maybe I/O die or something else that we are not sure about. If AMD puts 3D V-Cache on both the dies of 9950x3D maybe we will get a true monster in gaming and productivity. Maybe 15-20% better than 7950x3D in productivity only and similar to 9800x3D. One can only hope.
IJNShiroyuki@reddit
How are they going to name it? 9950X6D?
szczszqweqwe@reddit
Yup, they got my hopes high for ZEN6.
Beautiful-Active2727@reddit
I think this will happen only on zen6 with new packaging and IOD
AnthMosk@reddit
:-( when will I be able to afford this?! Do we ever see it sub $400 in the next 6-12 months?
CatsAndCapybaras@reddit
Likely. 78x3d was top for gaming until this, and it fell from $450 to ~300. I bought one at $350 in January.
Even though it doesn't really have competition in gaming, the $480 gaming CPU market is only so big. They will have to drop the price after that market is tapped.
RedTuesdayMusic@reddit
There's also those of us in that market waiting for the 9950X3D
WealthyMarmot@reddit
what do you have right now? Secondhand 7800x3ds may become a legitimate option if enough high-end folks upgrade
AnthMosk@reddit
2080ti and 8700k so I would need a compete system overhaul
nanonan@reddit
This isn't going down in price any time soon, so when you learn to save.
SJEPA@reddit
It won't be sub 400 for a while. This thing is going to sell really well as there's literally no competition.
veryjerry0@reddit
Although others have cited what has happened to x3D chips historically, I think this one is quite a bit different since AMD is clearly in the lead thanks to 24H2 improvements and actual hardware wins. It even has much better production capability this time. If it sells like a hot cake, which is likely, I don't see them lowering the price.
conquer69@reddit
I don't think so. There is no cheap 7800x3d stock anymore.
No_Share6895@reddit
most likely. probably within 6
AnthMosk@reddit
Fingers crossed
PiousPontificator@reddit
I don't think you should be concerning yourself with this purchase if $80 is what makes or breaks being able to purchase it.
Darkomax@reddit
I would have said yes if AMD wasn't now 2 generations ahead of Intel in gaming (or rather Intel went back one gen), with absolute no contest, Idk if 3D chips will lower anytime soon or as low as it used to.
TrantaLocked@reddit
Techpowerup found a 65w average / 75w PBO for their gaming tests yet we're seeing about 90w for gaming here. Why the discrepancy?
1234VICE@reddit
Looks like most gains vs 7800x3d could be explained by high clock frequencies by an improved thermal design for an increased power budget.
nanonan@reddit
Seems as it is also allowing the strong >10% ipc uplift seen in other workloads to be effective in games.
lintstah1337@reddit
Is the performance uplift from 7800X3D due to the 200MHz higher boost clock? If so could you get the same performance if you overclock 7800X3D with a mobo with external clock generator?
autumn-morning-2085@reddit
No it isn't, the cache just allows the Zen 5 cores to express its ~12% IPC gain. Ofc a better IO die would likely improve things even further.
nanonan@reddit
Will be interesting to see, but yeah I suspect at equal clocks there will still be a strong advantage.
detectiveDollar@reddit
Iirc Zen 6 is rumored to redesign the IO die, so that will give an uplift next time too.
TheAgentOfTheNine@reddit
ipc uplift too. 200MHz is less than 5% increase in performance.
lintstah1337@reddit
It turns out 9800X3D actually has 400MHz higher sustained max boost clock than 7800X3D.
https://www.youtube.com/watch?v=s-lFgbzU3LY&t=367s
Lenininy@reddit
Worth the upgrade on 4k? I get why the benchmarking process uses 1080p for isolating the performance of the cpu, but practically speaking for 4k, what is the uplift vis a vis 7800x3d?
nanonan@reddit
Only in specific cases, likely MSFS will benefit for instance.
Framed-Photo@reddit
No.
Slafs@reddit
Are you actually playing at native 4K though? Many people who have a 4K display, myself included, use a lot of upscaling, so while it isn't exactly 1080p it's closer to 1080p than 4K.
EnsoZero@reddit
Better to save up money for a GPU upgrade than it is to upgrade CPU at 4k, and even for most 1440p titles on max settings.
Qaxar@reddit
If you plan to g6et RTX 5000 series card then it's absolutely worth it since GPU will be less of a bottleneck and CPU performance will have a much larger impact.
funny_lyfe@reddit
At 4k you could probably get by with a 9700x and not feel that much of a dip.
Only_Marzipan@reddit
Not at all worth.
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
inyue@reddit
My 1270k at 89% at 1440p while paired with a $1600 gpu... I guess I'm fine with my 4070ti right?
Z3r0sama2017@reddit
Depends on the game. You a generalist? 7800x3d good enough. You play lots of sims that hammer cpu even @4k? 9800x3d no brainer.
RainyDay111@reddit
According to techpowerup at 4K with a RTX 4090 it's 0.3% faster than 7800X3D and 2.1% faster than 5800X3D https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
baron643@reddit
not worth the money
Ravere@reddit (OP)
I LOVE how he isn't just not standing, he is laying down on the sofa!
nanonan@reddit
Gonna need a hammock for the 9950X3D.
From-UoM@reddit
Excellent gains vs 7800x3D
One minor gripe is ~50% additional power usage. Which makes it less efficient than the 7800x3d. Still far below anything intel has
nanonan@reddit
Put on Eco mode if you want lower power.
Aussie_Butt@reddit
Excuse my ignorance, but are there any gains to be had vs the 7800x3d at higher resolutions? Or is just pretty much negligible due to GPU bottlenecks? Every review I see is only looking at 1080p differences.
WealthyMarmot@reddit
The 1% lows are significantly better in many games, which means a noticeably smoother experience in practice and IMO is sometimes even more important than average FPS. Even seriously GPU-bound games like Cyberpunk at 4k often suffer from intermittent CPU bottlenecks that show up as microstutter, so players at all resolutions should see benefits.
Shrike79@reddit
Yes there are gains.
When reviewers do 4k native benchmarks they usually put everything on max/ultra settings to isolate GPU performance as much as possible, turn it down a notch to high or use optimized settings then you shift that burden to the cpu, increasing fps.
Then there's upscaling which makes the game render at a lower internal resolution, shifting even more of the burden to the cpu.
In short, having an underpowered cpu because "it doesn't matter at 4k" means that you'll remain bottlenecked even as you turn down settings and use upscaling.
AK-Brian@reddit
It's also highly dependent on the game in question, something which seems to escape an alarming number of people. If you're playing iRacing, ARMA 3, Squad, DayZ, GW2, Baldur's Gate 3, Anno 1800, Civilization, Stellaris, FF XIV, WoW, MSFS 2020/2024, Tarkov or a number of other titles, the CPU is almost always the limiting factor - even at 4K. The larger cache does wonders for spaghetti code engines. In the case of something like iRacing or MSFS, a faster CPU also allows you to maintain greater detail for things like mirror view or multi-viewport rendering for VR or surround setups.
It's fair to state that a faster CPU generally won't help most people at higher resolutions, I just wish people would ask the important question when it comes up - "What are you wanting to run?" :)
Shrike79@reddit
Sure, if you're on a budget or are building something to run a specific title you should get what makes sense for what you want to do. Generally speaking though if someone just wants the best performance possible for a variety of current and future releases then you can't really go wrong with a 7800x3d or 9800x3d.
I mean even the 5800x3d is still doing really well, especially on titles that really love v-cache.
mauri9998@reddit
No there are not. I genuinely don't know why this subreddit focuses so hard on gaming benchmarks for CPUs when the bottleneck is gonna be the GPU in 90% of situations.
Wild_Fire2@reddit
Reviewers who use gaming benchmarks typically set the graphics to the lowest setting @ 1080p, removing the potential for GPU bottleneck.
I know that LTT's review has 1080p low settings for their review, along with running a few benchmarks with graphics maxed out at 1440p and 4k. As expected, the 1080p low test shows big differences with the CPUs, while the 1440p and 4k tests reveal little difference between having a 5800x3d, 7800x3d or 9800x3d.
CatsAndCapybaras@reddit
Depends on what games you play. For the vast majority, you will be GPU bound at 4k. There are some games that are CPU bound even there, but very few.
From-UoM@reddit
Close no gains 4k as you are GPU bound in many games.
The rtx 5090 might have something to say though lol
Aussie_Butt@reddit
lol yeah true, thanks for the reply
cookomputer@reddit
How are the temps? Does it run hotter since it's using more power
WealthyMarmot@reddit
It uses more power, but it has a much easier time of actually getting that heat to the heatsink now. Even an average cooler should be able to handle the increase without even bumping the fan speeds much.
ManWalkingDownReddit@reddit
They've shifted the cache from top to below the cores so heatsink is in direct contact with the die so it runs about the same
Wild_Fire2@reddit
It runs cooler, actually. At least, that's what the LTT review showed.
FuzzyApe@reddit
Much cooler. Der8auer review shows improvements of around 20 degrees kelvin. It has excellent temperatures
ffpeanut15@reddit
It runs even cooler than Zen4 now. The new cache design makes it much easier to cool, even at higher power usage
SmashStrider@reddit
Power usage isn't too big of a problem. It's still well below most parts, and it has a good generational gain. It was to be expected though, since it did increase clocks mainly, and Zen 5 isn't much more efficient than Zen 4 in gaming, if not the same efficiency.
ATangK@reddit
Definitely not a big problem when you consider intel exists. And that these are desktop systems at the end of the day.
SmashStrider@reddit
Exactly. Power consumption isn't really a problem at all in desktops unless it's like more than 50-100W higher. It's likely not going to add all that much to your electricity bill. Power consumption more so matters in Mobile and Servers. In desktop, power consumption should be used as a metric for judging how good an architecture is.
detectiveDollar@reddit
It's mainly because the Gen1 3D cache forced them to use more efficient voltage/clock targets since the structural silicon sat on top of the cores.
You can dial this one's clocks back and get a more efficient part than the 7800X3D if you want.
WarUltima@reddit
The efficiency still beats Intel alternatives. So I wouldn't call it bad.
TheForgottenOne69@reddit
I’ve not followed amd that much but with the rumors and the uplift this processor has compared to the non x3d variant, would the 9950x3d with 3D cache on the double ccd would finally increase performance compared to this one?
Weary-Perception259@reddit
11% gains over 7800x3d is decent. Price is a bit painful though. The real winners are those who got a 7800x3d for £300 last summer.
50% more money now for 11% more frames isn’t looking so hot.
WealthyMarmot@reddit
I think the gains in 1% and 0.1% lows are more meaningful. If the 7800X3D have a weakness at all, it’s the occasional microstutter in some games when the cache couldn’t make up for the low clock speed. So that’s what I’m more excited about.
gobaers@reddit
The price performance ratio isn't linear, it's exponential. There's always a premium to climb to the higher tiers.
But you're totally right: that $330 7800x3D is looking really sweet.
Firefox72@reddit
A complete stomp across the board.
retiredwindowcleaner@reddit
i hope they can use this momentum to do similar stomping of nvidia now. and i don't mean in the ai/dl sector but for gaming at least.
although afaik the fastest supercomputer runs on tens of thousands of radeon instincts actually...
Artoriuz@reddit
AMD GPUs aren't bad for compute, their software ecosystem just can't match Nvidia's.
retiredwindowcleaner@reddit
as i said... the fastest cluster computer on earth currently runs on amd gpus.
i think the software ecosystem is actually only really relevant for the soho sector, since real heavy load compute mostly has a full framework written specifically for the workloads.
Difficult-Alarm-3895@reddit
i think you're looked at outdated stuff, elon has made a super cluster recently (xAI Colossus) with 100000 H100's which should be at least double the flops of the current radeon cluster with only 36500 gpu's
SirActionhaHAA@reddit
u/eight_ender , u/yflhx , u/battler624, u/battler624
9800x3%d bros where we at? This didn't meet the 3% expectations.
SirActionhaHAA@reddit
u/eight_ender u/yflhx u/battler624
9800x3%d bros where we at? This didn't meet the 3% expectations.
yflhx@reddit
I never said it's going to be just 3% faster, if you had any decency to quote me instead of implying I said something I didn't.
That being said, I did indeed say it's unrealistic to expect 10% or more, and obviously I was wrong.
Woodworkingbeginner@reddit
Damn that is a nice release. Its always nice to see a good product raise the bar.
Snobby_Grifter@reddit
Intel engineers should be embarrassed. This is the third generation of 3d cache, and the largest gaming deficit between competitors since Skylake vs bulldozer. I wouldn't even take a 285k for free at that this point.
tangosmango@reddit
Any reason to upgrade now from a 7700x? I was initially holding to upgrade to the 5090 and the 9900x3D or 9950x3D.
I'm running AW3423DWx3 so I'm not sure if the 9800x3D will even benefit me all that much.
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hey detectiveDollar, your comment has been removed because it is not a trustworthy benchmark website. Consider using another website instead.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
LordMohid@reddit
Intel was only down, they need to start digging their graves, if not already done
Roseking@reddit
I am going to have to go complete zen mode to not impulse buy this.
This is a slaughter.
SJEPA@reddit
Why go Zen mode when you can go Ryzen mode? 😏
LightShadow@reddit
How can I justify the 7950X3D -> 9950X3D for work...all that sweet sweet "productivity."
letsgoiowa@reddit
You'd be going complete Zen mode either way :P
Roseking@reddit
Genuinely unintentional.
It's a sign.
stesha83@reddit
Argh, no 4K benchmarks. I need to know if it's worth upgrading my 5800X when I mostly play at 4k native (occasionally DLSS quality). Because it entails upgrading the motherboard and RAM too.
oup59@reddit
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
stesha83@reddit
Thanks, that’s awesome. And confirms it’s probably not worth the upgrade yet unless I come across an extremely CPU bound game or I play to play everything at DLSS performance (1080p-ish)
oup59@reddit
Looks like that way. I am building a new rig for 4K and honestly 7600X/9600X just work for me but I want this haha :)
stesha83@reddit
Nice. Just wondering why my comment saying “no 4K benchmarks” is being downvoted when the video has no 4k benchmarks in lol. Reddit: downvoting reality.
Paulpanzer32@reddit
just check your GPU usage once in a while in game, if it's dropping below \~90-95% it's likely bottlenecked by CPU, otherwise wouldn't worry about it!
wizfactor@reddit
The numbers don’t lie:
Crocodile Dundee cache layout is the best layout.
AK-Brian@reddit
Reverse 3D V-Cache. The Thunda Down Unda.
broken917@reddit
Wow... that nearly 30% against the 14900K actually means that Intel will probably need 2 gen to beat this one.
Danishmeat@reddit
And that’s if AMD stands still, which they probably won’t do
Brawndo_or_Water@reddit
A version of the X3D perhaps, at 1440P and 4K the 14900K is not that far, and it's better at everything else than gaming.
ConsistencyWelder@reddit
They need to stop regressing in performance first. That should be step 1.
broken917@reddit
Yeah, i should have said 2 actually good generations.
Silver-Substance-692@reddit
All these reviews are ridiculous, I don't care about the frames obtained at 1080p, even 13600k, it won't bottleneck with the new 5090, so everything they say for me is equal to 0, I'm interested in how stable it is and the most important thing is how compatible it is with nvidia cards, because I suspect that if it dominates the market, things will not be too good for those with nvidia cards and at the moment I don't even want to hear about AMD video cards.
So think carefully and take these aspects into consideration
Kurtisdede@reddit
...why?
shitrod@reddit
This looks amazing but I'm gonna hold out with my 5950x for as long as I possibly can.
Ploddit@reddit
Well, good to know buying RAM faster than 6000 is completely pointless.
MobiusTech@reddit
Amd fuckin killed it… holy shit.
SmashStrider@reddit
Killed Intel? More like bulldozed through them (pun intended)
AveryLazyCovfefe@reddit
Makes the arrow they took to their knee look just fine.
ConsistencyWelder@reddit
Makes the memory of 13th and 14th gen high end CPU's degrade a little.
ADtotheHD@reddit
Can’t wait to see if they do X3D cache on both ccds of Ryzen 9 versions.
ConsistencyWelder@reddit
They say they're going to provide Vcache on Threadripper soon, and we know they're not just gonna put it on one CCD...
Fixer9-11@reddit
Well, Steve is sitting comfortably and not standing so I know that it's gonna be good.
szczszqweqwe@reddit
He is just playing with us at this point.
ConsistencyWelder@reddit
And that couch he's reclining on was probably a hassle to get into his studio. Worth it though, it's a funny gag.
AK-Brian@reddit
He's earned a good, relaxing stretch.
lnkofDeath@reddit
9950X3D looks to be an incredible product
InAnimaginaryPlace@reddit
Do we know want time these get listed? Or is it just being around tomorrow at the right moment?
Omniwar@reddit
Newegg is 6am Pacific tomorrow, would assume it's the same at the other retailers. Doesn't mean someone won't jump the gun and list them at midnight though.
InAnimaginaryPlace@reddit
Great, thank you. That's really helpful.
detectiveDollar@reddit
Usually the review embargo is 24 hours before the launch, so probably 9AM
InAnimaginaryPlace@reddit
Thanks, that's helpful.
bimm3ric@reddit
I wish you could just pre-order. I've got a new AM5 build ready to go so hoping I can get an order in tomorrow.
VanWesley@reddit
Now to wait for that Microcenter bundle to drop.
conquer69@reddit
Steve is more excited about this than I am. It's less power efficient when unlocked like this and the performance gains aren't near to previous zen generations.
DeathDexoys@reddit
Intel slaughtered, bulldozed, destroyed and straight up stomped in gaming
Amazing results and the 12 and 16 core part might be something to look forward to
WTFAnimations@reddit
Just gonna stick this in here...
ResponsibleJudge3172@reddit
Well, well, we'll, X3D deserves to be called 2nd gen this time
karatekid430@reddit
AMD has been making good stuff for a while now. Intel on the other hand....
Mas_Turbesi@reddit
Pretty good, but I gonna keep my 7800x3d till AM5 EOL
oup59@reddit
I think I don't need this for my new 4K gaming rig but I may just deploy this with an X870E and forget about 4-5 years.
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hey Z3r0sama2017, your comment has been removed because it is not a trustworthy benchmark website. Consider using another website instead.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
szczszqweqwe@reddit
8% crowd, where are you guys?
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hey AdElectronic822, your comment has been removed because it is not a trustworthy benchmark website. Consider using another website instead.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
Beautiful-Active2727@reddit
zen 6 looking even more interesting now since AMD said it will use a new packaging and IOD(maybe the 8 + 16c best gaming and productivity cpu).
nismotigerwvu@reddit
I think this bodes well for future Zen generations. It shows both just how much the changes in Zen5 raised the performance ceiling and, just as importantly, where they are all bottlenecked at.
chown-root@reddit
I'm never going to be able to buy one of these. Damn it. Too good. Bullshit.
tvtb@reddit
I’m glad we have a strong product coming from AMD. I would also very much like it, for the USA’s national security, for Intel to remain a competitive chip designer AND fabricator. Intel is circling the drain right now and they need to right the ship. Or we’re gonna be beholden to TSMC which is going to be controlled by China probably within the next 4 years.
el_pinata@reddit
You served well, 5800X3D, but it's time for the new shit.
throwawayerectpenis@reddit
Holy shit, the madmen at AMD actually did it 😲
elbobo19@reddit
finally a good piece of hardware this year. Also really curious to see what the 9900x3d abd 9950x3d can do
DeeJayDelicious@reddit
Happy HUB?
What year is it?
bushwickhero@reddit
Can’t wait to upgrade from my 9600k early next year.
Mordho@reddit
I don’t even want to think about how expensive the 9950x3D is going to be 😭
TopdeckIsSkill@reddit
Great product, but I think I'll just upgrade my 3600 to the 5700x3d that cost 220€ since I'll only play on 4k.
The difference should be 5% at most
No_Share6895@reddit
Holy shit... amd fuckin killed it.
Zerasad@reddit
I'm willing to eat my words here. I expected another flop, but somehow AMD pulled it off. Hats off.