Reminder that CPU benchmarks are run at 1080p low in order to remove GPU bottlenecks. Make sure to look up your "real world" benchmarks as well.
Posted by dealerofbananas@reddit | hardware | View on Reddit | 127 comments
9800X3D crushes the 5800X3D across the board in pretty much every benchmark.
Does that mean an AM4 user should look to upgrade? Probably not since if you bump up the resolution to 1440p ultra, or even 4k, the performance differences are miniscule and you'd be much better off just saving for a next gen GPU.
A graph saying +20% across the board will not be even close to that in real world usage for the majority of users.
There are other reasons to upgrade such as productivity, "future proofing," better thermals, etc. so take all of those into account. Do not fall for performance hype when in a most situations it'll be negligible at best at the target audience for a high end CPU(high resolutions, ultra settings).
Onceforlife@reddit
Man I play at 4k 120hz and I was doing it with 4090 and a 12700k, no games I played was limited by the cpu it was either the gpu or the refresh rate of my monitor (LG c4 42inch). But some dumbfucks just can’t understand that. Like maybe in the games you play I’m bottlenecked but I was not bottlenecked in my games with the settings I use. Why is it so hard to understand??? Everyone jumps on the keyboard and wants to tell me I’m an idiot. At least now I know some people at least in this sub understands.
I’ve since upgraded to 7800X3D because it was a good deal from AliExpress back in may or earlier this year. I don’t think 5090 will be bottlenecked by it unless I switch to 4k 240hz. Even then I doubt it because I’ll be playing the latest games at 4k ultra without DLSS which will shred the gpu to pieces.
Framed-Photo@reddit
For reference, the 12700k performs within 4 FPS on average of the 9800X3D at 4k. Unless folks are literally only playing strategy games or other games that are gigantic outliers in terms of how CPU limited they are, then upgrading from the 12700k just for gaming at 4k doesn't make much sense.
Even when new GPU's come out, it would have to be so much faster than the 4090 it would be at least a few generations out.
Techpoweredups 720p average for the 9800X3D with a 4090 is 231, at 1080p it's 200, and at 4k it's 101. We'd need to see at least a 50% boost over the 4090 across the board, probably more, to start seeing significant GPU bottlenecking at 4k. And even at 720p the 5800X3D is just 22% behind, getting 182 on average.
inyue@reddit
How much did you pay for your 7800 3d on AliExpress?
Onceforlife@reddit
365 Canadian or I think around 260 usd, this is after taxes and shipping
inyue@reddit
Pretty cheap o.o
Did you randomly find it or is is there a price alert or deal alert subreddit or discord?
bestanonever@reddit
But also, remember that as soon as we get more powerful GPUs, the difference between these CPUs come back in higher resolutions, too. The current fact that there's barely a difference at 4K is not set in stone. The 720p and 1080p results are much more representative of what we are going to get in a few years from now, with more powerful GPUs.
And somebody that's buying the very best gaming CPU, it's not that far removed from upgrading to the very best graphic cards from Nvidia or AMD as soon as they drop, and they will be better off with a 9800X3D than the 5800X3D. Hell, if you are building the PC right now, just get the 9800X3D instead of the 7800X3D, it's worth it (if you are at that CPU-range of performance and budget).
It all comes down to how often you upgrade and how powerful your GPU is. Have a good GPU with the 5800X3D or regular Ryzen 7000 series and will ride both of them until the end of time/next platform reveal? Good, you don't need the upgrade. But if you are one to upgrade your GPU every two or three years, and particularly at the high-end, get the best CPU you can now because that performance margin will come back in full force.
On the other hand, there's nothing wrong in using a 5800X3D/5700X3D until all your games drop below 30FPS or whatever. Just update when you need to, not when there's something new around (there's always something newer around)
masterfultechgeek@reddit
So yes but also... no.
https://tpucdn.com/review/amd-ryzen-7-9800x3d/images/average-fps-1280-720.png
The cheepo 5 year old R5 3600 is still able to feed a 4090 with over 100FPS at low resolution.
Arguing "I'll just future proof" is often questionable... the performance levels we've gotten in the last 5 years have just been... awesome.
It'll probably take a new gen of consoles before CPU starts to matter much again and even then...
bestanonever@reddit
That's the thing. Most people don't upgrade their CPUs as often as their GPUS, so why wouldnt you get the best you can now and forget about it for a good bunch of years?
Taking a lesson from history, Sandy Bridge was a great buy...at the high-end. I3 Sandys didn't age as gracefully as the i7 2600K/2700K did.
Framed-Photo@reddit
Because the 9800X3D costs $500 dollars and is not twice as good as chips that are less then half the price of it?
Yeah it's a good chip, but it's nowhere close to being the best value on the market, not even in the AM5 lineup.
Likewise to your example, the Sandy bridge i3's were not the best value in those lineups, and the i5's and even the i7's if memory serves, did not cost $500. The 2700k MSRP'd at like $350 or something and that might be too high.
bestanonever@reddit
That's the part of "The best you can" I wrote about, lol. Not everyone can afford or want to get the very best gaming CPU there is.
If it's between the 7800X3D and 9800X3D, there isn't as big of a difference in price, so you might get the 9800X3D while you are at it, but if you are more on a, say $250 budget, then a 7700 would be more forward looking than going all in with any AM4 CPU that isn't X3D, for example, even if the rest of the platform is cheaper. Hell, you might even get a nice high-end Intel's 12th Gen CPU as a compromise in-between and use the savings for a better GPU.
It's just that some guys are talking like "There's no difference at 4K, don't get the best" and that's misguided in so many cases. Just a touch of extra care about the CPU you get can pay gaming dividends down the line.
masterfultechgeek@reddit
Is it fair to claim your argument is "The CPU matters so little in gaming that people don't bother to upgrade it. Because it matters so little you should spend more money now to get something that's overkill."
If so, that doesn't make a ton of sense.
Even in your i3 Sandy Bridge example, there's NOTHING that stopped people from plopping in a $50ish i5 or i7 off of ebay later on.
I wouldn't have recommended an i3 at the time (not enough performance for "basic use"), but the main choice would've been i5 vs i7 for most enthusiast.
A 2500k was still usable 5 years later though it was showing its age.
There's nothing stopping people from upgrading their CPUs. I went from a 1700 ($300) to a 3900x. This was NOT for gaming reasons though. If I wanted to I could 2x the MT performance by plopping in a 5950x or 5900XT but I don't need to at this time. The 3900x gets about the same gaming performance as the 3600 which was WAY cheaper. In that specific case, "future proofing" did nothing for gaming (though it was good for productivity).
FrewdWoad@reddit
>Even in your i3 Sandy Bridge example, there's NOTHING that stopped people from plopping in a $50ish i5 or i7 off of ebay later on.
Exactly what I did. By the time an i3 wasn't close to the i5 in games anymore, the i5 was literally $30.
Future-proofing is always a fool's errand.
bestanonever@reddit
That's not exactly what I said. CPU matters, and the more powerful it is today, the longer you can keep it!
As I said before, people (not everyone, of course) upgrade CPUs way less often than GPUs. So, it makes sense to have as much CPU as you can. Within reason, if you are buying GPUs in the range of the RTX 4060, you shouldn't be getting a 9800X3D, lol, but maybe a Ryzen 5 7600 is a better buy than a R5 5600, because the former will have more power later on, even if you upgrade to another budget GPU, like a potential RTX 6060.
masterfultechgeek@reddit
How is it different?
And how does CPU matter?
https://tpucdn.com/review/amd-ryzen-7-9700x/images/average-fps-3840-2160.png
looking at this the difference between a top of the line CPU and a meh CPU is only a few FPS and the frame rate is STILL 100+
Does 3 FPS matter?
---
>As I said before, people (not everyone, of course) upgrade CPUs way less often than GPUs.
And why don't CPUs get upgraded as much? Explain why people don't think a CPU upgrade would be worth the performance uplift.
>but maybe a Ryzen 5 7600 is a better buy than a R5 5600, because the former will have more power later on
total platform costs for a 7600 are around 1.5-2x vs the 5600. It's not going to last twice as long.
Tuxhorn@reddit
Very simple. You likely need a new motherboard, maybe there's new gen of ram, it's easily the worst part to upgrade if you go into a new gen.
masterfultechgeek@reddit
AM4 lasted a LONG time. AM5 is looking to as well.
In many cases you could buy a budget CPU and then a few years later plop in the in the same higher end CPU that you could've bought off ebay for a relatively low price... or a better one.
A 3600 (100 on sale) + 5700x3D (180) combined, without any resale of the 3600 cost LESS than the "faster" 9900x and 3900x parts and there wasn't any real loss in frame rates unless you had a $1000+ video card and ran at low settings (why do you buy a high end video card to NOT use it fully?)
teutorix_aleria@reddit
That was my logic with the 7800X3d, i paired it with a 7800XT but I expect i to last me through 3 or 4 GPU generations.
bestanonever@reddit
It makes perfect sense. Today, you can get by with the 7700 (non-x) but if you had the money for the X3D, it's an immediate faster CPU and it will last you a touch longer. So, you are even saving money and amortizing the difference every extra year you keep this one.
masterfultechgeek@reddit
"it depends" - if you bought a 5800x and a cheaper board you'd have been about $300ish ahead. If you invested that 1 year ago... that'd be $400ish right now. in a few years out... by the time you'd "want" a faster CPU you'd be able to buy it with the funds you set aside. AND you'd still have the old set up. A card like the 7800XT is going to bottleneck the CPU anyway so... no real loss in the short to mid run.
Raikaru@reddit
This is assuming one has infinite budget. if you don't, why would you spend more on a CPU instead of a GPU when the 2nd gets you better performance?
Bluedot55@reddit
ehh, it really does matter what you play. 100 fps average is more then fine, but if that varies from one game doing 30, and another doing 300, that may not be a great situation if you really like that game that winds up being stuck at 30. Like the new monster hunter beta, for example, would definitely stick a zen 2 part at around 30-40 fps, no matter the gpu.
masterfultechgeek@reddit
It's around 100ish across the entire range of titles TPU uses and it's similar on other sites as well.
There WILL be edge cases where CPU matters. I don't see any in the regularly reviewed titles that people obsess about when screaming "it's only a few percent better"
Bluedot55@reddit
Some of them have started to get a bit more coverage recently, with things like stellaris and factorio occasionally surfacing in benchmarks But conducting those benchmarks in the same way as a regular fps benchmark isn't terribly accurate, since you aren't just playing a light stellaris world and caring about how fast the game goes when you hit max time acceleration. You're figuring out how big and complex of a galaxy you're allowed to have, before performance becomes unacceptable, and a similar story for factorio.
But also, the problem is a lot of the cpu bound games just don't benchmark well. Even some very popular games/situations, like a big MMO raid, or juiced maps in path of exile- those are very hard to replicate, even if I have seen path of exile get cpu bound at around 20 fps on a 7800x3d, while other party members on older hardware were getting numbers you could count on one hand.
Valmar33@reddit
Some terrible game engines will be somehow CPU bottlenecked even at 4K with a very beefy GPU like a 4090. Youtuber Daniel Owen noticed this with Star Wars Outlaws.
bestanonever@reddit
Unreal Engine is like that Jurassic Park quote, it...uhhh...finds a way to bottleneck your CPU.
Valmar33@reddit
Unreal Engine is a plague that seriously needs to die. Monocultures have never been great for, well, anything. Everything ends up having the same underlying issues, and so people just think that's how it is, normalizing perceptions in very bad ways.
Raikaru@reddit
There will be new even more powerful CPUs out by the time GPUs are bottlenecking on something like a 5800x3d in 4k so why would that even matter?
Winegalon@reddit
Because if you cpu is still good enough you might not need to buy a new cpu by then.
Raikaru@reddit
This doesn’t even make sense. Spending money now when you don’t need to in order to maybe be good at some time in the future?
Winegalon@reddit
Spend more money now in a a more capable CPU knowing that you most likely will be able to keep it for longer. Or buy a cheaper option now and upgrade sooner. I think both are valid choices.
bestanonever@reddit
This is all I was saying. If you buy smart, as long as you have the budget, you can ride your current CPU for a year or two (or more!) longer than a slower CPU from the same generation. It does make a difference in the long run.
firaristt@reddit
+1 And better performing for the whole lifespan. If you need, you can OC and use it some more time on top of that. If I had i5 2500K, I possibly won't be able to wait 8700K release. Even if I could and if I would save money and got i5 8600K or even i5 8400, I should have upgrade it earlier, way before 7800X3D. But I got 2600K, 8700K now possibly 9800X3D. And each cpu with 2 gpus. so, imo it worth to spend a bit more and go a bit more powerful chip.
2600K with HD6950 Crossfire and GTX1070
8700K with GTX 1070 and RTX 3080.
9800X3D(?) with RTX 3080 and RTX 5080(?).
If I got i5 6 years ago, I had to upgrade cpu to get 3080, otherwise it would bottleneck really badly. Same for the 1070 and 2600K. And most cpu upgrades except AM4 didn't offer a good upgrade path either. I can upgrade i9 9900KS at max right now. Which will hold next gpu back, so worthless upgrade. Otherwise, I have to change the motherboard and possibly the ram too. Which adds up overtime.
Raikaru@reddit
Unless it's going to last twice as long you'll never get your money's worth that way since the platform and CPU costs are way higher
bestanonever@reddit
Because, usually, you upgrade the CPU less often than GPUs. So, it's realistic to think you could be using the RTX 60 series with your current build, and CPU performance will matter more to stretch your platform a bit longer.
Anyway, as I said in my other post, nobody needs to upgrade as long as their current PC plays the games you want. But, of course, I'm going to celebrate a CPU that improves gaming performance instead of the boring regular Ryzen 9000 or downgradish Core 300 gen.
TranslatorStraight46@reddit
Those future GPU’s need to be 2-3x more powerful before you will observe bottlenecking at 4K. By then these games will be old and no one will care how they run, as the fate of all games is to eventually be CPU limited.
The CPU bottleneck meme is just getting tiresome. It was relevant with Bulldozer which was genuinely crippled in games like Arma 3, StarCraft 2 etc but it just isn’t worth worrying about anymore.
Alternative-Sky-1552@reddit
Not true. There are already games strugling to get 100 fps on any resolution with these processors, more so in the future. Hogwarts, BG3 etc
TranslatorStraight46@reddit
Oh no, only 100 FPS. 🤣
spazturtle@reddit
My 5800X3D can struggle to maintain 100fps in ESO when there are a few hundred players close together doing an event.
Bluedot55@reddit
It really does depend on what you play though. level1techs did some testing of the 9800x3d, 7800x3d, and 285k with a 4090 and 7900xtx in BG3. In 4k, the 285k+4090 actually ended up behind the 9800x3d+7900xtx combo, just by being so cpu bound.
So some games can definitely wind up cpu bound in higher resolutions, which is amplified even more by dlss. Like the new monster hunter beta, which would make a 7800x3d+4090 cpu bound at like 70-80 fps if you were using dlss on higher resolutions.
nathris@reddit
You can see this in the benchmarks now. Just pick a game that's a few years old. You can reasonably assume that the 4090 performance you're seeing now will be the 5080 or the 6070 performance in a few years.
The results I'm seeing tell me I don't need to worry about my 5800X3D for years to come.
TheAgentOfTheNine@reddit
If you are waiting for better GPUs, you can wait for vetter CPUs, too. If you are buying now, you aim to have the most fps for a given budget. Going 7800x3d instead of 9800x3d may allow to go one tier up on the GPU.
bestanonever@reddit
Totally, your budget is the most important part for a balanced build. That's why the 5000X3D series is so interesting for gamers on AM4. It makes much more sense to spend $200-$300 to upgrade the CPU if you are on AM4, instead of spending even more on the CPU + a new motherboard + DDR5 and potentially a new CPU cooler, too.
But for new builds, the top of the line now it's the 9800X3D, but of course, the previous top-dog is still fantastic and a great performer. And I'd choose the older CPU, too, if it meant the difference between getting an RTX 4070 vs a 4080, for example.
gentlecuddler@reddit
I understand prioritizing 1080p benchmarks to show the biggest differences, but a few slides in 1440p would be nice.
KaiEkkrin@reddit
It also matters what games you play.
A lot of graphics-heavy, 3D games appear to have a similar load profile and strongly favour low memory latency and lots of cache -- this is the kind of load the X3D chips are especially good at.
Some are different, though. Strategy games may spend a lot of CPU time calculating AI moves or running a simulation, and may favour different chips, such as ones with more cores. Emulation may make heavy use of CPU vector instructions and see a large benefit from AVX512. Remember to seek out benchmarks of the games you play or similar ones, rather than assume performance in FFXIV Benchmark and Borderlands 3 is representative of all other games :)
9800X3D is looking super nice though.
Jeep-Eep@reddit
There's also IRL factors, like, these tarrifs in the pipe; may want to have virgin silicon before the prices moon.
Anfros@reddit
X3D in general and 9800x3d in particular seems to really shine in sim heavy loads. They perform really well in benchmarks for Stellaris, Factorio, x4 etc.
masterfultechgeek@reddit
Adding to this...
A lot of the "Zen 5 sucks at gaming" comments forget that Zen 5 is a BIG CPU that needs a lot of bandwidth (and ideally at low latency) to be fed well. Zen 4's IOD is 2 years old at this point and designed for a weaker core.
In applications that aren't memory sensitive... Zen 5 is getting upwards of +30% at times.
So yeah... feeding the beast matters more and 3DVcache matters more.
(but 99% of people are still GPU bottlenecked if they're gaming).
OGigachaod@reddit
More than 50% of gamers are still at 1080p, are they also GPU bottlenecked?
Bluedot55@reddit
most of them are probably not using a top end gpu, so possibly.
100GbE@reddit
Hey man, your info isn't even 28% legitimate unless you have a % somewhere in your post at least 17% of the time.
brunocas@reddit
You're 100% correct!
soggybiscuit93@reddit
Thats all gamers, many of whom are on low end hardware. What's the overlap of those buying the best CPU (and also presumably a very high end dGPU) and 1080P?
masterfultechgeek@reddit
Loosely speaking, if your GPU is 1/3rd the speed of a 4090 and you're at 1080p... you should probably be looking at 4K benchmarks instead of 1080p benchmarks to get a feel for how much CPU performance will matter for you. (YMMV, it'll vary by title)
https://cdn.mos.cms.futurecdn.net/BAGV2GBMHHE4gkb7ZzTxwK-1200-80.png.webp
doing crude math... something like a 4060Ti/3070/6750/7600XT/2080Ti is about 1/3rd as fast.
SimpleNovelty@reddit
Probably, because they aren't running a 4090. If you're still gaming at 1080p I sincerely doubt you would be buying a top end CPU or GPU.
Z3r0sama2017@reddit
Yeah I game @4k and going from a 5950x to 7800x3d gave me a sweet boost even with a 4090. Can't wait to swap it for a 9800x3d and see if I can hit a stable 60 in Zomboid.
anomoyusXboxfan1@reddit
Build 42 runs super fast from what I heard. 4k at hundreds of fps. The way the game code uses hardware is super inefficient from what I understand at the moment.
tr2727@reddit
So essentially get 9950x3d
Michelanvalo@reddit
GN tested turn time on Stellaris for this reason too.
KaiEkkrin@reddit
Yeah, I'm very happy to see tech Jesus testing Stellaris :)
Blacky-Noir@reddit
In this day and age, cpu is not just for gameplay and simulation computation. Outside of the basic "driving the gpu" task, cpu are also used extensively for advanced graphics such as ray-tracing BVH computation.
BatteryPoweredFriend@reddit
Simulation games are far more 1T and latency-sensitive than graphics-heavy games.
1eejit@reddit
I wonder why more CPU benchmarks don't include something like an endgame huge map Civ6 save and look at turn timers. That's at least as relevant as how it affects fps in low res shooters.
metakepone@reddit
That requires benchmarkers to actually play CIV6 to the endgame
ClearTacos@reddit
Yeah, you can't expect your average FPS in every game to go up by 20%, but that's not the only thing about CPU performance that matters.
Since cross gen console games stopped being released, we've seen more CPU heavy titles that lock to 30fps on consoles and can run around 60-80 in intensive areas on best CPU's today. That's the other thing, most games have areas or situations where they're heavier on the CPU and your FPS dips more than usual, that's where you can see those 20% and where they'll be most appreciated. Not to mention every other big release is a stutterfest lately, Z5X3D certainly won't smooth all of them out but it'll help. There are also sims or strategy games that either have a turn time that speeds up with faster CPU's or are heavily CPU bound, especially late game.
I have never seen this many "you'll never get that uplift in real scenarios!!!" reactions before, having a faster CPU is good if you value a smooth experience beyond average FPS.
john1106@reddit
so as someone who have 5800x3d cpu and only just gotten into am4 platform for just 2 years need to change my entire PC hardware to am5 and get the 9800x3d so that i won get cpu bottleneck even at 4k resolution?
capybooya@reddit
Yep, no matter the GPU, those CPU limited areas/scenarios will always drop. And its quite jarring with a very fast GPU to have some areas down say in the 70s compared to usually 130+.
If you're short on cash or have other priorities, sure ride it out with the older or current CPU, but some people just don't know this and assume that the GPU is the limitation all the time when its rather most of the time.
john1106@reddit
so as someone who have 5800x3d cpu and only just gotten into am4 platform for just 2 years need to change my entire PC hardware to am5 and get the 9800x3d so that i won get cpu bottleneck even at 4k resolution?
BedNervous5981@reddit
Can be pls stop this stupid discussion every other week? Those benchmarks are here to show differences in gaming performance in CPU bound scenarios. As someone who is mainly playing strategy games, it's a very valid information to see that the 9800X3D will run Anno 1800 or Frostpunk 2 massively faster than my 5800X3D even while I'm using a 4090. It's simply faster calculating the simulation.
Framed-Photo@reddit
Most people aren't doing calculations like that for the games they play, that's the issue, and it's why posts like this are good.
Most folks see "oh this is the best gaming CPU" and don't really understand what that actually means or how to apply it to the games they play. In fairness, same goes for GPU's. It ends up with folks wasting a ton of money on shit they don't need.
Unless you're playing at 1080p (which if you're considering a $500 CPU just for games, what the heck are you doing at 1080p), or if you play games that specifically see a high gain from X3D chips like you are, then it's a lot of money spent for not much gain to get a 9800X3D.
It's clearly a great chip, but do we really think it's worth a little more than double chips like the 5700x3d or the 7600? Unless it's someone like you who plays incredibly CPU heavy games that are not the norm, or you're an insane latency freak that wants 500+ fps at 1080p, then it probably won't be worth doing and you can save hundreds of dollars to put towards a GPU, or a 4k monitor, etc.
MrElendig@reddit
Not to mention 1% and 0.1% lows in some games.
Framed-Photo@reddit
If you've got the money to spend $500 on a CPU for gaming only, you should be playing at 1440p mimimum, if not higher.
At those resolutions, the CPU matters far less outside of some specific games like factorio, tarkov, etc.
john1106@reddit
yes correct. Im still on 5800x3d and i rather save money to upgrade to 5090 rather than upgrading my whole motherboard and ram just to get the 9800x3d. Plus im playing on 4k TV so cpu bottleneck shud not be that bad. And i can use DLDSR to render the resolution above 4k in order to make the game more gpu bound
mckirkus@reddit
It really does matter for those 8 of us running PC VR where you can't drop below 90 or 120 FPS without judders. There is no VRR in VR so 0.1% lows matter.
Jeep-Eep@reddit
I don't have PCIE 4, and that is starting to be painful, and I pay roughly as much for a good cooler and a RAM upgrade. Math isn't as good in my parts.
2560x1080p@reddit
Helldivers 2 is a costly game. its the game thats fcking over my entire build right now. I recently upgraded to a 6950 XT and then I met Helldivers 2 ah man it wrecked my rig, now I have to possibly get a 7900 XT or better. I can run it 3440 x 1440p Native, but having used Super sampling for so long, I feel like im not getting the full value of the game.
Valmar33@reddit
Sorry, but even at 4K with a powerful GPU, games still need good CPUs to perform well, as there will ALWAYS be moments where the game is quite CPU-demanding, in spite of that 4K.
You always want to be GPU-bottlenecked, preferably, so a good CPU is necessary to prevent the lows from being atrocious, and frametimes stable.
Archerofyail@reddit
Well sure, but if you're looking to upgrade from a 5800X3D, and you see the 1080p low charts it looks like an incredible uplift. But looking at 4K, or even some 1440p ultra benchmarks shows that there's basically no difference, even in 1% lows.
Superb_Raccoon@reddit
Only one bed in an ambulance.
0rinx@reddit
The main game I play (FFXIV) is heavenly cpu bottle necked so upgrading my cpu will have a larger impact then changing my gpu.
inyue@reddit
What's your specs? My 4070ti and 1270k gets cpu bottleneck inside limsa with less than 60 fps but any meaning gameplay session like a 8 man raid in get 175 fps.
COMPUTER1313@reddit
Civilization 6 with heavy mods (City Lights and Urban Complexity) and extra huge map still runs fine with a RX570 4GB at 1080p high details setting.
The turn times? It was chugging hard on my Ryzen 1600 by mid game until I upgraded to a 5600, and even then it's noticeably long in the late game.
Iaghlim@reddit
I have already tried to find a specific kind of review, where it is a gaming review but done while doing many lot of other things
As an example, sometimes I'm working, with teams, Spotify, chrome, excel, power point, word, lots of apps opened and for around 20-30 minutes I decide to play something real quick (yeah, thanks home office), and MAYBE(honestly, I have no idea) a 16 core would be better in gaming than 8 core in this specific kind is usage
Also, never saw someone doing these comparison while streaming or any different loads
That would be great for people who has specific situations like me
colxa@reddit
Congrats on watching the LTT video
mechkbfan@reddit
As someone that does VR, the 4K 1% lows is what I care about the most.
Glad it's included in at least one review
Neofarm@reddit
On the opposite, a lot of games are CPU heavy. You dont need "next gen" GPU at all to push frame rate but a better CPU like 9800X3D no matter the resolution. For example most MMO, RPG, shooting, strategy, simulation games... The only type of game where GPU matters more is single player graphic heavy game at high resolution.
SJGucky@reddit
1080p is actually a good real world resolution.
Most people are using upscaling and from my tests everything with a baseresolution of 1080p looks good.
Meaning DLSS quality with 1440p or DLSS performance in 4k.
GoldPanther@reddit
Any good sources for this. I've often wondered how much my 9900k is bottlenecking my 4090 at 4k.
Gambler_720@reddit
What's up with the insecurity of the 5800X3D owners having to validate themselves every time on how they don't need to upgrade?
The 7800X3D was already a pretty substantial upgrade over the 5800X3D, if someone is too poor for such an upgrade then that doesn't mean it isn't a valid upgrade. The 9800X3D is an even bigger jump going from 5800X3D. Btw I say that as someone who is also too poor to buy a 9800X3D even though it represents a pretty good upgrade over my 7700. But I won't go around saying that no one needs to upgrade from a 7700.
masterfultechgeek@reddit
I don't have an RTX 4090.
I don't play at 1080p.
I'm not looking at 1080p or 720p benchmarks with a 4090.
I'm not looking at benchmarks where the 1% lows are WELL over 100FPS.
TheGillos@reddit
I want to see 320x240 benchmarks.
KirillNek0@reddit
Yes, yes...
People want real-world reviews, not "theoretical max" from CPUs.
conquer69@reddit
A "real world review" is completely different from a cpu comparison, which is what regular cpu reviews are.
What you want is a system review and you will need to find someone with the exact specs you want, testing the same games you are going to play.
I'm sure you can see why that is a dumb expectation to have from a tech reviewer.
KirillNek0@reddit
But we do have tech-tubers that do/did these reviews. This sub shits on them usually.
Qaxar@reddit
The irony is that a next gen GPU (at the higher end) would more than likely alleviate some of the gpu bottlenecking of some games, which would make cpu performance more important and widen the gap between 9800x3d and other processors.
GTRagnarok@reddit
Yeah, I game at 4K with my 4090 and my 13700K is doing just fine. It's undervolted and averages 60-80W while gaming which is not much more than AMD's CPUs. And Intel's idle power consumption is better which pretty much cancels things out.
dparks1234@reddit
Today the 9800X3D and the 9800X perform the same in 4K. Looking at the data one would think you’re paying more money for nothing. 3 years from now when CPU requirements increase the 9800X3D will perform better than the 9800X.
If you don’t test in a CPU-limited scenario like all-low 720p then you won’t be able to know these differences.
dealerofbananas@reddit (OP)
12600k is within 5% of a 9800X3D at 4k
This is a budget CPU from Jan 2022 compared to a state of the art gaming chip in November 2024, almost 3 years.
CPU's for 4k do not matter as long as you are using something somewhat modern.
Bluedot55@reddit
It really matters what you play though. There are many games that do just wind up gpu bound at 4k, with basically any modern cpu. And if you play mainly those, then that's fine. An average of 100 games where 95 are gpu bound only is relevant if you mainly play those 95.
But there's also a lot more cpu heavy stuff lately, where it can start to matter- even more so when upscaling is becoming very good. Looking at recent popular stuff on steam, you have fatorio and BG3 both in the top 20 most played games.
Bg3, in the level1Techs review for example, was getting notably lower fps at 1080p then a 9800x3d was getting at 4k. That game was even to the point that a 7900xtx at 4k was beating a 4090 at 1080p with a cpu switch. Is that game alone worth upgrading hardware for? Probably not. But if that game is important to someone, it can matter if you're getting in the 70s vs maintaining 4k120, or w/e.
Or factorio or any long list of sim games, where how much you can do and how fast you can do it is literally tied to the cpu speed.
Or some of these new games that are designed for 30 fps on the current console gen, like the new monster hunter beta that just got a bunch of hype over the weekend. A 12600k was stuck in the 40s, which isn't exactly ideal.
So if you play games where you're gpu bound and you know it, yea- theres not much point to a really fast cpu. But there's also a lot of things out there where a 100 game average showing near identical performance is going to be missing a bit.
RunTillYouPuke@reddit
I assume you meant 9700X. Actually there is a noticable difference in 4K on max settings in many games between those two. For example:
4K benchmark
Source
vedomedo@reddit
Yeah basically, until the 5090 is released, gaming at 4k is kind of «pick whatever top tier cpu». I was contemplating getting the 9800x3d but there’s nor point as of yet.
Michelanvalo@reddit
I tried making this point recently and got absolutely blasted by this sub for it. Lab results are great, but very few people are buying a 9800X3D/4090 to play games at 1080P. Practical set up results have their value too.
damien24101982@reddit
it depends alot on the games you play.
vialabo@reddit
That is literally how you control for the CPU variable but sure.
tuvok86@reddit
Turns out future proofing is much cheaper when done...in the future
OGigachaod@reddit
Yeah hindsight is awesome.
MrAldersonElliot@reddit
Jokes on you that's real world for me I only play competitive shooters online. Eg CS2 and Valorant. So those are real as it gets settings.
baron643@reddit
Nowadays you are using either DLSS or FSR at least 80% of the time and youre probably upscaling from a lower resolution so yes 1080p benchmarks matters
Raikaru@reddit
Lower Resolution =/= 1080p. 4k DLSS Quality and Balanced are still higher than 1080
Winegalon@reddit
Its still a fair point to make. 1440p is a very popular resolution, and if you use DLSS it renders bellow 1080p even in the quality setting.
TranslatorStraight46@reddit
If your card is CPU limited at 1080p you won’t need upscaling for 4K performance.
Most people should be running variable refresh rate monitors with their framerate capped to their monitor’s highest refresh rate instead of worrying about CPU bottlenecks.
mb194dc@reddit
Absolutely, if you're using a modern resolution 1440p+ and don't have a 4090, gains are going to minimal from changing cpu.
Mystikalrush@reddit
Any 1440p and/or ultrawide reviewers?
dragenn@reddit
If it drops the price on the 9xxx series CPU, I'm just as happy.
Even grab a massive discount on an owner that needs higher 0.1% lows...
🤣
matolati@reddit
Most people can't afford the price are trying to discredit the chip and the benchmarks...
RealPjotr@reddit
By "most situations" you basically mean gaming with a GPU?
For the rest of us not gaming, it's an even bigger upgrade.
dealerofbananas@reddit (OP)
The target audience for an X3D chip is gaming with a high end GPU btw.
matolati@reddit
No, it isn't. The audience for X3D chips are those who want competitive gaming with very high framerate and stables 0.1%, or enthusiasts who want the best available
toalv@reddit
Why would you pick an x3d cpu for productivity?
AK-Brian@reddit
Some non-gaming workloads also take advantage of large cache. Code compilation, databases, web server / homelab service containers (useful for prototyping load simulations), fluid motion simulation, etc. The original 3D V-Cache chips were designed for the Epyc Milan-X series of server CPUs.
Check out where the 9800X3D landed on Phoronix's charts for PyTorch workload responsiveness, as an example:
https://openbenchmarking.org/test/pts/pytorch
Nice!
mauri9998@reddit
The X3D CPUs are made for gaming, they don't expect anyone but gamers to buy them.
FitCress7497@reddit
Why would people who are not gaming buy this instead of a 9900x?
SpitneyBearz@reddit
Just wait msfs2024 user tests.... What if a non-3D 16cores cpu get better results on it?
lebrowski77@reddit
Most modern AAA single player games are made with 4k 60 fps console and tv gaming in mind. What I really wanna know with these x3d chips is how much they help reduce stutter, how smooth the frame time graph looks compared to other chips when locked to a 4k 60. Elden ring would be the perfect game for this, as it's plagued with all kinds of stutter.
handsupdb@reddit
Also remember what games you play - can be exactly the opposite of what you're saying.
Not everyone plays GPU eye-candy flavor of the month games. Some people play WoW, LoL or CS2 and just want more stability that can come from just more frames.
People who don't know read this type of shit and just bought a GPU instead and have the same damn framerate.
It's called N U A N C E
tangosmango@reddit
Any reason to upgrade now from a 7700x? I was initially holding to upgrade to the 5090 and the 9900x3D or 9950x3D.
I'm running AW3423DWx3 so I'm not sure if the 9800x3D will even benefit me all that much.
conquer69@reddit
In geekerwan's review, cod bo6 is gpu bound at any resolution, even 360p. Probably because DLSS has a fixed frametime cost. When you are getting 300 fps, that's a frametime of 3.33ms
DLSS costs 1ms. That means it can tank performance up to 70 fps which it seems to be doing.
FitCress7497@reddit
This! Also worth notice that those benchs are with a currently 2000$ 4090. If you have a weaker GPU, the result are much closer