Reminder that CPU benchmarks are run at 1080p low in order to remove GPU bottlenecks. Make sure to look up your "real world" benchmarks as well.
Posted by dealerofbananas@reddit | hardware | View on Reddit | 225 comments
9800X3D crushes the 5800X3D across the board in pretty much every benchmark.
Does that mean an AM4 user should look to upgrade? Probably not since if you bump up the resolution to 1440p ultra, or even 4k, the performance differences are miniscule and you'd be much better off just saving for a next gen GPU.
A graph saying +20% across the board will not be even close to that in real world usage for the majority of users.
There are other reasons to upgrade such as productivity, "future proofing," better thermals, etc. so take all of those into account. Do not fall for performance hype when in a most situations it'll be negligible at best at the target audience for a high end CPU(high resolutions, ultra settings).
XGenGamer007@reddit
Only games that have proven to scale with CPU performance should be used with 1080p low settings.
These games are usually older and well-optimized for console and PC.
All the more recent games show a small or no gap between the Ryzen 7800X3D and 9800X3D which clearly highlights the inappropriateness 1080 Low setting testing for all games.
So if your purpose is to show true RAW performance stop showing irrelevant results for poorly optimized games from lazy developers.
Why not do the real work of isolating CPU intensive settings versus GPU intensive settings for each game and benchmark all games with maxed CPU intensive settings and disable or set to the lowest the GPU intensive settings?
And if your purpose is to show true EXPERIENCE of gamers, than show at highest graphics at 1080P or high settings at 1440P.
These youtubers, however, constantly berate critics and defend their 1080P low testing while doing neither.
This is to expose the ignorance and complicity to the false market created by CPU manufacturers to justify new sales and higher prices with petty incremental improvements.
jassco2@reddit
Tell that to HWUB. They keep complaining about this being pointless and I've given up trying to help them understand why taking 5 minutes to move a resolution slider and post the damn chart, and say it doesn't matter for these people at these resolutions can solve their stress levels. They refuse. Keep being frustrated guys and yelling at your viewers.
bestanonever@reddit
But also, remember that as soon as we get more powerful GPUs, the difference between these CPUs come back in higher resolutions, too. The current fact that there's barely a difference at 4K is not set in stone. The 720p and 1080p results are much more representative of what we are going to get in a few years from now, with more powerful GPUs.
And somebody that's buying the very best gaming CPU, it's not that far removed from upgrading to the very best graphic cards from Nvidia or AMD as soon as they drop, and they will be better off with a 9800X3D than the 5800X3D. Hell, if you are building the PC right now, just get the 9800X3D instead of the 7800X3D, it's worth it (if you are at that CPU-range of performance and budget).
It all comes down to how often you upgrade and how powerful your GPU is. Have a good GPU with the 5800X3D or regular Ryzen 7000 series and will ride both of them until the end of time/next platform reveal? Good, you don't need the upgrade. But if you are one to upgrade your GPU every two or three years, and particularly at the high-end, get the best CPU you can now because that performance margin will come back in full force.
On the other hand, there's nothing wrong in using a 5800X3D/5700X3D until all your games drop below 30FPS or whatever. Just update when you need to, not when there's something new around (there's always something newer around)
StaysAwakeAllWeek@reddit
This argument held when you had to swap out your motherboard for a cpu upgrade and the gpu didn't cost several times more than the cpu. These days it makes just as much sense to drop in upgrade a cpu as it does a gpu, and we don't see reviewers limiting their gpu reviews to 4K max even for low end parts
bestanonever@reddit
The only company that allows more than 2 (real) generations per socket is still AMD. And, so far, that's only for AM4, AM5 might get a new gen (Zen 6) but it hasn't been confirmed just yet. Of course, in terms of performance, one might argue that X3D counts at its own generation. In that case, we already have 3 generational levels with AM5, so it's getting pretty good in terms of choice.
Intel, though, is still stuck with 2 real generations per socket.
Agree with you that GPUs are freaking expensive, though, lol. Particularly ones that'd be bottlenecked by older CPUs.
dfv157@reddit
Heh, LGA1851 is a 1-generation socket. Meteor Lake is cancelled. ARL Refresh is cancelled. Nova/Panther is designed for a new socket.
bestanonever@reddit
Well, you can always do even worse! Lol. Go home, Intel, you are drunk.
Funny to think 10 years ago, I wouldnt have recommended any AMD CPU, unless you were on a very strict budget. Who knows how the next 10 years are going to look like.
Raikaru@reddit
There will be new even more powerful CPUs out by the time GPUs are bottlenecking on something like a 5800x3d in 4k so why would that even matter?
Winegalon@reddit
Because if you cpu is still good enough you might not need to buy a new cpu by then.
Raikaru@reddit
This doesn’t even make sense. Spending money now when you don’t need to in order to maybe be good at some time in the future?
Winegalon@reddit
Spend more money now in a a more capable CPU knowing that you most likely will be able to keep it for longer. Or buy a cheaper option now and upgrade sooner. I think both are valid choices.
Daffan@reddit
For me the latter is definitely more valid heh. The 9800x3d for example is $800, a 13600k is only $330. It's insane!
bestanonever@reddit
This is all I was saying. If you buy smart, as long as you have the budget, you can ride your current CPU for a year or two (or more!) longer than a slower CPU from the same generation. It does make a difference in the long run.
firaristt@reddit
+1 And better performing for the whole lifespan. If you need, you can OC and use it some more time on top of that. If I had i5 2500K, I possibly won't be able to wait 8700K release. Even if I could and if I would save money and got i5 8600K or even i5 8400, I should have upgrade it earlier, way before 7800X3D. But I got 2600K, 8700K now possibly 9800X3D. And each cpu with 2 gpus. so, imo it worth to spend a bit more and go a bit more powerful chip.
2600K with HD6950 Crossfire and GTX1070
8700K with GTX 1070 and RTX 3080.
9800X3D(?) with RTX 3080 and RTX 5080(?).
If I got i5 6 years ago, I had to upgrade cpu to get 3080, otherwise it would bottleneck really badly. Same for the 1070 and 2600K. And most cpu upgrades except AM4 didn't offer a good upgrade path either. I can upgrade i9 9900KS at max right now. Which will hold next gpu back, so worthless upgrade. Otherwise, I have to change the motherboard and possibly the ram too. Which adds up overtime.
Raikaru@reddit
Unless it's going to last twice as long you'll never get your money's worth that way since the platform and CPU costs are way higher
bestanonever@reddit
Because, usually, you upgrade the CPU less often than GPUs. So, it's realistic to think you could be using the RTX 60 series with your current build, and CPU performance will matter more to stretch your platform a bit longer.
Anyway, as I said in my other post, nobody needs to upgrade as long as their current PC plays the games you want. But, of course, I'm going to celebrate a CPU that improves gaming performance instead of the boring regular Ryzen 9000 or downgradish Core 300 gen.
Parrelium@reddit
I usually wait until I’m having framerate issues with my current cpu and there’s deals on the one I want.
I went 3570k->1700x->3800x->5600x->7800x3d.
When I went 5600x I should have bit the bullet and paid the extra $150 or whatever it was at the time and gotten a 5800x3d. The reason was Tarkov btw. That game sucks to play on anything but x3d.
bestanonever@reddit
On the bright side, the jump from 5600X to 7800X3D was bigger.
Parrelium@reddit
Yes it was actually quite noticeable, even at 1440p.
Strazdas1@reddit
CPUs are already the bottleneck in 4k if you play CPU-intensive games.
Raikaru@reddit
The whole context is if you weren’t bottlenecking now
Valmar33@reddit
Some terrible game engines will be somehow CPU bottlenecked even at 4K with a very beefy GPU like a 4090. Youtuber Daniel Owen noticed this with Star Wars Outlaws.
bestanonever@reddit
Unreal Engine is like that Jurassic Park quote, it...uhhh...finds a way to bottleneck your CPU.
Valmar33@reddit
Unreal Engine is a plague that seriously needs to die. Monocultures have never been great for, well, anything. Everything ends up having the same underlying issues, and so people just think that's how it is, normalizing perceptions in very bad ways.
PMARC14@reddit
Idk it feels like a correlation equals causation thing. While I don't want everything to become Unreal, are things really unoptimized because they chose Unreal, or were choosing Unreal engine because they were lazy and cheap so they were never going to give it proper optimization and make it use resources efficiently? I think it is much more of the later, but Unreal doesn't help as it still is notorious for micro stuttering and way too much TAA
bestanonever@reddit
Thing is, as good as the devs can be, it's the directors and company executives the ones that are choosing to work with Unreal, instead of nurturing an in-house engine tailor-made for their needs. I'd bet they just want the devs to start creating levels and coding routines and stuff asap instead of spending time working on the engine, or else they'd have stayed with the custom-built engine.
So, a lot of talent is probably not allowed enough time to polish Unreal for their particular needs and we end up with stuttery messes. They are probably doing what they can and it's enough, sometimes.
Valmar33@reddit
Unreal has plenty of issues with optimization in various areas ~ it simply cannot be as good as a custom in-house engine developed for a very specific style of game.
Strazdas1@reddit
From my limited information on interviews and personal talks with game devs, primary reason to choose Unreal is that there is a huge talent pool to hire from and even new people just from uni will have some hands-on experience with it. This is high contrast to having to onboard people to inhouse engines that may be different.
Strazdas1@reddit
Does not have to be terrible. Just sim-heavy. For example in paradox games you are CPU-bottlenected and GPU load does not even reach 50%, let alone 100%.
masterfultechgeek@reddit
So yes but also... no.
https://tpucdn.com/review/amd-ryzen-7-9800x3d/images/average-fps-1280-720.png
The cheepo 5 year old R5 3600 is still able to feed a 4090 with over 100FPS at low resolution.
Arguing "I'll just future proof" is often questionable... the performance levels we've gotten in the last 5 years have just been... awesome.
It'll probably take a new gen of consoles before CPU starts to matter much again and even then...
Strazdas1@reddit
The testing suite they used includes only one CPU-heavy game in total.
bestanonever@reddit
That's the thing. Most people don't upgrade their CPUs as often as their GPUS, so why wouldnt you get the best you can now and forget about it for a good bunch of years?
Taking a lesson from history, Sandy Bridge was a great buy...at the high-end. I3 Sandys didn't age as gracefully as the i7 2600K/2700K did.
Framed-Photo@reddit
Because the 9800X3D costs $500 dollars and is not twice as good as chips that are less then half the price of it?
Yeah it's a good chip, but it's nowhere close to being the best value on the market, not even in the AM5 lineup.
Likewise to your example, the Sandy bridge i3's were not the best value in those lineups, and the i5's and even the i7's if memory serves, did not cost $500. The 2700k MSRP'd at like $350 or something and that might be too high.
bestanonever@reddit
That's the part of "The best you can" I wrote about, lol. Not everyone can afford or want to get the very best gaming CPU there is.
If it's between the 7800X3D and 9800X3D, there isn't as big of a difference in price, so you might get the 9800X3D while you are at it, but if you are more on a, say $250 budget, then a 7700 would be more forward looking than going all in with any AM4 CPU that isn't X3D, for example, even if the rest of the platform is cheaper. Hell, you might even get a nice high-end Intel's 12th Gen CPU as a compromise in-between and use the savings for a better GPU.
It's just that some guys are talking like "There's no difference at 4K, don't get the best" and that's misguided in so many cases. Just a touch of extra care about the CPU you get can pay gaming dividends down the line.
Framed-Photo@reddit
That's kinda what I'm trying to get at. I don't think a chip like the 9800X3D makes sense for most people, even those that can afford it easily.
Like you said, if there's not a big price disparity between something like the 7800X3D and the 9800X3D then get the 9800X3D. But if you're at 4k or aren't a huge FPS snob and are ok with just one tier down the ladder in terms of performance, then you can save literally hundreds and still have a fantastic computer.
Something like the 7700 is significantly cheaper and still a top tier performing chip, for example.
According to techpoweredups review suite, yeah there literally is no difference at 4k, and you should not get the best lol. They tested averages and lows in separate charts, the difference between even a budget chip like the 5600 and the 9800X3D was 92.8 vs 101.4, and the difference in lows was 70.6 vs 77.8. I don't think anyone would argue that a difference that big is worth 5x the price.
There are exceptions of course, games like tarkov/factorio/msfs, targetting high FPS with super low settings, etc. But even then chips like the 7700 or the 5700X3D are still less then half the price and get really damn close.
masterfultechgeek@reddit
Is it fair to claim your argument is "The CPU matters so little in gaming that people don't bother to upgrade it. Because it matters so little you should spend more money now to get something that's overkill."
If so, that doesn't make a ton of sense.
Even in your i3 Sandy Bridge example, there's NOTHING that stopped people from plopping in a $50ish i5 or i7 off of ebay later on.
I wouldn't have recommended an i3 at the time (not enough performance for "basic use"), but the main choice would've been i5 vs i7 for most enthusiast.
A 2500k was still usable 5 years later though it was showing its age.
There's nothing stopping people from upgrading their CPUs. I went from a 1700 ($300) to a 3900x. This was NOT for gaming reasons though. If I wanted to I could 2x the MT performance by plopping in a 5950x or 5900XT but I don't need to at this time. The 3900x gets about the same gaming performance as the 3600 which was WAY cheaper. In that specific case, "future proofing" did nothing for gaming (though it was good for productivity).
FrewdWoad@reddit
>Even in your i3 Sandy Bridge example, there's NOTHING that stopped people from plopping in a $50ish i5 or i7 off of ebay later on.
Exactly what I did. By the time an i3 wasn't close to the i5 in games anymore, the i5 was literally $30.
Future-proofing is always a fool's errand.
bestanonever@reddit
That's not exactly what I said. CPU matters, and the more powerful it is today, the longer you can keep it!
As I said before, people (not everyone, of course) upgrade CPUs way less often than GPUs. So, it makes sense to have as much CPU as you can. Within reason, if you are buying GPUs in the range of the RTX 4060, you shouldn't be getting a 9800X3D, lol, but maybe a Ryzen 5 7600 is a better buy than a R5 5600, because the former will have more power later on, even if you upgrade to another budget GPU, like a potential RTX 6060.
masterfultechgeek@reddit
How is it different?
And how does CPU matter?
https://tpucdn.com/review/amd-ryzen-7-9700x/images/average-fps-3840-2160.png
looking at this the difference between a top of the line CPU and a meh CPU is only a few FPS and the frame rate is STILL 100+
Does 3 FPS matter?
---
>As I said before, people (not everyone, of course) upgrade CPUs way less often than GPUs.
And why don't CPUs get upgraded as much? Explain why people don't think a CPU upgrade would be worth the performance uplift.
>but maybe a Ryzen 5 7600 is a better buy than a R5 5600, because the former will have more power later on
total platform costs for a 7600 are around 1.5-2x vs the 5600. It's not going to last twice as long.
Tuxhorn@reddit
Very simple. You likely need a new motherboard, maybe there's new gen of ram, it's easily the worst part to upgrade if you go into a new gen.
masterfultechgeek@reddit
AM4 lasted a LONG time. AM5 is looking to as well.
In many cases you could buy a budget CPU and then a few years later plop in the in the same higher end CPU that you could've bought off ebay for a relatively low price... or a better one.
A 3600 (100 on sale) + 5700x3D (180) combined, without any resale of the 3600 cost LESS than the "faster" 9900x and 3900x parts and there wasn't any real loss in frame rates unless you had a $1000+ video card and ran at low settings (why do you buy a high end video card to NOT use it fully?)
teutorix_aleria@reddit
That was my logic with the 7800X3d, i paired it with a 7800XT but I expect i to last me through 3 or 4 GPU generations.
bestanonever@reddit
It makes perfect sense. Today, you can get by with the 7700 (non-x) but if you had the money for the X3D, it's an immediate faster CPU and it will last you a touch longer. So, you are even saving money and amortizing the difference every extra year you keep this one.
masterfultechgeek@reddit
"it depends" - if you bought a 5800x and a cheaper board you'd have been about $300ish ahead. If you invested that 1 year ago... that'd be $400ish right now. in a few years out... by the time you'd "want" a faster CPU you'd be able to buy it with the funds you set aside. AND you'd still have the old set up. A card like the 7800XT is going to bottleneck the CPU anyway so... no real loss in the short to mid run.
Raikaru@reddit
This is assuming one has infinite budget. if you don't, why would you spend more on a CPU instead of a GPU when the 2nd gets you better performance?
Bluedot55@reddit
ehh, it really does matter what you play. 100 fps average is more then fine, but if that varies from one game doing 30, and another doing 300, that may not be a great situation if you really like that game that winds up being stuck at 30. Like the new monster hunter beta, for example, would definitely stick a zen 2 part at around 30-40 fps, no matter the gpu.
masterfultechgeek@reddit
It's around 100ish across the entire range of titles TPU uses and it's similar on other sites as well.
There WILL be edge cases where CPU matters. I don't see any in the regularly reviewed titles that people obsess about when screaming "it's only a few percent better"
Bluedot55@reddit
Some of them have started to get a bit more coverage recently, with things like stellaris and factorio occasionally surfacing in benchmarks But conducting those benchmarks in the same way as a regular fps benchmark isn't terribly accurate, since you aren't just playing a light stellaris world and caring about how fast the game goes when you hit max time acceleration. You're figuring out how big and complex of a galaxy you're allowed to have, before performance becomes unacceptable, and a similar story for factorio.
But also, the problem is a lot of the cpu bound games just don't benchmark well. Even some very popular games/situations, like a big MMO raid, or juiced maps in path of exile- those are very hard to replicate, even if I have seen path of exile get cpu bound at around 20 fps on a 7800x3d, while other party members on older hardware were getting numbers you could count on one hand.
TranslatorStraight46@reddit
Those future GPU’s need to be 2-3x more powerful before you will observe bottlenecking at 4K. By then these games will be old and no one will care how they run, as the fate of all games is to eventually be CPU limited.
The CPU bottleneck meme is just getting tiresome. It was relevant with Bulldozer which was genuinely crippled in games like Arma 3, StarCraft 2 etc but it just isn’t worth worrying about anymore.
Strazdas1@reddit
I can observe CPU-bottlenecking in 4k today. Its all about choosing the right game to test.
Alternative-Sky-1552@reddit
Not true. There are already games strugling to get 100 fps on any resolution with these processors, more so in the future. Hogwarts, BG3 etc
TranslatorStraight46@reddit
Oh no, only 100 FPS. 🤣
spazturtle@reddit
My 5800X3D can struggle to maintain 100fps in ESO when there are a few hundred players close together doing an event.
Bluedot55@reddit
It really does depend on what you play though. level1techs did some testing of the 9800x3d, 7800x3d, and 285k with a 4090 and 7900xtx in BG3. In 4k, the 285k+4090 actually ended up behind the 9800x3d+7900xtx combo, just by being so cpu bound.
So some games can definitely wind up cpu bound in higher resolutions, which is amplified even more by dlss. Like the new monster hunter beta, which would make a 7800x3d+4090 cpu bound at like 70-80 fps if you were using dlss on higher resolutions.
nathris@reddit
You can see this in the benchmarks now. Just pick a game that's a few years old. You can reasonably assume that the 4090 performance you're seeing now will be the 5080 or the 6070 performance in a few years.
The results I'm seeing tell me I don't need to worry about my 5800X3D for years to come.
TheAgentOfTheNine@reddit
If you are waiting for better GPUs, you can wait for vetter CPUs, too. If you are buying now, you aim to have the most fps for a given budget. Going 7800x3d instead of 9800x3d may allow to go one tier up on the GPU.
bestanonever@reddit
Totally, your budget is the most important part for a balanced build. That's why the 5000X3D series is so interesting for gamers on AM4. It makes much more sense to spend $200-$300 to upgrade the CPU if you are on AM4, instead of spending even more on the CPU + a new motherboard + DDR5 and potentially a new CPU cooler, too.
But for new builds, the top of the line now it's the 9800X3D, but of course, the previous top-dog is still fantastic and a great performer. And I'd choose the older CPU, too, if it meant the difference between getting an RTX 4070 vs a 4080, for example.
handsupdb@reddit
Also remember what games you play - can be exactly the opposite of what you're saying.
Not everyone plays GPU eye-candy flavor of the month games. Some people play WoW, LoL or CS2 and just want more stability that can come from just more frames.
People who don't know read this type of shit and just bought a GPU instead and have the same damn framerate.
It's called N U A N C E
Strazdas1@reddit
Some people just want 60 fps in CS2. That is, Cities Skylines 2.
GruntChomper@reddit
And they'll be waiting for AMD to create an X4D processor then
Strazdas1@reddit
Point being, there are very letigitmate reasons to update to this and to want even better CPUs. Even for gaming.
j_a_guy@reddit
Unity games like Rust and EFT also benefit tremendously from X3D. It’s always good to research the games you play before making a purchase.
Strazdas1@reddit
Im not sure there is anything that can benefit EFT. that game is just stuck so badly on 4 threads.
Valmar33@reddit
Sorry, but even at 4K with a powerful GPU, games still need good CPUs to perform well, as there will ALWAYS be moments where the game is quite CPU-demanding, in spite of that 4K.
You always want to be GPU-bottlenecked, preferably, so a good CPU is necessary to prevent the lows from being atrocious, and frametimes stable.
Framed-Photo@reddit
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/21.html
Even in the minimum FPS test at 4k in that second link, the 5600x is getting 70.6 on average compared to the 9800X3D's 77.8.
Average FPS difference is 101.6 vs 92.8 in favor of the X3D chip.
The 5600x can be had for less then a quarter the price of the 9800X3D. Good CPU's are important, and yes there are some games that are CPU bound even at 4k, but it's not that many and the differences really aren't large enough to warrant the upgrade IMO.
If you needed something like X3D in games like factorio or tarkov, the 5700X3D gets you that for almost a third the price.
XelNika@reddit
If you look at the 285K review from Digital Foundry, the 5600X fails to deliver 60 FPS averages in Dragon's Dogma 2 (using the 5900X as a proxy), Baldur's Gate 3, MS Flight Sim 2020, Cyberpunk 2077. The 7800X3D easily manages above 60 FPS average in all of them.
Looking at the same review, the 5600X 1% lows hover around 40 FPS in Starfield while the 7800X3D manages above 60 FPS. The 5600X is at just 25 FPS in Baldur's Gate 3 1% lows while the 7800X3D is hitting 50 FPS.
All of these titles were big, commercially successful releases. If you're building a new system and play any one of these, the cost of a 3D V-cache CPU could be worth it for that experience alone.
79215185-1feb-44c6@reddit
Turn off AA or one of the various CPU heavy settings that do nothing for visual fidelity like AO or Chromatic Aberration and wow. Suddenly 60FPS.
XelNika@reddit
AA as in anti-aliasing? AA and AO are GPU-heavy while chromatic aberration is a post-process filter that costs practically nothing.
I know that in the case of Baldur's Gate 3 at least, there is no setting to fix the issue. The game just stumbles in act 3 due to the CPU load.
Daffan@reddit
I've been on 4k since like 2015 and yeah I only just get budget CPU's lmao for exactly this. The 12400 was $150 2 years ago, a 9800x3d today is $800.
Valmar33@reddit
I'm sorry, but at 4K, you still need a good CPU that can deliver good 1% and 0.1% lows, and more importantly, stable frametimes.
The 5600X will be a complete joke even at 4K, compared to the 9800X3D ~ especially in areas of games that are CPU-intensive. 4K isn't enough to make up for the CPU intensity of the areas of some games.
Archerofyail@reddit
Well sure, but if you're looking to upgrade from a 5800X3D, and you see the 1080p low charts it looks like an incredible uplift. But looking at 4K, or even some 1440p ultra benchmarks shows that there's basically no difference, even in 1% lows.
Valmar33@reddit
Depends partially of the reviewer, and how they test their games. Do they do built-in benchmarks? Do they just test the start of the game? Do they test areas of the game that they know to be CPU-demanding?
Some games are bad even at 4K in some areas.
Minimum-Account-1893@reddit
I always find it funny when people upgrade to these 200fps capable 8 core x3ds at 1080p, just to be graphically bottlenecked at 100-120fps on a 120fps display.
No doubt the minority has high refresh 1080p displays, but the majority buy these CPUs expecting gaming performance at the cost of productivity, while being completely limited.
Becomes a lose/lose for so many. Also, when the transition begins from 8 cores to more, the 8 core won't be as hot when optimization is for 12-16 cores.
A few games actually did want higher than 8 core counts for ultra settings, but I never see these kind of games tested for a 8 core x3d.
KaiEkkrin@reddit
It also matters what games you play.
A lot of graphics-heavy, 3D games appear to have a similar load profile and strongly favour low memory latency and lots of cache -- this is the kind of load the X3D chips are especially good at.
Some are different, though. Strategy games may spend a lot of CPU time calculating AI moves or running a simulation, and may favour different chips, such as ones with more cores. Emulation may make heavy use of CPU vector instructions and see a large benefit from AVX512. Remember to seek out benchmarks of the games you play or similar ones, rather than assume performance in FFXIV Benchmark and Borderlands 3 is representative of all other games :)
9800X3D is looking super nice though.
Anfros@reddit
X3D in general and 9800x3d in particular seems to really shine in sim heavy loads. They perform really well in benchmarks for Stellaris, Factorio, x4 etc.
masterfultechgeek@reddit
Adding to this...
A lot of the "Zen 5 sucks at gaming" comments forget that Zen 5 is a BIG CPU that needs a lot of bandwidth (and ideally at low latency) to be fed well. Zen 4's IOD is 2 years old at this point and designed for a weaker core.
In applications that aren't memory sensitive... Zen 5 is getting upwards of +30% at times.
So yeah... feeding the beast matters more and 3DVcache matters more.
(but 99% of people are still GPU bottlenecked if they're gaming).
OGigachaod@reddit
More than 50% of gamers are still at 1080p, are they also GPU bottlenecked?
Bluedot55@reddit
most of them are probably not using a top end gpu, so possibly.
100GbE@reddit
Hey man, your info isn't even 28% legitimate unless you have a % somewhere in your post at least 17% of the time.
Hendeith@reddit
Truth is, 100% of comments on Reddit that mention % are taking data of lut their asses
100GbE@reddit
I think you're 100% right 33% of the time, tripled.
acssarge555@reddit
Love the joke but 57% of steam users use 1080p according to steams hardware survey. So the guy was telling the truth
brunocas@reddit
You're 100% correct!
Aggressive_Ask89144@reddit
Prebuilts make up the vast majority and it's going to be I5s for most people lol. The Steam Survey gives 6 core cpus and a 3060 which makes sense nowadays lol.
SimpleNovelty@reddit
Probably, because they aren't running a 4090. If you're still gaming at 1080p I sincerely doubt you would be buying a top end CPU or GPU.
SomniumOv@reddit
That would be a fire CS1.6 rig though. 480p very high hz CRT, top end GPU and highest single-thread performance CPU.
Strazdas1@reddit
Yes because they will almost all use old GPUs.
soggybiscuit93@reddit
Thats all gamers, many of whom are on low end hardware. What's the overlap of those buying the best CPU (and also presumably a very high end dGPU) and 1080P?
masterfultechgeek@reddit
Loosely speaking, if your GPU is 1/3rd the speed of a 4090 and you're at 1080p... you should probably be looking at 4K benchmarks instead of 1080p benchmarks to get a feel for how much CPU performance will matter for you. (YMMV, it'll vary by title)
https://cdn.mos.cms.futurecdn.net/BAGV2GBMHHE4gkb7ZzTxwK-1200-80.png.webp
doing crude math... something like a 4060Ti/3070/6750/7600XT/2080Ti is about 1/3rd as fast.
Z3r0sama2017@reddit
Yeah I game @4k and going from a 5950x to 7800x3d gave me a sweet boost even with a 4090. Can't wait to swap it for a 9800x3d and see if I can hit a stable 60 in Zomboid.
anomoyusXboxfan1@reddit
Build 42 runs super fast from what I heard. 4k at hundreds of fps. The way the game code uses hardware is super inefficient from what I understand at the moment.
SagittaryX@reddit
Build 42 isn’t real till it hits public release, it’s been almost 3 years since build 41 :(
(This is just a meme)
Z3r0sama2017@reddit
I know, I saw the youtube videos of it, but It's been years. Until I'm playing it, it may as well be a unicorn.
Wrekh@reddit
Also World of Warcraft is usually CPU bound.
Strazdas1@reddit
They perform wonderfully when the sim fits into cache. They start running into same problems when the sim exeeds cache and you are going to memory like with other CPUs. Still a vast improvement.
Fullyverified@reddit
And DCS / microsoft flight simulator
Jerithil@reddit
Some sim games can be hit and miss for the x3d chips as they can run super fast on small maps but on late game maps sometimes other CPUs do better.
1eejit@reddit
I wonder why more CPU benchmarks don't include something like an endgame huge map Civ6 save and look at turn timers. That's at least as relevant as how it affects fps in low res shooters.
metakepone@reddit
That requires benchmarkers to actually play CIV6 to the endgame
chmilz@reddit
Civ6 has the worst endgame of all Civ's. It's just the worst Civ (for me) period. 7 looking really good though, appears to solve every gripe I have with 6.
So maybe I'll get to see late-game turn improvement with my latest upgrades.
Strazdas1@reddit
because most benchmarkers dont actually play games so they dont know what to look for. Especially rare to find one that plays sim games.
PMARC14@reddit
I believe a benchmark is out there but lately I have been seeing more Stellaris late game turn time tests as the preferred comparison. Maybe when 7 releases we will see that.
1eejit@reddit
Civ 7 will be seeing a lot of patching for a while, 6 should be stable at this point
Jeep-Eep@reddit
There's also IRL factors, like, these tarrifs in the pipe; may want to have virgin silicon before the prices moon.
tr2727@reddit
So essentially get 9950x3d
Michelanvalo@reddit
GN tested turn time on Stellaris for this reason too.
KaiEkkrin@reddit
Yeah, I'm very happy to see tech Jesus testing Stellaris :)
Blacky-Noir@reddit
In this day and age, cpu is not just for gameplay and simulation computation. Outside of the basic "driving the gpu" task, cpu are also used extensively for advanced graphics such as ray-tracing BVH computation.
BatteryPoweredFriend@reddit
Simulation games are far more 1T and latency-sensitive than graphics-heavy games.
mechkbfan@reddit
As someone that does VR, the 4K 1% lows is what I care about the most.
Glad it's included in at least one review
matejdro@reddit
From what I can see, not much difference in 4K, even at 1% lows?
mechkbfan@reddit
No that's correct.
I was previously on 3700x and upgraded to 5800x3d for 1%.
This at least confirms that was still a good upgrade and I don't need you to further
Forza is another title that had some big pretty improvements in 1%. I play Assetto Corsair in VR, so it might be worth investigating further
The other thing is it's not always clear to users which games still do better with newer CPU as well. Like only testing 1080p only implies GPU bound, but then look at Baldurs Gate 3.
Glum-Sea-2800@reddit
There should definitely be more VR benchmarks in both cpu and gpu reviews. Doesn't have to be many but at least one. VR is a lot more latency and 0.1~1% lows sensitive than the regular flat titles tested.
My 5800x is great until it suddenly finds out it needs to give a 100% spike for absolutely no reason.
enigmatic_esoterik@reddit
I am glad you named the reviewer.
lizardpeter@reddit
Anyone who plays any games competitively knows that the ultra-low settings and resolution benchmarks are king. No one who’s serious about gaming should be in a GPU bottlenecked situation - ever. I’m glad to see this CPU is actually bringing meaningful upgrades. If someone is GPU bottlenecked, simply upgrade the GPU.
BausTidus@reddit
You seem to forget that better CPU‘s will improve 0.1% lows which will still happen on higher resolutions and has arguably more impact than average fps.
RealPjotr@reddit
By "most situations" you basically mean gaming with a GPU?
For the rest of us not gaming, it's an even bigger upgrade.
mauri9998@reddit
The X3D CPUs are made for gaming, they don't expect anyone but gamers to buy them.
RealPjotr@reddit
9800X3D beats 16 core 9950X in some non-gaming software. Who wouldn't buy the cheaper faster CPU, even if AMD markets it for gaming?
FitCress7497@reddit
Why would people who are not gaming buy this instead of a 9900x?
RealPjotr@reddit
Because in a lot of workloads it is simply the fastest consumer CPU on the planet, for example FFMPEG, Google Draco, Ngspice:
https://www.phoronix.com/review/amd-ryzen-7-9800x3d-linux/7
dealerofbananas@reddit (OP)
The target audience for an X3D chip is gaming with a high end GPU btw.
matolati@reddit
No, it isn't. The audience for X3D chips are those who want competitive gaming with very high framerate and stables 0.1%, or enthusiasts who want the best available
RealPjotr@reddit
Or people running FFMPEG, Google Draco, Ngspice and a number of other workloads where the 9800X3D beats all Intel and AMD consumer CPUs, including the 16 core 9950X:
https://www.phoronix.com/review/amd-ryzen-7-9800x3d-linux/7
toalv@reddit
Why would you pick an x3d cpu for productivity?
AK-Brian@reddit
Some non-gaming workloads also take advantage of large cache. Code compilation, databases, web server / homelab service containers (useful for prototyping load simulations), fluid motion simulation, etc. The original 3D V-Cache chips were designed for the Epyc Milan-X series of server CPUs.
Check out where the 9800X3D landed on Phoronix's charts for PyTorch workload responsiveness, as an example:
https://openbenchmarking.org/test/pts/pytorch
Nice!
BedNervous5981@reddit
Can be pls stop this stupid discussion every other week? Those benchmarks are here to show differences in gaming performance in CPU bound scenarios. As someone who is mainly playing strategy games, it's a very valid information to see that the 9800X3D will run Anno 1800 or Frostpunk 2 massively faster than my 5800X3D even while I'm using a 4090. It's simply faster calculating the simulation.
Framed-Photo@reddit
Most people aren't doing calculations like that for the games they play, that's the issue, and it's why posts like this are good.
Most folks see "oh this is the best gaming CPU" and don't really understand what that actually means or how to apply it to the games they play. In fairness, same goes for GPU's. It ends up with folks wasting a ton of money on shit they don't need.
Unless you're playing at 1080p (which if you're considering a $500 CPU just for games, what the heck are you doing at 1080p), or if you play games that specifically see a high gain from X3D chips like you are, then it's a lot of money spent for not much gain to get a 9800X3D.
It's clearly a great chip, but do we really think it's worth a little more than double chips like the 5700x3d or the 7600? Unless it's someone like you who plays incredibly CPU heavy games that are not the norm, or you're an insane latency freak that wants 500+ fps at 1080p, then it probably won't be worth doing and you can save hundreds of dollars to put towards a GPU, or a 4k monitor, etc.
MajorTankz@reddit
Obviously not. We're talking about high end hardware here. This stuff is not actually supposed to be "good value".
All decent reviews will cover some metric on value or performance/dollar so people who are looking for that will get the answer they want easily. Others who just want the best will be happy to pay for it.
Kashinoda@reddit
People aren't generally aware of when they might be CPU bottlenecked to be honest, it's not the worst philosophy to simply get the best if it makes sense in your budget.
Good example I ran into the other day was playing Hell Let Loose, my friend and I spent some time in the training zone dialing in DSR on an offline map (both have 4080 Supers). The second we connected to an online match he struggled to get 40-50fps on a 5800X whilst I was sitting pretty at 80-90fps on a 7800X3D, neither of us had trouble getting 120+ offline.
Another example is Arma 3, the game is ancient but if you're playing on busy server it's all CPU limited. There's lots of scenarios like this which wont make it to a typical review.
Framed-Photo@reddit
Again, what I'm saying is that for most people they simply do not need the best, and won't even notice it.
Your friend could likely replace their CPU with even just a 5700X3D and likely see a large boost in games like you're talking about without getting "the best". That's what would happen if they played tarkov for example. And that's a chip you can easily get for sub $200 USD.
Meanwhile, if your friend was going to get a 7800X3D even when it was available for low prices on amazon earlier this year at $350 (which was limited, they would have had to wait around or get it for a smidge higher), add on any of the recommended mobos from PC part picker ($100 even for a shitty one, you'll probably wanna spend $150 for a decent one), and throw in 32gb of ram ($85), and you're already well over double, close to triple the price. If you wanted the 9800X3D the MSRP alone is nearly $500, and based on what else released it'll probably end up over that, you're nearly quadruple the price.
Heck if they still wanted AM5, chips like the 7600 and the 7500f have been available for under $200 consistently as well, and slightly outperform the 5700X3D.
I ain't saying these high end chips are bad, but no I really don't think it's justifiable for most people. Most would be better served by MUCH cheaper options that perform almost as well.
BedNervous5981@reddit
You started your post with a clickbait title essentially saying: „low resolution tests are useless, look at high resolution tests“. What did you expect.
If you want to make the post: look at the best fps/$ than that’s a totally different and valid argument. For me personally owing already the gaming top-dog on AM4, I would get enough oomph out of the new chips to consider an upgrade next year together with a 5090. someone on a budget and maybe still rocking a Zen 2 chip might be better served by getting a 5700X3D instead of completely upgrading the Plattform.
Framed-Photo@reddit
Ok see the problem here is that you're replying to a comment that was not a reply to you lol.
I replied to you, and I said that people who primarily play strategy games are not the norm. The thing I tried to imply there was that, yeah it's fine to upgrade to a new CPU if the games one plays are almost entirely CPU bound regardless of resolution.
Like if you're a big factorio player, that game scales directly with an upgrade like this. Other strategy games you play as well scale directly, I'm not denying that.
As I said too, I'm not denying that some people are insane latency freaks that want to play on 500hz+ monitors, and a CPU like this goes a long way for that. For those people, CPU means a little more than their GPU. Settings can be turned down to get better GPU performance, but it's hard to get around a CPU bottle neck.
Most people do not fall into these two categories, and for those people, posts like the one OP made are good discussion points. Most people don't need a 9800X3D and will almost never see their games getting CPU bottle necked hard enough to matter. If you want to disagree for your own personal use case that I already carved out an exception for, then fine. If you want to say that your use case is the norm, I would strongly disagree.
Successful_Ad_8219@reddit
The Intel white knights have to white knight though.
MrElendig@reddit
Not to mention 1% and 0.1% lows in some games.
matolati@reddit
Most people can't afford the price are trying to discredit the chip and the benchmarks...
Successful_Ad_8219@reddit
Bingo.
Successful_Ad_8219@reddit
It's funny how this "reminder" seems to happen when AMD does well and Intel flops. It also entirely misses the point that just because there exists a 1440p or 4k bottleneck with the GPU, doesn't mean that will always be the case. This CPU is a strong contender for long term ownership just like the 5800x3d.
But what ever. The Intel white knight will ride the white horse right?
wichwigga@reddit
1% lows affect gaming experience more than people think, well maybe not at 4k especially if you have an under powered GPU.
xXMadSupraXx@reddit
It is 40% faster even in 1440p. Even if it is a third of that, it is not "minuscule".
semitope@reddit
40%? Whose benchmarks?
xXMadSupraXx@reddit
The 7800X3D is roughly 20% faster, so the 9800X3D is roughly 40% faster than the 5800X3D.
semitope@reddit
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/27.html
16% at 1080p. weird
xXMadSupraXx@reddit
I'm not really sure what to make of this, they must have updated tests? My results say the 78 is ~20% faster than the 58 but yours says the 98 is only like 8% faster than the 58?
I was guessing based on benchmarks I'm already aware of between three.
dparks1234@reddit
Today the 9800X3D and the 9800X perform the same in 4K. Looking at the data one would think you’re paying more money for nothing. 3 years from now when CPU requirements increase the 9800X3D will perform better than the 9800X.
If you don’t test in a CPU-limited scenario like all-low 720p then you won’t be able to know these differences.
RunTillYouPuke@reddit
I assume you meant 9700X. Actually there is a noticable difference in 4K on max settings in many games between those two. For example:
4K benchmark
Source
dparks1234@reddit
Was just a theoretical example, I never looked up the numbers. Point is that CPU limited scenarios need to be tested so that buyers can see the theoretical difference between CPUs even if they appear equal in their current use-case
dealerofbananas@reddit (OP)
12600k is within 5% of a 9800X3D at 4k
This is a budget CPU from Jan 2022 compared to a state of the art gaming chip in November 2024, almost 3 years.
CPU's for 4k do not matter as long as you are using something somewhat modern.
Bluedot55@reddit
It really matters what you play though. There are many games that do just wind up gpu bound at 4k, with basically any modern cpu. And if you play mainly those, then that's fine. An average of 100 games where 95 are gpu bound only is relevant if you mainly play those 95.
But there's also a lot more cpu heavy stuff lately, where it can start to matter- even more so when upscaling is becoming very good. Looking at recent popular stuff on steam, you have fatorio and BG3 both in the top 20 most played games.
Bg3, in the level1Techs review for example, was getting notably lower fps at 1080p then a 9800x3d was getting at 4k. That game was even to the point that a 7900xtx at 4k was beating a 4090 at 1080p with a cpu switch. Is that game alone worth upgrading hardware for? Probably not. But if that game is important to someone, it can matter if you're getting in the 70s vs maintaining 4k120, or w/e.
Or factorio or any long list of sim games, where how much you can do and how fast you can do it is literally tied to the cpu speed.
Or some of these new games that are designed for 30 fps on the current console gen, like the new monster hunter beta that just got a bunch of hype over the weekend. A 12600k was stuck in the 40s, which isn't exactly ideal.
So if you play games where you're gpu bound and you know it, yea- theres not much point to a really fast cpu. But there's also a lot of things out there where a 100 game average showing near identical performance is going to be missing a bit.
AzysLla@reddit
I am downgrading from 7950X3D to 9800X3D…
Martin321313@reddit
Here this is interesting comparison ... :)
12 y.o CPU + RTX 4070 vs 14900K + RTX 4060
https://www.gpucheck.net/en-usd/compare/nvidia-geforce-rtx-4070-vs-nvidia-geforce-rtx-4060/intel-core-i7-2600k-3-40ghz-vs-intel-core-i9-14900k/ultra/ultra/-vs-
0rinx@reddit
The main game I play (FFXIV) is heavenly cpu bottle necked so upgrading my cpu will have a larger impact then changing my gpu.
DarthV506@reddit
Yep, MMOs are games that really like the extra cache.
COMPUTER1313@reddit
Civilization 6 with heavy mods (City Lights and Urban Complexity) and extra huge map still runs fine with a RX570 4GB at 1080p high details setting.
The turn times? It was chugging hard on my Ryzen 1600 by mid game until I upgraded to a 5600, and even then it's noticeably long in the late game.
Strazdas1@reddit
doesnt Civ 6 with heavy mods just cease to function on account of how they designed the engine limitations?
COMPUTER1313@reddit
Yeah, I only have the Gathering Storm and Rise&Fall DLCs. Even then I can't use other heavy mods with those two, without causing a crash.
inyue@reddit
What's your specs? My 4070ti and 1270k gets cpu bottleneck inside limsa with less than 60 fps but any meaning gameplay session like a 8 man raid in get 175 fps.
Omotai@reddit
Honestly, that goes for most hardware in FFXIV. The thing that causes CPU bottlenecking in that game (and most MMOs) is large numbers of player characters on-screen, which means things like crowded towns and large-scale content (in FFXIV specifically things like hunts). When the player count is constrained (as in dungeons or raids) there isn't much of a bottleneck.
Confident_Hyena2506@reddit
Maybe people don't actually care about the graphics - but just want their Factorio or other strategy game to run faster. This is the best use of x3d chips!
Terepin@reddit
I play at 1440p ultra and I still suffer from CPU bottleneck in several games. And I have "only" 4070 Ti.
Relative-Pin-9762@reddit
Saw a few test with 9800x3d vs 7800x3d with a 4090 running at 1080p.....almost no one using that setup
79215185-1feb-44c6@reddit
Hopefully one day I am so blessed to have $2500 to buy a Graphics Card.
79215185-1feb-44c6@reddit
Spending less hours working > Spending more hours playing video games.
I still do not regret my purchase of the 7950X3D. 8C/16T in 2024 is laughable.
masterfultechgeek@reddit
I don't have an RTX 4090.
I don't play at 1080p.
I'm not looking at 1080p or 720p benchmarks with a 4090.
I'm not looking at benchmarks where the 1% lows are WELL over 100FPS.
TheGillos@reddit
I want to see 320x240 benchmarks.
Gippy_@reddit
256x144 go go go. That's the lowest YouTube resolution. So people use it to either save phone data, or when their internet is choking.
retiredwindowcleaner@reddit
almost all of the big outlets test do their recent cpu benchmarks at 720p / 1080p ultra (max fidelity)
no one uses low settings because many of those settings also increase cpu load (contrary to purely upping resolution)
cb, tpu, hwub...etc.
zarafff69@reddit
I wish reviewers would check out performance on heavy ray tracing games, or just new games that are very cpu heavy. Does anybody care if you get 400 or 600 fps at rocket league? Just show me the results for like the Witcher 3 with ray tracing on. That shit is heavily cpu limited. Lots of new games are cpu limited, but it seems like they are all just testing old games at 1080p, who cares about that?
GoldPanther@reddit
Any good sources for this. I've often wondered how much my 9900k is bottlenecking my 4090 at 4k.
BeefistPrime@reddit
You can use the nvidia app to show an in-game overlay that shows CPU and GPU utilization. Whichever one is capped at 99% is whatever the bottleneck is. Other apps have this function like afterburner but it's a little more complicated.
mb194dc@reddit
Absolutely, if you're using a modern resolution 1440p+ and don't have a 4090, gains are going to minimal from changing cpu.
Strazdas1@reddit
I have CPU bottlenecked games at 4k on a 4070S. It all depends on what you play.
SpitneyBearz@reddit
Just wait msfs2024 user tests.... What if a non-3D 16cores cpu get better results on it?
Strazdas1@reddit
MSFS isnt the best test, its very GPU-bound at higher resolutions.
Onceforlife@reddit
Man I play at 4k 120hz and I was doing it with 4090 and a 12700k, no games I played was limited by the cpu it was either the gpu or the refresh rate of my monitor (LG c4 42inch). But some dumbfucks just can’t understand that. Like maybe in the games you play I’m bottlenecked but I was not bottlenecked in my games with the settings I use. Why is it so hard to understand??? Everyone jumps on the keyboard and wants to tell me I’m an idiot. At least now I know some people at least in this sub understands.
I’ve since upgraded to 7800X3D because it was a good deal from AliExpress back in may or earlier this year. I don’t think 5090 will be bottlenecked by it unless I switch to 4k 240hz. Even then I doubt it because I’ll be playing the latest games at 4k ultra without DLSS which will shred the gpu to pieces.
NoBeefWithTheFrench@reddit
Also on 12700k and 4k 120hz. Planning to buy a 5090 as soon as it comes out but still can't justify upgrading CPU.
inyue@reddit
How much did you pay for your 7800 3d on AliExpress?
Onceforlife@reddit
365 Canadian or I think around 260 usd, this is after taxes and shipping
inyue@reddit
Pretty cheap o.o
Did you randomly find it or is is there a price alert or deal alert subreddit or discord?
Onceforlife@reddit
Mostly following r/bapcsalescanada, not a lot of people trust AliExpress tho, so it’s not like they nabbed it up super quickly. As long as you checked daily in the summer before July I think there were like 2 or 3 sales. It went as low as 230 usd I’ve heard
Framed-Photo@reddit
For reference, the 12700k performs within 4 FPS on average of the 9800X3D at 4k. Unless folks are literally only playing strategy games or other games that are gigantic outliers in terms of how CPU limited they are, then upgrading from the 12700k just for gaming at 4k doesn't make much sense.
Even when new GPU's come out, it would have to be so much faster than the 4090 it would be at least a few generations out.
Techpoweredups 720p average for the 9800X3D with a 4090 is 231, at 1080p it's 200, and at 4k it's 101. We'd need to see at least a 50% boost over the 4090 across the board, probably more, to start seeing significant GPU bottlenecking at 4k. And even at 720p the 5800X3D is just 22% behind, getting 182 on average.
Yebi@reddit
"Real world" benchmarks are an unrealistic idea with a misleading name. You're not going to find reliable data with good methodology of your exact system. What you actually do is look up benchmarks of your CPU, then look up benchmarks of your GPU at the resolution you want, and expect slightly less than the lower of the two
ClearTacos@reddit
Yeah, you can't expect your average FPS in every game to go up by 20%, but that's not the only thing about CPU performance that matters.
Since cross gen console games stopped being released, we've seen more CPU heavy titles that lock to 30fps on consoles and can run around 60-80 in intensive areas on best CPU's today. That's the other thing, most games have areas or situations where they're heavier on the CPU and your FPS dips more than usual, that's where you can see those 20% and where they'll be most appreciated. Not to mention every other big release is a stutterfest lately, Z5X3D certainly won't smooth all of them out but it'll help. There are also sims or strategy games that either have a turn time that speeds up with faster CPU's or are heavily CPU bound, especially late game.
I have never seen this many "you'll never get that uplift in real scenarios!!!" reactions before, having a faster CPU is good if you value a smooth experience beyond average FPS.
john1106@reddit
so as someone who have 5800x3d cpu and only just gotten into am4 platform for just 2 years need to change my entire PC hardware to am5 and get the 9800x3d so that i won get cpu bottleneck even at 4k resolution?
ClearTacos@reddit
Are you feeling limited by your current system, specifically in CPU bound scenarios? Can you notice these performance drops?
If yes and you have the money/desire to buy a new one, sure, if no, then don't, I can't make a decision for you.
capybooya@reddit
Yep, no matter the GPU, those CPU limited areas/scenarios will always drop. And its quite jarring with a very fast GPU to have some areas down say in the 70s compared to usually 130+.
If you're short on cash or have other priorities, sure ride it out with the older or current CPU, but some people just don't know this and assume that the GPU is the limitation all the time when its rather most of the time.
john1106@reddit
so as someone who have 5800x3d cpu and only just gotten into am4 platform for just 2 years need to change my entire PC hardware to am5 and get the 9800x3d so that i won get cpu bottleneck even at 4k resolution?
Framed-Photo@reddit
If you've got the money to spend $500 on a CPU for gaming only, you should be playing at 1440p mimimum, if not higher.
At those resolutions, the CPU matters far less outside of some specific games like factorio, tarkov, etc.
comthing@reddit
People who primarily play strategy games look at new CPUs the same way that action players look at GPUs. It's not just some specific games that benefit, but rather to do with the scale of gameplay.
john1106@reddit
yes correct. Im still on 5800x3d and i rather save money to upgrade to 5090 rather than upgrading my whole motherboard and ram just to get the 9800x3d. Plus im playing on 4k TV so cpu bottleneck shud not be that bad. And i can use DLDSR to render the resolution above 4k in order to make the game more gpu bound
2560x1080p@reddit
I'm in your exact predicament. Right now my build is:
i7 12700F
6950XT
32GB DDR4 3000 mhz
I have a choice of either upgrading my 6950XT to a 7900 XT, or getting a full AM5 build with a 7600X3D/DDR5 and keeping the 6950XT, or just wait till the 7900 XTX drops in price. I fucking upgraded my i9 9th gen, like a month before the 7600X3D came out and I can't return anything.
Whats really encouraging me to upgrade is 1 game thats bothering me cause I use to run it on Supersampling but cant since I changed monitors to 3440x1440 up from 2560x1080.
Gambler_720@reddit
What's up with the insecurity of the 5800X3D owners having to validate themselves every time on how they don't need to upgrade?
The 7800X3D was already a pretty substantial upgrade over the 5800X3D, if someone is too poor for such an upgrade then that doesn't mean it isn't a valid upgrade. The 9800X3D is an even bigger jump going from 5800X3D. Btw I say that as someone who is also too poor to buy a 9800X3D even though it represents a pretty good upgrade over my 7700. But I won't go around saying that no one needs to upgrade from a 7700.
Framed-Photo@reddit
I own a 5700X3D, not a 5800X3D just to preface. I only bought it after the 9000 series was announced.
People can buy what they want, I don't need others to validate my purchase. For my resolution/games/GPU I'm fairly certain I made the best choice, and for others usecases it could be different.
The thing that confuses me is that we can all see the numbers, we can see how good most of these chips are, and we know how expensive something like the 9800X3D is compared to the performance you get, especially at resolutions over 1080p where the differences shrink drastically.
So while you might not say nobody ever needs to upgrade from a 7700, I would argue that if you're at 4k and you're not playing games like factorio or msfs where they are clearly outliers in terms of CPU requirements, then no you definitely don't need to upgrade lol. The 7700 performs within like 2% on average at 4k compared to the 9800X3D lmao.
vedomedo@reddit
Yeah basically, until the 5090 is released, gaming at 4k is kind of «pick whatever top tier cpu». I was contemplating getting the 9800x3d but there’s nor point as of yet.
Framed-Photo@reddit
The 5090 would have to be at least 50% faster for any sort of bottlenecks to start showing at 4k, and even then they'd still be small.
gentlecuddler@reddit
I understand prioritizing 1080p benchmarks to show the biggest differences, but a few slides in 1440p would be nice.
mckirkus@reddit
It really does matter for those 8 of us running PC VR where you can't drop below 90 or 120 FPS without judders. There is no VRR in VR so 0.1% lows matter.
Jeep-Eep@reddit
I don't have PCIE 4, and that is starting to be painful, and I pay roughly as much for a good cooler and a RAM upgrade. Math isn't as good in my parts.
2560x1080p@reddit
Helldivers 2 is a costly game. its the game thats fcking over my entire build right now. I recently upgraded to a 6950 XT and then I met Helldivers 2 ah man it wrecked my rig, now I have to possibly get a 7900 XT or better. I can run it 3440 x 1440p Native, but having used Super sampling for so long, I feel like im not getting the full value of the game.
Superb_Raccoon@reddit
Only one bed in an ambulance.
Iaghlim@reddit
I have already tried to find a specific kind of review, where it is a gaming review but done while doing many lot of other things
As an example, sometimes I'm working, with teams, Spotify, chrome, excel, power point, word, lots of apps opened and for around 20-30 minutes I decide to play something real quick (yeah, thanks home office), and MAYBE(honestly, I have no idea) a 16 core would be better in gaming than 8 core in this specific kind is usage
Also, never saw someone doing these comparison while streaming or any different loads
That would be great for people who has specific situations like me
colxa@reddit
Congrats on watching the LTT video
Neofarm@reddit
On the opposite, a lot of games are CPU heavy. You dont need "next gen" GPU at all to push frame rate but a better CPU like 9800X3D no matter the resolution. For example most MMO, RPG, shooting, strategy, simulation games... The only type of game where GPU matters more is single player graphic heavy game at high resolution.
SJGucky@reddit
1080p is actually a good real world resolution.
Most people are using upscaling and from my tests everything with a baseresolution of 1080p looks good.
Meaning DLSS quality with 1440p or DLSS performance in 4k.
KirillNek0@reddit
Yes, yes...
People want real-world reviews, not "theoretical max" from CPUs.
conquer69@reddit
A "real world review" is completely different from a cpu comparison, which is what regular cpu reviews are.
What you want is a system review and you will need to find someone with the exact specs you want, testing the same games you are going to play.
I'm sure you can see why that is a dumb expectation to have from a tech reviewer.
KirillNek0@reddit
But we do have tech-tubers that do/did these reviews. This sub shits on them usually.
Qaxar@reddit
The irony is that a next gen GPU (at the higher end) would more than likely alleviate some of the gpu bottlenecking of some games, which would make cpu performance more important and widen the gap between 9800x3d and other processors.
GTRagnarok@reddit
Yeah, I game at 4K with my 4090 and my 13700K is doing just fine. It's undervolted and averages 60-80W while gaming which is not much more than AMD's CPUs. And Intel's idle power consumption is better which pretty much cancels things out.
Michelanvalo@reddit
I tried making this point recently and got absolutely blasted by this sub for it. Lab results are great, but very few people are buying a 9800X3D/4090 to play games at 1080P. Practical set up results have their value too.
damien24101982@reddit
it depends alot on the games you play.
vialabo@reddit
That is literally how you control for the CPU variable but sure.
tuvok86@reddit
Turns out future proofing is much cheaper when done...in the future
OGigachaod@reddit
Yeah hindsight is awesome.
MrAldersonElliot@reddit
Jokes on you that's real world for me I only play competitive shooters online. Eg CS2 and Valorant. So those are real as it gets settings.
baron643@reddit
Nowadays you are using either DLSS or FSR at least 80% of the time and youre probably upscaling from a lower resolution so yes 1080p benchmarks matters
Raikaru@reddit
Lower Resolution =/= 1080p. 4k DLSS Quality and Balanced are still higher than 1080
Winegalon@reddit
Its still a fair point to make. 1440p is a very popular resolution, and if you use DLSS it renders bellow 1080p even in the quality setting.
TranslatorStraight46@reddit
If your card is CPU limited at 1080p you won’t need upscaling for 4K performance.
Most people should be running variable refresh rate monitors with their framerate capped to their monitor’s highest refresh rate instead of worrying about CPU bottlenecks.
Mystikalrush@reddit
Any 1440p and/or ultrawide reviewers?
dragenn@reddit
If it drops the price on the 9xxx series CPU, I'm just as happy.
Even grab a massive discount on an owner that needs higher 0.1% lows...
🤣
lebrowski77@reddit
Most modern AAA single player games are made with 4k 60 fps console and tv gaming in mind. What I really wanna know with these x3d chips is how much they help reduce stutter, how smooth the frame time graph looks compared to other chips when locked to a 4k 60. Elden ring would be the perfect game for this, as it's plagued with all kinds of stutter.
tangosmango@reddit
Any reason to upgrade now from a 7700x? I was initially holding to upgrade to the 5090 and the 9900x3D or 9950x3D.
I'm running AW3423DWx3 so I'm not sure if the 9800x3D will even benefit me all that much.
conquer69@reddit
In geekerwan's review, cod bo6 is gpu bound at any resolution, even 360p. Probably because DLSS has a fixed frametime cost. When you are getting 300 fps, that's a frametime of 3.33ms
DLSS costs 1ms. That means it can tank performance up to 70 fps which it seems to be doing.
FitCress7497@reddit
This! Also worth notice that those benchs are with a currently 2000$ 4090. If you have a weaker GPU, the result are much closer