[Hardware Unboxed] Real World 9800X3D Review: Everyone Was Wrong! feat. satire
Posted by imaginary_num6er@reddit | hardware | View on Reddit | 157 comments
Gippy_@reddit
While this was a tongue-in-cheek response to everyone wanting 4K benchmarks, there actually is a bit of merit behind this.
At 4K, the GPU is clearly more important than the CPU. Now the question is, how low of a CPU can you go before the CPU significantly matters? Will you still get the same bottleneck with a Ryzen 5600 or an Intel 9900K? The oldest CPU test here was the 12900K which did show that for 4K gaming, the 12900K is still virtually functional to the 9800X3D.
RentedAndDented@reddit
Yes and no. He avoided testing any game that might be CPU limited at 4k, as they do exist.
SupportDangerous8207@reddit
I play a lot of Helldivers
At 4K it brings my 7800x3d to its knees even though my gpu is pretty weak.
SHOLTY@reddit
This is what I'm saying, these games that I regularly play are ones I immediately thought of when thinking about CPU heavy games that will crush most any CPU.
I know for a FACT those 3 games alone are VERY CPU dependent and there's no way they are getting the same fps paired with a 5090 @ 4k( especially since those games really can't max out a 5090 @ 4k lol so you're not GPU bottlenecked). Helldivers in particular was going down to like 50 fps on my old 5800x3d and it is literally like half the performance my 9800x3d gives me in that title.
I play those games on a 9800x3d/5080 now and just recently upgraded from a 5800x3d/3080.
The difference with JUST the CPU was massive while I was trying to get a 5080. It literally was night and day in Helldivers 2, Darktide, and Escape from Tarkov.
If anyone watches this and goes away thinking that there is no reason to upgrade your CPU at all, I beg you to reevaluate and understand your use case. If you like to play CPU demanding games like the ones listed above or other early access, and unoptimized games, consider upgrading for sure.
Like ofc if you are GPU bottlenecked and refuse to turn down settings, the CPU isn't going to be the biggest bottleneck in most games. But I'm willing to bet even in those games the frametimes and your 1%/.1% lows are going to have a massive difference.
But they are a ton of games that are just poorly optimized that will never max out your GPU and will absolutely dunk on your CPU. I'm thinking of mostly early access by indie devs here but, think about your use case people!
GarbageFeline@reddit
The problem with most online games is that they're tricky to benchmark in a repetable manner.
A good example as well is FF XIV. The official benchmark always makes it look like incredible frame rates can be achieved. Get into a hunt train with 200/300 people and you'll see the best CPUs plummet to 60/80 fps
SupportDangerous8207@reddit
I’m not even convinced Helldivers is poorly optimised because it hits my cpu so hard it’s difficult to believe
Bad software will overload one thread
Helldivers will use like half of my cores to their fullest extent it’s insane
SHOLTY@reddit
Yeah, my bad, I wasn't tryIng to throw shade or imply Helldivers is one of those unoptimized games, lol.
Manordown@reddit
I agree 100% any online game benefits with a faster cpu. Modding games also benefits. Lastly vr needs all the single threaded performance you can give it.
BulletToothRudy@reddit
Yeah but these are outliers, TW Attila will grind any cpu to dust even at 720p, but this doesn't really matter for general discussion.
Strazdas1@reddit
They are not outliers. They are typical experience in certain genres.
Plank_With_A_Nail_In@reddit
The genres are outliners themselves.
madmk2@reddit
It has always been like that. Remember all those years when intel was a couple percent better sucking up twice the power for twice the money? People bought it anyway.
For one reason or another, people like to have the best thing. For GPUs that has become unobtainable for most, but spending a little extra on a CPU you don't really need isn't going to bankrupt you the same way.
At the end of the day, it's still a hobby for most and is supposed to be fun. Not every decision has to be logical.
Vb_33@reddit
With modern AAA games the faster the CPU you have the less of a stutter frst you will have. Too many things hammering the CPU nowadays from data streaming to decompression to shader compilation to Ray tracing. It's not the 2600k days anymore.
Plank_With_A_Nail_In@reddit
Maybe reviewers could provide proof of this instead of us just guessing?
Vb_33@reddit
They do. The reason I listed soany is because it depends on the game for example FF7R Rebirth has the industry best traversal stutter in that there's 0 traversal stutters despite being an open world UE game, that's a first. But the game does suffer from shader comp stutter unfortunately and since it's a UE4 game it's even harder for them to resolve it.
To look at what I mean just look at the latest DF review: the last of us part 2. And guess what that game hammers the CPU with shader compilation till the point you're CPU on a 9800X3D to 80fps with a 5090. I feel most people here don't follow the AAA gaming scene and just assume every game is CS2 or Doom 2016.
raydialseeker@reddit
The HUGE elephant in the room is upscaling.
Suddenly 4k dlss4 performance actually looks insanely good and more than doubles frame rates. Now the cpu just gets nuked
If you have a 5090 you probably also have a 4k 240hz monitor. Dlss4 performance lets you hit that frame rate in a lot of AAA titles. Native res gaming is dead with how good upscaling has gotten.
Plank_With_A_Nail_In@reddit
I'm kinda hoping frame gen kills the desire for ever higher frame rates and we can go back to getting better image quality. The Xbox one era made the PC gaming crowd only value fps as that's all they had over consoles its been a disaster for video game progress.
raydialseeker@reddit
Ever seen alan wake 2, cyberpunk 2077 or black myth wukong with path tracing ? Games have never looked better.
honeybadger1984@reddit
Upscaling is a good argument for a stronger CPU. If you play at 4K but render at native 1080P, suddenly it matters.
wanescotting@reddit
This is the way. I skipped the 5090 and upgraded my platform.
cart0graphy@reddit
This is fundamentally testing two different things. It is essentially not testing the product, but testing scenarios in which the product cannot reasonably perform to it's specifications.
If 4K gaming is the only workload you have, then yes, I agree that at this certain point in time you can't capitalize on the potential of a better CPU (but it is not a guarantee that this will continue to be the case).
Sipas@reddit
Not even that. I've been burned by reviews like this before because they can never fully cover real life scenarios, like mmorpgs, online shooters, simulators, virtual reality, etc., even if they try it's not representitive of actual game play. On paper Ryzen 3500 was practically on par with 3600 in gaming but it was a horrible experience for me. Upgrading to 3600 was a day and night difference. I'm 100% sure there are games that'll choke most of the CPUs in the video at 4K in certain realistic scenarios.
cart0graphy@reddit
Oh yeah I agree, I bought my 9800x3d exclusively for WoW and cpu bound games.
msshammy@reddit
80% of my gaming time is spent in WoW. The 9800 x3d was the biggest upgrade ever.
Mastotron@reddit
This was my biggest reason for upgrading from a 12900k and it was very clear after launching the game.
Capable-Silver-7436@reddit
yep the extra cache helps mmos so much especially on raid night
Swaggerlilyjohnson@reddit
I had a 7800x3d build and a 4070s on a 4k monitor. looking at most reviews you would think there is zero chance I could be CPU bound in essentially 99.9% of games.
I was actually frequently CPU bound in many games like elden ring, hell divers,black ops 6 etc.
Why? because I was mostly playing at all low settings and using dlss perf or even ultra performance. No one really tests games like that and people would say "well it's not 4k you are heavily upscaling" true but the fact remains I was CPU bottlenecked. I wanted really high framerates and CPUs matter more for that. In some games you can 5x or even more your framerate with different settings and upscaling.
Reviewers can't test every configuration. I wouldn't ask a reviewer to always test like I was playing and everyone would be saying it's dumb to test like that because who is going to buy a 4k monitor and play like that but I still get useful information from the 1080p and 720p CPU game testing because it tells me the framerate I can get with a CPU if I change the settings to make it happen.
what determines how fast a CPU needs to be for you is more about what framerate you want then your monitor resolution or even your GPU (within reason). If you want 100+fps on a 4070 even on a 4k monitor many games you can actually make it happen (no framegen either) but even on a 5090 I wouldn't get a consistent 150 in hell divers because even the 9800x3d isn't fast enough no matter what you do.
There are tons of ways to make your bottleneck the CPU and maybe 1080p is not "real world" but neither is all ultra settings or no upscaling if I had to guess.
IshTheFace@reddit
Daniel Owen did a video about this showcasing Baldurs Gate 3 where the bottlenecking happened on both cpu and GPU depending on the scene.
Framed-Photo@reddit
HUB is probably my favorite tech review outlets, but their refusal to admit there's even some merit to testing like this, kinda irks me the wrong way? Especially after the whole B580 scaling fiasco, where they themselves even managed to show that not only does the B580 scale horribly even when supposedly 100% GPU bound, but even AMD and Nvidia cards can also see decent performance varience while GPU bound.
HardwareUnboxed@reddit
Couple of things here.
Firstly you are confusing GPU reviews with CPU reviews, this video is about CPU reviews, not GPU reviews. Even so you're B580 example is an outlier, this issue, at least to that degree, is not a thing with Radeon or GeForce GPUs.
As for the CPU testing, asking the reviewer to arbitrarily GPU-limit performance to represent 'real-world' performance is neither, real-world or useful.
The only right choice here is to minimize the GPU bottleneck, not try and manage it to a degree that you think makes sense. GPU-limited CPU benchmarking is misleading at best.
ryanvsrobots@reddit
Maybe if that's the only test you did but no one is asking for that. But if it's supplemental with the obvious context of "I want to know what to expect at 4k" I don't see how it's misleading.
It's totally ok to just not want to do the extra work but calling it misleading at best is... misleading.
CatsAndCapybaras@reddit
If you want to know what the performance is in a GPU bound scenario, you would watch the GPU review. Even as a supplemental addition, it provides no new data to test CPUs at GPU limited scenarios.
CPU reviews are to help people choose between CPUs when they are buying, not as a way to estimate how many frames you will be getting.
ryanvsrobots@reddit
But this video actually proves that upgrading my CPU would be a waste of money. The CPU review would mislead me into spending money for nearly zero benefit.
yo1peresete@reddit
"zero benefit" - in non cpu limited games - or even scenes, for example 5090 showed over 70 fps in stalker2 with 9800x3d - you know what's funny? There's plenty of scenes and story moments where 9800x3d drops below 60fps in stalker2. and stalker2 is not only poorly performing CPU game + more game's to come.
CatsAndCapybaras@reddit
It does nothing of the sort. This video only tells you that you can play AAA titles at ultra 4k with shit framerates if you have a 5090. If that's what you want to do, then go for it.
ryanvsrobots@reddit
It does and I have no idea why you're so salty about it.
HardwareUnboxed@reddit
No idea why you have been downvoted here, you are correct, this is the intelligent answer.
madmk2@reddit
If you're ever feeling bored i would still love to see a deep dive on how much CPU performance is required for certain breakpoints. It can be pretty hard to accurately gauge what someone should buy if they were playing 1440p with a 9070XT for example.
Strazdas1@reddit
Playing what? I can give you games where a 9800x3D will choke before a 3050 does at 4k.
conquer69@reddit
That varies on a per game basis and per scene inside each game. Some things can run well at 4K on a 9070 xt. Others need 720p.
There isn't a good way to get that data without spending hundreds of hours testing. The best way so far is subscribing to multiple reviewers that each test different things.
capybooya@reddit
Exactly. Those who insist on getting a brand new GPU for their older CPU and playing at high resolution completely ignore the fact that the frame rate will completely tank in various scenarios. Its completely game dependent how often but its extremely noticeable and shows up in 1% and 0.1% and often also impact average somewhat.
Strazdas1@reddit
If you are getting GPU limited then you are using software unfit for CPU benchmarking.
Numerlor@reddit
I think the disconnect here is that you're doing CPU only reviews (or GPU only), while people are looking into these trying to buy a whole system. There's a portion of viewers that enjoys the reviews for purely entertainment value or to stay up to speed, but the other portion just wants to buy a computer, and showing a CPU as a clear winner on most stats will get people to buy it, even if they don't need it. Think of e.g. a parent buying their kid a computer and the kid getting all info from the reviews
I can guarantee that most people buying the 9800x3d, or 7800x3d/14900k/13900k previously did not need the power at all and would've got similar performance with a cheaper CPU. Right now I'm seeing a lot of people with 9800x3d. It sure is a great CPU but with the demand its price is also very inflated and the FPS increase won't be nearly worth it with when on a lower end GPU compared to say a 9700x
honeybadger1984@reddit
Some of it is the audience. Reviewers and online enthusiasts aren’t shy about discussing the CPU sitting idle at 4K frame rate wise, or barely any difference at 1440P. But people see bigger number better must buy, and ignore the context of synthetic benchmarks or 1080P.
The discussion does get muddled if people with high end GPUs use upscaling for more frames, rendering at 1080P performance.
WuWaCamellya@reddit
Also side note but for people who want to get an idea of whether or not a certain CPU will bottleneck their GPU at 4k they could just watch your CPU reviews and look at the 1080p average, cross reference that with your GPU reviews of their card's 4k average, and get a, while not perfect, still fairly good idea of if it is a sensible pairing or not. EG if the 7600x has a higher 1080p average in similar games than the 5080 has a 4k average then you literally have all the info you need to know that pairing the two is reasonable if you intend on gaming exclusively at 4k. People keep complaining and begging for the information when it is already there if they really want it.
Framed-Photo@reddit
I agree that in theory if you have something like a 7600x at 1080p you can just use that data combined with the 5090's 4k data to see where you'll be limited. That's basically what HUB has suggested viewers do if I'm not mistaken.
In practice though, it sometimes doesn't work that well because of some quirk with how the game performs or when using certain hardware combinations. Sometimes games just...scale unpredictably with different CPU's, or sometimes certain settings have noticable CPU performance hits that might not have been caught in the benchmarking, etc.
It's just part of the problem with using games as a metric for trying to test objective hardware performance. Most games don't ONLY tax one part of your system even if we try to minimize variables as much as possible. The CPU is still a variable in a GPU bounce scenario and vice versa, and depending on hardware and the game tested that difference can be minimal or huge.
WuWaCamellya@reddit
Which is why I said that it is not perfect. It is sufficient to get a solid idea of when a pairing is complete nonsense though.
Framed-Photo@reddit
I guess we can have a difference of opinion there. I don't beleive it to be sufficient, at least not all the time. It can actually be quite misleading depending on the game and how the separate CPU and GPU benchmarks were performed.
VastTension6022@reddit
"erasing a real world bottleneck is the only way to get real results"
What's really misleading is promoting expensive CPUs promising extra frames that don't exist.
Like I'm sorry, but do you actually hear yourself?
HardwareUnboxed@reddit
The frames are very real and they can be unlocked using a number of configurations. You seem to have misunderstood what a CPU review is and how important this data is for purchasing the best value or best performance CPU. Perhaps this small section of a recent video will help you understand a little better: https://youtu.be/5GIvrMWzr9k?si=4lzygZG-wGSSTRox&t=1745
soggybiscuit93@reddit
Because there's too many permutations of CPU + GPU combos. If the game is limited by the dGPU performance, you're not actually testing the CPU. And you can figure out of the game is would be limited by the dGPU by just watching the dGPU review, comparing the CPU and GPU FPS figures of a particular game, and recognizing that you'd be getting the lower of the 2 if you bought them.
GPU limited CPU reviews are just asking to be spoon-fed the info of those specific games that were tested. There are plenty of games that are CPU limited that aren't used in reviews because it's very hard to consistently replicate the test between runs - stuff like MMORPGs or simulators, etc.
Gippy_@reddit
The problem is that GPU reviews done by the big names aren't done with any sort of CPU scaling. They are done with the best CPU and then are compared against older GPUs. This ends up having the "9800X3D with a 1080Ti" scenario that people laugh at. However, people don't tend to upgrade CPUs as often as GPUs due to platform limitations. So the reverse situation is more likely: Will that RTX 5090 work well on your legendary 14-year old i7-2600K @ 5.0GHz (+47% OC) Sandy Bridge battlestation?
There are certainly smaller YouTube channels that take the time to test new GPUs with old CPUs and vice versa, but usually that info comes out weeks or months later, and the data takes a bit more effort to find.
soggybiscuit93@reddit
I do think something along these lines would make for a great video that I would definitely like to watch.
I do, however, think it's just not realistic for launch day reviews and will need to be a video released at a later point.
basil_elton@reddit
Or you could just take a representative card for its appropriate resolution - like RTX xx60 for 1080p, RTX xx70 for 1440p and RTX xx80/90 for 4K and then give us the data for which CPUs fail to make the cut for delivering a reasonably high FPS target like 120 FPS average, at high settings, without upscaling.
It will be far more useful than saying "CPU X is 20% faster than CPU Y" because that is only applicable for those who have the fastest GPU in that particular circumstance.
If the temperature at noon is 30*C and at night is 20*C, we don't say that it was 50% hotter in the day than at night.
timorous1234567890@reddit
1080p is more relevant than ever with more and more upscaling being used. With 4K performance you are rendering at 1080p and 1440p quality is sub 1080p.
So no, just test at 1080p native with a top line GPU and compare CPU performance.
That is the only way to know if a CPU can push a particular game at a desired frame rate. If you want 120 fps and no CPU can manage that mark then you need to wait for patches to improve performance or for new CPUs to brute force it because no amount of tuning settings will overcome a CPU bottleneck.
basil_elton@reddit
There are actual games that are both performant and CPU-heavy that do not need any patches to improve performance.
Have you given a thought that Alan Wake 2 with ultra settings at 4K DLSS balanced - i.e. 1080p - is irrelevant to someone with a RTX 4060, yet it doesn't mean that someone like that isn't playing any game that doesn't need a RTX 5090 to "eliminate GPU bottleneck" before CPU differences can be observed?
Framed-Photo@reddit
The issue I see in most modern benchmarks is that lack of scaling testing. As you guys showed in this video here, and this one too, the scaling we're seeing is not 100% predictable and/or consistent, for both CPU's and GPU's.
I can ellaborate on what I mean if you want, maybe you'd be willing to give some insight?
althaz@reddit
There is *ZERO* merit to testing CPUs at higher resolutions. Best-case scenario it's a negative. When you're testing CPU performance, you need to test CPU performance. You cannot do that if the GPU is getting in the way.
However there is *absolutely* room for additional content that's far removed from CPU reviews where you look at how systems should be balanced, where and when different components matter, etc.
And then there's the other side which is benchmarking *software* (which is not something I think HUB does, I am not across all of their content so please correct me if I'm wrong?). There you do want to use a variety of hardware and a variety of settings as well. But that is the absolute opposite of what you want from a CPU review.
Strazdas1@reddit
testing CPU in higher resolution is the most useful form of testing. If you are getting GPU limited thats a signal you are testing something thats not fit for a CPU test in the first place.
Framed-Photo@reddit
I would agree if the software being benchmarked was entirely CPU bound, but video games are not. They will always have SOME variance based on what GPU you test with, and that variance isn't always predictable.
Like for a synthetic benchmark it obviously makes no sense to do that with a 4090 and then a 4060 or some shit, but games scale in weird ways that often aren't that easy to predict, so getting hard numbers instead of guessing and hoping things scaled as you thought they would, could be nice.
soggybiscuit93@reddit
I'd like to see that, but often times reviewers just don't have enough time to get their benchmarks done in time between when they receive a sample and when embargos lift.
I would like to see a 2nd, followup review that comes out when they complete it that includes more detailed information.
Or at least some more CPU bound games. I imagine they use comically high FPS E-Sport Benchmarks as a fill in that's easily reproducible. Would like to see something like Banner lords 2 with maximum units or City Skylines 2 late game population growth test (idk I'm sure there's something they can find)
SubmarineWipers@reddit
Except the merit is mostly imaginary. I just upgraded from 12700 DDR4 to 7700X DDR5, and while these two look almost identical by internet benchmarks, my cpu load in Veilguard RT droped from constant 100% (on all 12 cores) to 40-60%, game is much more fluent and less choppy in extremes (not even 1% lows, maybe 0.1% which nobody tests).
It also makes a load of difference for input lag using Framegen - previously Stalker2 was almost unplayable due to input lag, and now it is mostly okay.
FG with path tracing in Cyberpunk2077 is better, but still too slow for me - I suppose an X3D would make another massive improvement for this - looking forward.
No tests truly cover how much your gaming experience improves with a newer generation CPU.
Strazdas1@reddit
It could. It wouldnt even be hard. They just have to stop being braindead and testing the most GPU limited games they can find for a CPU test.
soggybiscuit93@reddit
A 12700K with low latency, high performance DDR5 is generationally faster than a 12700K with DDR4.
puffz0r@reddit
Yeah but if you're going to move to a new platform you might as well go to one with longevity... And not one that is 13th/14th gen
soggybiscuit93@reddit
12700K -> 7700X is basically a side grade. Then if he upgrades to Zen 6, that just seems like a really expensive, roundabout way of slightly improving performance every 2 years.
Would've just been better off originally going with ADL and a DDR5 board and waiting until something more substantial of an improvement came out.
Like, if you're gonna go through all the cost and effort of switching from ADL to AM5, why bother with non-X3D?
greggm2000@reddit
If the commenter was like me, they got 12700K + DDR4 at launch, when DDR5 was only available at 4800 MT/sec, was really expensive, and was slower than the DDR4 available at the time.
Myself, I plan to go Zen 6 X3D when it arrives.
SubmarineWipers@reddit
Exactly like this, DDR5 platform was way too expensive in the beginning.
For the previous commenter - I saw no point in investing into a dead platform, instead I sold the old one, added 300 usd and bought something that works well now, and can be upgraded to X3D when they reach normal prices (\~400 usd instead of the 600 it is now).
ExplodingFistz@reddit
Upgraded to the 7700x from a 10400f. Surprised how hot the Ryzen chip runs, but it is a beast for gaming
Strazdas1@reddit
if GPU is more important than CPU at 4k, you are using wrong game to test CPU. Thats it.
Tuna-Fish2@reddit
... Note that while 12900K is a CPU that works on a DDR4 platform, it performs much worse if used on one, iirc by ~20%. To the point where 12900k on DDR5 isn't a bottleneck, the same CPU on DDR4 would be.
ansha96@reddit
Depends what DDR4 ram you use, 4000 c14 b-die is still faster in many games more sensitive to latency than any DDR5....
Gippy_@reddit
Are you referring to this video? Once again, this used 1080p testing.
I'm willing to bet that at 4K, the effect of DDR4 vs. DDR5 is negligible.
Morningst4r@reddit
Problem is no one is just setting every game to 4k ultra and putting up with whatever framerate they get. I suppose if anyone is going this, it would explain why so many people complain about "bad optimisation" even in games that run pretty well if you change settings. If I had a 5090 I definitely wouldn't be happy playing Cyberpunk and AW2 at 30 fps.
At 4k DLSS performance and high/very high settings I can guarantee the 9800X3D will be noticeably better in many games.
Also, if you want to see price/performance numbers that would really confuse the complainers, take a look at RAM. 8GB will get you 99% of the average frame rate in most games for less than 1/8th of the price of 64GB! (and is obviously a terrible idea unless you're at the absolute minimum budget).
The8Darkness@reddit
Then youre testing games not cpus. If youre a game benchmark channel thats valid. If youre a hardware benchmark channel its not.
Also shouldnt be hard to figure out if a cpu can provide 100fps in 1080p and 100 is enough for you, it will also provide 100 in 4k if the gpu can keep up.
Aleblanco1987@reddit
I'd like to see frametimes at 4k. Maybe the average is the same but there are smoother cpus
Game0nBG@reddit
1% lows are affected more than average fps. Also if you change whole system together then get a mod tier CPU like the 7600 and pump all money on GPU. If you change parts as you should ideally you want your CPU to hold for the next GPu after your current.
gokarrt@reddit
imo there is a growing need for more qualitative analysis of this gear. testing without features that almost everyone uses (upscaling), is growing increasingly disjointed from the user experience.
back in the day, hardocp used to try something like this. they would establish a performance baseline (say, 4K (effective) @60fps in game X), and then they'd tell you what settings you could use on each GPU to acheive that baseline. i think about that a lot, i think modern reviews will start to move into something similar - i know DF has talked about it several times.
No_Guarantee7841@reddit
You mean at 4k native. 4k can still have a lot different internal render resolutions. Just like any other res in that regard tbh.
BrightCandle@reddit
While people buy a CPU(MB+RAM) and a GPU and these are technically separate purchases the interplay between the two clearly impacts performance. There is and always has been value in determining when a component upgrade makes sense.
Anandtech and a lesser extent Tomshardware used to always include older very popular CPU products in their GPU testing for this reason, how the GPUs scale and which had worse driver overhead mattered. The way the Arc driver overhead problem was hidden despite people noticing it is symptomatic of a blind spot in how GamersNexus, HardwareUnboxed and the other youtubers test things.
There is value in knowing what the worst CPU is that can still 1440p and 2160p game without hampering the brand new GPU too much because its a real world scenario many people find themselves in as they don't have the money to just be upgrading the entire computer every couple of years. We keep old SSDs around and motherboards, CPU and RAM for as long as its still good enough because we are budget constrained.
The way HUB and GN behave is that we are budget constrained on the purchases but that old products don't really exist beyond the prior generation. In contrast yesterday Tomshardware did a GPU comparison going all the way back to the Riva 128.
sidEaNspAn@reddit
So I actually have some data on this! Although just a single data point.... I have a 4090 and play at 4k
I upgraded from a 9700k to a 9800x3d, using 3dmark steel nomad as a benchmark I am seeing almost a 100% increase in frame rate (not overall score!) during the benchmark run.
capybooya@reddit
I mean, not very low at all. There's always some CPU bound scenarios, even in relatively 'simple' games. In those areas the frame rate will skydive. If one of those people who stubbornly stay on their Coffee Lake or Zen3 or god forbid SB with a 4000 or 5000 series GPU, can live with those moments, more of them in newer games, by all means keep riding that delusion into the sunset.
TheMegaDriver2@reddit
This is pretty much, why I got a used 12900k to replace my 12400f. Much cheaper than going AM5. Used 13th/14th gen are out of the question since you never know if they are good or not.
But the 12900k is perfectly capable of not being the bottleneck.
Zednot123@reddit
There has also been some straight up platform differences in performance when GPU limited in the past. Where you could see measurable and repeatable 1-2% performance differences.
Just because you are not CPU/memory limited. Doesn't mean there can't still be latency bottlenecks still that affects performance.
Daffan@reddit
This is actually why I bought a 14600k on sale lol. It was $340 whereas in my country, the 9800x3d is $889. I already had a 4k 144hz monitor.
uBetterBePaidForThis@reddit
Well, 5600x was not enough
kazenorin@reddit
I remember there was a hardware review site that uses target graphical fidelity to test stuff. That could be kind of useful. The thing is that they'll have to test both combinations.
timorous1234567890@reddit
That was [H]ardOCP. They used a maximum playable settings metric so they would have a target FPS (say 60 or 120 or whatever) and then tune the settings to provide the best IQ possible at that frame rate.
Cheeze_It@reddit
I miss that era of computing.
LuminanceGayming@reddit
unless you consider not using super mega ultra graphics and instead (i know this is illegal here but still) use high graphics.
timorous1234567890@reddit
After the intro they should have gone straight for Civ 7 at 4K native and just done a turn time test, Then do a paradox grand strategy game at 4K native but measure tic rate.
eubox@reddit
he purposefully chose the titles to be as much gpu limited as possible, that's why at the end of the video he says sarcastically: "and I'm glad it's now on me, to determine how gpu limited your gpu gaming should me, and I've decided as GPU limited as possible"
timorous1234567890@reddit
Tbh if they want as GPU limited as possible they might as well use the iGPU in the 9800X3D iodie.
anders_hansson@reddit
The problem with using the iGPU when testing a CPU is that it competes with CPU resources (especially memory bandwidth, but also the power budget I would assume), so your testing would be irrelevant for a dGPU user.
eubox@reddit
then you'll just be comparing iGPUs, and some CPUs don't even have one (Ryzen 5000 series and older)
Morningst4r@reddit
Easy, just put big FAIL marks on those - totally unusable
Strazdas1@reddit
that would be an actual useful test instead of april fools one.
superamigo987@reddit
You know something morons are going to take this seriously lmao
inyue@reddit
I didn't watch it yet but is this a fake video with fake results for 4/1?
Szmoguch@reddit
real video with real results
inyue@reddit
Hnn, so I wonder why it's wrong to take it seriously.
eubox@reddit
because its a cpu comparison in heavy gpu limited scenarios jus to make fun of people crying for 4k max settings benchmarks on cpu review (in these benchmarks 9800x3d is equal to the 285k, 14900k and even the 7600x)
Strazdas1@reddit
if you are GPU limited you are using wrong software to test CPU in the first place.
eubox@reddit
yes and that is what this video is making fun of
Strazdas1@reddit
no, this video is making fun of himself because instead of switching to correct software, hes just gimping GPU use and still using wrong software his his non-joke tests.
eubox@reddit
what software are you talking about? the games themselves or what?
Strazdas1@reddit
Yes, the games he chooses to test it on.
Embarrassed_Club7147@reddit
Because we are testing different tires on a car thats swimming. Our conclusion is that the tires dont affect our swim speed, therefore we can use any tires on the car even if its now driving. Its not wrong data, but its useless.
Hefty-Click-2788@reddit
It's highlighting the absurdity of people complaining that CPU doesn't matter because at 4K you're GPU limited on basically any modern CPU in most games.
The truth is that most people play at 1440p, and people who play at 4K are almost always using upscaling tech at an effective res of \~1080p-1440p. You have to contrive this silly scenario to get the results these people claim and want to see to validate their purchasing decisions.
If you actually only play at 4K native res on these types of games (no simulators, 4x, MMO, etc) then I guess this video is right up your alley and the results can be taken at face value.
terraphantm@reddit
I imaging a larger percentage of people spending >1k on a GPU gave 4k monitors and play games at 4k. There is some merit to knowing for sure whether your existing cpu is good enough for the game. Or for example if it’s reasonable to skip the 3d cache because you do other things that would benefit more from having more cores.
Hefty-Click-2788@reddit
Yeah, more information is always good. The video does show that even with a 5090, you will be well below 60FPS in games with path tracing at 4K. Those folks are more likely to play with upscaling enabled, at which point the CPU performance will be more of a factor than it is in this extremely GPU limited example. While it's interesting to see, I don't think the examples are really useful for anyone making a purchasing decision.
OliveBranchMLP@reddit
Incomplete data, limited testing methodology, biased sample.
LEMental@reddit
People forget what day it is.
Zaziel@reddit
That’s the best part of it.
R1ddl3@reddit
I unironically think this is info that should be at least mentioned/emphasized in serious cpu reviews though. People see the 1080p graphs thinking that's the difference they can expect to see if they were to upgrade without realizing that at higher resolutions cpu matters way way less. Clearly a ton of people come away from cpu reviews with that misconception, based on comments you see all over the internet.
Hefty-Click-2788@reddit
What would be useful is to bench 4K using DLSS/FSR4 performance mode and 1440p quality. A realistic and very common real-world use case.
CodeRoyal@reddit
Isn't performance mode basically 1080p with some overhead?
Hefty-Click-2788@reddit
Yeah basically. But I still think it'd be a good answer to people who complain about 1080 benchmarks not being relevant, even if the results are about the same.
alpharowe3@reddit
I feel like we go through this every CPU launch so as song as you are in the hardware space for more than 1 launch you would know this.
Strazdas1@reddit
Then why does reviewers never learn?
R1ddl3@reddit
Eh, there are always a ton of first time pc builders watching reviews. Also a lot of people aren't enthusiasts who follow hardware for fun. They build their pc and then don't follow hardware until it's time to upgrade a few years down the road.
alpharowe3@reddit
Yeah, but you also have to consider this video takes dozens of man hrs maybe more and prob isn't popular content to make the $ back and doesn't reveal any new or interesting information.
R1ddl3@reddit
I'm not saying they should actually run their full suite of tests at higher resolutions. They should just very clearly spell out that the differences are going to be much smaller at higher resolutions and maybe include 1 or 2 graphs to drive the point home. Like very clearly saying "if you meet x criteria, you probably won't see much benefit from this cpu".
CodeRoyal@reddit
That is mentioned at every CPU launch cycle.
Slyons89@reddit
If those noobies read any of the comments they will be made privy to this information dozens if not hundreds of times on every single video and post about CPU reviews because a small contingent of commenters always fails to understand the point of the CPU reviews. It’s become so meta at this point the channels making a video don’t need to bother because it’s always always debated in the comments.
Xplt21@reddit
Whilst it does matter way less, one of the points of the video is that these cases aren't how the games are usually played, despite people saying it. Unless you are buying a high end gpu to play at 40-60 fps you will be using upscaling or lower rt settings which will boost the frame rate and make the cpu more important. With that said a 9800x3d or 7800x3d isn't making much of a difference for most use cases when playing at 4k, but it will probably age well and if you find them close to msrp it probably won't be that bad of a deal compared to other cpus (and if it's 4k gaming the budget is probably reasonable anyways so might as well make it last)
capybooya@reddit
Games are complex and have very varying workloads depending on the surroundings, materials, number of npc's, etc Those 1080 graphs are relevant, because the performance will completely tank back to the baseline of the CPU, even if its 1%, 5%, 10%, 20% of the time. That is very noticeable.
EmilMR@reddit
you can find it in every techpowerup review.
conquer69@reddit
1080p has only gotten more relevant with the advent of decent upscalers. 1080p still looks great at 27" upscaled to 4K with DLSS or FSR4.
GOOGAMZNGPT4@reddit
I appreciate satire, and I appreciate self-deprecation, and I appreciate self-awareness.
But I still see Steve as being quite belligerent in this.
There was a time when it was a standard for reviewers to do 3 resolution testing (720, 1080, 1440 first, and 1080, 1440, 4k later on).
Steve was very calculated in this video - by choosing 4k only, on max settings, with RT, in known GPU-centric titles. It was a tongue-in-cheek contrived test to prove himself 'right' and say 'see I told you so', when he and everyone knew what the results were before any test was ever conducted.
None of the CPU-testing critics are asking to create GPU-limited-only testing scenarios for CPU-testing. None.
1080p testing on $4000 worth of hardware is still stupid and not representative of any real-world use case. Swinging in the complete opposite direction out of spite is not a good alternative either (though of course the video was done out of jest, not presented as a true solution).
I also understand that thorough testing is an impossible job, because every variable that is testable would literally multiply the amount of testing (labor hours) required.
We know 2 things;
1080p low testing with insane hardware just gives fantasy, exaggerated, unreal results. Might as well do 720p testing to really exaggerate those bar graphs.
4k max testing will commonly result in useless data. (not useless if the data is contextualized properly. Yes, if you are GPU limited and if a consumer is presented with those results it could inform them that a CPU upgrade is not necessary.)
The valuable information for consumers and enthusiasts is going to lie in the middle.
A happy medium would be two 1440p test runs, with the assumption that the end user would be aiming for satisfying a 120hz monitor.
Test Run 1 would be something like a 'low-medium' settings, or perhaps rasterization oriented, results. The idea being that the user is going to accept compromises on video settings so that their system can output a higher framerate. Perhaps we leave premium features like FG, DLSS, FSR, RT out of these tests.
Test Run 2 would be something like a 'medium-high' settings. The idea here being maybe the end user wants to dabble in, but not max out, some featuresets. Maybe the lowest level of RT, maybe DLSS quality, maybe the highest texture quality setting, high shadow quality, etc. (DLSS / FSR being critical here; the idea being can the CPU support the higher framerates, at higher quality, enabled by Nvidias software tricks which are now industry standard.)
So instead of testing the farthest left of the spectrum which is 1080p low settings, and instead of testing the farthest right of the spectrum which is 4k max gpu-limited, instead we aim to test the two counter balanced points in the middle (imagine a horizontal line divided into 3 equal parts by equidistant vertical lines).
The results would show tighter margins than 1080p low testing, and are unlikely to present GPU-limited results (but may indicate just how close we are to being GPU-limited). However, it just miiiiight expose some failures or successes of different CPU SKUs. Like, maybe a 9800x3d dominates 1440p with DLSS in a way that maybe a 12600k doesn't, or maybe the opposite is true and the 3D vcache is exposed as irrelevant past 1080p.
And of course, infinitely closer to real-world system use where everyone is making compromises, and no one is playing at absolute maxes and mins.
I feel that everyone would benefit from and be more informed by this testing than 1080p low testing. Even if this isn't used as de-facto product release review testing methodology, it's at least worth a 1-off video to gage the viewer appetite for this type of testing, or at least evaluate the state of mid-range gaming in 2025.
1080p low bar graph-maxxing is a masturbatory exercise.
If 'real world' testing tells us that the 7800X3D and the 9800X3D and the 14900k and the 285k are all equivalent, well then, so be it and thanks for saving us $500 and thanks for being the pioneer that was willing to do more than the bare minimum to combat mindless consumerism.
Sevastous-of-Caria@reddit
Why these reviewers not testing cpus with HDR. Who even uses 4k smh
eubox@reddit
yeah I only watch CPU reviews which are done at 8k HDR max settings, you know, real world scenarios
tngsv@reddit
Thank goodness, I was wondering where the fellow reasonable gamers were at
Minimum-Account-1893@reddit
Yep same ole story. I see people misled into buying a 9800x3d with their older GPU quite often, and then giving their 9800x3d all the credit for their 60fps graphics.
Its stupid how many have been duped into thinking an x3d is responsible for graphics (rather than just being cache).
Rapogi@reddit
well to be fair, it depends on the game, heavily single threaded games can still get benefits from like going from 58003d >9800. at 1440p with 6800xt, i saw a pretty big bump in fps in something like WoW, a wonky 90 fps to a pretty stable 110fps. by wonky prtty big frame drops in like a 30man raids, 98003d pmuch solved all that!
so i can def see a scenario in where someone is suddenly getting very stable frames after upgrading cpu leading to a very noticeable smooth experience
Strazdas1@reddit
if your raid experience goes from slideshow to 90 fps its completely useless because you see at some different game it was GPU bound so we shouldnt test it.
godfrey1@reddit
duped and misled into buying the best CPU on the market KEKW
Stark_Reio@reddit
Jesus, how is it there's unironically so many people asking for 4k results in CPU reviews? On a non related note; I find it hilarious how even in April fool's joke Benchmarks, Intel still manages to land at the bottom of the list (cost per frame.)
Strazdas1@reddit
because a CPU should be tested in CPU bound scenarios. If you are getting GPU bound at 4k then you are using wrong softtware.
WJMazepas@reddit
He did this video as a joke, but I do see the value on it.
I totally believed that at least the 9800X3D would guarantee a lot better 1% lows than the other CPUs, but so many games it didn't matter at all.
Now, of course, I would be running those games with DLSS set to quality if they were running lower than 60FPS, rendering then at 1440p and then maybe we would have a good difference in the results there
CodeRoyal@reddit
Why not simply test at 1440p ?
5iveBees4AQuarter@reddit
It's a joke. He's making a point but it's still a joke. He intentionally picked games that don't benefit from a faster CPU at 4k. He also intentionally used native 4k which is increasingly less relevant with high quality upscaleds.
Tee__B@reddit
Except it does matter in many many games, even before you start using DLSS. Check through Hardware Canucks 9950X3D review.
srjnp@reddit
the actual joke is that steve still stubbornly thinks this isn't valid to show in a review.
CodeRoyal@reddit
Why would he increase his workload by 50% to show that CPUs achieve similar performance in GPU bound scenarios?
honeybadger1984@reddit
It’s April Fools but he’s not wrong? Depending on the games and the higher resolutions, CPU matters less than GPU. You’re best having a more humble CPU then throw more money at the GPU.
yzmydd123456@reddit
Although this is a joke, a 5090 at 4K cause all CPU has same result, this might also happen with a 4070 super at 2K. I have seem a lot people buying 9800x3d pair with a 4070 super or 4070ti super running at 2K, these people expected to get 20% fps boost from CPU, but in reality the performance boost is very minium. if they just buy a 9700x and upgrade their GPU to a higher tier there will be a better overall fps result. No matter what people say but only testing at 1080P is definite misleading some people.
yourdeath01@reddit
I play only 4k graphics games so I downgraded from 7800x3d to a used 7600x for $140 and sold my 7800x3d for $360 and performance is same
SVWarrior@reddit
I am running a 7900X, and while this is a kickass older AM5 processor, I cannot justify the price to performance cost that the 9800X3D and 9950X3D ask for while running games in 4k over what I currently have.
Kougar@reddit
Perfect video for the perfect day, I needed that laugh! And as a 7700X @ 4K gamer I particularly enjoyed this video... Stellaris sim time aside, I'm happy with my chip and will wait for a Zen 6 X3D.
billwharton@reddit
It makes no sense to play at native 4k unless you have a shit CPU and like wasting power. drop the internal res and play at 120fps.
onlinenow81@reddit
april fools day video
shimszy@reddit
I get that its a satirical video, but unironically maxing out your hardware can really improve frame pacing (you see this often with stuff like well tuned memory OCs)
Gohardgrandpa@reddit
They never should've made this video. Hey we have nothing to do so we'll post some bullshit to get views and make some $
Stennan@reddit
Lighten up; it is once a year.
I would have liked they to release a serious video, but with AI-dubbing using Steve's voice. They did. I accidentally 6 months ago, and you could only get the video in French/German/Italian/Spanish? 😆
Gohardgrandpa@reddit
Yeah til some dumb shit believes this and wants to come on here and argue with everyone
INITMalcanis@reddit
Then we can laugh at them and they can learn a little lesson about critical thinking