Gamers Are Wrong About 1440p vs 1080p CPU Benchmarking
Posted by DyingKino@reddit | hardware | View on Reddit | 289 comments
Posted by DyingKino@reddit | hardware | View on Reddit | 289 comments
vanebader-2048@reddit
u/HardwareUnboxed I'm curious about another aspect of CPU testing.
My understanding is that, nowadays, AMD drivers have lower CPU overhead than Nvidia drivers (at least on some APIs, can't remember which ones specifically). That means in CPU-bound scenarios, we'd see a 7900 XTX/9700 XT outperforming the 5090 in these games due to this overhead advantage.
Did that factor in your test bench choices? Did you guys have to consider using AMD instead of Nvidia even though their flagship isn't as fast? If AMD had a GPU as fast as the 5090, would you use that one instead just because of the driver overhead situation?
NDCyber@reddit
I think with a CPU bottleneck AMD would be faster. I think iceberg tech once compared a RTX 3070 and a RX 6700 and the AMD GPU was faster while being CPU bottlenecked, but I can't remember the specific video
Flaimbot@reddit
the part about the overhead is a fair question. i think that has been omitted in this video to not make it too confusing to people, who still struggle with the understanding that a cpu delivers the same fps at all resolutions, given the gpu being fast enough to not be bogged down by that resolution.
regarding the decision amd vs nvidia, yes. they always pick what's fastest. it's just that in recent history amd hasn't been able to keep up with nvidia.
Laziik@reddit
A 5090 can at the absolute worst be equal to a 9070XT in a very heavily CPU demanding game, such as any Total War game or any Paradox game (Europa Universalis, Hearts of Iron etc) but it will never be slower because the limit is your CPU.
In 99% of other games the 5090 is simply always faster, even in games such as Helldivers 2 where the game is still CPU bound but not to the point of the other games i mentioned before, the GPU still comes into play. It just has too much raw performance, there is no driver overhead that would match a 9070XT/7900XTX to a 5090.
vanebader-2048@reddit
That's not how any of it works, genius. You fundamentally do not understand how driver overhead works.
Driver overhead is something that affects CPU performance, not GPU performance. It's the CPU that handles API and draw calls, not the GPU itself. That's why, in this test they did years ago, you see the 6900 XT outperforming the RTX 3090 by a wide margin when paired with lower-end CPUs. They are all CPU-limited in these tests, so the performance of the GPU itself doesn't come into play. But the overhead of the Nvidia drivers impacts those CPUs harder, which is why you end up with the AMD system being faster even though it shouldn't be based on GPU specs.
Laziik@reddit
Driver overhead has been fixed eons ago, why are you linking me GPU's released in 2020 as proof? Link a 5000 series GPU compared to a 9000 series equivalent and lets see, example 9070XT/5070Ti.
vanebader-2048@reddit
Citation needed.
You link it. I haven't seen any, which is why I'm asking Steve about it. You're the one who inserted yourself in the conversation while having no clue what you're talking about.
Laziik@reddit
Brother he aint in this thread, why are you tagging him, he dont know you 😭😭😭
vanebader-2048@reddit
I'm tagging him so he sees it. If you look at that account's comment history, he does engage with reddit threads about his videos. I've talked to him on this sub before.
Laziik@reddit
False, which is exactly what i wrote my post, a test being CPU limited does not mean it doesn't utilize the GPU at all, it still utilizes it to some degree and as the very test you linked shows, the overhead comes into play mostly between the GPU's that are similar in rasterization, there's a reason the 5700XT is under the 3070 there, because the 3070 is faster right? Otherwise what would be the reason then, same CPU, obviously more FPS with a 3070.
And well, the gap between the 5090 and a 9070XT is much bigger than the 3070/5700XT.
Flynny123@reddit
I have a lot and love of respect for Steve but I find this endless debate tiresome and think he could stand to take a slightly chiller approach, even though he’s of course almost entirely correct.
Yes, low resolution testing draws out how much more the CPUs can do at their best, instead of deriving GPU limited results. But it’s also legit for consumers and enthusiasts to want to know ‘ok there’s a theoretical 15% delta between these CPUs at peak, but what am I going to see if I game at [my preferred resolution] in some modern titles - might I be fine with a cheaper one for the foreseeable?’. That’s where testing at other resolutions is really helpful.
For the exact same reason, I’d like to see more reviews of GPUs include a test configuration with a midrange CPU (esp midrange GPUs) and visa versa - alongside the ‘peak measured difference’ results with those bottlenecks controlled for as far as possible.
Castielstablet@reddit
Absolutely agree. There's plenty of viewers who just wanna see "oh at 4k the difference is 2% between top of the line cpu and a cpu similar to mine because its gpu bottlenecked, good to know" even if they understand how cpu benchmarking works.
errdayimshuffln@reddit
Id they understood, they wouldn't need to see to know the difference will be small.
Castielstablet@reddit
its like watching shorts instead of long form content. you can infer things yourself or you can just get exactly what you wanted to know like a pill. people who comment on hub videos saying they want 4k cpu testing probably want the pill :)
Ok_Emu7050@reddit
yeah I think that's the perfect analogy. I wish HUB would change their messaging to be more like... "I understand you guys are looking for a shortcut metric, but for our channel we test this way because it allows to put out the most amount of content that is relevant to everyone. We recommend XYZ channel/website if you are looking for those benchmarks "
Instead they basically repeat ad nauseam that the people looking for higher resolution benchmarks are idiots.
Castielstablet@reddit
Absolutely agree with this. I feel like steve either misses this aspect or ignores it to make more videos about the topic and get views/comments. You can 100% understand what steve talks about in the video but still want that shortcut, its valid reasoning but hardware unboxed as a channel is not interested in providing that so that type of viewer should look elsewhere.
errdayimshuffln@reddit
Thats impossible to do because everyone has a different configuration. When you reintroduce the cpu, now you have two part combos you have to test. Thats not multiplying the testing workload by to, thats squaring the workload.
In case you dont catch my drift, the question of whether a 5800x3d is decently close to a 9800x3d at 2k depends on your gpu. So they would have to retest entire game suite for multiple GPUs and multiple CPUs just to find where exactly, it stops matter what CPU you have at 2k.
Flynny123@reddit
Of course of course, I do understand why that’s tough, I’m simply suggesting showing the differences with a representative midrange part, not cross-analysing hundreds of different CPU and GPU combos.
errdayimshuffln@reddit
But how useful will that be? As soon as you have a different GPU, you mess with the bottleneck and the info no longer applies
EdliA@reddit
Thats extremely useful. If the bottleneck is that big that tells people to not bother upgrading their CPUs or buying the best there is. That's the whole freaking point.
Ok_Emu7050@reddit
HUB's point is you should watch another video and/or watch more videos to figure that out. They aren't wrong but they are also overtly smug/principled about this.
Framed-Photo@reddit
They proposed doing this for the GPU reviews specifically, so if they swap out the GPU then it's a different review haha.
The idea would be like, you review a 5090 and use the best possible CPU to isolate the GPU as the variable, then you also do a few scenarios with a lower end chip to see how it fares.
To answer how that would be useful: it's an additional data point you wouldn't otherwise get, that could give you an idea of how useful that GPU upgrade would be on a less-than-ideal CPU.
errdayimshuffln@reddit
But the minute you do a weaker CPU, you could swap out the gpu for another and you'd get the same result. I. Otherwords, the data has no relevancy to the GPU you are testing. Let me explain this another way thats mathematically equivalent.
Imagine CPU is red and GPU is blue and you are using rgb with separate alpha channel for red and blue.
You want to compare one blue (gpu 1) to another blue (gpu 2). To compare the two blues, you make the red transparent (via the alpha channel) for each. What you are suggesting is to add a few scenarios where you make the blues way more transparent and the red more opaque and then compare the blues! You see? You are not comparing the blues anymore! What you end up comparing is the reds!!!! Thats the whole point and why reviews have been done like this since the beginning of cpu/gpu reviews (even before YouTube existed)
hugeyakmen@reddit
I can see how it would get tiring to repeat on every review and it has to be setup and framed as a very general comparison because of how much it is influenced by the other components, but mostly it just serves as a quick reality check to help remind people that many single component upgrades within a few generations aren't worth it. Sometimes reviewers do throw in a comment along those lines, but it's rare to see it put into clear numbers.
I have a lower-midrange system and gotten interested looking at the reviews of a newer CPU and the 20% better performance. Then I had to spend a while tracking down various YouTube videos from random gamers with an RTX 3060 who benchmarked games so I could find out that the CPU upgrade would net maybe 2-3 fps. For most single component upgrades, I've found that the only relevant info comes from YouTube benchmarks like that, and it's painful compared to a well written review!
errdayimshuffln@reddit
Thats because people refuse to believe when they are told what logic to apply for their circumstance. They want other people to waste time, money, and energy reproving what has been shown a million times before. If you are CPU bottlenecked, dont waste money on a better gpu. If you are gpu bottlenecked, dont waste money on a better CPU.
This means that if you game at higher resolution, the gpu being more powerful is more beneficial than getting a more powerful cpu. You dont need someone to bench your specific cpu gpu combo/scenario
hugeyakmen@reddit
For example, with a 10600k and a 4060 and upgrading only one of them, how far can you upgrade either one before you hit the bottleneck wall and it becomes pointless? Standard reviews don't really help with that, but that's the real decision that the shoppers are trying to make
hugeyakmen@reddit
Bottlenecked isn't a purely yes or no scenario though, and you might only become CPU bottlenecked after you upgrade a GPU and only if you upgrade it too far. A GPU upgrade without a great CPU can still have a positive effect, and a bigger positive effect in some games and tools than others.
Framed-Photo@reddit
When they upload the "RTX 6090 Review" video, the GPU is not going to be swapped out at any point, that video will stand as-is forever.
The 6090 review gets uploaded, most of it stays the same, they throw in a few extra charts with a weaker CPU to prove their point, that's it.
It's not to try and evaluate the performance of the 6090, it's to show how games scale with different hardware platforms that a buyer might be on, which is entirely relevant for every potential buyer.
errdayimshuffln@reddit
The answer is whatever the result was when they benched the weaker cpu. Thats what a gpu bottleneck means. A 6090 won't change anything
Framed-Photo@reddit
Whatever is popular when they make the benchmark. For right now it would probably be some budget AM5 chip or a 5800XT or something.
Testing it to see if it is too weak is the entire point. If they test a 6090 with the 9800X3D and it's great, but then they do their handful of tests with a 7500f and it matches a 5070ti, then that's useful data for all those thousands of commentors who keep asking for testing like that.
But in a much more accessible format, available directly with all the other data they'd need to make an informed purchase.
Because they get thousands of comments a month from people who are less knowledgable than you or me, who won't know how to infer data from multiple benchmark videos, who want to be able to watch a review of a product and know how that product will do for them instead of in a vaccuum.
Which way is better to deal with that, to make a long winded video every few months shitting on commentors and redoing an entire benchmark suite AGAIN to try and explain a point you've done a dozen times before? Or could they just...run a couple tests on one more CPU in their next GPU review to shut them up? I'm not talking about doing their entire benchmark suite on two CPU's every single time a GPU comes out, they'd literally just need to test the GPU being reviewed in like, 3 games to get the point across and then move on with the review.
Flynny123@reddit
Yes this is precisely what I meant. As part of a GPU product review of an upcoming 6090, I of course want to see that benchmarked with the best possible x3D part to see peak real world differences, exactly as they do it now. As they already do for GPUs, I'd like to see that at 1440p and 4k too because that's sensible. HUB are really good at this stuff and I enjoy their videos.
But if they have the time - possibly in a follow up because I understand this takes time and it isn't the main headline, I'd also love to see a comparison of the same uplift if using e.g. a 7600x (and if they felt like it, possibly another one or two CPUs at different positionings), and what the delta is, if you see considerably less uplift in some games as a result, and which they are. Between those data points, if you have any reasonable sense CPU hierarchy, you have enough to infer how much difference a new GPU might make in your use case, in a way you don't have otherwise for this particular product.
Similarly for midrange GPUs, this is potentially even more valuable, because when a 6060 comes out the number 1 Q in PCPartpicker is going to be "do I need to upgrade from my r53600/ 5700x /12600k to avoid bottlenecking a 6060?", and those kind of questions HUB can (and sometimes are!) really helpful at answering using their methodologies. A follow up video on a GPUs review giving you a sense of approximately how much CPU you need to see worthwhile benefit out of a CPU- even if only covering 1-2 additional representative CPUs, would be great.
errdayimshuffln@reddit
Testing to see if its too weak should not be done in every gou or cpu video. Finding the bottleneck is game dependent!
Now I get HUs tone. Most people asking for this shit aren't doing the work and dont realize how dumb it is.
I have built like a fuck ton of PCs, I used to be a seller on ebay as a side hustle when it was more profitable.
Ask for game by game bottleneck benchmarks. Otherwise, it makes no sense to do one every CPU or GPU reviews. This shit deserves no more than a comment in my opinion. There is no helpful info you didnt already have access to.
Framed-Photo@reddit
It does not need to be full testing. As I've said, a 3 game sample with just the product being tested would be more than enough. Like the aformentioned 6090 on a 9800x3d for the full suite, 3 games tested on a 7500f, or to flip it, a 9800X3D tested with a 5090 for the full suite, with 3 games tested on a 5060ti.
That's all it has to be. 3 additional benchmark passes on a different setup, one single slide in their video showing how the performance compares just so viewers can get an idea of how things stack up when not using TOTL hardware.
But no, apparently HUB would rather make a bi-monthly 30 minute video directly calling out commentors through screenshots and redoing entire benchmark suites, that's totally within their right.
Ok_Emu7050@reddit
As if all things, people are way way too analog. Even HUB doesn't 100% believe their goal of the test is to only test the CPU without bottlenecks. If they did they would test at lower resolutions. They presumably do 1080 because virtually nobody plays below 1080- so they do care about real world applications.
HUB is correct- their current methodology allows folks to make intelligent decisions with some legwork and watching multiple videos.
They also come across as overly principled/smug. 1440P results are not useless. They are a data point that has a more limited use case.
T
JShelbyJ@reddit
I assume their testing suite is automated because they keep testing the same games, so I doubt it would be an issue. If they were manually benchmarking games like fortnight, the finals, or CoD that don’t have automated benchmarks, then I would agree.
Like, part of me agrees with them, but another part of me is suspicious that they hold AMD stock or hate Intel. Because, while they might be right about performance in games, it really doesn’t matter for most gamers if most gamers would be GPU limited with the mid-tier CPU offerings from both manufacturers. So why spend so much effort to anoint a winner when, by their own admission, it doesn’t matter.
Either cpu choice doesn’t matter because we are GPU bound and so why do they make such a big deal about it, or it does matter and in that case they should test all resolutions but they don’t.
Seanspeed@reddit
This is very wrong. HUB usually dont use in-game benchmarks, and specifically find heavier parts of a game's gameplay sections to benchmark things manually.
Seanspeed@reddit
HOW ARE PEOPLE JUST NOT GETTING THIS?! lol
Not just different hardware configuration, but also using different resolutions and using different graphical settings and whatnot.
Flaimbot@reddit
cpus are delivering the exact same fps in a given scene for literally all resolutions. if it runs 120fps at 1080p then it runs it at all other resolutions you set it to. from 1x1, over to 1920x1080p, 2560x1440p, 3840x2160p and even 2873456789732465x8723658749p, the cpu will still deliver 120fps.
what your final fps is is then determined by how capable your gpu is. if the gpu delivers then 70 fps at 720p, that 70 fps is what you get displayed due to the gpu bottleneck. if the gpu is fast enough to deliver 300fps at 4k, then you're still limited to 120fps because now the cpu cant serve more to the gpu.
if you want to make an educated decision, you're looking at cpu review and gpu review (which then has the different resolutions) and write down the number the cpu delivers and the gpu numbers at the desired resolution. the smaller number is what you get.
it's not rocket surgery.
Vivid-Software6136@reddit
And then you get the people who complain that including 1440p means nothing because you are testing with a 5090 so its still not realistic.
So to satisfy everyone you need to benchmark 3 resolutions for 10 cpus each with 5 different gpus bringing you from 10-30 benchmark runs per game to 150+ benchmark runs per game. All because people are too lazy and/or dumb to consider both a cpu and gpu test at the same time and compare the data for both.
Two reviews with minimized bottlenecking one for cpu one for gpu provides the most useful data with the least amount of testing. If you want to benchmark specific individual system configurations go out and spend 100k on all the different hardware and do it yourself, clearly there's an audience for it, but is it worth the cost and the effort?
S4luk4s@reddit
There will always be countless videos on YouTube comparing different cpus with the same Gpu, or vice versa, to look at exactly this. Just search for "7500f vs 7800x3d 9070xt", I didn't do it, but I guarantee you there are multiple videos for this exact information. But hub, gamersnexus etc are not looking for this information, and they don't need to. They want to create pure, recreateable data that everyone can agree to and that is applicable universally, not test how specific setups do in specific games, with specific settings compared to other specific setups. You can't have a useful database for this.
Bitter_Leather_8319@reddit
Seriously... if someone still not gets it, in 2026, go play on console
NeroClaudius199907@reddit
Who plays natively anymore? Think majority of people using dlss q at 1440p-4k at minimum
Flaimbot@reddit
if you understood how this stuff works you wouldn't be saying such stupid things.
it doesnt matter. dlss only changes at what underlying resolution you're running (with mostly irrelevant overhead for then to scale the picture back up to the original resolution, e.g. 4k)
you can run the game at 1080p native or 4k dlss performance. same base resolution, i.e. 1080p. guess what. for the most part they also have the same fps. shocking revelation, i know. the only difference is how pretty/crisp it's then displayed on the 4k screen.
NeroClaudius199907@reddit
1080p native & 4k dlss performance doesn't have the same fps
S4luk4s@reddit
Doesn't matter for cpu compareability.
Laziik@reddit
Why would i play with DLSS/FSR quality when i can play almost every game in existence on 1440p 120FPS+? What benefit do i get going from 120 FPS native to 150FPS upscaled, worse visuals for 30 FPS? I can just enable DLSS/FSR Native and enjoy a more premium AA option without sacrificing my resolution and my base performance.
If the difference is going from 50 to 70 on 4K than that's fine in my opinion, especially since its been proven that the higher your base resolution the better the upscaler works, but for 1440P if you're already above 120 its 100% completely pointless, you'll be able to go above 120 FPS in every esports game and every triple A game does not need even more than 60 in my opinion, let alone 120.
NeroClaudius199907@reddit
Whats native 1440p, oh you mean dlaa? Then thats fine to play 1440p. If if game doesn't has dlaa, anybody would just use dlss q since it looks better than native 1440p taa. Didn't know people here have come to accept native 1440p taa is a worse experience than dlss q already.
Logical-Database4510@reddit
There are plenty of luddites out there that will play natively at 4k with objectively worse IQ and drastically worse performance because DLSS is "fake resolution" or whatever.
hallese@reddit
Some.of us luddites don't understand either of these posts but are too ashamed to ask for an explanation, so I will do it for them. Can someone please translate/explain these two comments for my luddite friend?
BandicootKitchen1962@reddit
DLSS balanced will look better than native taa with higher fps. Not applicable to 1080p or out of date DLSS versions that comes with older games. You have to learn which preset best suits your gpu and game combo and apply it through nvidia app.
ReliantG@reddit
In many instances DLSS at quality or balanced looks better than native resolution, because the Anti-Aliasing it uses is better than the anti-aliasing in the game engine. So with DLSS on, you get a better image than native in these cases.
NonEuclidianMeatloaf@reddit
This just in: luddites turn off normal mapping because “fake polygons”
Luddites disable texture filtering because “fake surfaces”
Luddites disable shaders because “fake photons”
zakats@reddit
The games i play aren't super demanding so I ignore upscaling on all three brands of gpu that I own. It's not particularly beneficial to me and I kinda resent the whole hype train around ai upscaling for games that's been used as a value-add for their AI businesses since gaming became an even lower priority to these companies.
It's fine to have upscaling as a feature, but raster isn't as much of a business priory when most of your money comes from enabling billionaire egos and a surveillance state.
Keulapaska@reddit
DLSS SR is also the best from of TAA, you don't need to upscale with it, can just run native.
NeroClaudius199907@reddit
Upscaling is better than taa, if more games use fxaa or msaa or alternative. Upscaling wont be as hyped. Blame bad taa
qwertyqwerty4567@reddit
Still haven't played a single game that has dlss in it so me.
Keulapaska@reddit
So essentially, you play old/indie/mobile games, cause there really ain't many big budget games in the past... 5? years that don't have DLSS. Helldivers 2 and TW:WH3 come to mind, but can't relaly think of other of the top of my head andh both of those could really use DLSS SR as their AA options are kinda trash.
NeroClaudius199907@reddit
Seems like you dont play single player games. With dlss being plugin in many engines now. It comes default in many aaa games.
If you play natively then look at
Intel Core Ultra 7 270K Plus Review - Intel's Fastest Gaming CPU - Game Tests 1440p/RTX 5090 | TechPowerUp
russia_delenda_est@reddit
Who plays anymore? Nobody can afford to buy a pc
HuckleberryWeird1879@reddit
The ones that already have one.
NeroClaudius199907@reddit
Other hubbies & housing becoming relatively expensive. PC wont look as horrible of a deal. PC is actually more expensive than it was 10 years ago but there are more pc players now.
NeroClaudius199907@reddit
Didn't pc shipments increase by like 2.5% this quarter? Seems like theres people who can afford
Sevastous-of-Caria@reddit
%55 of all displays are 1080p. And below quality preset the ghosting becomes nerve wracking. Just switched to 4k myself and realised the upscalers. full potential
NeroClaudius199907@reddit
1080p is understanable, taa can beat it in a lot of games. At 1440p+ I dont see anyone not activating dlss q at bare minimum
BNSoul@reddit
1440p vs 1080p testing aside, I wonder why they never bother testing heavy CPU bottlenecks in mainstream games like in Resident Evil 4 remake the first chainsaw fight arriving in the village and later on the iconic cabin event with Luis, there you can see that the 9800X3D is 70+ fps faster than a 5800X3D with ray-tracing enabled regardless of the resolution you're playing (except 4K I guess).
Puiucs@reddit
time and repeatability of the test.
BNSoul@reddit
understandable, but they went and tested 5800X3D vs 9800X3D in BF6 (online multiplayer) even with repeatability being technically impossible, I guess it's just time consuming compared to built-in benchmarks and scripts.
seraphinth@reddit
Iirc they have both systems running at the same time on the same private map server when they benchmarked the bf6 beta. With both player walking together to experience the same effects on screen.
BandicootKitchen1962@reddit
I think they made a custom scenario with bots.
Puiucs@reddit
they are testing demos in most online games, and only do what you said if it can't be done at all.
resetallthethings@reddit
they also do precious little testing with super CPU bound games and NON 5090
Nvidia driver overhead is still a thing
I think that stuff is interesting, there are some games where an 9070xt and 5090 will be CPU bound, but because of the lower overhead on the AMD card, the FPS will be better
SagittaryX@reddit
They do? I forget all the examples, but they do call out testing some games in specific areas to focus on aspect like that. Afaik for the last Tomb Raider game they always tested the village section to be more CPU demanding.
JudgeCheezels@reddit
Because “benchmarkers” aren’t gamers. They don’t play the games they talk about every week.
Bet they don’t even know they could get into a CPU bottleneck in merc mode simply by baiting a bunch of enemies into a house just to grenade them for a chain kill bonus - which is a legitimate gameplay mechanic, which they would argue is “unrealistic” because no one plays like that.
Pugs-r-cool@reddit
It’s not a repeatable test. If you run a test like you mentioned 10 different times, you’ll get 10 different results with the same hardware. That’s why we have built in benchmarks, even if they aren’t perfectly representative of gameplay, if you run them 10 times you know you’ll get the same result every time.
JudgeCheezels@reddit
You can script it lol.
Stop making excuses.
Pugs-r-cool@reddit
Scripting is far from perfect and still features run to run variance. If it was as simple as just “script it lol”, reviewers would already be doing it.
JudgeCheezels@reddit
You all are ok with imperfect and flawed benchmarks but have issues with minor run variances.
Nice contradiction. 👏
BNSoul@reddit
maybe unrealistic but still a way to measure CPU performance in games, more realistic to me than playing built-in benchmarks and scripts.
JudgeCheezels@reddit
Maybe people here have comprehension issues or something, I’m downvoted for spitting facts. So idk 🤷♀️
S4luk4s@reddit
I mean what exactly are you testing this situation for? What information are you trying to extract there that you're not getting somewhere else already?
BNSoul@reddit
I mean that, in my experience, multiple games are CPU bound with specific settings and in particular areas and they're super useful in order to measure CPU performance in real world scenarios, differences more apparent and useful than built-in benchmarks and scripted runs at 1080p.
S4luk4s@reddit
I dididnt watch the video yet, but isn't this one of the exact things that hardware unboxed tries to explain for the last 10 years? That you need to force a heavy cpu bottleneck by lowering the resolution to make the cpus accurately comparable. They also already do test cpu heavy areas / situations in their benchmark runs, as long as it's easy to replicate. I don't see a reason why differences would be more apparent there.
Benchmarking guys don't care if cpu x gets 130 or 160 fps in a situation with specific settings etc, there is nothing you can do with that information, they just want to get the best data to compare different cpus.
fiah84@reddit
well yes, but I think the point BNSoul is making is that there are also scenes in games out there that are CPU bottlenecked even at high resolutions. It is true that lowering the resolution is good for testing CPU performance in all scenes, and the minimum FPS seen in those tests probably correlate closely with the CPU heavy scenes BNSoul describes. But it would have been nice if HUB could demonstrate that by picking such a scene and showing that those 1080p results correlate with a "real world" 1440p/4k example
in my personal experience I've seen that such scenes are sometimes not actually really bottlenecked by CPU performance per se, but actually more by the memory latency and/or bandwidth. That would of course also show up in 1080p and 1440p / 4k tests. If I recall correctly, the second Shadow of the Tomb Raider in-game benchmark has such a scene
it'd satisfy my curiosity if publications like HUB try to benchmark such things but for the most part I don't think it really provides any additional information vs. the regular 1080p benchmarks
S4luk4s@reddit
Okay I get what you mean, but just play any cpu heavy game and you see the difference "in real world". Play a few rounds of bf6 at 1440p with a 9070xt/5070ti and you will see the difference between 7500f, 7800x3d, 9800x3d. Or any real time strategy game etc... I didn't watch the hub video yet, but yeah, I guess it would be nice to demonstrate exactly this. But also, they do test for things like this from time to time, it's just not part of their regular cpu reviews / comparisons.
Framed-Photo@reddit
He's right but the commentors are also right, just for different reasons. HUB is there to test the hardware, a lot of viewers are there to see the games.
Yes, if you're just using games as benchmarks for running CPU's against eachother with, then testing at 1080p medium or 720p low or whatever else is the best way to isolate the CPU as the variable.
The problem is that games are not benchmarks to most people, and they somewhat reasonably want to see how well the games actually run under different circumstances?
bubblesort33@reddit
They are right in wanting that data. They are wrong coming to a "CPU review/benchmark" video for that data on a specific game. Hardware Unboxed does occasionally do benchmark videos on a specific game with dozens of GPUs, or CPUs.
Someone probably types into youtube "Cyberpunk 2077 5800x3D vs 3800x" and ends up on his 5800x3D review video, which just happens to maybe test that game.
NeroClaudius199907@reddit
But some people will wonder why include all resolutions for gpu reviews but not cpu? Techpowerup (goat) includes 720p/1080p/1440p/4k in theirs
Seanspeed@reddit
Techpowerup are actually kinda bad for CPU reviews. But great for GPU stuff, for the most part.
CPU testing is also so much more difficult and tedious than GPU testing.
NeroClaudius199907@reddit
why is tpu actually bad for cpu reviews? are their averages an outlier vs meta data?
Flaimbot@reddit
cpu delivers the same fps at all resolutions in the same scenario.
it's on the gpu if it can deliver that much or not, which is based on the resolution.
Framed-Photo@reddit
I'm gonna have to disagree strongly with you on that one. The channels ALL test games in all of their hardware reviews, it makes sense that people go to the hardware reviews to see how the games perform.
People like HUB care more about testing the specific hardware in a vaccum, using the game as just another software benchmark, whereas a lot of viewers don't give a shit about the hardware because it's just a tool used to play games. Both takes are valid, and it's up to HUB if they want to accomodate that or not, and clearly they don't. Doesn't bother me.
bubblesort33@reddit
You have to test the hardware in a vacuum as much as possible, because if you don't there is thousands of possible combinations that contaminate your data.
If they come to these reviews with expectations on how the game will perform, only the people using a 5090 wil get accurate results. So now you're stuck making 99% of people not happy anyways, or misrepresenting what they can expect just as much. If they "don't give a shit about hardware", they shouldn't look at hardware reviews, but software reviews.
You can say it's valid to want soldering kit or screwdriver rereview from youtube, but it's not the reviews fault if they don't give you a review of the soldering kit or screwdriver they are using while reviewing an Arduino robotics kit from amazon, just because it requires soldering, or screwing things together. The games are a tool for testing hardware, and he's reviewing hardware, not the tools used for testing the hardware.
Framed-Photo@reddit
You have to test your hardware in a vacuum if the goal is to exclusively test the hardware, but a lot of viewers clearly want more than that. Doing so would involve including different kinds of testing, like how things scale on different systems. If the testing is redundant in some cases (which it will be) then they don't need to include a whole lot of it, just a chart or two would be more than enough to answer the question.
Ok_Emu7050@reddit
This actually reminds me a lot of the famous "Plane on a conveyer belt" internet debate. People fiercely argue about it and think each other are idiots, but really they are just talking past each other as what they understand is the question is different. xkcd does a funny takedown of it https://blog.xkcd.com/2008/09/09/the-goddamn-airplane-on-the-goddamn-treadmill/
droopy_ro@reddit
And here is the issue, low and medium presets sometimes reduce things that actually put a load on the CPU, like unit size in Total War games or crowd density in Cyberpunk. Now, if someone gets a game, sets it on ultra and finds out that the cheap 4/6 core CPU with castrated cache is underperforming, they will start blaming the CPU, not the benchmark. HUB seems to understand this, by running the games both at medium and ultra/ray tracing settngs. GN and other tech tubers seem to not get it.
NeroClaudius199907@reddit
Techpowerup is goated. All that information is there
zomboza@reddit
Genuinely asking, how do you extrapolate the data from different GPU instead? In every test they're using 5090 and there's indeed a gain on 1440p from 5800X3D to 9800X3D, but what about RTX 5070ti or RX 9070XT? Would it guarantee the same percentage of uplift? I kinda remember that Nvidia have this overhead issue which makes it even more misleading. I fully agree that reviewing mix of CPU + GPU combo would take forever rather than just CPU or GPU, but so far, most reviewer would review with the best CPU or GPU in which most people can afford. Likewise, I'm pretty sure that what most people on those poll meant was with their midrange GPU, and If upgrading from 7500F to 7800X3D can raise it from 60 to 80 rather than upgrading their GPU's (Memory shortage really doesn't help in this trying times)
Danny_ns@reddit
They've explained this previously in detail, and quickly touched on this in the video. But it is kind of simple. You have to look at what FPS each part gets and the lowest FPS will be your final performance.
So a quick example, if you look at Cyberpunk numbers, the 5800X3D in 1440p got 91fps, the 9800X3D got 94fps. Both with a RTX5090.
If an RTX 5070 Ti can only get 40fps with those same settings according to their GPU benchmark, then in Cyberpunk you will get no uplift going from a 5800X3D to a 9800X3D, because you will be GPU limited to 40fps. You can even downgrade to a 3800X and still be stuck at 40fps because all three CPUs could deliver more fps than the 5070 Ti.
zomboza@reddit
Okay, so if for example in this Marvel Rivals at 1440p native 5800X3D got 125 fps and 3800X got 84 fps. I checked their 5070ti vs 9070XT and with 9800X3D, it gets 86 fps at same setting. So regardless if 5070ti downgraded below 9800X3D, it will still be at 86 fps with 5800X3D (or 7800X3D) and only bottlenecked with 3800X with only 2 fps difference. Am I correct?
Danny_ns@reddit
Yes, correct!
zomboza@reddit
Thanks, now I finally know that all I need is just 7800X3D equivalent of performance instead of burning money at 9800X3D. Nonetheless, this comparison was only possible because there's actually a data point for each CPU with 5090 at 1440p, which they won't do ever again. Luckily Techpowerup does it and confirm some games that don't gain fps (BF6 just like this review confirm) and those who actually do (Baldurs Gate 3 and only CPU above 14700K/5800X3D). Cheers!
deadheadkid92@reddit
You basically have to look at the CPU review and the review of the GPU you want and whichever has the lowest framerate will be the bottleneck in the system once they're combined.
So if HUB says the CPU can do 200fps but the GPU you're looking at can only do 90fps at the resolution you want, the game will only run at 90fps.
zomboza@reddit
I just realized something, we only knew how each CPUs (3800X, 5800X3D) can do with 5090 at 1440p exactly because of this "dumb" request. Now If someone need to compare say an 7500F to 7800X3D or even Intel, they need a different data point.
Battlefield 6 is really good example here where at 1080p there's a gain on each CPU upgrade but not at 1440p across the CPU tested. Their BF6 CPU review is only at 1080p thus far so I looked elsewhere. Techpowerup is really great in this regard.
Seanspeed@reddit
Watch a general GPU comparison and you'll get a good idea of how these GPU's related to each other in terms of performance. Is not hard to extrapolate from that.
emn13@reddit
A simple mental model: bottlenecks are like speed limits. Whichever speed limit is lower de facto applies. Whether that's your CPU, or your GPU - the lower FPS cap applies. And that means specifically that if you are already CPU bottlenecked, then a faster GPU is completely useless. If you're GPU bottlenecked, then a faster CPU is completely useless.
Game settings affect how much load each component gets, but almost always they affect CPU load only a little if at all, and GPU load quite a lot - and that means that in practice you can max out your GPU simply by using heavier settings - higher quality or higher res, or less upscaling. But you can't do much about a CPU limit: sometimes a few game quality settings help a bit, but beyond that you're stuck.
There are details that muddy that simple story, but thinking simply about the slower speed limit applying works pretty well. And that's why you want to test the CPU speed when you're putting the GPU under basically no load and use a fast GPU too; and test GPU speed by putting the GPU under high load and use a fast CPU too.
nightstalk3rxxx@reddit
You just have to look at the framerate your GPU/CPU can produce in any given game.
Lets say 5800X3D (100fps) -> 9800X3D (200fps)
Now you add the 5090 which can also do200 fps, you will have an increase of 100% with a CPU upgrade, because the 5800X3D was bottlenecking your GPU.
Now you test it with a 5070ti instead which can only produce 150 fps, your increase will only be 50% with a cpu upgrade.
This is why people test CPU's with the lowest possible settings / resolution, so you can see the theoretical max fps it can deliver before being a bottleneck.
HavocInferno@reddit
Look up gpu benchmark of the specific gpu in that game. That's the max fps you'll get with that gpu in that game. Is the number lower than for cpu x? Then the gpu is the limit and a faster cpu womt gain. Is the number higher than for cpu x? Then a faster cpu can gain. That simple.
Extrapolating here is literally just taking the lower number of the results for your desired cpu+gpu combo.
Puiucs@reddit
i've spent way too much time explaining to ppl that even at 1440/4k, if you enable DLSS you reintroduce the CPU in the equation. and this is ignoring the obvious mid/low settings for competitive games.
smarlitos_@reddit
Honestly even games like Fortnite benefit from a great (recent, 6 cores or more) cpu.
I find that old CPU’s will stutter and cause frames drops more than recent ones, even in Fortnite
Can you max out at 144 or 240fps with an old cpu and decent GPU in Fortnite? Sure, but the frame drops will be painful and frequent.
frostygrin@reddit
Fortnite is actually one of the most CPU-demanding games if you enable advanced lighting settings. While a single-player game with smaller levels may end up running perfectly fine on a 10 year old CPU, even if you max out the graphics.
bizude@reddit
Yeah, let's be realistic here - fortnight will run on a potato will lower settings. Not everybody is trying to run advanced max ultra everything :D
frostygrin@reddit
I can't compare because personally I played it only when they added the improved lighting. But I find it hard to believe that it runs on a potato when it's a big world plus online multiplayer.
smarlitos_@reddit
I’m talking about performance mode tho. Most people try to max their fps, at least to their monitor’s max refresh rate, and minimize frame drops if playing on pc. DX12 perf mode and dx11 perf mode, either way, feel like old cpu = stutters and frame drops.
Fortnite is really graphically demanding if you crank graphics settings all the way up, don’t know how cpu intensive it is when you do that, but I imagine it’s also v cpu intensive
True, single player games w small levels or simple mechanics not so hard to run, makes sense.
frostygrin@reddit
The thing is, CPU bottlenecks often result in some degree of stuttering. It's not necessarily a problem. If you're getting 240fps with CPU stuttering, you usually can limit to ~180fps and reduce stuttering. Even the fastest CPUs still can end up with occasional framerate drops. So the actual choice is between a framerate limit, and a GPU bottleneck.
MITBryceYoung@reddit
It also just straight up ignores many games are heavily CPU involved too... Helldivers, harry potter, baldurs gate, mobas like league/dota, CS, civ, most paradox grand strategy games...
glizzygobbler247@reddit
Also gta 5 and spiderman 2 when you enable RT.
Way to many people have a caveman understanding of cpus and think that 1080p uses the cpu, and that 1440/4k uses the gpu
SagittaryX@reddit
Isn't GTA 5 just engine limited with any modern CPU?
glizzygobbler247@reddit
Not with RT, the engine limit is 188fps, with a console equivalent cpu, youre around 40-50, which is also why the consoles likely wont run gta6 at 60fps, at least not with RT, due to cpu limitations caused by the way the engine handles RT
fajarmanutd@reddit
Baldurs Gate Act 1-2 and Act 3 are like different games lol. My CPU was fine in Act 1-2 but stutter like crazy in 3.
Heymelon@reddit
Huh, this is something that I have done with 4K for a long time without ever thinking of thus phenomenon but it immediately makes sense. Might be time to upgrade my 7600x.
resetallthethings@reddit
at 4k with a 7600x there's probably less than 5 games where you might see ANY benefit with a better cpu
Heymelon@reddit
So far so good. But I hadn't really taken upscaling into account. And that actual render resolutions can bring the CPU more into play with DLSS and what not.
Jon_TWR@reddit
The next gen of AMD CPUs should still be AM5 compatible, so a nice X3D cpu when they release would be a tremendous upgrade in all workloads!
Heymelon@reddit
That would be baller. Such a pain to have to swap mobo if you don't need to. So I suppose you would suggest I wait for next gen then, vs buying an upgrade now?
Jon_TWR@reddit
I mean if you find a 9800x3d for a good price, it would still be a significant upgrade.
But if rumors hold true and the next gen is 12cores per CCD, the x3d version would be a hell of an upgrade in literally all workloads.
Heymelon@reddit
Seems like 9800x3d goes for 500$ for atm for me here in Sweden and I'm guessing that's not a "steal". But I'll have my eye out for sales as well as the news for the next gen. Thx for the tips.
Jon_TWR@reddit
Va fan!? I went to Sweden a long time ago, and that's about the extent of the Swedish I remember, tack sa myket.
Sounds like the around the normal price, I think. Last I saw, rumors were predicting late 2026/early 2027 for Zen 6, and it should be a drop in upgrade. Even a non-X3D chip should still be a significant upgrade for you.
Just keep your eyes open and upgrade when it makes sense.
Heymelon@reddit
Lol, you picked up on the important parts of Swedish well!
Aight, ty again. I'm running fairly well right now but depending on the news I'll definitely consider some current gen upgrades.
StaysAwakeAllWeek@reddit
Honestly it's getting to the point where even GPU benchmarks should only be done at 1080p. Who honestly still uses a base resolution higher than 1080p with DLSS4/FSR4 available? I certainly don't with my 4090
Plank_With_A_Nail_In@reddit
So test that then.
Puiucs@reddit
i did test. i have an 9800x3d and 5070ti and i play at 1440p. this also includes CS2 at competitive settings.
Appropriate_Bottle44@reddit
I'm going to try to explain this with an example for the many people who still don't get it:
Say I have an Intel Whizbang CPU. Hardware Unboxed tests it with a high-end GPU at 1080p medium and get 150 frames in the game Happy Fun Time.
My GPU is an Nvidia SuperDuper. Hardware Unboxed tests it with a high-end CPU at 1080p ultra and get 200 frames, and they also test it at 1440p ultra and get 120 frames in Happy Fun Time.
If I run Happy Fun Time at 1080p ultra it will run at 150 frames because I am CPU-limited. If I run Happy Fun Time at 1440p ultra I will get 120 frames because I am GPU-limited. If I use DLSS the resolution it is upscaling from will drop down to 1080p or lower so my cap will move up to 150 frames again.
There are some very small caveats to this stuff, like the CPU overhead of upscaling, but don't miss the forest from the trees! This is how you're supposed to use hardware reviews to figure out what your system can do, and it should give you a very accurate picture if you have the relevant information.
MolassesSad8089@reddit
It’s not that there is CPU overhead to upscaling it is GPU related. It typically adds a fixed frametime cost for the upscale (like 2ms for example).
BasedOnAir@reddit
Here it is in fewer words lol:
They used to say “4k does not need a strong cpu” but that is no longer true because with dlss, 4k games actually render at 1080p and now you’re back to needing a 1080p cpu (aka strong) … even at 4k.
That’s what they say gamers aren’t getting
frostygrin@reddit
I don't think gamers think of "4K with DLSS" as "4K". They know that DLSS improves performance by rendering at a lower resolution.
is-this-a-nick@reddit
I would think less than 10% of people know that if you select 4K in the graphics setting and swith DLSS between balanced, quality or performance you vary the actual resolution by a factor of 2.
BasedOnAir@reddit
Clearly not enough of them do lol, to warrant this video lol
frostygrin@reddit
It's not about the specific resolution anyway. These days games can get very GPU-heavy even at 1080p. So when you're lowering the settings, you still might end up hitting - or not hitting - the CPU bottleneck. Especially when we have DLAA and DLDSR.
mechkbfan@reddit
The issue I think for a lot of people is that reading it, you might infer the wrong outcome because there's no data there to counteract that view.
What I'd love to see is a reality check benchmark. It'll never happen because it's more work but the following would help
Use the CPU with some several GPU at 1440p or 4k with several games.
GrandmasLilPeeper@reddit
I can't wait until this drama face thumbnail crap is gone. It's saturated and stale.
kuug@reddit
HUB or GN have said at one point they do it because that’s what draws people in. Until the viewers’ instincts change, the thumbnail marketing stays the same
bizude@reddit
Viewers instincts won't change unless someone takes a stand, those guys are big enough they'll be getting plenty of views with or without these lame thumbnails
ThankGodImBipolar@reddit
LMG has said that before too
capybooya@reddit
I feel most of their videos keep that energy throughout the whole video and not just in the thumbnail, while HUB and GN aren't that bad yet.
Neosantana@reddit
Every YouTuber has said that. But people want to rake HUB and GN over the coals.
cloud_t@reddit
When you have a good content channel, you're bound to have elitists who think only good content will get you ahead in the attention economy. They couldn't be more wrong and joke's on those elitists.
Suitable_Elk6199@reddit
People probably hold them to a higher standard because of the quality and style of the actual content. Not saying it's right/wrong, but the bar for popular opinion is a moving target.
theunspillablebeans@reddit
The more obvious answer is that their core audience remember a time before they used such thumbnails. I haven't yet come across anyone that that both likes it for one type of content but not for GN and HUB.
meodd8@reddit
That’s what he just said.
Pristine-Emotion3083@reddit
This is why I hope YouTube continues with the whole thumbnail changing development, on more than one occasion I see a stupid facial expression on a video and I no longer want to watch because it makes me feel like I'm having keys shaken in front of me.
It already is slightly a thing, but if they can figure out a full system that means the algorithm figures out I am far less likely to click with some stupid shit on it, it'll be much better
soggybiscuit93@reddit
The algorithm cares mostly about CTR (click through rate) and AVD (Average View Duration), and how that relates to various other metrics (like core audience vs new audience).
Theres also Thumbnail A/B testing, so a lot of YouTubers will give a video multiple thumbnails and titles, and the algorithm will show all of these combos and give feedback and which were the most successful videos.
Unfortunately, soyface thumbnails and clickbait titles have the best CTR, and if your business depends on views, you're gonna optimize for that.
RHINO_Mk_II@reddit
The algorithm is dense as hell, I have told it at least 100 times that I'm not interested in videos with "reacts" in the title or channel name and it keeps pushing that shit on me.
Pokiehat@reddit
Those Mr Beast-esque thumbnails provoke a response from people, which means they generate engagement.
youtube's algorithm cares not if that engagement is positive or negative - its enough that you click (even to hate watch) so long as theres an ad spot.
If you make a career out of uploading youtube videos, you have to play the game or viewers will just see someone else's face contorted into a stupid expression, in front of a stock chart red line tumbling down into a flaming dumpster.
Puiucs@reddit
they talked about it. this is what the YT algorithm prefers so they have to use it.
EdliA@reddit
Because they have cultivated a certain kind of follower type
Puiucs@reddit
if you are talking about what YT wants, then yes. if you are talking about what CCs want, then no.
Fatigue-Error@reddit
And the algorithm prefers it because it works on people. Our eyes are drawn to it.
Seanspeed@reddit
It's never gonna be gone. It's literal psychology that we are hard-wired to pick out faces. Scrolling through a bunch of visual information, our brains will go to a face before anything else basically every time.
No way to fix that.
AnechoidalChamber@reddit
I fear this crap will be gone when humans stop having emotions... I wouldn't hold my breath for that one.
And it's kind of justified for this one, if I were him I'd grab my head too on this subject.
Illustrious-Delay884@reddit
HUB once again have to slowly and painfully explain the difference between benchmarking a CPU and benchmarking a system configuration
capybooya@reddit
Its clearly needed, I still see people saying stuff like that at 4K CPU doesn't matter at all. And someone with not a lot of money has incentive to believe that and might put all their budget into a GPU that is mismatched with their ancient platform. If they're a WoW player or the platform is way too old they have probably made an expensive mistake with regards to the actual value they get. In the current economic climate people are more likely to make mistakes because of desperation, some people are years overdue on their planned upgrades.
imaginary_num6er@reddit
More like spending money on a 4K monitor than a better CPU, since a worse monitor will be more CPU dependent
bizude@reddit
Believe it or not, you can play less graphical demanding games like e-sports or (gasp!) and older titles and still hit CPU-demanding framerates at 4K.
I'm running LG's 5K2K at 165hz and there's plenty of games I play where my 9950X3D's CPU is pushed to its potential with "only" a RTX 4070ti Super!
airmantharp@reddit
It’s… really all relative.
Maurhi@reddit
And yet, this thread is full of people still complaining and not getting it.
Michelanvalo@reddit
I'm just sick of them making a video about this every 6 months.
airmantharp@reddit
But you’re not sick of the people refusing to understand the most basic principle of the scientific method?
cowoftheuniverse@reddit
Likely down voted because many think a sub about general hardware shouldn't be about this very specific topic endlessly.
soggybiscuit93@reddit
I would oppose this content being posted if the sub was in agreement, but many here still don't get it.
angry_RL_player@reddit
This is really only because nvidia users are not tech savvy and has overall lowered the quality of knowledgeable discussion in pc gaming hardware community.
Like this would be common sense but the average green team user just buys prebuilt pcs with a nvidia card like a console and simultaneously have no clue what they're talking about while acting superior for owning an nvidia card.
Clean_Experience1394@reddit
That sure is a hot take
Plank_With_A_Nail_In@reddit
They need to show that at the resolutions and settings people actually use that they are GPU bound not CPU bound, people on 3600X's are GPU bound. It needs to be a disclaimer at the start of all CPU reviews.
Michelanvalo@reddit
While I don't disagree with you in general, that's not the point of a CPU review. How to determine your best upgrade path is a different kind of video. It's not part of a testing review for any individual component.
Seanspeed@reddit
They are in no position to tell people this sort of thing, though. Everybody and their grandma has a different GPU and uses a different resolution and different graphics options and whatnot. HUB cannot tell people when they will or wont be GPU-bound. At some point, this requires some basic common sense from the user to understand what they're looking at with these benchmarks.
And also to understand that there's a difference between a CPU/GPU comparison benchmark, and one that is more meant to test a game's specific performance profile(and even then, different scenes and whatnot can provide different results, so should not always be taken as gospel, either).
Vivid-Software6136@reddit
If you need your hand held that much buy a PS5. A CPU review shouldnt have to start with a massive caveat of "if you own any of these gpus dont upgrade your CPU" that should be a given based on the GPU reviews done by the same outlet or others.
The people this video is targeted at simply lack the mental horsepower to synthesize data from two different sources and make a decision based on that. They want their specific machine benchmarked by a youtuber because they are too dumb to do even the basic comparison themselves.
This is bean soup theory for pc hardware.
Jaybonaut@reddit
you mean 5090
KGon32@reddit
And it's not even hard to understand the CPU benchmarks, they show the highest FPS you can get independently of GPU, resolution and most in game settings.
We can't know what a user preference is and what's useful to them, you may be a crazy guy with a 3060 that wants to play at 1080p low with DLSS performance and low settings to get the highest FPS, so for that user a 9800X3D may actually be a fully used, or you can find a 3060 user that wants to play at native 4K and R5 3600 will be more than fine.
juiceous@reddit
I think the worst for a user is spending so much time to chose the correct parts for a new PC, so much money just to discover it will microstutter.
hackenclaw@reddit
I wish they also test DLSS with 1080p. DLSS with 1080p is still a thing.
Sylanthra@reddit
So if the whole point is to highlight the differences in CPUs and since upscaling uses lower resolutions anyways, why not test at 720p? Heck why not 360p. What is it about 1080p that makes it the "correct" resolution to test at?
SagittaryX@reddit
He addresses it at the end of the video. He says they stopped with 720p once there was basically no difference between 1080p and 720p results for most games, and even some games saw performance regression at the lower resolution.
Flaimbot@reddit
720p used to be the go-to around the 2010's and only got phased out due to slowly being removed from the settings menus and reviewers becoming tired of this exact discussion.
technically, yes, the lower the resolution, the more precise the cpu benchmark is, because cpus have the exact same fps at all resolutions in the same scene.
from 1x1p, to 7680x4320p (and higher), the cpu has always the same fps. it's only on the gpu if it can keep up with it to fill all those pixels, which is why its fps is scaling with the reolution.
Appropriate_Bottle44@reddit
Because it's generally a supported resolution.
There's no reason you couldn't test CPUs at 720p, but if it's not a default supported resolution then you're just adding potential instability or bugs to the test, that's it, that's the reason.
zghr@reddit
https://www.reddit.com/r/hardware/comments/kysuk6/ive_compiled_a_list_of_claims_that_simply/
I've compiled a list of claims that simply changing resolution in certain games also changes the draw distance, making load on CPU different resolution to resolution. What do you think of this? Should reviewers be careful about these cases?
Da_Obst@reddit
If changing the resolution would alter the draw distance, then using an upscaler should show that effect. I just made a quick test with Pragmata and changing between native UHD and FSR-UltraPerformance (720p) makes no difference for draw distance. There's only less resolution and detail, but the amount of drawn objects remains the same.
exsinner@reddit
Except that it actually does, there is a reason why different dlss mode has different lod bias and the value goes lower the lower the resolution is.
dudemanguy301@reddit
Resolution can effect LoD, MIP Map, and draw distance, as size on screen counted in pixels and its relation to texel density can be used as a metric for heuristics.a primitive (triangle) is flagged as degenerate when it is smaller than a pixel in size which can feed into culling decisions. A virtualized system like nanite specifically tries to merge down to 1 triangle per pixel.
When using temporal upscaling game engines specifically introduce a LoD bias to counteract this relationship. For example of a game that forgot to do this see digital foundries PC review of Nioh 2.
f3n2x@reddit
I'm not sure what the point would be. When you change camera direction or scene CPU load will also change beause you'll see different objects but they will change for all CPUs relatively similarly with the ranking probably staying almost identical. That's just a more awkward and inconsistent version of testing a specific CPU demaning scene/area in the game.
hawkleberryfin@reddit
Surely this will result in nuanced discourse from the most reasonable of all tech consumers, gamers.
Seanspeed@reddit
Gamers™ really are the fucking worst.
mostrengo@reddit
Once a distant family member, as we were discussing hobbies said to me "so, you are a gamer, eh?" to which I flatly replied "No. I'm not a gamer, I play video games".
Anezay@reddit
Gamers don't deserve rights.
edflyerssn007@reddit
I love how a lot of the discussion here ignores ram and how ram can have an effect too. Just forgetting to set the OC profile correctly can limit your fps as well.
Big-Rip2640@reddit
Its the same story with game selected benchmarks.
They used to benchmark Ashes of the singularity, but they dropped it because no one was actually playing this game.
exsinner@reddit
That doesnt stop them from using strange brigade over and over again until Turing card arrives because Turing is actually good with Vulkan.
Kairukun90@reddit
I used to fret over shit like this then I got married and I just buy either mid-high tier or high-tier shit and then just forget about it for 5-10 years. The only thing I really feel like I need to upgrade at this point is storage but I can’t justify 3-4x the price it was a year ago
Jumpy-Dinner-5001@reddit
HUB once again trying to gaslight their fans into believing their take on the situation.
They’re doing benchmarks for fan base who want to argue over who is better, not giving consumers an idea of what they can expect from a product. I just think they should be more open about that.
And I think the word "review" shouldn’t be used for videos like that. But that’s a much bigger problem overall.
HavocInferno@reddit
But their take is objectively correct.
CPU testing at high res just gives you dirty data with partial GPU limit. Then you've essentially not benchmarked the CPUs anymore.
If you want an idea of what to expect from a specific real system configuration, take the lower value of cpu-fps vs gpu-fps for the given scenario.
StickiStickman@reddit
But this is literally the opposite, because the most "real" data is simply what actual FPS you get.
HavocInferno@reddit
Do you expect reviewers to benchmark literally every possible combination of hardware? If not, clean data per major component is the next best (and only feasible) thing.
StickiStickman@reddit
Data that's entirely useless because it's so divorced from reality and the setups and settings people actually have.
soggybiscuit93@reddit
They give you max FPS for the CPUS. And they give you max FPS for the GPUs. And it's on you to extrapolate that data and understand you'll get the lower of the two values of your combo.
Its not on HUB to test 1000+ hardware permutations to spoonfeed you an answer so you save 10 minutes of using your brain.
HavocInferno@reddit
Duuuuuude why do you people still not get this in 2026?? There are sooo many articles and videos explaining it by now, including this one you're commenting on.
You gotta put in a tiny modicum of effort to make the clean data useful for your actual setup.
Your "real" benchmarks you're asking for would describe exactly one system's performance and actually be useless to nearly everyone else. Because it'd be tainted data that you can't extract anything else from. What, do you want them to run these tests for a hundred different configs?
If you have a clean cpu result set and a clean gpu result set, you can determine any combination between the two.
Stoooop asking for shit data just because you're incapable of putting two (!) damn numbers together.
Framed-Photo@reddit
Their take is objectively correct if you accept their premise that CPU's can only be reviewed in how well they perform against other CPU's.
For a lot of folks though, it doesn't really matter how much better a 9800X3D is compared to a 9600x, it matters if the chip can hit X performance in Y game, and you don't currently get that with a lot of the modern benchmarking practices.
As you said, you'd have to find HUBs review of the CPU you want, find their review of the GPU you want, then just...pray that they tested the same games in both, with the same scenario, throw out the idea of driver/software updates changing performance, and then just ball parking your performance entirely.
OR HUB could look at how they test parts like this to make them more representative.
Both sides of this argument have points, and if HUB likes how they test then they can keep testing like that. But I don't think the critiques are entirely invalid.
zerinho6@reddit
The performance of a CPU is how it performs compared to another CPU, if a game has lower gains than expected between those same 2 CPUs, then that's the developer architecture issue (or lack thereof) that you're asking HUB to cover and that the average PC game is maybe also expecting to get, but that was never the point of HUB/GN/LTT CPU tests, and I doubt it ever will be. The target of those tests is not such an uneducated audience.
The majority of games released today are not CPU-optimized for more than 1000 reasons, and each game will have a different reason that might make a totally weaker CPU be closer to a stronger one than it should be. (GPU is totally another story, engines are making quite a good use of available GPU resources.)
According_Spare7788@reddit
I think the problem isn't that. What data a lot of people want is essentially "what cpu should i get?" and "should i upgrade my cpu?". So they go watch "reviews" and they are met with data that...doesn't really tell them the answer straightforwardly, so they get confused.
Take for example: "I use a low end system, and i've upgrade my gpu from a 2060 to a 5060 ti. Does it make sense to go from a ryzen 5 3600 to a ryzen 5 7600x3d?".
They go watch the 7600x3d review, and it's being tested on a 4090 or 5090 and being tested at 1080p low. They're trying to maybe target 1440p with the 5060 ti upgrade., maybe try boosting the fps with DLSS Balance at 1440p. 7600x3d with 5090 at 1080p data does very little to help that scenario. That's why some people are frustrated.
Problem is, it's complicated. One thing i very much agree with Steve in the video is "not every gamer is the same, not everyone's expectation of the right level of performance is the same, and not every game is the same at every setting".
So usually when people ask those questions, my answer is "it depends on what game you play and what settings you use." However, HWU takes on this attitude that kinda headbutt's others confusion and keeps doubling down on the "you don't want this data" rhetoric, where in reality what they are presenting and what people want are fundamentally not the same. No one is able to realistically answer those questions because of all the variables.
kwirky88@reddit
I go to YouTube to find the heroes who make the zero commentary videos of obscure hardware combination recordings.
Keulapaska@reddit
Yea no single reviewer is going to review every single possible gpu/cpu combo in a single review, or even a cpu/gpu "tier" combo, cause it would be an insane amount of data. Also ppl do be uploading all kinds of cpu/gpu combos of random game benchmarks to youtube so can always broaden the horizon and not only trust a single source and agther more data over the years that way.
Nor do users need absolute perfect combo of cpu/gpu either, like your 3600>7600X3d with a 5060ti question, the answer can be yes, no or try to find a cheap AM4 X3D, depending on budget/price of things/games played.
Jumpy-Dinner-5001@reddit
No, it’s not.
HavocInferno@reddit
Yes it is and you will struggle to argue otherwise.
Jumpy-Dinner-5001@reddit
Because you don’t care about objective facts or anything.
HavocInferno@reddit
I do, which is why I am confident in my statement. Go ahead, actually make your case. So far all you've done is complained.
Jumpy-Dinner-5001@reddit
You literally haven’t made a single argument and ignore mine.
Flaimbot@reddit
cpu delivers the same fps at all resolutions in the same scenario.
it's on the gpu if it can deliver that much or not, which is based on the resolution.
the lower number of these two is what you see on the screen.
to figure out which of the two numbers is smaller you only need the gpu review to cover the different resolutions, which they do, because that's the component that scales with resolution.
GOOGAMZNGPT4@reddit
HUB has been belligerent over this for years. They will not change their stance.
They gaslight, they mischaracterize, at the end of the day the problem is Testing = Labor = Cost.
They just want to take 2 days running 1080p testing and presenting exaggerated bar graphs to sell you CPU vs CPU fights. It's infotainment.
They can't take a month testing 50 games at 3 resolutions, 2 DLSS configurations, 3 quality settings, 3 gpu manufacturers, RTX On vs Off, real-use simulation testing while maintaining a sensible review timeline and favorable business model. They don't care if their uses want more data to make more informed purchasing decisions.
Yes, we know, you can contrive testing methodology to show any result that you desire even if that result doesn't have a basis in real-world use cases. That is what they are doing. (That is what most hardware 'reviewers' are doing.) Personally I think they should test at 240p + 6090 to really sell those bar graphs.
There are a lot of ways you could contrive better testing. But that would take thought and effort and hours (labor costs).
Flaimbot@reddit
saying that you can't look at 2 numbers and tell which is smaller without saying that you can't look at 2 numbers and tell which is smaller.
cpu delivers the same fps at all resolutions in the same scenario.
it's on the gpu if it can deliver that much or not, which is based on the resolution.
the lower number of these two is what you see on the screen.
to figure out which of the two numbers is smaller you only need the gpu review to cover the different resolutions, which they do, because that's the component that scales with resolution.
S4luk4s@reddit
Only you can decide what games you play, what settings you use, what resolution you use, how much fps you need, what Gpu you're pairing with your specific cpu, etc... How could they even test for all this? I play cpu heavy games, so I absolutely care about having a strong cpu. Maybe you don't need one, I don't care, hub, doesn't care, only you care about that and you need to make the decision for yourself. This is not gaslighting wtf.
No_Guarantee7841@reddit
Consumers only want to hear that their 10 year old cpu is still top tier performance and all that its needed for a 5090. So they will instinctively bootlick every garbage testing methodoly site like techpowerup and flame everybody testing correctly like HUB.
ShadowsGuardian@reddit
They should just be open that they want to show % differences, even if 5090 or playing at 1080 isn't what everyone will use.
I find it interesting that with the amount of time they have, they wouldn't pick 2 or 3 possible common system combos to show case cpu benchmarks... even if the results would be that the fps wouldn't change.
Why do I care that x cpu is better, if that's using a 5090 at 1080p? I get what they're doing, but they could leverage a different set of videos, to still make their community happy.
Some people will sometimes think "I really need a 9800x3d, it's the best to get that fps!"... but then they don't factor in their resolution and that their gpu may not even output as many fps.
TLDR: I understand what they do, they're right, but there's a potential for another type of video series, to accommodate more approximate/realistic scenarios.
HavocInferno@reddit
Except that other scenario would shift with literally every tier of GPU viewers may still be using. Hence the need for clean data with as little GPU limit as possible.
Even 3 common configs would at best be relevant to a low double digit fraction of users.
No, people just need to learn that there's multiple factors to real system performance and they themselves need to put in the bare minimum work to figure out the numbers for their desired config.
According_Spare7788@reddit
Yeah. I think this is true.
HWU CPU "reviews" are essentially which CPU is the fastest with their testing methodology. I have no problem with that. But it's not "what CPU should you buy for your given needs and price range", because there are just way too many variables to give a concise recommendation.
ecktt@reddit
Honestly, HUB misses the whole point.
If we were to accept their arguments, we might as well use 720p to test the CPU, as upscaling could use those resolutions.
Flaimbot@reddit
it used to be the go-to around mid to late 2010s and still should be. it only shifted because of these pointless neverending arguments with people who dont understand that cpu delivers the same fps at all resolutions in the same scenario.
it's on the gpu if it can deliver that much or not, which is based on the resolution.
the lower number of these two is what you see on the screen.
to figure out which of the two numbers is smaller you only need the gpu review to cover the different resolutions, which they do, because that's what scales with resolution.
ecktt@reddit
If running 1080 was to eliminate the GPU as the bottleneck, 720p should better achieve that...since it is a CPU review.
Also, as previously mentioned, upscalers use 720p as the performance option to upscale from.
What HUB does not address is how a CPU respond as it transitions from being the bottleneck. ie 1440p and 4K. Saying one could "imply" or "assume " does not substitute for a real-world test.
People don't just watch these videos out of pure love of hardware. Many are trying to build systems on a budget. Knowing the average fps for CPU A to CPU Z with 5090 @ 1080p and GPU A to GPU Z with 9800X3D leaves the audience having to make a best guess at what (eg) a 9700X+9070XT would do at 1440p.
I have all the gratitude for HUB producing the results of the test they have but...
... ridiculing your own audience (or at least more than 1/2 the people polled) also isn't a professional way to respond, whether you believe they are wrong or right.
They said it. They already justified previously why they do things their way. Why troll the audience? Is tech news that slow?
constantlymat@reddit
After decades of opting for the best value for money solutions and upgrading/selling my old platform at strategically ideal times I bought a 9800X3D so I don't have to consume a single CPU benchmarking article/video for the hopefully next five+ years.
NeroClaudius199907@reddit
Why? I feel like nova lake blk vs vcache might be even more debatable. Am5 will have platform availability,cheaper boards, support*, while intel wants everything st, mt, gaming, platform. Ultra 4 will have 24C vs ryzen 9800x3d. If gaming is tied, content creation will come back into play
TophxSmash@reddit
no it wont, it was never in play. Thats a niche within a niche.
michaelsoft__binbows@reddit
I wonder if compiling code is also a niche within a niche according to you
TophxSmash@reddit
Uh yeah, most people who compile code do it on whatever laptop i have and last i checked laptops arent DIY.
Sopel97@reddit
??????????
do programmers these days really subject themselves to such torture of developing on a laptop?!
TophxSmash@reddit
Maybe not but surely its a company provided device
Relliker@reddit
It's absolutely a niche but both personally and professionally I compile and test locally on a desktops I specced, and a most of my coworkers do as well.
Assaulter@reddit
"You mean majority of people with a pc aren't programmers or video editors???"
turtleship_2006@reddit
What percentage of people do you think are programmers?
NeroClaudius199907@reddit
I think it will, well a lot of people will make a big deal out of ultra 4 mt performance vs zen5. Ultra 7 might be 2.7x faster than 9800x3d. Intel will be silly not to take advantage of that in their marketing
Friendly_Top6561@reddit
Zen 6 version of 9800x3D will have a 12-core CCD, so no.
exscape@reddit
If you're looking at next-gen Intel, shouldn't you compare with next-gen AMD?
And of course, someone buying today should consider the 9800X3D.
NeroClaudius199907@reddit
zen6 comes 2027. Nova lake coming later this fall... Reviewers are only going to compare with products in the market
exscape@reddit
That's not confirmed, is it? Rumors still say late 2026 or early 2027 for Zen 6. The delay itself is a rumor as far as I can tell? And though I believe it likely that it is released in 2027, the roadmap still says 2026.
Friendly_Top6561@reddit
AMD received the first Zen 6 silicon a year ago, Lisa showed Epic Zen 6 at CES and the node entered HVM end of last year so they could potentially launch it whenever they want.
TSMC is targeting 120000-150000 wafer starts per month on N2 end of 2026, so they are probably around 60000-75000 now, and AMD and Apple are the biggest customers on N2.
Traditionally they go with server first since there is where the big money is and it gives them time binning out high frequency parts to build stock before desktop launch, but if they want to preempt Intel it seems they can.
Jon_TWR@reddit
You’ll be fine…I’m hanging onto my $130 5700x3d for the next 5+ years too, lol.
PrimaryRecord5@reddit
Your cpu didn’t blow up?
TheFondler@reddit
You've made this comment at least 5 times (hiding your post history is meaningless), aren't you bored of it yet? The actual failure rate of 9800X3Ds was extremely low, even in ASRock boards where there were a few hundred examples out of hundreds of thousands (millions?) of units sold.
iFarmGolems@reddit
The key is not to have ASRock Mobo.
Source: proud owner of 9800X3D
Seanspeed@reddit
I mean, I watch these kinds of things simply cuz they're interesting, not specifically as buying advice.
horatiobanz@reddit
I understand why reviewers test at 1080p, but I also think this is kind of a predatory thing to do. Most consumers will look at these numbers and think that upgrading their CPU is going to get them 25% increases in framerates in the games they play, when in reality their hundreds of dollars spend on upgrading the CPU gets them a couple fps max.
Flaimbot@reddit
cpu delivers the same fps at all resolutions in the same scenario.
it's on the gpu if it can deliver that much or not, which is based on the resolution.
the lower number of these two is what you see on the screen.
to figure out which of the two numbers is smaller you only need the gpu review to cover the different resolutions, which they do, because that's the component that scales with resolution.
Seanspeed@reddit
HUB has made many videos now, including this one, trying to explain this to people so this isn't the case. It's not their fault gamers are uninformed about this stuff.
capybooya@reddit
Upgrading CPU can often give a subjectively better experience that the average frame rate doesn't necessarily tell the whole story about. Periodic frame drops are really annoying and are often from CPU limitation in game mechanics. So while on average a GPU upgrade is more often the best course of action in a system that is relatively balanced, I think people underestimate CPU because a new GPU seems like the most flashy and sexy option.
Martiopan@reddit
They should stick to console gaming honestly if they can't even spend a few minutes to understand the relationship between screen resolution, GPU and their CPU (or lack thereof to be more accurate)
horatiobanz@reddit
You make it sound like its easy to watch some review on a new processor and translate that into what upgrading would do for your system. You can understand just fine how the relationship between GPU and CPU works just fine, and yet the relentless reviews showing absolutely massive increases in performance make you think it will in any way translate to massive increases in performance in your system, when it won't.
Martiopan@reddit
That's why you understand the BASICS. CPU is in charge of things like game logic, AI, physics, player input, world updates, scripting. None of those graphical things where the GPU is in charge of. So, knowing this, does increasing/decreasing screen resolution (a graphics thing) affect CPU performance? no. So how does one use CPU benchmark video to make an informed decision? Cross check it with GPU benchmark video. If my GPU can't exceed the CPU performance when not GPU bottlenecked, then I don't need to upgrade my CPU. But then again, you shouldn't be upgrading any of your component when you're still getting the performance you're happy with.
Da_Obst@reddit
Imho a fair point, but still CPU tests only exist to compare different CPUs. Instead of breaking the test with a GPU limit, people should be educated on how to spot a CPU- or GPU-limit in their games, so they can make an informed decision about which component should be upgraded to improve performance.
TophxSmash@reddit
you can lead a horse to water
StunningIndividual83@reddit
Why don't we just test at 2k which is the modern standard rather than forcing 1080 as it makes AMD look better ?? Yes we get that the 3d cache is good but lets keep it relevant.
Flaimbot@reddit
cpu delivers the same fps at all resolutions in the same scenario.
it's on the gpu if it can deliver that much or not, which is based on the resolution.
the lower number of these two is what you see on the screen.
to figure out which of the two numbers is smaller you only need the gpu review to cover the different resolutions, which they do, because that's the component that scales with resolution.
Appropriate_Bottle44@reddit
"Why don't we just test at 2k"
Because you will then frequently get useless GPU-limited results.
tilted0ne@reddit
This whole thing is built on trying to differentiate CPUs in ways which has such little basis in reality. Like okay, there's a difference if you get the fastest GPU and test with a resolution where nobody with said card will play it on. People need to be sold on something and reviewers need something to do other than to tell people that in gaming what CPU you have barely makes a difference in 99% of games.
Flaimbot@reddit
cpu delivers the same fps at all resolutions in the same scenario.
it's on the gpu if it can deliver that much or not, which is based on the resolution.
the lower number of these two is what you see on the screen.
to figure out which of the two numbers is smaller you only need the gpu review to cover the different resolutions, which they do, because that's the component that scales with resolution.
errdayimshuffln@reddit
If you are gpu limited why even test a new CPU? Just test the old CPU, it will be the same result +/- 2-5%.
tilted0ne@reddit
I have a problem with how these benchmarks creates a false reality. There is obviously going to be differences, but if you're going to just bench at 1080p, everything is going to look dramatic, not to mention the fact that not everyone has a 5090, but they see these crazy differences and think they need to have x,y,z CPU. I don't expect things to change since it is against their interest as content creators and I respect that they have constraints.
Berntam@reddit
It's not creating false reality. It's a tool that one can use together with others to make informed decision. Benchmarking CPU at its purest is more useful for wider amount of people because not everyone wants the same thing from their games. Some people prioritize performance others prioritize graphics. I have RTX 5080 but that doesn't mean I want to play at 4K Max settings in every game I play. I like to have at least 100 fps and wouldn't mind turning down settings but if a video only tests the CPUs when they are GPU bottlenecked at 60 fps then how would I know which CPU can give me what I want?
errdayimshuffln@reddit
The problem is that you cannot do both a CPU and GPU bench at the same time. Your realistic scenario isnt bench able and if it is, the info gained has minimal applicability.
That tells me you dont get what GPU bottleneck means and how testing works. Content creators didnt come up with this scheme for views. This is how it has been done since before YouTube existed. For good reason.
Framed-Photo@reddit
Personally, I want to see CPU reviews focus specifically on finding the CPU bottleneck points rather than just testing with a standardized suite. We kind of get that now, but in a lot of testing they end up being GPU bound anyways and just kinda throw in the towel, even if there are ways to reduce the GPU load?
Like in Path traced or RT workloads, how high could a 9800X3D go if you just kept lowering the resolution scale? You wouldn't know from watching HUB because they just test at 1080p native or maybe DLSS quality and that's really it.
At least with testing like that you could know for a specific CPU that your cut-off with X game is at Y framerate, no messing with trying to figure out if they were GPU bound in that test or not.
bubblesort33@reddit
Only way to do that is 480p testing lol. Make Steve's life worse.
Framed-Photo@reddit
Or to just try and adjust their benchmark passes so that the current GPU they use doesn't get GPU bound on modern CPU's?
For example, in this video they tested Cyberpunk with RT Ultra at 1080p native and found it to be GPU bound, so why not use a lower resolution to avoid that? Then we could see where the CPU is limited in that scenario, which would be helpful when more powerful GPU's come out.
S4luk4s@reddit
If you buy a better cpu, it won't be bottlenecking your Gpu for a longer time. Also, there are many cpu heavy games, which makes having a strong cpu absolutely a real world scenario.
bubblesort33@reddit
Sometimes you build a system that you might want to hold onto for 6 years. In that case you want to know if you should get a 5800x3D or a 12700k.
NeroClaudius199907@reddit
Cpus do make a difference... It depends on the games you're playing. You saw even at 1440p native space marine 2 9800x3d was 38% faster than 5800x3d. The gap might increase if the user gets 6090.
But yeah 7500f oced is enough for vast vast majority of gamers
tilted0ne@reddit
I never said CPUs don't make a difference. It's overstated and warped by using a 5090 at lower res. 5090 is a 4k card and the performance tax from 1440p to 4k is huge.
NeroClaudius199907@reddit
You're right but you also have to understand people will be upscaling & some games will be cpu bound no matter. So cpus still important. People arent hiding higher you go the more gpu bound you'll be
Techpowerup 4k tests
Intel Core Ultra 7 270K Plus Review - Intel's Fastest Gaming CPU - Game Tests 4K/RTX 5090 | TechPowerUp
tecedu@reddit
Something that gets forgetten when people ask for higher res benchmark is simply that we want to know what is the best/cheapest option for our resultion like hey a 5800x3d and 270k perform the same at 4k; everyone says that and then benchmarks are missing. Not to mention the fact in some games it does matter, infernecing it based on 1080p only get the fastest cpu, not what is good enough
Flaimbot@reddit
cpu delivers the same fps at all resolutions in the same scenario.
it's on the gpu if it can deliver that much or not, which is based on the resolution.
the lower number of these two is what you see on the screen.
to figure out which of the two numbers is smaller you only need the gpu review to cover the different resolutions, which they do.
cadaada@reddit
This has been a tiresome discussion, but buying a 12100f/12400f i wanted to see it paired with a 3060/3070/4060 to understand what exactly i would get at noth resolutions nothing else...
emn13@reddit
That's only really all that helpful if all else is also equal, most importantly game settings and game selection, but potentially including other stuff like RAM speed and amount, and sometimes game/windows/driver versions. Since it's not that hard to adjust GPU load by changing settings, it's just not that useful to test exactly your hardware combo. It's still not going to tell you what you actually want to know because you're almost certainly going to eventually choose to play different games than were tested, and doing that means you're conflating GPU with CPU bottlenecks so that you'll never know that (say) game X might have run 50% faster by changing the graphics settings but game Y was already at its cap since it was CPU limited.
It's just really not helpful to test specifically both bottlenecks at once unless you're really careful about the software settings and versions and stuff, and let's be honest, nobody that cares for this simplification actually is going to be.
cadaada@reddit
Fair enough.
But thats my point? Not knowing something even close for bottlenecks for a low end system gives no knowledge for anyone trying to buy it too.
getting a rx 6600 with a 5600 (that were a common combo at the time) and doing some benchmarks would give many people the knowledge of what their system might handle, because they will not go a tier higher in either cpu or gpu, so it does not matter which part of the system is bottlenecking.
bubblesort33@reddit
That's such a weird, and specific niche case, like 1 out of 1000 people into hardware would actually be interested in that. It's like asking what a chocolate, lemon, vanilla cake would taste and look like.
cadaada@reddit
Thats why there are thousands of videos out there with specific combinations....
i'm talking about realistic testing, no one will buy a 4090 for a 12100/12400/12600 setup, they might even get a 3070, 4070, etc but its not useful to see 700 fps on a 12100f if these benchmarkers are testing games that would get capped first with the gpu instead.
S4luk4s@reddit
Then search for it on YouTube and don't expect hub to test your specific setup ideas.
cadaada@reddit
i'm just asking for realistic testings.
iDontSeedMyTorrents@reddit
Yes, everyone would love to see their exact system configuration benchmarked in all the games they actually play and with the same settings they use.
And that's useless information to everyone else, a colossal waste of time and resources, and is a system benchmark, not an individual component test.
NeroClaudius199907@reddit
techpowerup tested 12900k with 3080
Intel Core i9-12900K Review - Fighting for the Performance Crown - Game Tests 1440p / RTX 3080 | TechPowerUp
12400-12700k paired with 3060-3070 you'll maybe see 5% perf diff and bit higher in cpu demanding games.
But 3060-3070 users will mostly be playing at 1080p-1440p upscaling. So maybe upto 8% diff
Intel Core i9-12900K Review - Fighting for the Performance Crown - Game Tests 1080p / RTX 3080 | TechPowerUp
JonWood007@reddit
I admire steve for putting in the effort here, but I wish he kinda wouldnt bother. The people he's arguing with are idiots and bad faith actors looking to justify their purchasing decisions or push their brand's tribalism.
I'm pretty sure these talking points originated as AMD marketing back around 2017 when the first gen ryzens kinda sucked at gaming compared to intel so they spun it as "well, you dont need high frame rates because you're GPU bottlenecked anyway!"
The problem is that your CPU is your ceiling, and as games become more demanding over time, what's an acceptable CPU today becomes an underpowered one that bottlenecks you tomorrow, and what's an overpowered CPU today becomes an acceptable one tomorrow. When you're buying a CPU, it's good to know the full capabilities of it. Intelligent people are supposed to take for granted that yeah, if their GPU can only run a game at 100 FPS and their CPU can do 200, they'll only get...100 FPS....
But here's the thing. Imagine the next title gets like 50 FPS. Okay, well, if your CPU can only do 100 last time and now it's doing 50, that's bad. Meanwhile the 200 FPS CPU is now doing 100. See? Your CPU lasts longer when you overbuy. So many people make a mistake with their builds to overemphasize GPU and underemphasize CPU. They'll pick the cheapest CPU that can barely run their existing GPU at high-ultra settings, not understanding that in 2-4 years, games are gonna become more demanding and their barely adequate 2026 CPU struggles to run 2029 era games.
There is some nuance here. More cores can shake things up historically. Games used to only use 1 core...until they required 2, they used to use 2 cores until they required 4. They used to only use 4 until they required 6, then 8, then 12. And now 12 threads is the sweet spot, but I wouldnt rule out cheap 6c/12t CPUs being bottlenecked in the future while 8/16 ones handle things better. It's the nature of progress. Games become more demanding, and if you want to buy hardware that lasts the longest amount of time, hitting target frame rates, it's better to emphasize a stronger CPU and a weaker GPU.
If anything, while comparing GPUs at ultra settings at 4k is useful for stressing them and seeing the differences between them to make an informed purchasing decision, I'm the kind of gamer who just wants to run 60+ FPS smoothly, and I'll dial back quality settings to do it. I won't run games at ultra, because I dont care about ultra graphics and definitely wont run them that way at 30 FPS. I'll play at medium or something instead. Or even low. Or low with FSR on, etc. Whatever it takes. But knowing how GPUs compare in head to head contests when you buy them makes you make informed purchasing decisions for what will likely last longer. For example, a 6650 XT will last me longer than, say, a 3050 (Nvidia's closest competition price wise at the time) would. It doesnt matter if it's above my framerate or both do the job. I want the one that does the better job because I tend to hold on to GPUs an average of 5 years and CPUs an average of 6-7.
I wanna know, from reviews, how my hardware will likely age compared to other products. As such, bigger number is generally better, with some nuance. We can argue VRAM, APIs, and drivers can impact GPU longevity, but at the same time, core count, instruction sets, and general IPC/single thread performance in games can influence CPUs as well.
But if you look at older CPUs and GPUs...and how THEY aged, you can get a good idea of how future ones will age.
No one NEEDS a X3D CPU now, but that extra single thread oomph will generally make them more long lasting, assuming they have enough multithreaded oomph to back it up. Compare this to how a 7700k beat the 1600s/1700s at the time. Or how the 2500k beat the FX 8150.
Alternatively, we can look at how the 7600k vs the 1600x would age just by comparing the E8400 and Q6600 debate of the past, and I think that was a pretty accurate comparison.
If you're comparing a 5800X3D vs a 9800X3D, it's like comparing say, an i7 920 vs an i7 3770k. One lasted about 6 years, the other closer to 8, etc.
Ya know?
So many people make stupid short sighted decisions in building. And the people who think that 1080p or even better IMO, 720p doesn't matter are the same people who historically thought that a 7600k was fine because "4 cores is all you need right now", or that the 1700 was the GOAT because eventually it'll beat the 7700k because "moar coars". On the one hand, going from a quad core standard, we very quickly went up to a 12 thread standard within like 3 years. So you got burned if you thought 4 threads was enough. And because battlefield is relevant here, let's discuss that series in particular.
Battlefield is my main multiplayer game. The current BF title is very well optimized, but that's after MASSIVE backlash from 2042's many perfomance issues. Historically, BF games are the big CPU killers. And if im gonna pour hundreds in a game like that, I wanna run it smoothly.
If you bought a 7600k and then bought BF5 like 1.5 years later, you'd have realized you made a major mistake. The game was horribly bottlenecked on quad core CPUs and stuttered like mad. I turned off the 7700k's hyper threading just to see what it was like on a true quad core and it was a horrible stuttery mess. Same with warzone, which launched in 2020.
That's ALWAYS been the case though. BF2042 was the game that pushed my 7700k where that started stuttering. People who had the venerated 2500k started struggling with battlefield 1, while 2600k owners had a good experience.
I struggled like mad on my Phenom II X4 965 on BF4.
BF3 and Bad company 2 were the games that ultimately settled the E8400 vs Q6600 debates. people kept emphasizing dual cores because "moar FPS" but then the quad cores had more power and could run games smoothly.
Point is, if you listen to idiots online and their CPU advice, I'm going to be blunt, if you wanna play battlefield, you're gonna be hurting.
GPU, not so much. That GPU that barely handles ultra today, well, you can run stuff on high, medium, low, very low, very low with a resolution scale or upscaling on, etc. It's a scalable experience. But as someone who spent hundreds of hours stuttering because of a CPU bottleneck in BF4 back in the day, NO AMOUNT of settings changes will get you from 30 to 60 FPS. You get what you get and if your CPU doesn't do the job, it doesn't do the job.
Choose your next CPU wisely. It might mean the difference between a smooth 80 FPS experience or a stuttery 50 FPS one. And gee, maybe knowing one CPU does 200 FPS today while the other does like 130 would help with that somewhat (barring nuance like core counts).
McCullersGuy@reddit
While Steve is generally correct on this bottlenecking point, enough pretending HUB is a pure data Youtube channel. It's a buying guide channel based on their previous actions, and this view on CPUs falls in line with that as most owners will never see close to these huge % improvements that they show in 5090 at 1080p setup.
Seanspeed@reddit
They do both, ffs. Why do people always have this need to fit everything into some single black and white box? It's so lazy.
ishsreddit@reddit
The people who want 4k heavy RT/PT to be a standard for CPU limited benchmarks are the same people who buy high sell low. And also buy more to save more
gemmy99@reddit
That's why I like other Steve, techpowerup, and cb, to find best cpu/gpu combo for me (best value), because i dont buy you of the line cpu/gpu, and still use 1440p, or 4k on tv.
They just do test that prove which is best cpu/gpu. Other combinations not so much.
ResponsibleJudge3172@reddit
Test 5 year old games on 720p with 5 year old CPUs vs 1080p with today's CPUs. You would be surprised
lutel@reddit
Millions of flies cannot be wrong.